Is there a form of blob (block, page, etc) that will allow this using the C# api? Machine X could be uploading a file to an azure blob endpoint, and machine Y be reading the file in real-time. I seems to me like block blob won't work, because you need to put the block list before you can query the http endpoint for it, but is there a way to query for uncommitted blocks and download those beforehand?
An example of this in practice, is that a user machine does a handshake with the server, gets a write shared access token from the server and permission to upload the file. Client #1 machine begins uploading - now say, a second client machine requests the file from the server but client #1 has not finished the upload. In this case, client #2 will get relevant details from the server, and a read-only shared access token, and then begin to read the file even though the upload has not been finished yet.
With Block Blobs I don't think it is possible to start downloading the blob while it is still being uploaded. This is simply because nothing is stored at this time. When you upload blocks for a blob, Azure Storage simply stores the byte chunks someplace.
It is only when you commit the block list, Azure Storage creates a block blob by arranging the byte chunks based on the request payload in commit block list operation. Even though Azure Storage lets to see the block list before it is committed, it doesn't expose any API to read the contents of a block.
I don't think Page Blob is the correct type of blob in your scenario (same with Append Blob) even though the moment you write the page it gets committed in the blob and other caller can get the page ranges and start downloading the data stored in those pages. However a page blob size has to be a multiple of 512 bytes and not all files uploaded in your application would meet this requirement.
Related
We're just getting started with Azure Storage.
In our scenario we upload to private blobs that we later need to access directly from our client app, e.g. images.
Is there a way to address private blobs in Azure Storage with a URL containing the access key?
Sifting through the MS docs all I could find so far is simple URL access via the blob URI, e.g. as given by the URI property of the CloudBlockBlob instance when listing blobs via the .net API.
Naturally accessing this from a web browser fails due to the blob not being public.
However, can we qualify the URL to also include the access key in order to allow authorized clients to access the blob..?
You can generate an SAS URL and token for the private blob. Here's the process for generating this manually in the Azure portal, to test the concept. It will work even if your storage container is private, as it allows temporary, time limited access to the file using a URL that contains a token in it's query string.
Click on your file within the storage container, select the 'Generate SAS' tab, and in the right pane select
This will generate a token, and a URL that includes the token, like below:
You can test downloading the URL as a file by using curl. Use the 2nd URL shown in the image above (the one that includes the full token and other parameters in the querystring), then do this (IMPORTANT - the URL must be in double quotes):
curl "<YOUR_URL>" --output myFileName.txt
Tip - this is also a good method for making files available to an Azure VM, if you need to install a file directly on the VM for any reason (I needed to do this to install an SSL certificate), you can generate the URL then curl to download the file, on the VM itself. E.g. connect to the VM first with Bastion or SSH, then use curl to download the file somewhere.
This is the API for how you read blobs from storage:
https://learn.microsoft.com/en-us/rest/api/storageservices/get-blob
There is no URL-Parameter to pass the access key, only the header value Authorization. So you could do the request manually and e.g. add the resulting data as a base64 encoded image. I would advise against it if at all possible.
You must also be aware that by passing your access key to the client, you are effectively making your blob public anyways. You would be putting your data at more risk than anonymous access, since the access key allows more operations than anonymous access. This would also hold true for your objective-c app, even though its much more obfuscated there. SAS is the way to go there - create a backend service that creates a defined set of SAS tokens for given resources. It is however much more effort than simply obfuscating the full access key somewhere.
See "Features available to anonymous users":
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-manage-access-to-resources
Good morning,
I'm trying to implement Azure Blog Storage for the first time using their example code provided. However my app is through a very broad 400 Bad Request error when trying to UploadFromStream().
I have done a bunch of searching on this issue. Almost everything i have come across identifies naming conventions of the container or blob to be the issue. this is NOT my issue, i'm using all lowercase, etc.
My code is no different from their example code:
The connection string:
<add key="StorageConnectionString" value="DefaultEndpointsProtocol=https;AccountName=xxxx;AccountKey=xxxxxx;EndpointSuffix=core.windows.net" />
And the code:
// Retrieve storage account from connection string
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Create the blob client
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve reference to a previously created container
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
// Retrieve reference to a blob named "myblob"
CloudBlockBlob blob = container.GetBlockBlobReference("myblob");
// Create the container if it doesn't already exist
container.CreateIfNotExists();
// Create or overwrite the "myblob" blob with contents from a local file.
using (var fileStream = System.IO.File.OpenRead(#"D:\Files\logo.png"))
{
blob.UploadFromStream(fileStream);
}
Here is the exception details:
This is all i have to go on. The only other thing i can think of is that i'm running this on my development environment with HTTP not HTTPS. Not sure if this might be a issue?
EDIT:
Additionally, when attempting to upload a file directily in the Azure portal to the container i recieve a
Validation error for TestAzureFileUpload.txt. Details: "The page blob
size must be aligned to a 512-byte boundary. The current file size is
56."
Could this be related to my issue? Am i missing some setting here?
I know i do not have enough to go on here for anyone to help me identify the exact issue, but i am hoping that someone can at least point me in the right direction to resolve this?
Any help would be appreciated
I use a Premium storage account to test the code and get the same "400 bad request" as yours.
From the exception details, you can see the "Block blobs are not supported" message.
Here is an image of my exception details
To solve your problem, I think you should know the difference between block blob and page blob.
Block blobs are comprised of blocks, each of which is identified by a block ID. You create or modify a block blob by writing a set of blocks and committing them by their block IDs. they are for you discrete storage objects like jpg, txt, log, etc. That you'd typically view as a file in your local OS. Supported by standard storage account only.
Page blobs are a collection of 512-byte pages optimized for random read and write operations, such as VHD's. To create a page blob, you initialize the page blob and specify the maximum size the page blob will grow. The truth is, page blobs are designed for Azure Virtual Machine disks. Supported by both standard and Premium storage account.
Since you are using the Premium Storage, which is currently available only for storing data on disks used by Azure Virtual Machines.
So my suggestion is :
If you want your application to support streaming and random access scenarios, and be able to access application data from anywhere. You should use block blobs with the standard account.
If you want to lift and shift applications that use native file system APIs to read and write data to persistent disks. Or you want to store data that is not required to be accessed from outside the virtual machine to which the disk is attached. You should use Page blobs.
Reference link:
Understanding Block Blobs, Append Blobs, and Page Blobs
I have this specific scenario:
The user is sending me request which contains a URL to a file in my private repository.
The server side catch this request and Download the file.
The server making some calculation on the downloaded file.
The server sending the results back to client.
I implemented this in the "Naive" way. Which mean, I downloading the file (step 2) for each request. In most cases, the user will send the same file. So, I thought about better approach: keep the downloaded file in short term "cache".
This mean, I will download the item once, and use this for every user request.
Now the question is, how to manage those files?
In "perfect world", I will use the downloaded file for up to 30 minutes. After this time, I won't use it any more. So, optional solutions are:
Making a file system mechanism to handling files for short terms. Negative: Complex solution.
Using temporary directory to do this job (e.g. Path.GetTempFileName()). Negative: What if the system will start to delete those files, in the middle of reading it?
So, it's seems that each solution has bad sides. What do you recommend?
I am trying to find a simple way to list log files which I am storing in and Azure Blob Container so developers and admins can easily get to dev log information. I am following the information in this API doc https://msdn.microsoft.com/en-us/library/dd135734.aspx but when I go to
https://-my-storage-url-.blob.core.windows.net/dev?comp=list&nclude={snapshots,metadata,uncommittedblobs,copy}&maxresults=1000
I see one file listed which is a Block Blob but the log files I have generated which are of type Append Blob are not showing. How can I construct this api call to include Append Blobs?
I'm having an issue within my application Pelotonics. When a user downloads a file the system seems to block all incoming requests until that file is done downloading. What is the proper technique to to open a download dialog box (standard from the browser), let the user start downloading the file, then while the file is downloading, let the user continue throughout the application.
The way we're getting the file from the server is we have a separate ASPX page that get's passed in a value through the query string, then retrieves the stream of the file from the server, then I add the "content-disposition" header to the Response and then loop through the file's stream and read 2KB chunks out to the response.outputstream. Then once that's done I do a Response.End.
Watch this for a quick screencast on the issue:
http://www.screencast.com/users/PeloCast/folders/Jing/media/8bb4b1dd-ac66-4f84-a1a3-7fc64cd650c0
by the way, we're in ASP.NET and C#...
Thanks!!!
Daniel
I think ASP.NET allows one simultaneous page execution per session and I'm not aware of any way to configure this otherwise.
This is not a very pretty workaround, but it might help if you rewrote ASP.NET_SESSIONID value to the request cookie in Application_BeginRequest (in global.asax). Of course, you would need to the authentication some other way. I haven't tried this, though.
Another way would be launching a separate thread for the download process, but you would need to find a way how this can be done without the worker thread closing it's resources.
May I ask, is there a reason why don't you just use HttpResponse.TransmitFile?