Good morning,
I'm trying to implement Azure Blog Storage for the first time using their example code provided. However my app is through a very broad 400 Bad Request error when trying to UploadFromStream().
I have done a bunch of searching on this issue. Almost everything i have come across identifies naming conventions of the container or blob to be the issue. this is NOT my issue, i'm using all lowercase, etc.
My code is no different from their example code:
The connection string:
<add key="StorageConnectionString" value="DefaultEndpointsProtocol=https;AccountName=xxxx;AccountKey=xxxxxx;EndpointSuffix=core.windows.net" />
And the code:
// Retrieve storage account from connection string
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Create the blob client
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve reference to a previously created container
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
// Retrieve reference to a blob named "myblob"
CloudBlockBlob blob = container.GetBlockBlobReference("myblob");
// Create the container if it doesn't already exist
container.CreateIfNotExists();
// Create or overwrite the "myblob" blob with contents from a local file.
using (var fileStream = System.IO.File.OpenRead(#"D:\Files\logo.png"))
{
blob.UploadFromStream(fileStream);
}
Here is the exception details:
This is all i have to go on. The only other thing i can think of is that i'm running this on my development environment with HTTP not HTTPS. Not sure if this might be a issue?
EDIT:
Additionally, when attempting to upload a file directily in the Azure portal to the container i recieve a
Validation error for TestAzureFileUpload.txt. Details: "The page blob
size must be aligned to a 512-byte boundary. The current file size is
56."
Could this be related to my issue? Am i missing some setting here?
I know i do not have enough to go on here for anyone to help me identify the exact issue, but i am hoping that someone can at least point me in the right direction to resolve this?
Any help would be appreciated
I use a Premium storage account to test the code and get the same "400 bad request" as yours.
From the exception details, you can see the "Block blobs are not supported" message.
Here is an image of my exception details
To solve your problem, I think you should know the difference between block blob and page blob.
Block blobs are comprised of blocks, each of which is identified by a block ID. You create or modify a block blob by writing a set of blocks and committing them by their block IDs. they are for you discrete storage objects like jpg, txt, log, etc. That you'd typically view as a file in your local OS. Supported by standard storage account only.
Page blobs are a collection of 512-byte pages optimized for random read and write operations, such as VHD's. To create a page blob, you initialize the page blob and specify the maximum size the page blob will grow. The truth is, page blobs are designed for Azure Virtual Machine disks. Supported by both standard and Premium storage account.
Since you are using the Premium Storage, which is currently available only for storing data on disks used by Azure Virtual Machines.
So my suggestion is :
If you want your application to support streaming and random access scenarios, and be able to access application data from anywhere. You should use block blobs with the standard account.
If you want to lift and shift applications that use native file system APIs to read and write data to persistent disks. Or you want to store data that is not required to be accessed from outside the virtual machine to which the disk is attached. You should use Page blobs.
Reference link:
Understanding Block Blobs, Append Blobs, and Page Blobs
Related
I've looked around but can't seem to find anywhere in docs and the Intellisense documentation for these APIs is nearly identical.
What is the difference between an Azure Storage BlockBlobClient and a BlobClient in the Azure Storage v12 SDK?
Which one should I be using for efficiently uploading file streams to Azure Blob Storage using the Azure Storage v12 .NET SDK?
Is there any difference between these two bits of code and how they get files up to the cloud??
var container = _blobServiceClient.GetBlobContainerClient(containerName);
var blobClient = container.GetBlobClient(filename); // this?
var blockBlockClient = container.GetBlockBlobClient(filename); // or this?
Azure Blob Storage supports three kinds of blobs - Block, Page and Append. While a lot of operations are common for all of these blobs (like delete, copy, lease etc.), there are some operations which are specific to a blob type (like put block, put block list for block blobs). To see operations specific to a particular blob type, please see this: https://learn.microsoft.com/en-us/rest/api/storageservices/operations-on-blobs.
BlobClient provides functionality which can be used for all kinds of blobs.
However to deal with functionality only available with specific kind of blob, you will need to use a client specific for that e.g. BlockBlobClient to deal with block blobs, AppendBlobClient to deal with append blobs and PageBlobClient to deal with page blobs.
BlockBlobClient is a specialization - when you know you're dealing with only block blobs, you can use this. It will have some special functions which will only apply for block blobs. More functionality.
BlobClient is a generic implementation - all its functions can be safely executed on any blob type - block/append/page. Less functionality.
In your case you can use either. They both have the functions you will need.
I am using log4net in my web application
We are deploying it via Cloud Services (NOT App Services).
My understanding is that I won't be able to access the log files on disk (and further, these files are not persistent anyways).
My readings are to use Blob storage. But I don't see any code out there on how to do this. There is a nuget package
https://www.nuget.org/packages/log4net.Appender.Azure
but the documentation says it creates a file for each log entry.
What I want is the RollingLogFile.
Do I basically have to create my own? as in, pull down the log4net source code and create my own appender that logs to a cloud storage account instead of disk? Seems like a lot of work, would have thought someone would have coded this feature already?
Thanks.
This project shares us some samples that store log entry in Azure Blob storage using AzureBlobAppender or AzureAppendBlobAppender for log4Net.
According the code, we could find AzureBlobAppender will create separate xml file for each log entity in Azure Blob storage, but AzureAppendBlobAppender will store logs that generated in one day in one log file by calling CloudAppendBlob.AppendBlock method to append a new block of logs data to the end of an existing blob.
If you do not want to create a xml file for each log entry, you could try to use AzureAppendBlobAppender.
private static string Filename(string directoryName){
return string.Format("**{0}/{1}.entry.log.xml**",
directoryName,
DateTime.Today.ToString("yyyy_MM_dd",
DateTimeFormatInfo.InvariantInfo));
}
I am trying to find a simple way to list log files which I am storing in and Azure Blob Container so developers and admins can easily get to dev log information. I am following the information in this API doc https://msdn.microsoft.com/en-us/library/dd135734.aspx but when I go to
https://-my-storage-url-.blob.core.windows.net/dev?comp=list&nclude={snapshots,metadata,uncommittedblobs,copy}&maxresults=1000
I see one file listed which is a Block Blob but the log files I have generated which are of type Append Blob are not showing. How can I construct this api call to include Append Blobs?
Is there a form of blob (block, page, etc) that will allow this using the C# api? Machine X could be uploading a file to an azure blob endpoint, and machine Y be reading the file in real-time. I seems to me like block blob won't work, because you need to put the block list before you can query the http endpoint for it, but is there a way to query for uncommitted blocks and download those beforehand?
An example of this in practice, is that a user machine does a handshake with the server, gets a write shared access token from the server and permission to upload the file. Client #1 machine begins uploading - now say, a second client machine requests the file from the server but client #1 has not finished the upload. In this case, client #2 will get relevant details from the server, and a read-only shared access token, and then begin to read the file even though the upload has not been finished yet.
With Block Blobs I don't think it is possible to start downloading the blob while it is still being uploaded. This is simply because nothing is stored at this time. When you upload blocks for a blob, Azure Storage simply stores the byte chunks someplace.
It is only when you commit the block list, Azure Storage creates a block blob by arranging the byte chunks based on the request payload in commit block list operation. Even though Azure Storage lets to see the block list before it is committed, it doesn't expose any API to read the contents of a block.
I don't think Page Blob is the correct type of blob in your scenario (same with Append Blob) even though the moment you write the page it gets committed in the blob and other caller can get the page ranges and start downloading the data stored in those pages. However a page blob size has to be a multiple of 512 bytes and not all files uploaded in your application would meet this requirement.
I'm confused about how to get size of the blob in Windows Azure.
In my case, I first get the blob reference with CloudBlockBlob blob = container.GetBlockBlobReference(foo);(here foo is name of blob and I'm sure the blob exists). Then I try to get blob size blob.Property.Length; However, it always return 0. I breakpoint at this statement and track content inside blob: uri of the blob is correct, can I infer that the blob is correctly retrieved from that? While all the fields in Properties is either null or 0. I cannot figure out a solution. Is it because I currently emulate the app locally in Storage Emulator and will be OK after the deployment?
Thanks and Best Regards.
Call blob.FetchAttributes(). GetBlockBlobReference doesn't actually make any calls to the blob service. It just constructs a local object that represents the blob.
I have written a blog on exact same issue about 4 days back:
http://blogs.msdn.com/b/avkashchauhan/archive/2012/04/27/windows-azure-blob-size-return-0-even-when-blob-is-accessible-and-downloaded-without-any-problem.aspx