I am developing a web platform to manage the upload/download of files. The front-end is developed in React, the back-end in ASP.NET and Azure Blob Containers is used to store the uploaded files.
As for the upload, I'm using the Microsoft "Azure Storage Client Library" to send files directly from the client to Azure through SAS authentication. This Javascript library allow me to update a progress bar during the whole process.
As for the download, the process is more complicated: the file is first downloaded from the server (phase 1 or Azure->Server) and then it is downloaded from the client (phase 2 or Server->Client). Phase 1 creates two problems for me:
I cannot display a progress bar to check the progress;
It can take long time and, at this stage, the client cannot begin the download;
To solve these problems I would like one of the following solutions:
download the file directly from the client using the Javascript library but in this case it is necessary to rename the file;
create a server-client communication to implement a progress bar relating to phase 1;
This is my current C # function to allow the download of a file
using Microsoft.WindowsAzure.Storage.Blob;
private IActionResult DownloadFile(...) {
...
using (var blobStream = new MemoryStream()) {
string blobName = ...
CloudBlobContainer cloudBlobContainer = ...
CloudBlockBlob blob = cloudBlobContainer.GetBlockBlobReference(blobName);
blob.DownloadToStream(blobStream);
return new FileContentResult(blobStream.ToArray(), "application/pdf");
}
}
EDIT:
Below the code I use to generate the SAS token:
private string GetSasReadToken(string connectionString, string containerName) {
var storageAccount = CloudStorageAccount.Parse(connectionString);
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference(containerName);
var sasConstraints = new SharedAccessBlobPolicy {
SharedAccessExpiryTime = DateTimeOffset.UtcNow.AddMinutes(60),
Permissions = SharedAccessBlobPermissions.Read
};
var sasContainerToken = cloudBlobContainer.GetSharedAccessSignature(sharedAccessBlobPolicy);
}
In order to make use of Content-Disposition, you will need to generate SAS token on a blob (currently you're creating a SAS token on a blob container). Then you will need to make use of SharedAccessBlobHeaders and define the content-disposition value there.
Here's the sample code (untested though):
private string GetSasReadToken(string connectionString, string containerName, string blobName) {
var storageAccount = CloudStorageAccount.Parse(connectionString);
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference(containerName);
CloudBlockBlob cloudBlockBlob = cloudBlobContainer.GetBlockBlobReference(blobName);
var sasConstraints = new SharedAccessBlobPolicy {
SharedAccessExpiryTime = DateTimeOffset.UtcNow.AddMinutes(60),
Permissions = SharedAccessBlobPermissions.Read,
};
var sasHeaders = new SharedAccessBlobHeaders();
sasHeaders.ContentDisposition = "attachment;filename=<your-download-file-name>";
var sasBlobToken = cloudBlockBlob.GetSharedAccessSignature(sharedAccessBlobPolicy, sasHeaders);
}
Related
i use Azure Blob Storage Client Libary v11 for .Net.
I wrote a program that our customers can use to upload files. I generate a URL with a SAS Token (valid for x Days) for our customer and the customer can upload files using the program. Here is an example url:
https://storage.blob.core.windows.net/123456789?sv=2019-07-07&sr=c&si=mypolicy&sig=ASDH845378ddsaSDdase324234234rASDSFR
How can I find out whether the SAS token is still valid before the upload is started?
Update:
I have no se claim in my url.
Here is my code to generate the url:
var policyName = "mypolicy";
string containerName = "123456789";
// Retrieve storage account information from connection string
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(GetSecret());
// Create a blob client for interacting with the blob service.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Create a container for organizing blobs within the storage account.
CloudBlobContainer container = blobClient.GetContainerReference(containerName);
try
{
// The call below will fail if the sample is configured to use the storage emulator in the connection string, but
// the emulator is not running.
// Change the retry policy for this call so that if it fails, it fails quickly.
BlobRequestOptions requestOptions = new BlobRequestOptions() { RetryPolicy = new NoRetry() };
await container.CreateIfNotExistsAsync(requestOptions, null);
}
catch (StorageException ex)
{
MessageBox.Show(ex.Message, Application.ProductName, MessageBoxButtons.OK, MessageBoxIcon.Error);
return string.Empty;
}
// create the stored policy we will use, with the relevant permissions and expiry time
var storedPolicy = new SharedAccessBlobPolicy()
{
SharedAccessExpiryTime = DateTime.UtcNow.AddDays(7),
Permissions = SharedAccessBlobPermissions.Read |
SharedAccessBlobPermissions.Write |
SharedAccessBlobPermissions.List
};
// get the existing permissions (alternatively create new BlobContainerPermissions())
var permissions = container.GetPermissions();
// optionally clear out any existing policies on this container
permissions.SharedAccessPolicies.Clear();
// add in the new one
permissions.SharedAccessPolicies.Add(policyName, storedPolicy);
// save back to the container
container.SetPermissions(permissions);
// Now we are ready to create a shared access signature based on the stored access policy
var containerSignature = container.GetSharedAccessSignature(null, policyName);
// create the URI a client can use to get access to just this container
return container.Uri + containerSignature;
I have found a solution myself. This blog describes two different ShardedAccessSignatures. I have adapted the code so that I now also have the se claim in my URL.
Solution:
protected void GetSharedAccessSignature(
String containerName, String blobName)
{
CloudStorageAccount cloudStorageAccount =
CloudStorageAccount.FromConfigurationSetting(“DataConnectionString”);
CloudBlobClient cloudBlobClient = cloudStorageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer =
new CloudBlobContainer(containerName, cloudBlobClient);
CloudBlockBlob cloudBlockBlob =
cloudBlobContainer.GetBlockBlobReference(blobName);
SharedAccessPolicy sharedAccessPolicy = new SharedAccessPolicy();
sharedAccessPolicy.Permissions = SharedAccessPermissions.Read;
sharedAccessPolicy.SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-10);
sharedAccessPolicy.SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(40);
String sharedAccessSignature1 =
cloudBlockBlob.GetSharedAccessSignature(sharedAccessPolicy);
String sharedAccessSignature2 =
cloudBlockBlob.GetSharedAccessSignature( new SharedAccessPolicy(), “adele”);
}
The sharedAccessSignature1 contains the se claim.
In my code of my initial questions I had used the sharedAccessSignature2.
I am trying to rename a container name for users when a condition is met. I made some research and found out that there is no rename function for containers in azure blob storage. But there is a way to accomplish this by copying the files and deleting it after copy. Below is the code I made.
string ContainerName = "old-container-name-user1";
string NewContainerName = "new-container-name-user2"
CloudStorageAccount sourceAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
CloudBlobClient sourceblobClient = sourceAccount.CreateCloudBlobClient();
CloudBlobContainer sourceBlobContainer = sourceblobClient.GetContainerReference(ContainerName);
CloudBlobContainer destBlobContainer = sourceblobClient.GetContainerReference(NewContainerName);
CloudBlockBlob blobOriginal = sourceBlobContainer.GetBlockBlobReference(ContainerName);
CloudBlockBlob blobNew = destBlobContainer.GetBlockBlobReference(NewContainerName);
blobNew.StartCopy(blobOriginal);
blobOriginal.Delete();
When I execute this code I got an error message. Below is the error.
Exception User-Unhandled
Microsoft.WindowsAzure.Storage.StorageException:'The remote server
returned an error: (404) Not Found.'
Inner Exception WebException: The remote server returned an error:
(404) Not Found.
When I also try "blobNew.StartCopyAsync(blobOriginal)" the code just goes through but when I check the containers in azure there is no container created. What do you think the problem is? Any tips on how to improve my code? Delete function does not work also.
UPDATE
I update my code and was able to copy the files from other to container to the new one. Below is the code.
string ContainerName = "old-container-name-user1"
string NewContainerName = "new-container-name-user2"
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference(ContainerName);
CloudBlobContainer destcontainer = blobClient.GetContainerReference(NewContainerName);
destcontainer.CreateIfNotExists(BlobContainerPublicAccessType.Blob);
IEnumerable<IListBlobItem> IE = container.ListBlobs(useFlatBlobListing: true);
foreach (IListBlobItem item in IE)
{
CloudBlockBlob blob = (CloudBlockBlob)item;
CloudBlockBlob destBlob = destcontainer.GetBlockBlobReference(blob.Name);
destBlob.StartCopyAsync(new Uri(GetSharedAccessUri(blob.Name, container)));
}
AccessURI Method
private static string GetSharedAccessUri(string blobName, CloudBlobContainer container)
{
DateTime toDateTime = DateTime.Now.AddMinutes(60);
SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessStartTime = null,
SharedAccessExpiryTime = new DateTimeOffset(toDateTime)
};
CloudBlockBlob blob = container.GetBlockBlobReference(blobName);
string sas = blob.GetSharedAccessSignature(policy);
return blob.Uri.AbsoluteUri + sas;
}
Now its working but another problem shows up. It says
System.InvalidCastException: 'Unable to cast object of type
'Microsoft.WindowsAzure.Storage.Blob.CloudBlobDirectory' to type
'Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob'.'
But this will be posted in another question. Thanks to our friend Gauriv for pointing out my problem. Check his answer below.
Update 2
By adding useFlatBlobListing: true in IEnumerable<IListBlobItem> IE = container.ListBlobs(blobListingDetails: BlobListingDetails.Metadata); I was able to fix my problem. This line of code was put in my display.
Final Code
IEnumerable<IListBlobItem> IE = container.ListBlobs(blobListingDetails: BlobListingDetails.Metadata, useFlatBlobListing: true);
If you look at your code for creating source and destination blob:
CloudBlockBlob blobOriginal = sourceBlobContainer.GetBlockBlobReference(ContainerName);
CloudBlockBlob blobNew = destBlobContainer.GetBlockBlobReference(NewContainerName);
You'll notice that you're passing the names of the blob container and not the name of the blob. Because you don't have a blob in the container by the name of the container, you're getting 404 error.
To copy a blob container, what you have to do is list all blobs from the source container and then copy them individually in the destination container. Once all the blobs have been copied, you can delete the source container.
If you want, you can use Microsoft's Storage Explorer to achieve "rename container" functionality. It also works by copying blobs from old container to the renamed container and then deletes the old container.
static void RenameContainer()
{
var connectionString = "DefaultEndpointsProtocol=https;AccountName=account-name;AccountKey=account-key";
var storageAccount = CloudStorageAccount.Parse(connectionString);
var client = storageAccount.CreateCloudBlobClient();
var sourceContainer = client.GetContainerReference("source-container");
var targetContainer = client.GetContainerReference("target-container");
targetContainer.CreateIfNotExists();//Create target container
BlobContinuationToken continuationToken = null;
do
{
Console.WriteLine("Listing blobs. Please wait...");
var blobsResult = sourceContainer.ListBlobsSegmented(prefix: "", useFlatBlobListing: true, blobListingDetails: BlobListingDetails.All, maxResults: 1000, currentToken: continuationToken, options: new BlobRequestOptions(), operationContext: new OperationContext());
continuationToken = blobsResult.ContinuationToken;
var items = blobsResult.Results;
foreach (var item in items)
{
var blob = (CloudBlob)item;
var targetBlob = targetContainer.GetBlobReference(blob.Name);
Console.WriteLine(string.Format("Copying \"{0}\" from \"{1}\" blob container to \"{2}\" blob container.", blob.Name, sourceContainer.Name, targetContainer.Name));
targetBlob.StartCopy(blob.Uri);
}
} while (continuationToken != null);
Console.WriteLine("Deleting source blob container. Please wait.");
//sourceContainer.DeleteIfExists();
Console.WriteLine("Rename container operation complete. Press any key to terminate the application.");
}
You can rename containers with Microsoft's "Microsoft Azure Storage Explorer" (after version 0.8.3).
Actual Answer:
Regarding your error message, If the client application receives an HTTP 404 (Not found) message from the server, this means that the object the client was attempting to use does not exist in the storage service. There are several possible reasons for this, such as:
· The client or another process previously deleted the object (Make sure name is correct)
· A Shared Access Signature (SAS) authorization issue
· Client-side code does not have permission to access the object
· Network failure
In order to identify the issue in detail, you can add a try/catch and see the actual error
Trying to move a append blob to another container after processing it.
I am first just trying to copy it, then I will delete it (unless there is a actual move function?)
Using C#
I keep getting a 404
: The remote server returned an error: (404) Not Found. ---> System.Net.WebException : The remote server returned an error: (404) Not Found.
at System.Net.HttpWebRequest.GetResponse()
I have tried creating a SasToken at both the container and blob level.
private static void copyBlob(messageClass msgPassed, CloudStorageAccount storageAccount)
{
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer sourceContainer = blobClient.GetContainerReference(receiveScanContainer);
//create a SAS on source blob container with "read" permission. We will append this SAS later
var sasToken = sourceContainer.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddDays(1),
});
CloudBlobContainer targetContainer = blobClient.GetContainerReference(archiveContainer);
CloudAppendBlob sourceBlob = sourceContainer.GetAppendBlobReference(msgPassed.currentName);
var sasToken2 = sourceBlob.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddDays(1),
});
CloudAppendBlob targetBlob = targetContainer.GetAppendBlobReference(msgPassed.currentName);
string name = sourceBlob.Uri.Segments.Last();
CloudAppendBlob destBlob = targetContainer.GetAppendBlobReference(name+sasToken2);
targetBlob.StartCopy(destBlob);
}
OK...dug into it more.
The below works.
I also was passing the name of the target container slightly wrong (one letter off) from what the container name actually was.
private static void copyBlob(messageClass msgPassed, CloudStorageAccount storageAccount)
{
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer sourceContainer = blobClient.GetContainerReference(receiveScanContainer);
CloudBlobContainer targetContainer = blobClient.GetContainerReference(archiveContainer);
CloudAppendBlob sourceBlob = sourceContainer.GetAppendBlobReference(msgPassed.currentName);
CloudAppendBlob targetBlob = targetContainer.GetAppendBlobReference(msgPassed.currentName);
targetBlob.StartCopy(sourceBlob);
}
I've got a basic web app in C# MVC (i'm new to MVC) which is connected to a database. In that database there is a table with a list of filenames. These files are stored in Azure Storage Blob Container.
I've used Scaffolding (creates a controller and view) to show data from my table of filenames and that works fine.
Now I would like to connect those filenames to the blob storage so that the user can click on and open them. How do I achieve this?
Do I edit the index view? Do I get the user to click on a filename and then connect to Azure storage to open that file? How is this done?
Please note that files on storage are private and is accessed using the storage key. Files cannot be made public.
Thanks for any advice.
[Update]
I've implemented the Shared Access Signature (SAS) using the code below.
public static string GetSASUrl(string containerName)
{
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference(containerName);
BlobContainerPermissions containerPermissions = new BlobContainerPermissions();
containerPermissions.SharedAccessPolicies.Add("twominutepolicy", new SharedAccessBlobPolicy()
{
SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-1),
SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(2),
Permissions = SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.Read
});
containerPermissions.PublicAccess = BlobContainerPublicAccessType.Off;
container.SetPermissions(containerPermissions);
string sas = container.GetSharedAccessSignature(new SharedAccessBlobPolicy(), "twominutepolicy");
return sas;
}
public static string GetSasBlobUrl(string containerName, string fileName, string sas)
{
// Create new storage credentials using the SAS token.
StorageCredentials accountSAS = new StorageCredentials(sas);
// Use these credentials and the account name to create a Blob service client.
CloudStorageAccount accountWithSAS = new CloudStorageAccount(accountSAS, [Enter Account Name], endpointSuffix: null, useHttps: true);
CloudBlobClient blobClientWithSAS = accountWithSAS.CreateCloudBlobClient();
// Retrieve reference to a previously created container.
CloudBlobContainer container = blobClientWithSAS.GetContainerReference(containerName);
// Retrieve reference to a blob named "photo1.jpg".
CloudBlockBlob blockBlob = container.GetBlockBlobReference(fileName);
return blockBlob.Uri.AbsoluteUri + sas;
}
In order to access blobs that are not public, you'll need to use Shared Access Signatures, with that, you'll create access tokens valid for a period of time (you'll choose) and you can also restrict by IP address.
More info in here:
https://learn.microsoft.com/en-us/azure/storage/storage-dotnet-shared-access-signature-part-1
As they are not public, you'll need to add an additional step before pass the data to your view, which is concatenate the SAS token to the blob Uri. You can find a very good example in here: http://www.dotnetcurry.com/windows-azure/901/protect-azure-blob-storage-shared-access-signature
I'm trying to upload an image that I get from my Android device as a ByteArray
to my Azure Storage Blob. By using a webservice in asp.net.
But I can't figure out how to do this...
Here is my code so far:
[WebMethod]
public string UploadFile(byte[] f, string fileName)
{
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
ConfigurationManager.ConnectionStrings["StorageConnectionString"].ConnectionString);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve a reference to a container.
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
// Create the container if it doesn't already exist.
container.CreateIfNotExists();
container.SetPermissions(
new BlobContainerPermissions
{
PublicAccess = BlobContainerPublicAccessType.Blob
});
// Retrieve reference to a blob named "filename...".
CloudBlockBlob blockBlob = container.GetBlockBlobReference(fileName);
// Create or overwrite the "filename..." blob with contents from a local file.
using (var fileStream = System.IO.File.OpenRead("C:\\filepath"))
{
blockBlob.UploadFromStream(fileStream);
}
return "OK";
}
This code gets the image from a local file path on my computer, and thats not what I want.
I want to use the byte[] array 'f' wich I recive from my Android device instead of 'C:\filepath'
How can I do that ?