I have uploaded images a,b,c and d, now how do I know what is the location/address of my image b in azure blob storage(I am using c# code).
I know i have a way to get the list of all images stored in container but how should i approach if ask is specific to a particular image.
Any guidance should be helpful.
If you're using CloudBlobClient you can get the primary and secondary location of the blob:
var storageCredentials = new StorageCredentials(accountName, keyValue);
var cloudStorageAccount = new CloudStorageAccount(storageCredentials, true);
var cloudBlobClient = cloudStorageAccount.CreateCloudBlobClient();
var container = cloudBlobClient.GetContainerReference(containerName);
var cloudBlockBlob = container.GetBlockBlobReference(blobName); // "a/b/c/d" blob names
Uri primaryLocation = cloudBlockBlob.StorageUri.PrimaryUri;
Uri secondaryLocation = cloudBlockBlob.StorageUri.SecondaryUri;
Related
i use Azure Blob Storage Client Libary v11 for .Net.
I wrote a program that our customers can use to upload files. I generate a URL with a SAS Token (valid for x Days) for our customer and the customer can upload files using the program. Here is an example url:
https://storage.blob.core.windows.net/123456789?sv=2019-07-07&sr=c&si=mypolicy&sig=ASDH845378ddsaSDdase324234234rASDSFR
How can I find out whether the SAS token is still valid before the upload is started?
Update:
I have no se claim in my url.
Here is my code to generate the url:
var policyName = "mypolicy";
string containerName = "123456789";
// Retrieve storage account information from connection string
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(GetSecret());
// Create a blob client for interacting with the blob service.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Create a container for organizing blobs within the storage account.
CloudBlobContainer container = blobClient.GetContainerReference(containerName);
try
{
// The call below will fail if the sample is configured to use the storage emulator in the connection string, but
// the emulator is not running.
// Change the retry policy for this call so that if it fails, it fails quickly.
BlobRequestOptions requestOptions = new BlobRequestOptions() { RetryPolicy = new NoRetry() };
await container.CreateIfNotExistsAsync(requestOptions, null);
}
catch (StorageException ex)
{
MessageBox.Show(ex.Message, Application.ProductName, MessageBoxButtons.OK, MessageBoxIcon.Error);
return string.Empty;
}
// create the stored policy we will use, with the relevant permissions and expiry time
var storedPolicy = new SharedAccessBlobPolicy()
{
SharedAccessExpiryTime = DateTime.UtcNow.AddDays(7),
Permissions = SharedAccessBlobPermissions.Read |
SharedAccessBlobPermissions.Write |
SharedAccessBlobPermissions.List
};
// get the existing permissions (alternatively create new BlobContainerPermissions())
var permissions = container.GetPermissions();
// optionally clear out any existing policies on this container
permissions.SharedAccessPolicies.Clear();
// add in the new one
permissions.SharedAccessPolicies.Add(policyName, storedPolicy);
// save back to the container
container.SetPermissions(permissions);
// Now we are ready to create a shared access signature based on the stored access policy
var containerSignature = container.GetSharedAccessSignature(null, policyName);
// create the URI a client can use to get access to just this container
return container.Uri + containerSignature;
I have found a solution myself. This blog describes two different ShardedAccessSignatures. I have adapted the code so that I now also have the se claim in my URL.
Solution:
protected void GetSharedAccessSignature(
String containerName, String blobName)
{
CloudStorageAccount cloudStorageAccount =
CloudStorageAccount.FromConfigurationSetting(“DataConnectionString”);
CloudBlobClient cloudBlobClient = cloudStorageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer =
new CloudBlobContainer(containerName, cloudBlobClient);
CloudBlockBlob cloudBlockBlob =
cloudBlobContainer.GetBlockBlobReference(blobName);
SharedAccessPolicy sharedAccessPolicy = new SharedAccessPolicy();
sharedAccessPolicy.Permissions = SharedAccessPermissions.Read;
sharedAccessPolicy.SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-10);
sharedAccessPolicy.SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(40);
String sharedAccessSignature1 =
cloudBlockBlob.GetSharedAccessSignature(sharedAccessPolicy);
String sharedAccessSignature2 =
cloudBlockBlob.GetSharedAccessSignature( new SharedAccessPolicy(), “adele”);
}
The sharedAccessSignature1 contains the se claim.
In my code of my initial questions I had used the sharedAccessSignature2.
I am developing a web platform to manage the upload/download of files. The front-end is developed in React, the back-end in ASP.NET and Azure Blob Containers is used to store the uploaded files.
As for the upload, I'm using the Microsoft "Azure Storage Client Library" to send files directly from the client to Azure through SAS authentication. This Javascript library allow me to update a progress bar during the whole process.
As for the download, the process is more complicated: the file is first downloaded from the server (phase 1 or Azure->Server) and then it is downloaded from the client (phase 2 or Server->Client). Phase 1 creates two problems for me:
I cannot display a progress bar to check the progress;
It can take long time and, at this stage, the client cannot begin the download;
To solve these problems I would like one of the following solutions:
download the file directly from the client using the Javascript library but in this case it is necessary to rename the file;
create a server-client communication to implement a progress bar relating to phase 1;
This is my current C # function to allow the download of a file
using Microsoft.WindowsAzure.Storage.Blob;
private IActionResult DownloadFile(...) {
...
using (var blobStream = new MemoryStream()) {
string blobName = ...
CloudBlobContainer cloudBlobContainer = ...
CloudBlockBlob blob = cloudBlobContainer.GetBlockBlobReference(blobName);
blob.DownloadToStream(blobStream);
return new FileContentResult(blobStream.ToArray(), "application/pdf");
}
}
EDIT:
Below the code I use to generate the SAS token:
private string GetSasReadToken(string connectionString, string containerName) {
var storageAccount = CloudStorageAccount.Parse(connectionString);
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference(containerName);
var sasConstraints = new SharedAccessBlobPolicy {
SharedAccessExpiryTime = DateTimeOffset.UtcNow.AddMinutes(60),
Permissions = SharedAccessBlobPermissions.Read
};
var sasContainerToken = cloudBlobContainer.GetSharedAccessSignature(sharedAccessBlobPolicy);
}
In order to make use of Content-Disposition, you will need to generate SAS token on a blob (currently you're creating a SAS token on a blob container). Then you will need to make use of SharedAccessBlobHeaders and define the content-disposition value there.
Here's the sample code (untested though):
private string GetSasReadToken(string connectionString, string containerName, string blobName) {
var storageAccount = CloudStorageAccount.Parse(connectionString);
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference(containerName);
CloudBlockBlob cloudBlockBlob = cloudBlobContainer.GetBlockBlobReference(blobName);
var sasConstraints = new SharedAccessBlobPolicy {
SharedAccessExpiryTime = DateTimeOffset.UtcNow.AddMinutes(60),
Permissions = SharedAccessBlobPermissions.Read,
};
var sasHeaders = new SharedAccessBlobHeaders();
sasHeaders.ContentDisposition = "attachment;filename=<your-download-file-name>";
var sasBlobToken = cloudBlockBlob.GetSharedAccessSignature(sharedAccessBlobPolicy, sasHeaders);
}
I am trying to rename a container name for users when a condition is met. I made some research and found out that there is no rename function for containers in azure blob storage. But there is a way to accomplish this by copying the files and deleting it after copy. Below is the code I made.
string ContainerName = "old-container-name-user1";
string NewContainerName = "new-container-name-user2"
CloudStorageAccount sourceAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
CloudBlobClient sourceblobClient = sourceAccount.CreateCloudBlobClient();
CloudBlobContainer sourceBlobContainer = sourceblobClient.GetContainerReference(ContainerName);
CloudBlobContainer destBlobContainer = sourceblobClient.GetContainerReference(NewContainerName);
CloudBlockBlob blobOriginal = sourceBlobContainer.GetBlockBlobReference(ContainerName);
CloudBlockBlob blobNew = destBlobContainer.GetBlockBlobReference(NewContainerName);
blobNew.StartCopy(blobOriginal);
blobOriginal.Delete();
When I execute this code I got an error message. Below is the error.
Exception User-Unhandled
Microsoft.WindowsAzure.Storage.StorageException:'The remote server
returned an error: (404) Not Found.'
Inner Exception WebException: The remote server returned an error:
(404) Not Found.
When I also try "blobNew.StartCopyAsync(blobOriginal)" the code just goes through but when I check the containers in azure there is no container created. What do you think the problem is? Any tips on how to improve my code? Delete function does not work also.
UPDATE
I update my code and was able to copy the files from other to container to the new one. Below is the code.
string ContainerName = "old-container-name-user1"
string NewContainerName = "new-container-name-user2"
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference(ContainerName);
CloudBlobContainer destcontainer = blobClient.GetContainerReference(NewContainerName);
destcontainer.CreateIfNotExists(BlobContainerPublicAccessType.Blob);
IEnumerable<IListBlobItem> IE = container.ListBlobs(useFlatBlobListing: true);
foreach (IListBlobItem item in IE)
{
CloudBlockBlob blob = (CloudBlockBlob)item;
CloudBlockBlob destBlob = destcontainer.GetBlockBlobReference(blob.Name);
destBlob.StartCopyAsync(new Uri(GetSharedAccessUri(blob.Name, container)));
}
AccessURI Method
private static string GetSharedAccessUri(string blobName, CloudBlobContainer container)
{
DateTime toDateTime = DateTime.Now.AddMinutes(60);
SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessStartTime = null,
SharedAccessExpiryTime = new DateTimeOffset(toDateTime)
};
CloudBlockBlob blob = container.GetBlockBlobReference(blobName);
string sas = blob.GetSharedAccessSignature(policy);
return blob.Uri.AbsoluteUri + sas;
}
Now its working but another problem shows up. It says
System.InvalidCastException: 'Unable to cast object of type
'Microsoft.WindowsAzure.Storage.Blob.CloudBlobDirectory' to type
'Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob'.'
But this will be posted in another question. Thanks to our friend Gauriv for pointing out my problem. Check his answer below.
Update 2
By adding useFlatBlobListing: true in IEnumerable<IListBlobItem> IE = container.ListBlobs(blobListingDetails: BlobListingDetails.Metadata); I was able to fix my problem. This line of code was put in my display.
Final Code
IEnumerable<IListBlobItem> IE = container.ListBlobs(blobListingDetails: BlobListingDetails.Metadata, useFlatBlobListing: true);
If you look at your code for creating source and destination blob:
CloudBlockBlob blobOriginal = sourceBlobContainer.GetBlockBlobReference(ContainerName);
CloudBlockBlob blobNew = destBlobContainer.GetBlockBlobReference(NewContainerName);
You'll notice that you're passing the names of the blob container and not the name of the blob. Because you don't have a blob in the container by the name of the container, you're getting 404 error.
To copy a blob container, what you have to do is list all blobs from the source container and then copy them individually in the destination container. Once all the blobs have been copied, you can delete the source container.
If you want, you can use Microsoft's Storage Explorer to achieve "rename container" functionality. It also works by copying blobs from old container to the renamed container and then deletes the old container.
static void RenameContainer()
{
var connectionString = "DefaultEndpointsProtocol=https;AccountName=account-name;AccountKey=account-key";
var storageAccount = CloudStorageAccount.Parse(connectionString);
var client = storageAccount.CreateCloudBlobClient();
var sourceContainer = client.GetContainerReference("source-container");
var targetContainer = client.GetContainerReference("target-container");
targetContainer.CreateIfNotExists();//Create target container
BlobContinuationToken continuationToken = null;
do
{
Console.WriteLine("Listing blobs. Please wait...");
var blobsResult = sourceContainer.ListBlobsSegmented(prefix: "", useFlatBlobListing: true, blobListingDetails: BlobListingDetails.All, maxResults: 1000, currentToken: continuationToken, options: new BlobRequestOptions(), operationContext: new OperationContext());
continuationToken = blobsResult.ContinuationToken;
var items = blobsResult.Results;
foreach (var item in items)
{
var blob = (CloudBlob)item;
var targetBlob = targetContainer.GetBlobReference(blob.Name);
Console.WriteLine(string.Format("Copying \"{0}\" from \"{1}\" blob container to \"{2}\" blob container.", blob.Name, sourceContainer.Name, targetContainer.Name));
targetBlob.StartCopy(blob.Uri);
}
} while (continuationToken != null);
Console.WriteLine("Deleting source blob container. Please wait.");
//sourceContainer.DeleteIfExists();
Console.WriteLine("Rename container operation complete. Press any key to terminate the application.");
}
You can rename containers with Microsoft's "Microsoft Azure Storage Explorer" (after version 0.8.3).
Actual Answer:
Regarding your error message, If the client application receives an HTTP 404 (Not found) message from the server, this means that the object the client was attempting to use does not exist in the storage service. There are several possible reasons for this, such as:
· The client or another process previously deleted the object (Make sure name is correct)
· A Shared Access Signature (SAS) authorization issue
· Client-side code does not have permission to access the object
· Network failure
In order to identify the issue in detail, you can add a try/catch and see the actual error
I have created blob storage in azure.
Then have created container called as "MyReport"
Inside container "MyReport" I created 2 folders called as "Test" and "Live". Under both folders "Test" and "Live" there are many subfolders.
What I want is to get latest folder created by azure in those folders.
I tried the following:
StorageCredentialsAccountAndKey credentials = new StorageCredentialsAccountAndKey(accountName, accessKey);
CloudStorageAccount acc = new CloudStorageAccount(credentials, true);
CloudBlobClient client = acc.CreateCloudBlobClient();
CloudBlobDirectory container = client.GetBlobDirectoryReference(#"MyReport/Test");
var folders = container.ListBlobs().Where(b => b as CloudBlobDirectory != null).ToList();
In Folders variable I get many folders but I want to get the latest folder created by azure.
How to do this?
Update 10/04:
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference("test1");
CloudBlobDirectory myDirectory = cloudBlobContainer.GetDirectoryReference("test");
var myfiles = myDirectory.ListBlobs(useFlatBlobListing: true, blobListingDetails: BlobListingDetails.All).Where(b => b as CloudBlockBlob != null);
var my_lastmodified_blob = myfiles.OfType<CloudBlockBlob>().OrderByDescending(b => b.Properties.LastModified).First();
Console.WriteLine(my_lastmodified_blob.Parent.StorageUri.PrimaryUri.Segments.Last());
The result(there is a "/" at the end of the folder name, you can remove it as per your need):
As per this issue, when list blobs, the blob is ordered by comparing blob's name char-by-char(ascending order).
So in your code, just use ListBlobs method, then use .Last() to fetch the latest one.
Sample code:
#other code
var myblob = container.ListBlobs().Last();
Console.WriteLine(((CloudBlockBlob)myblob).Name);
The result:
Actually CloudBlobDirectory don't hold LastModified Date but within folder all CloudBlockBlob hold Last modified date. so We should decide based on inner files
Here is sample and its working for me
CloudBlobClient client = acc.CreateCloudBlobClient();
var container = client.GetContainerReference(#"seleniumtestreports");
CloudBlobDirectory Directory = container.GetDirectoryReference("DevTests");
var BlobFolders = Directory.ListBlobs().OfType<CloudBlobDirectory>() .Select(f => new { cloudBlobDirectory = f,LastModified = f.ListBlobs().OfType<CloudBlockBlob>().OrderByDescending(dd => dd.Properties.LastModified).FirstOrDefault().Properties.LastModified }).ToList();
var getLastestFolder = BlobFolders.OrderByDescending(s => s.LastModified).FirstOrDefault();
With some clue from #Ivan Yang, i found answer for this.
Clue was to use BlobRequest options. So this works for me
StorageCredentialsAccountAndKey credentials = new StorageCredentialsAccountAndKey(accountName, accessKey);
CloudStorageAccount acc = new CloudStorageAccount(credentials, true);
CloudBlobClient client = acc.CreateCloudBlobClient();
CloudBlobDirectory container = client.GetBlobDirectoryReference(#"MyReport/Test");
BlobRequestOptions options = new BlobRequestOptions();
options.UseFlatBlobListing = true;
var listblob = container.ListBlobs(options);
var latestFolderAzure = listblob.OfType<CloudBlob>().OrderBy(b => b.Properties.LastModifiedUtc).LastOrDefault()?.Parent.Uri.AbsoluteUri;
I'm trying to upload an image that I get from my Android device as a ByteArray
to my Azure Storage Blob. By using a webservice in asp.net.
But I can't figure out how to do this...
Here is my code so far:
[WebMethod]
public string UploadFile(byte[] f, string fileName)
{
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
ConfigurationManager.ConnectionStrings["StorageConnectionString"].ConnectionString);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve a reference to a container.
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
// Create the container if it doesn't already exist.
container.CreateIfNotExists();
container.SetPermissions(
new BlobContainerPermissions
{
PublicAccess = BlobContainerPublicAccessType.Blob
});
// Retrieve reference to a blob named "filename...".
CloudBlockBlob blockBlob = container.GetBlockBlobReference(fileName);
// Create or overwrite the "filename..." blob with contents from a local file.
using (var fileStream = System.IO.File.OpenRead("C:\\filepath"))
{
blockBlob.UploadFromStream(fileStream);
}
return "OK";
}
This code gets the image from a local file path on my computer, and thats not what I want.
I want to use the byte[] array 'f' wich I recive from my Android device instead of 'C:\filepath'
How can I do that ?