Shared Access Signature - Signature did not match - c#

Getting an error when trying to use a SAS token.
I generate the token on the container like so:
var client = _account.CreateCloudBlobClient();
var container = client.GetContainerReference("2017-med");
var sas = container.GetSharedAccessSignature(new SharedAccessBlobPolicy
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTimeOffset.UtcNow.AddYears(5)
});
I append it to the URL of a resource that is in the container and get this error message when I visit the URL:
Signature did not match. String to sign used was r
2023-05-14T05:48:34Z /blob/myblobname/2017-med 2017-07-29
Is it possible to use a container token for resources in the container? Or is there something else at play here?

As I have tested, you search the sasuri in the explorer and the container permission you set is only read.
I suggest that you could add a list permission in your sas policy like below:
var sas = container.GetSharedAccessSignature(new SharedAccessBlobPolicy
{
Permissions = SharedAccessBlobPermissions.Read | SharedAccessBlobPermissions.List
SharedAccessExpiryTime = DateTimeOffset.UtcNow.AddYears(5)
});
And then when you search it in your explorer you could list all the blob in your container to judge if the sasuri you generated is correct.
The List Blobs request may be constructed as follows.
HTTPS is recommended. Replace myaccount with the name of your storage account:
https://myaccount.blob.core.windows.net/mycontainer?restype=container&comp=list&sastoken
Actually, when you generate the sasuri for container, you could use the sasuri to access the blob in the container and do some operations within your permission.
string sasUri = uricontainer;
CloudBlockBlob blob = new CloudBlockBlob(new Uri(sasUri));
Update:
Signature did not match. String to sign used was r 2023-05-14T05:48:34Z /blob/myblobname/2017-med 2017-07-29
As you have provided, I found that your storagename is myblobname, which is very puzzled.
And I found there is indeed existing the storagename called myblobname. So, I guess that maybe you have a splicing error.
You could use var bloburi = container.Uri + sas; to get the full uri.
And if you want to show list or other operation with full uri, you could refer to the above code I provided.

Related

Roles required to use PersistKeysToAzureBlobStorage with AzureServiceTokenProvider

I want my application to be able to initialize the data protection key storage at startup.
But it will only work if I use the Azure Storage Access Key and not the MSI or Azure user in my Visual Studio (az login) that I expected. I have given the user and VM Scale Set the roles Owner, Contributor, Storage blob Data Contributor without success.
Is it a must to use SAS or the Access Key in order for the blob to be created automatically?
Error when using the token provider (MSI, az login)
Microsoft.AspNetCore.DataProtection.Internal.DataProtectionHostedService:
Information: Key ring failed to load during application startup.
Request Information
StatusMessage:The specified resource does not exist.
ErrorCode:ResourceNotFound
ErrorMessage:The specified resource does not exist.
public void ConfigureServices( IServiceCollection services )
{
var accessKey = "ddBU/...==";
var blobUri = new Uri( $"https://mystorageaccount.blob.core.windows.net/mycontainer/keys.xml" );
var tp = new Microsoft.Azure.Services.AppAuthentication.AzureServiceTokenProvider();
var token = tp.GetAccessTokenAsync( $"https://mystorageaccount.blob.core.windows.net/" ).Result;
// OK - creates and updates blob when neccessary
var sc = new Microsoft.Azure.Storage.Auth.StorageCredentials( "mystorageaccount", accessKey );
// NOK - can only read the blob
//sc = new Microsoft.Azure.Storage.Auth.StorageCredentials( token );
var cbb = new Microsoft.Azure.Storage.Blob.CloudBlockBlob( blobUri, sc );
services.AddDataProtection()
.PersistKeysToAzureBlobStorage( cbb )
.SetApplicationName( "MyFrontends" );
}
The role "Storage blob Data Contributor" does allow PersistKeysToAzureBlobStorage() to create and maintain the key in the Azure Storage Account.
I must have made a mistake while testing, maybe I did not wait long enough after applying the role or maybe I had not created the target container.

Read Parquet file from Azure blob with out downloading it locally c# .net

We have a parquet formatfile (500 mb) which is located in Azure blob.How to read the file directly from blob and save in memory of c# ,say eg:Datatable.
I am able to read parquet file which is physically located in folder using the below code.
public void ReadParqueFile()
{
using (Stream fileStream = System.IO.File.OpenRead("D:/../userdata1.parquet"))
{
using (var parquetReader = new ParquetReader(fileStream))
{
DataField[] dataFields = parquetReader.Schema.GetDataFields();
for (int i = 0; i < parquetReader.RowGroupCount; i++)
{
using (ParquetRowGroupReader groupReader = parquetReader.OpenRowGroupReader(i))
{
DataColumn[] columns = dataFields.Select(groupReader.ReadColumn).ToArray();
DataColumn firstColumn = columns[0];
Array data = firstColumn.Data;
//int[] ids = (int[])data;
}
}
}
}
}
}
(I am able to read csv file directly from blob using sourcestream).Please kindly suggest a fastest method to read the parquet file directly from blob
Per my experience, the solution to directly read the parquet file from blob is first to generate the blob url with sas token and then to get the stream of HttpClient from the url with sas and finally to read the http response stream via ParquetReader.
First, please refer to the sample code below of the section Create a service SAS for a blob of the offical document Create a service SAS for a container or blob with .NET using Azure Blob Storage SDK for .NET Core.
private static string GetBlobSasUri(CloudBlobContainer container, string blobName, string policyName = null)
{
string sasBlobToken;
// Get a reference to a blob within the container.
// Note that the blob may not exist yet, but a SAS can still be created for it.
CloudBlockBlob blob = container.GetBlockBlobReference(blobName);
if (policyName == null)
{
// Create a new access policy and define its constraints.
// Note that the SharedAccessBlobPolicy class is used both to define the parameters of an ad hoc SAS, and
// to construct a shared access policy that is saved to the container's shared access policies.
SharedAccessBlobPolicy adHocSAS = new SharedAccessBlobPolicy()
{
// When the start time for the SAS is omitted, the start time is assumed to be the time when the storage service receives the request.
// Omitting the start time for a SAS that is effective immediately helps to avoid clock skew.
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(24),
Permissions = SharedAccessBlobPermissions.Read | SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.Create
};
// Generate the shared access signature on the blob, setting the constraints directly on the signature.
sasBlobToken = blob.GetSharedAccessSignature(adHocSAS);
Console.WriteLine("SAS for blob (ad hoc): {0}", sasBlobToken);
Console.WriteLine();
}
else
{
// Generate the shared access signature on the blob. In this case, all of the constraints for the
// shared access signature are specified on the container's stored access policy.
sasBlobToken = blob.GetSharedAccessSignature(null, policyName);
Console.WriteLine("SAS for blob (stored access policy): {0}", sasBlobToken);
Console.WriteLine();
}
// Return the URI string for the container, including the SAS token.
return blob.Uri + sasBlobToken;
}
Then to get the http response stream of HttpClient from the url with sas token .
var blobUrlWithSAS = GetBlobSasUri(container, blobName);
var client = new HttpClient();
var stream = await client.GetStreamAsync(blobUrlWithSAS);
Finally to read it via ParquetReader, the code comes from Reading Data of GitHub repo aloneguid/parquet-dotnet.
var options = new ParquetOptions { TreatByteArrayAsString = true };
var reader = new ParquetReader(stream, options);

C# MVC Web App Service Connect to Azure Storage Blob

I've got a basic web app in C# MVC (i'm new to MVC) which is connected to a database. In that database there is a table with a list of filenames. These files are stored in Azure Storage Blob Container.
I've used Scaffolding (creates a controller and view) to show data from my table of filenames and that works fine.
Now I would like to connect those filenames to the blob storage so that the user can click on and open them. How do I achieve this?
Do I edit the index view? Do I get the user to click on a filename and then connect to Azure storage to open that file? How is this done?
Please note that files on storage are private and is accessed using the storage key. Files cannot be made public.
Thanks for any advice.
[Update]
I've implemented the Shared Access Signature (SAS) using the code below.
public static string GetSASUrl(string containerName)
{
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference(containerName);
BlobContainerPermissions containerPermissions = new BlobContainerPermissions();
containerPermissions.SharedAccessPolicies.Add("twominutepolicy", new SharedAccessBlobPolicy()
{
SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-1),
SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(2),
Permissions = SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.Read
});
containerPermissions.PublicAccess = BlobContainerPublicAccessType.Off;
container.SetPermissions(containerPermissions);
string sas = container.GetSharedAccessSignature(new SharedAccessBlobPolicy(), "twominutepolicy");
return sas;
}
public static string GetSasBlobUrl(string containerName, string fileName, string sas)
{
// Create new storage credentials using the SAS token.
StorageCredentials accountSAS = new StorageCredentials(sas);
// Use these credentials and the account name to create a Blob service client.
CloudStorageAccount accountWithSAS = new CloudStorageAccount(accountSAS, [Enter Account Name], endpointSuffix: null, useHttps: true);
CloudBlobClient blobClientWithSAS = accountWithSAS.CreateCloudBlobClient();
// Retrieve reference to a previously created container.
CloudBlobContainer container = blobClientWithSAS.GetContainerReference(containerName);
// Retrieve reference to a blob named "photo1.jpg".
CloudBlockBlob blockBlob = container.GetBlockBlobReference(fileName);
return blockBlob.Uri.AbsoluteUri + sas;
}
In order to access blobs that are not public, you'll need to use Shared Access Signatures, with that, you'll create access tokens valid for a period of time (you'll choose) and you can also restrict by IP address.
More info in here:
https://learn.microsoft.com/en-us/azure/storage/storage-dotnet-shared-access-signature-part-1
As they are not public, you'll need to add an additional step before pass the data to your view, which is concatenate the SAS token to the blob Uri. You can find a very good example in here: http://www.dotnetcurry.com/windows-azure/901/protect-azure-blob-storage-shared-access-signature

Azure container permissions

Im reading this article.
I have an azure container called "test" that is set to private in azure.
That container has a scorm package in it "121/HEEDENNL/story.html"
I'm using the code below to set the permissions of the folder to read.
However that story.html file needs several other files to run properly.
The story page opens and doesn't return a 403 or 404.
but the files it trying to reference to to make the page run properly are not loading.
How can I get all the files needed for story.html to run properly, be set to read access also?
I thought changing the containers permissions would allow that file to access the files needed.
What am I missing here?
public ActionResult ViewContent(int id)
{
const string pageBlobName = "121/HEEDENNL/story.html";
CloudStorageAccount storageAccount = Common.Constants.Azure.ConnectionStringUrl;
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
//// Retrieve a reference to a container.
// CloudBlobContainer learningModulContainer = blobClient.GetContainerReference(Common.Constants.Azure.LearningModulesContainerName);
CloudBlobContainer learningModulContainer = blobClient.GetContainerReference("test");
PrintBlobs(learningModulContainer);
CloudBlockBlob myindexfile = learningModulContainer.GetBlockBlobReference(pageBlobName);
SharedAccessBlobPermissions permission = SharedAccessBlobPermissions.None;
permission = SharedAccessBlobPermissions.Read;
var token = GetSasForBlob(myindexfile, permission,30);
//this isn't finished.....must get learning module
var module = DataAccessService.Get<LearningModule>(id);
var url = $"{Common.Constants.Azure.StorageAccountUrl}{"test"}/{module.ScormPackage.Path.Replace("index_lms", "story")}{token}";
return Redirect(token);
}
public static string GetSasForBlob(CloudBlockBlob blob, SharedAccessBlobPermissions permission, int sasMinutesValid)
{
// var sasToken = blob.GetSharedAccessSignature(new SharedAccessBlobPolicy()
var sasToken = blob.Container.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
Permissions = permission,
SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-15),
SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(sasMinutesValid),
});
return string.Format(CultureInfo.InvariantCulture, "{0}{1}", blob.Uri, sasToken);
}
How can I get all the files needed for story.html to run properly, be set to read access also?
Firstly, if possible, you could put these css&js&image etc files that your html page reference in a allow-public-access container.
Secondly, you could provide URL with SAS of the blob resource, and add reference like this in your html page.
<link href="https://{storageaccount}.blob.core.windows.net/styles/Style1.css?st=2017-06-15T02%3A27%3A00Z&se=2017-06-30T02%3A27%3A00Z&sp=r&sv=2015-04-05&sr=b&sig=%2FWwN0F4qyoIH97d7znRKo9lcp84S4oahU9RBwHTnlXk%3D" rel="stylesheet" />
Besides, if you’d like to host your web app, you could try to use Azure app service.

HttpResponseMessage Redirect to Private Azure Blob Storage

Developing an API using asp.net
Is it possible to redirect a user to a private azure blob storage? Can i do this using SAS keys or the azure blob SDK?
For example I want to do something like this:
var response = Request.CreateResponse(HttpStatusCode.Moved);
response.Headers.Location = new Uri(bloburl);
return response;
Is it possible to access a private blob by putting a key in the URL? Obviously i dont want to put the master key though.
Is it possible to redirect a user to a private azure blob storage? Can
i do this using SAS keys or the azure blob SDK?
Yes, it is entirely possible to redirect a user to a private blob. You would need to create a Shared Access Signature (SAS) with at least Read permission and append that SAS token to your blob URL and do a redirect to that URL.
Your code would look something like this:
var cred = new StorageCredentials(accountName, accountKey);
var account = new CloudStorageAccount(cred, true);
var client = account.CreateCloudBlobClient();
var container = client.GetContainerReference("container-name");
var blob = container.GetBlockBlobReference("blob-name");
var sasToken = blob.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(1)//Assuming you want the link to expire after 1 hour
});
var blobUrl = string.Format("{0}{1}", blob.Uri.AbsoluteUri, sasToken);
var response = Request.CreateResponse(HttpStatusCode.Moved);
response.Headers.Location = new Uri(bloburl);
return response;

Categories

Resources