I've combed through at least a dozen different versions of this problem but none of them have so far produced an answer to the one I'm having. I'm attempting to develop a proof-of-concept for retrieving an image from Azure BLOB storage using Azurite and a service SAS with a corresponding storage account key. I'm using the well-known storage account credentials for Azurite as indicated here. There appear to be innumerable ways to do this kind of thing in C# but I opted to loosely follow this example and ended up with something like this:
public Uri GetSharedKeySasUrl()
{
BlobServiceClient serviceClient = new BlobServiceClient(
new Uri("http://127.0.0.1:10000/devstoreaccount1/"),
new Azure.Storage.StorageSharedKeyCredential(
"devstoreaccount1",
"Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw=="
)
);
var blobContainer = serviceClient.GetBlobContainerClient("images");
var blobClient = blobContainer.GetBlobClient("00000-pano.jpg");
BlobSasBuilder sasBuilder = new BlobSasBuilder()
{
BlobContainerName = "images",
BlobName = "00000-pano.jpg",
Resource = "b",
StartsOn = DateTimeOffset.UtcNow,
ExpiresOn = DateTimeOffset.UtcNow.AddHours(1)
};
sasBuilder.SetPermissions(BlobSasPermissions.Read);
Uri sasUri = blobClient.GenerateSasUri(sasBuilder);
Debug.WriteLine("SAS URI for blob is: {0}", sasUri);
return sasUri;
}
This produces a URL that looks as though it should work just fine, but whenever I paste it into a browser and attempt to access it I consistently get the following:
<Error>
<Code>AuthorizationFailure</Code>
<Message>Server failed to authenticate the request. Make sure the value of the Authorization header is formed correctly including the signature. RequestId:551902dd-ecb8-4736-9cf1-32406d98c02f Time:2022-04-11T20:18:06.368Z</Message>
</Error>
Conversely, if I generate a SAS signature through the Microsoft Azure Storage Explorer, that URL works perfectly. If I compare this URL to the URL this method generates, both seem to be perfectly well-formatted - I can't see anything that would suggest mine is malformed.
Generated by the method:
http://127.0.0.1:10000/devstoreaccount1/images/00000-pano.jpg?sv=2021-04-10&st=2022-04-11T20%3A17%3A59Z&se=2022-04-11T21%3A17%3A59Z&sr=b&sp=r&sig=bUHI2562NmvvtflOqT5kr0E%2BnZv7Q12PlR%2FGNPmEhL8%3D
Generated in Storage Explorer:
http://127.0.0.1:10000/devstoreaccount1/images/00000-pano.jpg?sv=2018-03-28&st=2022-04-11T20%3A06%3A47Z&se=2022-04-12T20%3A06%3A47Z&sr=b&sp=r&sig=V6N7uWDGgoVx8wirM%2FP1ou2kbg05PB4D%2BG8YQdvS5RU%3D
The only notable difference seems to be the version of the service each one is using - the explorer uses 2018-03-28 while the method uses 2021-04-10. I've tried assigning the version in the BlobSasBuilder but it just gets ignored, which is consistent with Microsoft's own remarks about the property being deprecated and the class always using the latest supported version of the service regardless of what you specify here. I still don't think that explains the issue, though.
Any ideas? Thanks in advance for the help.
EDIT
I remain hopelessly stuck on this. I've made numerous attempts to reformulate the code I'm using to construct the SAS URI in hopes that something might work. Not a single one has worked so far. The methods I've tried:
Using the BlobUriBuilder, SasBuilder and StorageSharedKeyCredential to generate the URI
Using the BlobServiceClient with a StorageSharedKeyCredential to derive the BlobContainerClient and in turn the BlobClient, and then the SasBuilder to generate the URI
Same as above using a connection string with the BlobServiceClient in place of a StorageSharedKeyCredential
Same as above using "UseDevelopmentStorage=true" as the connection string with the BlobServiceClient
Signing the string manually using HMACSHA256 and a NameValueCollection to construct the query parameters
I've also experimented with running Azurite using different startup options, including azurite --loose and azurite --skipApiVersionCheck. Neither one makes any difference.
Other things I've tried:
Updating the version of Azurite I'm using
Running Azurite from within Visual Studio as a service dependency
I simply can't understand why this would be such an insurmountable problem. Please help.
Updated Solution:
I have tested Azure.Storage.Blobs package version 12.11.0, and it gives the exact issue OP described in the question.
Changing this to version 12.10.0 solved the problem.
This makes a conclusion that version 12.11.0 must either be a side effect of a breaking change or bug.
So until Microsoft comes with a solution or answer, the solution is to use version 12.10.0.
Original answer:
I have tested your code and it works, I do not get the error you have. I guess you either need to reinstall the latest version of Azurite or do it the way I did it and it worked for me.
To ensure the consistency of my work, I used docker to install azurite.
I use Windows 11 and Docker Desktop, more about installation check Microsoft doc.
docker pull mcr.microsoft.com/azure-storage/azurite
Then
docker run -p 10000:10000 -p 10001:10001 -p 10002:10002 -v c:/azurite:/data mcr.microsoft.com/azure-storage/azurite
Now we have docker up and running, I use Microsoft Azure Storage Explorer to create a blob container and upload the image 00000-pano.jpg (it is a flower image).
Now I take your code without changing anything in it and run it in my console application with package reference
<PackageReference Include="Azure.Storage.Blobs" Version="12.10.0" />
And running the code, I get the following URL:
http://127.0.0.1:10000/devstoreaccount1/images/00000-pano.jpg?sv=2020-08-04&st=2022-04-20T10%3A47%3A00Z&se=2022-04-20T11%3A47%3A00Z&sr=b&sp=r&sig=RHHe5NJRjhbOjoTeCONqhrRHjjJCygOa8urcwzOWpeE%3D
I copy and paste the URL into my browser and I get the flower without authentication issues:
The following is my console for docker and logs of what I was doing with the image:
Disclaimer: I have some articles about Azurite and docker, testing Azure code with Azurite, and mocking Azure storage this might add value for your further work. (https://itbackyard.com/tag/azurite/)
Update
I found out after submitting a bug to the Azure SDK team that the fix for this is actually being handled on the Azurite side, per these previously identified issues. It appears Azurite simply doesn't support SAS versions 2020-10-02 and newer at this time. Worth keeping an eye on this one for those wishing to develop locally using the latest SAS versions.
Original Answer
It was a versioning issue. For whatever reason the latest version of Azure.Storage.Blobs (12.11.0, at the time of writing) absolutely refuses to work with Azurite, but version 12.10.0 works just fine.
It appears there was some kind of breaking change with the new version. I've submitted a bug to the SDK team, just to add some visibility. In the meantime I suppose I'll just be sticking to version 12.10.0.
Please check the below points:
please make sure you have Assigned Storage Blob Data Contributor role or storage blob data reader role . Navigate to Storage account -> Access Control (IAM) -> Add role assignment or ask the adimin to assign that role.
Also do check the Access level of the container and change access level to read blob ,if it is private.
In my case,i am also admin to my storage account and has read , write and create sas permissions enabled.
-Please heck if there is any timezone difference of local computerw(out of sync time stamp).
Also please check if your storage account is firewall enabled.
Azure Portal -> Storage Account -> Networking -> Check Allow Access
From (All Networks / Selected Networks)
If it is "Selected Networks" - It means the storage account is
firewall enabled.
If the storage account is firewall enabled ,that may be the cause for
the error , so pleasecheck your storage is whitelisted to access as by default its ip is 127.0.0.1:10000.
Also check if SAS has expired.
Please check with https endpoint :
SO reference :add a file to Azurite
Easiest way is to use connection string in app config file
<appSettings>
<add key="StorageConnectionString" value="UseDevelopmentStorage=true" />
</appSettings>
NOTE: Keep a track similar ongoing Azurite/issues in github.
With connection string :
var connectionString = "DefaultEndpointsProtocol=https;AccountName=.........";
BlobServiceClient _blobServiceClient = new BlobServiceClient(connectionString);
// var blobUri = _blobServiceClient.GenerateAccountSasUri(
AccountSasPermissions.Read, DateTimeOffset.UtcNow.AddDays(10),
AccountSasResourceTypes.Object);
// var sasCredential = new AzureSasCredential(blobUri.Query);
var blobContainerClient = _blobServiceClient.GetBlobContainerClient("imagecont"); //container name
var blobClient = blobContainerClient.GetBlobClient("computeimage.jpg"); // my image blob name
var sasBuilder = new BlobSasBuilder()
{
BlobContainerName = blobClient.BlobContainerName,
BlobName = blobClient.Name,
Resource = "b", // b for blob, c for container
StartsOn = DateTimeOffset.UtcNow,
ExpiresOn = DateTimeOffset.UtcNow.AddHours(2),
};
sasBuilder.SetPermissions(BlobSasPermissions.Read); // read permissions
Uri sasUri = blobClient.GenerateSasUri(sasBuilder);
Console.WriteLine("SAS URI for blob is: {0}", sasUri);
With connectionstring and (SAS key)Account key get SAS token of blob :
var connectionString = "DefaultEndpointsProtocol=h.......";
BlobServiceClient _blobServiceClient = new BlobServiceClient(connectionString);
var storageAccountName = " ";
var blobContainerClient = _blobServiceClient.GetBlobContainerClient("imagecont");
var blobClient = blobContainerClient.GetBlobClient("computeimage.jpg"); // blob name
//Here we have used SAS Key StorageSharedKeyCredential(storageAccountName,Account key);
StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(storageAccountName, "hmjkxxxxxxX3A==");
var sasBuilder = new BlobSasBuilder()
{
BlobContainerName = blobClient.BlobContainerName,
BlobName = blobClient.Name,
Resource = "b", // b for blob
StartsOn = DateTimeOffset.UtcNow,
ExpiresOn = DateTimeOffset.UtcNow.AddHours(2),
};
sasBuilder.SetPermissions(BlobSasPermissions.Read); // read permissions
string sasToken = sasBuilder.ToSasQueryParameters(sharedKeyCredential).ToString();
Console.WriteLine("SAS token for blob is: {0}", sasToken);
OUTPUT SAS TOKEN:
Form url https://<storage account name>.blob.core.windows.net/<container>/<blob name ex: computeimage.jpg >?<give sas token here>
And use the endpoint to get /retreive the image
OUTPUT: RETRIEVED IMAGE BOTH WAYS (FROM URL AND BY FORMING URL FROM SAS TOKEN)
REFERENCES:
azure-storage-net/issues(GITHUB)
How to access Azure blob using SAS in C#
How to get a Shared Access Signature on a Blob
I experienced the exact same problem as OP but with slightly different technologies:
Azurite 3.16.0 (Docker image)
Azure Storage Blob Java SDK 12.15.0
#azure/storage-blob JavaScript SDK 12.9.0
In my application, I generate the SAS using the Java SDK and then pass it to a JavaScript app running in the browser which then uploads a file to an Azure Storage blob container.
My SAS generator code is explicitly using BlobServiceVersion.V2021_04_10
My SAS generator code:
BlobContainerClient container = new BlobContainerClientBuilder()
.credential(azuriteStorageCredential())
.endpoint(endpoint)
.containerName(containerName)
.serviceVersion(BlobServiceVersion.V2021_04_10)
.buildClient();
BlobClient blob = container.getBlobClient(blobName);
OffsetDateTime expiration = OffsetDateTime
.now()
.plusMinutes(30);
BlobSasPermission permissions = new BlobSasPermission()
.setCreatePermission(true);
BlobServiceSasSignatureValues values = new BlobServiceSasSignatureValues(expiration, permissions)
.setContentType("video/mp4")
.setProtocol(SasProtocol.HTTPS_HTTP);
String sas = blob.generateSas(values);
String url = blob.getBlobUrl() + "?" + sas; // the final blob url passed to my javascript app
JavaScript code
const options = {
blobHTTPHeaders: {
blobContentType: 'video/mp4'
}
};
const client = new BlockBlobClient(url);
client.uploadData(file, options);
I too used the well-known storage account credentials for Azurite and kept getting Server failed to authenticate the request. Make sure the value of the Authorization header is formed correctly including the signature and couldn't for the life of me, figure out why Azurite was rejecting the SAS.
I turned on verbose logging in Azurite and noticed the signatures didn't match but couldn't determine why.
I too went through the process of trying a bunch of different ways to do the same thing and then eventually stumbled upon this StackOverflow post and decided to try upgrading the libraries.
I upgraded to:
Azurite 3.17.1
Azure Storage Blob Java SDK 12.16.0
and amazingly... it finally worked!
So it turned out the problem was either in Azurite or in the Java SDK...
I didn't bother to independently downgrade them to try to figure out which was actually the problem but I figured I would post this here for anyone experiencing a similar issue using these same SDKs + Azurite.
Update: based on this comment, it looks like the bug was in Azurite and was fixed in 3.17.0
I want to initiate an Azure-PIM using c#/.net
I already found a PowerShell-Function to do this:
New-AzurePIMRequest ... inside the "PIMTools"-packages: https://www.powershellgallery.com/packages/PIMTools/0.4.0.0
This is working just fine and of course I could just execute a PS-Script containing this from within my C#-Application. But I would prefer to natively achieving the same using a NuGet-Package or a library from within my application.
Is there a package that allows me to achieve the same from within C#?
Those PIMTools are just wrapping some existing powershell modules.You can check the details here: https://github.com/janegilring/PIMTools/blob/main/functions/New-AzurePIMRequest.ps1
As you can see they mainly use the AzureADPreview module which is giving access to the Microsoft Graph Endpoint. Microsoft Graph is a RESTful web API that enables you to access Microsoft Cloud service resources.
You can find the Graph SDK here: https://github.com/microsoftgraph/msgraph-sdk-dotnet
With the SDK installed you can use something like this to issue a PIM Request:
var graphClient = new GraphServiceClient(new DefaultAzureCredential());
var privilegedRoleAssignmentRequest = new PrivilegedRoleAssignmentRequestObject
{
Duration = "2",
Reason = "DevWork",
AssignmentState = "Active",
RoleId = "b24988ac-6180-42a0-ab88-20f7382dd24c",
};
await graphClient.PrivilegedRoleAssignmentRequests
.Request()
.AddAsync(privilegedRoleAssignmentRequest);
Note: You might have to the use the /beta Endpoint of the SDK to get PIM working. However, APIs under the /beta version in Microsoft Graph are subject to change. Use of these APIs in production applications is not supported.
I have the task of making a .NET Core 3.1 console app that will run in a linux docker container on the AWS platform, connect to a Azure File Store to read and write files. I have am a C# programmer but have not had anything to do with the world of containers or Azure as yet.
I have received a Azure connection string in the following format:
DefaultEndpointsProtocol=https;AccountName=[ACCOUNT_NAME_HERE];AccountKey=[ACCOUNT_KEY_HERE];EndpointSuffix=core.windows.net
But following the examples I have seen online like this:
https://learn.microsoft.com/en-us/visualstudio/azure/vs-azure-tools-connected-services-storage?view=vs-2019
Right click on project in VS2019 and add Connected Service.
Select Azure Storage from the list.
Connect to you Azure Storage account
For step 3 you need to use an Azure Account login email/pass.
I don't have that, I just have a connection string.
I have found examples like the following:
http://www.mattruma.com/adventures-with-azure-storage-accessing-a-file-with-a-shared-access-signature/
https://www.c-sharpcorner.com/article/implement-azure-file-storage-using-asp-net-core-console-application/
But these both use:
Microsoft.Azure.Storage.Common
and under dependencies it has .NET Standard and .NET Framework. I don't think these will run in the linux docker container. Once I have worked out docker containers work I will do a test to confirm this.
Can anyone shed some light on how I can, from a .NET Core 3.1 console app running in a linux docker container on the AWS platform, connect to a Azure File Store to read and write files using the Azure connection string format outlined above?
If you focus on how to add service dependencies in Connected Service, then if you don't have the Azure Account login email/pass you can not add the service.
From your description, seems you just want to read and write files from files share of storage using C# console app. So you don't need to add service of project, just add the code in your console app is ok.
This is a simple code:
using System;
using System.IO;
using System.Text;
using Azure;
using Azure.Storage.Files.Shares;
namespace ConsoleApp4
{
class Program
{
static void Main(string[] args)
{
string con_str = "DefaultEndpointsProtocol=https;AccountName=0730bowmanwindow;AccountKey=xxxxxx;EndpointSuffix=core.windows.net";
string sharename = "test";
string filename = "test.txt";
string directoryname = "testdirectory";
ShareServiceClient shareserviceclient = new ShareServiceClient(con_str);
ShareClient shareclient = shareserviceclient.GetShareClient(sharename);
ShareDirectoryClient sharedirectoryclient = shareclient.GetDirectoryClient(directoryname);
//write data.
ShareFileClient sharefileclient_in = sharedirectoryclient.CreateFile(filename,1000);
string filecontent_in = "This is the content of the file.";
byte[] byteArray = Encoding.UTF8.GetBytes(filecontent_in);
MemoryStream stream1 = new MemoryStream(byteArray);
stream1.Position = 0;
sharefileclient_in.Upload(stream1);
//read data.
ShareFileClient sharefileclient_out = sharedirectoryclient.GetFileClient(filename);
Stream stream2 = sharefileclient_out.Download().Value.Content;
StreamReader reader = new StreamReader(stream2);
string filecontent_out = reader.ReadToEnd();
Console.WriteLine(filecontent_out);
}
}
}
There is C# example client-libraries-usage-csharp of using the lib.
And there is example how to set an evironment variable
export GOOGLE_APPLICATION_CREDENTIALS="/home/user/Downloads/[FILE_NAME].json"
How do I set credentials for google speech to text without setting environment variable?
Somehow like this:
var credentials = ...create(file.json);
var speech = SpeechClient.Create(credentials);
using Grpc.Auth;
then
string keyPath = "key.json";
GoogleCredential googleCredential;
using (Stream m = new FileStream(keyPath, FileMode.Open))
googleCredential = GoogleCredential.FromStream(m);
var channel = new Grpc.Core.Channel(SpeechClient.DefaultEndpoint.Host,
googleCredential.ToChannelCredentials());
var speech = SpeechClient.Create(channel);
Unless you're running your application on a GCP service, there's no other way to obtain Service Account credentials for client libraries than to set the environmental variable.
GCP client libraries use a strategy called Application Default Credentials (ADC) to find your application's credentials.
By default, the client library will use the JSON pointed to by the environmental variable. If the JSON is not found but your app is running on App Engine, Compute Engine or Kubernetes Engine, then your application will use the credentials of the default service account (for example, the App Engine default service account, if your application is running on App Engine.)
SpeechClient.Create() does not support the parameter credentials anymore in version 2.7.0 but I found the following solution:
var client = new SpeechClientBuilder { JsonCredentials = "..." }.Build()
JsonCredentials accepts a string with the json content.
I have the following very basic TTS code running on my local server
using System.Speech.Synthesis;
...
SpeechSynthesizer reader = new SpeechSynthesizer();
reader.Speak("This is a test");
This code has a dependency on System.Speech for which I have added a Reference in my VS 2015 project.
Works fine but from what I have read and from trying it I know this will not work when the code is hosted on Azure.
I have read several posts on SO querying if it is actually possible to do TTS on azure. Certainly 2 yrs ago it did not appear to be possible. How to get System.Speech on windows azure websites?
All roads seem to lead to the Microsoft Speech API
https://azure.microsoft.com/en-gb/marketplace/partners/speechapis/speechapis/
I have signed up and have gotten my private and sec keys for calling into this API.
However my question is this. How do I actually call the SpeechAPI? What do I have to change in the simple code example above so that this will work when running on azure?
The speech API you referred to at the Azure marketplace is part of an AI Microsoft project called ProjectOxford which offers an array of APIs for computer vision, speech and language.
These are all RESTful APIs, meaning that you will be constructing HTTP requests to send to a hosted online service in the cloud.
The speech-to-text documentation is available here and you can find sample code for various clients on github. Specifically for C# you can see some code in this sample project.
Please note that ProjectOxford is still in preview (Beta). Additional support for using these APIs can be found on the ProjectOxford MSDN forum.
But just to give you an idea of how your program will look like (taken from the above code sample on github):
AccessTokenInfo token;
// Note: Sign up at http://www.projectoxford.ai for the client credentials.
Authentication auth = new Authentication("Your ClientId goes here", "Your Client Secret goes here");
...
token = auth.GetAccessToken();
...
string requestUri = "https://speech.platform.bing.com/synthesize";
var cortana = new Synthesize(new Synthesize.InputOptions()
{
RequestUri = new Uri(requestUri),
// Text to be spoken.
Text = "Hi, how are you doing?",
VoiceType = Gender.Female,
// Refer to the documentation for complete list of supported locales.
Locale = "en-US",
// You can also customize the output voice. Refer to the documentation to view the different
// voices that the TTS service can output.
VoiceName = "Microsoft Server Speech Text to Speech Voice (en-US, ZiraRUS)",
// Service can return audio in different output format.
OutputFormat = AudioOutputFormat.Riff16Khz16BitMonoPcm,
AuthorizationToken = "Bearer " + token.access_token,
});
cortana.OnAudioAvailable += PlayAudio;
cortana.OnError += ErrorHandler;
cortana.Speak(CancellationToken.None).Wait();