I am using the c#.net api to work with azure file storage but cannot successfully list all files in a fileshare. My code errors with:
Microsoft.WindowsAzure.Storage: Server failed to authenticate the
request. Make sure the value of Authorization header is formed
correctly including the signature.
The following code works perfectly, so my connection to the fileshare 'temp' is fine:
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(connectionString);
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
CloudFileShare share = fileClient.GetShareReference("temp");
CloudFile f = share.GetRootDirectoryReference().GetFileReference("Report-461fab0e-068e-42f0-b480-c5744272e103-8-14-2018.pdf");
log.Info("size " + f.StreamMinimumReadSizeInBytes.ToString());
The code below results in the discussed authentication error:
FileContinuationToken continuationToken = null;
do
{
var response = await share.GetRootDirectoryReference().ListFilesAndDirectoriesSegmentedAsync(continuationToken);
continuationToken = response.ContinuationToken;
}
while (continuationToken != null);
Any help would be appreciated.
Thanks.
Using key 1 instead of key resolved the issue.
Related
I am using the azure blob storage to copy the dropbox file. But when I try to copy that file via URL, got the 500 error and totalbytes are -1.
I am using StartCopy method of WindowsAzure.Storage.Blob package. But here I get the copyStatus.TotalBytes as -1 and copy not working.
Tried the all types of url as below:
https://dl.dropboxusercontent.com/s/1v9re1dozilpdgi/1_32min.mp4?dl=0
https://dl.dropboxusercontent.com/s/1v9re1dozilpdgi/1_32min.mp4?dl=1
https://www.dropbox.com/s/1v9re1dozilpdgi/1_32min.mp4?dl=0
So can you please help me to solve this issue? Anything needs to change in URL or any way to copy the dropbox media to azure blob storage.
Also, I am using the .net 4.8 frameworks with the C#.
Sample Code:
string url = "https://dl.dropboxu`enter code here`sercontent.com/s/1v9re1dozilpdgi/1_32min.mp4?dl=0";
Uri fileUri = new Uri(url);
string filename = "test-file.mp4";
var account = CloudStorageAccount.Parse(connectionstring);
var blobClient = account.CreateCloudBlobClient();
var container = blobClient.GetContainerReference("test-container");
var blob = container.GetBlockBlobReference(filename);
blob.DeleteIfExists();
blob.StartCopy(fileUri);
var refBlob = (CloudBlockBlob)container.GetBlobReferenceFromServer(filename);
var fileLength = refBlob.CopyState.TotalBytes ?? 0;
while (refBlob.CopyState.Status == CopyStatus.Pending)
{
refBlob = (CloudBlockBlob)container.GetBlobReferenceFromServer(filename);
var copyStatus = refBlob.CopyState;
}
Error message: 500 InternalServerError "Copy failed."
We need to use CloudBlockBlob instead of using GetBlockBlobReference .
Because the filename, not the URL, is passed to GetBlockBlobReference in its Constructor.
For more information please refer the below
SO THREAD as suggested by #Tobias Tengler
& This BLOG:- Azure – Upload and Download data using C#.NET
In General, I want to get files from the Azure blob storage account using C#.net, and I have an SPN name.
I have tried the following ways to do it.
Note: Here Authentication key stored in KEY Vault and Azure admin
provide us only SPN Name
Way 1:
private static async Task<string> GetAccessToken()
string accessToken = "";
try
{
var authContext = new AuthenticationContext($"https://login.windows.net/{TenantID}");
var credential = new ClientCredential("{ClientID}", "{SPN Name}");
var result = await authContext.AcquireTokenAsync("https://storage.azure.com", credential);
if (result == null)
{
throw new Exception("Failed to authenticate via ADAL");
}
accessToken = result.AccessToken;
}
catch (Exception ex)
{
System.Diagnostics.Trace.WriteLine("Exception for get Blob container" + ex.Message.ToString());
}
Way 2:
private static async Task<string> GetAccessToken()
var serviceTokenProvider = new AzureServiceTokenProvider();
var keyVaultClient = new KeyVaultClient(new KeyVaultClient.AuthenticationCallback(serviceTokenProvider.KeyVaultTokenCallback));
SecretBundle secretValue = null;
try
{
secretValue = await keyVaultClient.GetSecretAsync("{KeyVault URI}", {SPN_Name});
}
catch (Exception kex)
{
System.Diagnostics.Trace.WriteLine("Exception for get Blob container" + kex.Message.ToString());
}
return secretValue.Value;
Following code to Access File from Azure Storage
public static async Task<List<IListBlobItem>> GetBlobContainer( string containerName)
var token = await GetAccessToken();
TokenCredential tokenCredential = new TokenCredential(token);
//here i am getting all one day file
DateTime date = DateTime.Now;
date = date.AddDays(-1);
StorageCredentials _objectCrentials = new StorageCredentials(tokenCredential);
CloudBlobClient blobClient = new CloudBlobClient(new Uri($"{Storage Account URI}"), _objectCrentials);
// container
CloudBlobContainer blobContainer = blobClient.GetContainerReference(containerName);
IEnumerable<IListBlobItem> listOfBlob = blobContainer.ListBlobs().OfType<CloudBlob>()
.OrderByDescending(b => b.Properties.LastModified > date);
//check her do you get any list
_list = listOfBlob.ToList();
Way 2 have gives this error.
Parameters: Connection String: [No connection string specified],
Resource: https://vault.azure.net, Authority:
https://login.windows.net/{tenantID}. Exception Message: Tried the
following 3 methods to get an access token, but none of them worked.
Parameters: Connection String: [No connection string specified],
Resource: https://vault.azure.net, Authority:
https://login.windows.net/{tenantID}. Exception Message: Tried to get
token using Managed Service Identity. Unable to connect to the
Instance Metadata Service (IMDS). Skipping request to the Managed
Service Identity (MSI) token endpoint. Parameters: Connection String:
[No connection string specified], Resource: https://vault.azure.net,
Authority: https://login.windows.net/{tenantID}. Exception Message:
Tried to get token using Visual Studio. Access token could not be
acquired. Exception for Visual Studio token provider
Microsoft.Asal.TokenService.exe : TS003: Error, TS005: No accounts
found. Please go to Tools->Options->Azure Services Authentication, and
add an account to be to authenticate to Azure services during
development.
Parameters: Connection String: [No connection string specified],
Resource: https://vault.azure.net, Authority:
https://login.windows.net/{tenantID}. Exception Message: Tried to get
token using Azure CLI. Access token could not be acquired. 'az' is not
recognized as an internal or external command, operable program or
batch file.
Please correct me if followed wrong steps.
I am trying to rename a container name for users when a condition is met. I made some research and found out that there is no rename function for containers in azure blob storage. But there is a way to accomplish this by copying the files and deleting it after copy. Below is the code I made.
string ContainerName = "old-container-name-user1";
string NewContainerName = "new-container-name-user2"
CloudStorageAccount sourceAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
CloudBlobClient sourceblobClient = sourceAccount.CreateCloudBlobClient();
CloudBlobContainer sourceBlobContainer = sourceblobClient.GetContainerReference(ContainerName);
CloudBlobContainer destBlobContainer = sourceblobClient.GetContainerReference(NewContainerName);
CloudBlockBlob blobOriginal = sourceBlobContainer.GetBlockBlobReference(ContainerName);
CloudBlockBlob blobNew = destBlobContainer.GetBlockBlobReference(NewContainerName);
blobNew.StartCopy(blobOriginal);
blobOriginal.Delete();
When I execute this code I got an error message. Below is the error.
Exception User-Unhandled
Microsoft.WindowsAzure.Storage.StorageException:'The remote server
returned an error: (404) Not Found.'
Inner Exception WebException: The remote server returned an error:
(404) Not Found.
When I also try "blobNew.StartCopyAsync(blobOriginal)" the code just goes through but when I check the containers in azure there is no container created. What do you think the problem is? Any tips on how to improve my code? Delete function does not work also.
UPDATE
I update my code and was able to copy the files from other to container to the new one. Below is the code.
string ContainerName = "old-container-name-user1"
string NewContainerName = "new-container-name-user2"
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference(ContainerName);
CloudBlobContainer destcontainer = blobClient.GetContainerReference(NewContainerName);
destcontainer.CreateIfNotExists(BlobContainerPublicAccessType.Blob);
IEnumerable<IListBlobItem> IE = container.ListBlobs(useFlatBlobListing: true);
foreach (IListBlobItem item in IE)
{
CloudBlockBlob blob = (CloudBlockBlob)item;
CloudBlockBlob destBlob = destcontainer.GetBlockBlobReference(blob.Name);
destBlob.StartCopyAsync(new Uri(GetSharedAccessUri(blob.Name, container)));
}
AccessURI Method
private static string GetSharedAccessUri(string blobName, CloudBlobContainer container)
{
DateTime toDateTime = DateTime.Now.AddMinutes(60);
SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessStartTime = null,
SharedAccessExpiryTime = new DateTimeOffset(toDateTime)
};
CloudBlockBlob blob = container.GetBlockBlobReference(blobName);
string sas = blob.GetSharedAccessSignature(policy);
return blob.Uri.AbsoluteUri + sas;
}
Now its working but another problem shows up. It says
System.InvalidCastException: 'Unable to cast object of type
'Microsoft.WindowsAzure.Storage.Blob.CloudBlobDirectory' to type
'Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob'.'
But this will be posted in another question. Thanks to our friend Gauriv for pointing out my problem. Check his answer below.
Update 2
By adding useFlatBlobListing: true in IEnumerable<IListBlobItem> IE = container.ListBlobs(blobListingDetails: BlobListingDetails.Metadata); I was able to fix my problem. This line of code was put in my display.
Final Code
IEnumerable<IListBlobItem> IE = container.ListBlobs(blobListingDetails: BlobListingDetails.Metadata, useFlatBlobListing: true);
If you look at your code for creating source and destination blob:
CloudBlockBlob blobOriginal = sourceBlobContainer.GetBlockBlobReference(ContainerName);
CloudBlockBlob blobNew = destBlobContainer.GetBlockBlobReference(NewContainerName);
You'll notice that you're passing the names of the blob container and not the name of the blob. Because you don't have a blob in the container by the name of the container, you're getting 404 error.
To copy a blob container, what you have to do is list all blobs from the source container and then copy them individually in the destination container. Once all the blobs have been copied, you can delete the source container.
If you want, you can use Microsoft's Storage Explorer to achieve "rename container" functionality. It also works by copying blobs from old container to the renamed container and then deletes the old container.
static void RenameContainer()
{
var connectionString = "DefaultEndpointsProtocol=https;AccountName=account-name;AccountKey=account-key";
var storageAccount = CloudStorageAccount.Parse(connectionString);
var client = storageAccount.CreateCloudBlobClient();
var sourceContainer = client.GetContainerReference("source-container");
var targetContainer = client.GetContainerReference("target-container");
targetContainer.CreateIfNotExists();//Create target container
BlobContinuationToken continuationToken = null;
do
{
Console.WriteLine("Listing blobs. Please wait...");
var blobsResult = sourceContainer.ListBlobsSegmented(prefix: "", useFlatBlobListing: true, blobListingDetails: BlobListingDetails.All, maxResults: 1000, currentToken: continuationToken, options: new BlobRequestOptions(), operationContext: new OperationContext());
continuationToken = blobsResult.ContinuationToken;
var items = blobsResult.Results;
foreach (var item in items)
{
var blob = (CloudBlob)item;
var targetBlob = targetContainer.GetBlobReference(blob.Name);
Console.WriteLine(string.Format("Copying \"{0}\" from \"{1}\" blob container to \"{2}\" blob container.", blob.Name, sourceContainer.Name, targetContainer.Name));
targetBlob.StartCopy(blob.Uri);
}
} while (continuationToken != null);
Console.WriteLine("Deleting source blob container. Please wait.");
//sourceContainer.DeleteIfExists();
Console.WriteLine("Rename container operation complete. Press any key to terminate the application.");
}
You can rename containers with Microsoft's "Microsoft Azure Storage Explorer" (after version 0.8.3).
Actual Answer:
Regarding your error message, If the client application receives an HTTP 404 (Not found) message from the server, this means that the object the client was attempting to use does not exist in the storage service. There are several possible reasons for this, such as:
· The client or another process previously deleted the object (Make sure name is correct)
· A Shared Access Signature (SAS) authorization issue
· Client-side code does not have permission to access the object
· Network failure
In order to identify the issue in detail, you can add a try/catch and see the actual error
I'm using this approach to encrypt files and store them in Azure block blobs. I would like to copy the encrypted blob to another blob storage account and decrypt it in the process. I know it's possible to do a "copy blob" operation which runs entirely inside Azure asynchronously and doesn't download the blob contents through my local computer in transit. I believe this is accomplished through the CloudBlockBlob.StartCopy method. But is it possible to do that with an encrypted file and decrypt it in transit to the other storage account?
Following that link above, my code looks like the following. blob.OpenRead works but blob2.StartCopy doesn't work.
BlobEncryptionPolicy policy = new BlobEncryptionPolicy(null, cloudResolver);
BlobRequestOptions options = new BlobRequestOptions() { EncryptionPolicy = policy };
CloudBlockBlob blob = container.GetBlockBlobReference("MyFile.txt");
//var blobStream = blob.OpenRead(null, options); //this works
CloudBlockBlob blob2 = container2.GetBlockBlobReference("MyFile2.txt");
blob2.StartCopy(blob, null, null, options, null); //this fails with: The remote server returned an error: (404) Not Found.
The answer is that encryption is done in the storage client library so if you do a copy blob to a new storage account it will still be encrypted.
The reason your code is failing is because the source blob is in a Private container. For cross-account copy to work, the source blob should be publicly accessible. Within same storage account, you can copy a blob from a private container. AFAIK, the error has nothing to do with encryption.
What you could do is create a SAS URL on the source blob and then use the following override of StartCopy method:
public string StartCopy(
Uri source,
AccessCondition sourceAccessCondition = null,
AccessCondition destAccessCondition = null,
BlobRequestOptions options = null,
OperationContext operationContext = null
)
Here's the sample code to do so:
private static void StartCopyAcrossAccount()
{
var sourceAccount = new CloudStorageAccount(new StorageCredentials("source-account-name", "source-account-key"), true);
var sourceContainer = sourceAccount.CreateCloudBlobClient().GetContainerReference("source-container");
var sourceBlob = sourceContainer.GetBlockBlobReference("blob-name");
var sourceBlobSas = sourceBlob.GetSharedAccessSignature(new Microsoft.WindowsAzure.Storage.Blob.SharedAccessBlobPolicy()
{
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(1),
Permissions = Microsoft.WindowsAzure.Storage.Blob.SharedAccessBlobPermissions.Read
});
var sourceBlobSasUrl = sourceBlob.Uri.AbsoluteUri + sourceBlobSas;
var targetAccount = new CloudStorageAccount(new StorageCredentials("target-account-name", "target-account-key"), true);
var targetContainer = targetAccount.CreateCloudBlobClient().GetContainerReference("target-container");
var targetBlob = targetContainer.GetBlockBlobReference("blob-name");
var copyId = targetBlob.StartCopy(new Uri(sourceBlobSasUrl), null, null);
}
I'm looking to start an Azure runbook from a c# application which will be hosted on an Azure web app.
I'm using certificate authentication (in an attempt just to test that I can connect and retrieve some data)
Here's my code so far:
var cert = ConfigurationManager.AppSettings["mgmtCertificate"];
var creds = new Microsoft.Azure.CertificateCloudCredentials("<my-sub-id>",
new X509Certificate2(Convert.FromBase64String(cert)));
var client = new Microsoft.Azure.Management.Automation.AutomationManagementClient(creds, new Uri("https://management.core.windows.net/"));
var content = client.Runbooks.List("<resource-group-id>", "<automation-account-name>");
Every time I run this, no matter what certificate I use I get the same error:
An unhandled exception of type 'Hyak.Common.CloudException' occurred in Microsoft.Threading.Tasks.dll
Additional information: ForbiddenError: The server failed to authenticate the request. Verify that the certificate is valid and is associated with this subscription.
I've tried downloading the settings file which contains the automatically generated management certificate you get when you spin up the Azure account... nothing I do will let me talk to any of the Azure subscription
Am I missing something fundamental here?
Edit: some additional info...
So I decided to create an application and use the JWT authentication method.
I've added an application, given the application permissions to the Azure Service Management API and ensured the user is a co-administrator and I still get the same error, even with the token...
const string tenantId = "xx";
const string clientId = "xx";
var context = new AuthenticationContext(string.Format("https://login.windows.net/{0}", tenantId));
var user = "<user>";
var pwd = "<pass>";
var userCred = new UserCredential(user, pwd);
var result = context.AcquireToken("https://management.core.windows.net/", clientId, userCred);
var token = result.CreateAuthorizationHeader().Substring("Bearer ".Length); // Token comes back fine and I can inspect and see that it's valid for 1 hour - all looks ok...
var sub = "<subscription-id>";
var creds = new TokenCloudCredentials(sub, token);
var client = new AutomationManagementClient(creds, new Uri("https://management.core.windows.net/"));
var content = client.Runbooks.List("<resource-group>", "<automation-id>");
I've also tried using other Azure libs (like auth, datacentre etc) and I get the same error:
ForbiddenError: The server failed to authenticate the request. Verify that the certificate is valid and is associated with this subscription.
I'm sure it's just 1 tickbox I need to tick buried somewhere in that monolithic Management Portal but I've followed a few tutorials on how to do this and they all end up with this error...
public async Task StartAzureRunbook()
{
try
{
var subscriptionId = "azure subscription Id";
string base64cer = "****long string here****"; //taken from http://stackoverflow.com/questions/24999518/azure-api-the-server-failed-to-authenticate-the-request
var cert = new X509Certificate2(Convert.FromBase64String(base64cer));
var client = new Microsoft.Azure.Management.Automation.AutomationManagementClient(new CertificateCloudCredentials(subscriptionId, cert));
var ct = new CancellationToken();
var content = await client.Runbooks.ListByNameAsync("MyAutomationAccountName", "MyRunbookName", ct);
var firstOrDefault = content?.Runbooks.FirstOrDefault();
if (firstOrDefault != null)
{
var operation = client.Runbooks.Start("MyAutomationAccountName", new RunbookStartParameters(firstOrDefault.Id));
}
}
catch (Exception ex)
{
Console.WriteLine(ex.ToString());
}
}
Also in portal:
1) Application is multitenant
2) Permissions to other applications section - Windows Azure Service Manager - Delegated permissions "Access Azure Service Management(preview)"
Ensure that your Management certificate has private key and was not made from the .CER file. The fact that you're not supplying a password when generating the X509Certificate object makes me think you're using public key only
Ensure that your Managemnet's certificate public key (.CER file) has been uploaded to the Azure management portal (legacy version, Management Certificate area)
Use CertificateCloudCredentials and not any other credential type of an object
Ok, stupid really but one of the tutorials I followed suggested installing the prerelease version of the libs.
Installing the preview (0.15.2-preview) has fixed the issue!