Azure File Storage Create File 404 Error - c#

I have below code which is attempting to create a blank file on my azure file storage account
CloudStorageAccount sa = CloudStorageAccount.Parse(connectionString);
var fc = sa.CreateCloudFileClient();
var share = fc.GetShareReference("uploadparking");
share.CreateIfNotExists();
var rootDirectory = share.GetRootDirectoryReference();
var subDirectory = rootDirectory.GetDirectoryReference("valuationrequests");
subDirectory.CreateIfNotExists();
var uri = new Uri(subDirectory.Uri.ToString() + "/file.txt");
var file = new CloudFile(uri);
file.Create(0);
On the final line I am getting the following exception:
Microsoft.WindowsAzure.Storage.StorageException' occurred in Microsoft.WindowsAzure.Storage.dll
Additional information: The remote server returned an error: (404) Not Found.
I'm not sure what it can't find. It shouldn't be trying to find the file as it's creating it. I have confirmed the directories exist.
If anyone knows I can go about creating a file successfully please let me know. I've checked the tutorials and they sadly only show how to download a file not upload.

I believe the documentation is incorrect. The documentation only mentions that the URI should be absolute. It fails to mention that if you're using absolute URI, then you should also pass storage credentials or the URI should include Shared Access Signature with at least Create permission to create a file.
You should try using the following override of CloudFile to create an instance: https://learn.microsoft.com/en-us/dotnet/api/microsoft.windowsazure.storage.file.cloudfile.-ctor?view=azurestorage-8.1.3#Microsoft_WindowsAzure_Storage_File_CloudFile__ctor_System_Uri_Microsoft_WindowsAzure_Storage_Auth_StorageCredentials_.
So your code would be:
CloudStorageAccount sa = CloudStorageAccount.Parse(connectionString);
var fc = sa.CreateCloudFileClient();
var share = fc.GetShareReference("uploadparking");
share.CreateIfNotExists();
var rootDirectory = share.GetRootDirectoryReference();
var subDirectory = rootDirectory.GetDirectoryReference("valuationrequests");
subDirectory.CreateIfNotExists();
var uri = new Uri(subDirectory.Uri.ToString() + "/file.txt");
var file = new CloudFile(uri, sa.Credentials);
file.Create(0);
Other alternative would be to create a Shared Access Signature (SAS) token on the share and use a SAS URL when creating an instance of CloudFile. So in this case your code would be:
CloudStorageAccount sa = CloudStorageAccount.Parse(connectionString);
var fc = account.CreateCloudFileClient();
var share = fc.GetShareReference("uploadparking");
share.CreateIfNotExists();
var rootDirectory = share.GetRootDirectoryReference();
var subDirectory = rootDirectory.GetDirectoryReference("valuationrequests");
subDirectory.CreateIfNotExists();
SharedAccessFilePolicy policy = new SharedAccessFilePolicy()
{
Permissions = SharedAccessFilePermissions.Create,
SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(15)
};
var sasToken = share.GetSharedAccessSignature(policy);
var uri = new Uri(subDirectory.Uri.ToString() + "/file1.txt" + sasToken);
var file = new CloudFile(uri);
file.Create(0);

Related

How to copy a file to a sharepoint site

I have to copy a file on a sharepoint site.
I have seen that the only authentication working is with the AuthenticationManager.
So this works:
var authManager = new AuthenticationManager();
var ctx = authManager.GetWebLoginClientContext(strHexagon);
Web web = ctx.Web;
User user = web.CurrentUser;
ctx.Load(web);
ctx.Load(user);
ctx.ExecuteQuery();
lbxInfo.Items.Add(web.Title);
lbxInfo.Items.Add(user.LoginName);
Now, after having authenticated I need to copy a file to the sharepoint site.
I have seen that there is ctx.Web.SaveFileToLocal but what if I have to copy from local to sharepoint?
Thanks
You can use the OfficeDevPnP.Core library
string str1_Url=... <--- sharepoint site
string str2_FileSource_Full= #"C:\temp\A.txt";
string str3_FileDestination_NameExt="B.txt";
string str4_TopDestination_Folder=... <--- sharepoint site title folder
string str5_TopDestination_SubFolder=... <--- folder e.g. Production
string str6_TopDestination_AllSubFolders=...<--- subfolder e.g. Test
// AuthenticationManager -> ByPasses Multi-Factor Authentication
var authManager = new AuthenticationManager();
var ctx = authManager.GetWebLoginClientContext(str1_Url);
// Web & User definitions
Web web = ctx.Web;
User user = web.CurrentUser;
FileCreationInformation newFile = new FileCreationInformation();
newFile.Content = System.IO.File.ReadAllBytes(str2_FileSource_Full);
// Rename the destination file
newFile.Url = str3_FileDestination_NameExt;
Microsoft.SharePoint.Client.List docs = web.Lists.GetByTitle(str4_TopDestination_Folder);
// Selects a Folder inside the root one
Microsoft.SharePoint.Client.Folder folder = docs.RootFolder.Folders.GetByUrl(str5_TopDestination_SubFolder);
folder.Folders.Add(str6_TopDestination_AllSubFolders);
var targetFolder = folder.Folders.GetByUrl(str6_TopDestination_AllSubFolders);
// Uploads a file to the targetFolder
newFile.Overwrite = true;
Microsoft.SharePoint.Client.File uploadFile = targetFolder.Files.Add(newFile);
// Executes query
ctx.Load(docs);
ctx.Load(uploadFile);
ctx.Load(web);
ctx.Load(user);
ctx.ExecuteQuery();

Getting error when Dropbox url copy to azure blob storage

I am using the azure blob storage to copy the dropbox file. But when I try to copy that file via URL, got the 500 error and totalbytes are -1.
I am using StartCopy method of WindowsAzure.Storage.Blob package. But here I get the copyStatus.TotalBytes as -1 and copy not working.
Tried the all types of url as below:
https://dl.dropboxusercontent.com/s/1v9re1dozilpdgi/1_32min.mp4?dl=0
https://dl.dropboxusercontent.com/s/1v9re1dozilpdgi/1_32min.mp4?dl=1
https://www.dropbox.com/s/1v9re1dozilpdgi/1_32min.mp4?dl=0
So can you please help me to solve this issue? Anything needs to change in URL or any way to copy the dropbox media to azure blob storage.
Also, I am using the .net 4.8 frameworks with the C#.
Sample Code:
string url = "https://dl.dropboxu`enter code here`sercontent.com/s/1v9re1dozilpdgi/1_32min.mp4?dl=0";
Uri fileUri = new Uri(url);
string filename = "test-file.mp4";
var account = CloudStorageAccount.Parse(connectionstring);
var blobClient = account.CreateCloudBlobClient();
var container = blobClient.GetContainerReference("test-container");
var blob = container.GetBlockBlobReference(filename);
blob.DeleteIfExists();
blob.StartCopy(fileUri);
var refBlob = (CloudBlockBlob)container.GetBlobReferenceFromServer(filename);
var fileLength = refBlob.CopyState.TotalBytes ?? 0;
while (refBlob.CopyState.Status == CopyStatus.Pending)
{
refBlob = (CloudBlockBlob)container.GetBlobReferenceFromServer(filename);
var copyStatus = refBlob.CopyState;
}
Error message: 500 InternalServerError "Copy failed."
We need to use CloudBlockBlob instead of using GetBlockBlobReference .
Because the filename, not the URL, is passed to GetBlockBlobReference in its Constructor.
For more information please refer the below
SO THREAD as suggested by #Tobias Tengler
& This BLOG:- Azure – Upload and Download data using C#.NET

C# S3 Bucket TransferUtilityDownloadRequest working but TransferUtilityDownloadDirectoryRequest does not

I am using TransferUtilityDownload and TransferUtilityDownloadDirectory to download a file and full directrory. However even I am using same bucket name format it is working for single file but not for directory and returns 403 Access Denied. (same problem with listing objects):
string bucketName = "my-bucket-us-east-1-prod";
string UnscheduledIn = "abc/butter/input_butter_11nov2019/unscheduled";
AmazonS3Client client = new AmazonS3Client(RegionEndpoint.USEast1);
// request for object download
var request = new TransferUtilityDownloadRequest();
// request for directory download
var drequest = new TransferUtilityDownloadDirectoryRequest();
//This request for single file download
request.BucketName = bucketName + "/" + UnscheduledIn;
request.FilePath = "D:\\input\\" + "test.csv";
request.Key = "test.csv";
//This request for directory download
drequest.BucketName = bucketName + "/" + UnscheduledIn;
drequest.S3Directory = "unscheduled";
drequest.LocalDirectory = "D:\\input\\";
drequest.DownloadFilesConcurrently = true;
TransferUtility fileTransferUtility = new TransferUtility(new AmazonS3Client(RegionEndpoint.USEast1));
// This one works
fileTransferUtility.Download(request);
// This one does not work
fileTransferUtility.DownloadDirectory(drequest);
403 Access Denied error usually cause of wrong bucket or directory name (if request cannot find the bucket or directory and this is a known issue). However bucket name and directory name are correct. I wonder if the formatting or setting up some properties am missing?
Quick note this version also returns same 403 error:
//This request for directory download
drequest.BucketName = bucketName;
drequest.S3Directory = UnscheduledIn;
drequest.LocalDirectory = "D:\\input\\";
drequest.DownloadFilesConcurrently = true;
It looks there is some issue with the bucket name and s3 directory path. Update your code with this piece of code.
//This request for directory download
drequest.BucketName = bucketName;
drequest.S3Directory = '/' + UnscheduledIn;
drequest.LocalDirectory = "D:\\input";
drequest.DownloadFilesConcurrently = true;
Update:
In General, 403 forbidden comes from server in case of authentication failed/Permission Issue.
Please check your bucket policy to allow download.
{
"Sid": "AllowAllS3ActionsInUserFolder",
"Effect": "Allow",
"Action": ["s3:*"],
"Resource": ["arn:aws:s3:::my-bucket-us-east-1-prod/abc/butter/input_butter_11nov2019/*","arn:aws:s3:::my-bucket-us-east-1-prod/abc/butter/input_butter_11nov2019/unscheduled/*"]
}

How to download the file from online azure blob storage

When I tried to download the image file from the online azure blob, it throws exception as "The given path's format is not supported". My code block is below:
StorageCredentials creds = new StorageCredentials(accountName, accountKey);
CloudStorageAccount account = new CloudStorageAccount(creds, useHttps: true);
CloudBlobClient client = account.CreateCloudBlobClient();
container = client.GetContainerReference(blobName);
CloudBlockBlob blockBlob = container.GetBlockBlobReference(MyPath);
await blockBlob.DownloadToFileAsync(Path, FileMode.OpenOrCreate);
using (var fileStream = System.IO.File.Create(Path))
{
await blockBlob.DownloadToStreamAsync(fileStream);
}
If I provide the local path as "c:\users\Joy\Downloads" in path,as like below:
var localPath = #"C:\Users\Joy \Downloads\user.jpg" ;
await blockBlob.DownloadToFileAsync(localPath, FileMode.OpenOrCreate);
using (var fileStream = System.IO.File.Create(localPath))
{
await blockBlob.DownloadToStreamAsync(fileStream);
}
It can be copied into corresponding location. But I couldn't download the file in my custom location.
According to your description, I enabled public read access to my blobs to check this issue. I created a console application and you could refer to the following code snippet for downloading the file and maintain the virtual directory in your local file system as follows:
CloudBlockBlob blockBlob = new CloudBlockBlob(new Uri("https://brucchstorage.blob.core.windows.net/images/2017/11/28/lake.jpeg"));
var localPath = Path.Combine(/*your custom root folder for storing file(s)*/AppDomain.CurrentDomain.BaseDirectory,$"downloads\\{blockBlob.Name}"); //blockBlob.Name =2017/11/28/lake.jpeg
var rootDir = new FileInfo(localPath).Directory;
if (!rootDir.Exists) //make sure the parent directory exists
rootDir.Create();
await blockBlob.DownloadToFileAsync(localPath,FileMode.Create);
//OR
using (var fs = new FileStream(localPath, FileMode.Create))
{
await blockBlob.DownloadToStreamAsync(fs);
}
Result:
Moreover, you could construct the CloudBlockBlob instance with the Uri contains the SAS token as follows:
https://brucchstorage.blob.core.windows.net/images/2017/11/28/lake.jpeg?st=2017-11-28T06%3A28%3A00Z&se=2017-11-29T06%3A28%3A00Z&sp=r&sv=2015-12-11&sr=b&sig=15NAaRB43C%2BniIZZe8gAvFl7LY%2BS6K7DNyjLflpvgBg%3D
More details, you could follow here.

How to export SQL Database directly to blob storage programmatically

I need to programmatically backup/export a SQL Database (either in Azure, or a compatible-one on-prem) to Azure Storage, and restore it to another SQL Database. I would like to use only NuGet packages for code dependencies, since I cannot guarantee that either the build or production servers will have the Azure SDK installed. I cannot find any code examples for something that I assume would be a common action. The closest I found was this:
https://blog.hompus.nl/2013/03/13/backup-your-azure-sql-database-to-blob-storage-using-code/
But, this code exports to a local bacpac file (requiring RoleEnvironment, an SDK-only object). I would think there should be a way to directly export to Blob Storage, without the intermediary file. One thought was to create a Stream, and then run:
services.ExportBacpac(stream, "dbnameToBackup")
And then write the stream to storage; however a Memory Stream wouldn't work--this could be a massive database (100-200 GB).
What would be a better way to do this?
Based on my test, the sql Microsoft Azure SQL Management Library 0.51.0-prerelease support directly export the sql database .bacpac file to the azure storage.
We could using sqlManagementClient.ImportExport.Export(resourceGroup, azureSqlServer, azureSqlDatabase,exportRequestParameters) to export the .bacpac file the azure storage.
But we couldn't find ImportExport in the lastest version of Microsoft Azure SQL Management Library SDK. So we could only use sql Microsoft Azure SQL Management Library 0.51.0-prerelease SDK.
More details about how to use sql Microsoft Azure SQL Management Library to export the sql backup to azure blob storage, you could refer to below steps and codes.
Prerequisites:
Registry an App in Azure AD and create service principle for it. More detail steps about how to registry app and get access token please refer to document.
Details codes:
Notice: Replace the clientId,tenantId,secretKey,subscriptionId with your registered azure AD information. Replace the azureSqlDatabase,resourceGroup,azureSqlServer,adminLogin,adminPassword,storageKey,storageAccount with your own sql database and storage.
static void Main(string[] args)
{
var subscriptionId = "xxxxxxxx";
var clientId = "xxxxxxxxx";
var tenantId = "xxxxxxxx";
var secretKey = "xxxxx";
var azureSqlDatabase = "data base name";
var resourceGroup = "Resource Group name";
var azureSqlServer = "xxxxxxx"; //testsqlserver
var adminLogin = "user";
var adminPassword = "password";
var storageKey = "storage key";
var storageAccount = "storage account";
var baseStorageUri = $"https://{storageAccount}.blob.core.windows.net/brandotest/";//with container name endwith "/"
var backName = azureSqlDatabase + "-" + $"{DateTime.UtcNow:yyyyMMddHHmm}" + ".bacpac"; //back up sql file name
var backupUrl = baseStorageUri + backName;
ImportExportOperationStatusResponse exportStatus = new ImportExportOperationStatusResponse();
try
{
ExportRequestParameters exportRequestParameters = new ExportRequestParameters
{
AdministratorLogin = adminLogin,
AdministratorLoginPassword = adminPassword,
StorageKey = storageKey,
StorageKeyType = "StorageAccessKey",
StorageUri = new Uri(backupUrl)
};
SqlManagementClient sqlManagementClient = new SqlManagementClient(new Microsoft.Azure.TokenCloudCredentials(subscriptionId, GetAccessToken(tenantId, clientId, secretKey)));
var export = sqlManagementClient.ImportExport.Export(resourceGroup, azureSqlServer, azureSqlDatabase,
exportRequestParameters); //do export operation
while (exportStatus.Status != Microsoft.Azure.OperationStatus.Succeeded) // until operation successed
{
Thread.Sleep(1000 * 60);
exportStatus = sqlManagementClient.ImportExport.GetImportExportOperationStatus(export.OperationStatusLink);
}
Console.WriteLine($"Export DataBase {azureSqlDatabase} to Storage {storageAccount} Succesfully");
}
catch (Exception exception)
{
//todo
}
}
private static string GetAccessToken(string tenantId, string clientId, string secretKey)
{
var authenticationContext = new AuthenticationContext($"https://login.windows.net/{tenantId}");
var credential = new ClientCredential(clientId, secretKey);
var result = authenticationContext.AcquireTokenAsync("https://management.core.windows.net/",
credential);
if (result == null)
{
throw new InvalidOperationException("Failed to obtain the JWT token");
}
var token = result.Result.AccessToken;
return token;
}
Result like this:
1.Send request to tell sql server start exporting to azure blob storage
2.Continue sending request to monitor the database exported operation status.
3.Finish exported operation.
Here's an idea:
Pass the stream to the .ExportBacPac method but hold a reference to it on a different thread where you regularly empty and reset the stream so that there's no memory overflow. I'm assuming here, that Dac does not have any means to access the stream while it is being filled.
The thing you have to take care of yourself though is thread safety - MemoryStreams are not thread safe by default. So you'd have to write your own locking mechanisms around .Position and .CopyTo. I've not tested this, but if you handle locking correctly I'd assume the .ExportBacPac method won't throw any errors while the other thread accesses the stream.
Here's a very simple example as pseudo-code just outlining my idea:
ThreadSafeStream stream = new ThreadSafeStream();
Task task = new Task(async (exitToken) => {
MemoryStream partialStream = new MemoryStream();
// Check if backup completed
if (...)
{
exitToken.Trigger();
}
stream.CopyToThreadSafe(partialStream);
stream.PositionThreadSafe = 0;
AzureService.UploadToStorage(partialStream);
await Task.Delay(500); // Play around with this - it shouldn't take too long to copy the stream
});
services.ExportBacpac(stream, "dbnameToBackup");
await TimerService.RunTaskPeriodicallyAsync(task, 500);
It's similiar to the Brando's answer but this one uses a stable package:
using Microsoft.WindowsAzure.Management.Sql;
Nuget
Using the same variables in the Brando's answer, the code will be like this:
var azureSqlServer = "xxxxxxx"+".database.windows.net";
var azureSqlServerName = "xxxxxxx";
SqlManagementClient managementClient = new SqlManagementClient(new TokenCloudCredentials(subscriptionId, GetAccessToken(tenantId, clientId, secretKey)));
var exportParams = new DacExportParameters()
{
BlobCredentials = new DacExportParameters.BlobCredentialsParameter()
{
StorageAccessKey = storageKey,
Uri = new Uri(baseStorageUri)
},
ConnectionInfo = new DacExportParameters.ConnectionInfoParameter()
{
ServerName = azureSqlServer,
DatabaseName = azureSqlDatabase,
UserName = adminLogin,
Password = adminPassword
}
};
var exportResult = managementClient.Dac.Export(azureSqlServerName, exportParams);
You can use Microsoft.Azure.Management.Fluent to export your database to a .bacpac file and store it in a blob. To do this, there are few things you need to do.
Create an AZAD (Azure Active Directory) application and Service Principal that can access resources. Follow this link for a comprehensive guide.
From the first step, you are going to need "Application (client) ID", "Client Secret", and "Tenant ID".
Install Microsoft.Azure.Management.Fluent NuGet packages, and import Microsoft.Azure.Management.Fluent, Microsoft.Azure.Management.ResourceManager.Fluent, and Microsoft.Azure.Management.ResourceManager.Fluent.Authentication namespaces.
Replace the placeholders in the code snippets below with proper values for your usecase.
Enjoy!
var principalClientID = "<Applicaiton (Client) ID>";
var principalClientSecret = "<ClientSecret>";
var principalTenantID = "<TenantID>";
var sqlServerName = "<SQL Server Name> (without '.database.windows.net'>";
var sqlServerResourceGroupName = "<SQL Server Resource Group>";
var databaseName = "<Database Name>";
var databaseLogin = "<Database Login>";
var databasePassword = "<Database Password>";
var storageResourceGroupName = "<Storage Resource Group>";
var storageName = "<Storage Account>";
var storageBlobName = "<Storage Blob Name>";
var bacpacFileName = "myBackup.bacpac";
var credentials = new AzureCredentialsFactory().FromServicePrincipal(principalClientID, principalClientSecret, principalTenantID, AzureEnvironment.AzureGlobalCloud);
var azure = await Azure.Authenticate(credentials).WithDefaultSubscriptionAsync();
var storageAccount = await azure.StorageAccounts.GetByResourceGroupAsync(storageResourceGroupName, storageName);
var sqlServer = await azure.SqlServers.GetByResourceGroupAsync(sqlServerResourceGroupName, sqlServerName);
var database = await sqlServer.Databases.GetAsync(databaseName);
await database.ExportTo(storageAccount, storageBlobName, bacpacFileName)
.WithSqlAdministratorLoginAndPassword(databaseLogin, databasePassword)
.ExecuteAsync();

Categories

Resources