Azure Blob timeouts after encryption was applied - c#

I started encrypting my Azure blobs. Now I am occasionally getting 500 - The request timed out on blob operations (on both downloading and uploading blobs) after the request halts for about 30 seconds or so. After one of these timeouts, no other blob operation will work through the app, unless I restart my Azure website. Once restarted, everything runs as expected for a while.
Example: If I access an encrypted image through my application (I'm using the WebAPI to pull it and display it to the user) it shows up fine, but then if I try and access the same file hours later, the request halts and eventually times out. After that, I get the same issue while accessing any other file through my web app. However, if I access the direct url of the blob, then I can access the file (even though it's encrypted and therefore useless).
I cannot say with certainty what is causing this and when would the issue start occurring as I am not the only person accessing the app, so there's a good chance that the issue might have started before my failed request. Also, I never had issues as such before encryption was applied, nor did I have issues while testing encryption locally.
Any idea why this is happening, or maybe how can I prevent this? I am attaching my code below if it helps:
public async Task<Tuple<string, string>> UploadToStorage(CloudBlobContainer container, Stream stream, string reference, string contentType, byte[] byteArray = null) {
var blockBlob = container.GetBlockBlobReference(reference);
blockBlob.Properties.ContentType = contentType;
var cloudResolver = new KeyVaultKeyResolver(GetToken);
var rsa = await cloudResolver.ResolveKeyAsync(new BlobConfig().BlobKeyVault, CancellationToken.None);
var policy = new BlobEncryptionPolicy(rsa, null);
var options = new BlobRequestOptions { EncryptionPolicy = policy };
if (byteArray != null) blockBlob.UploadFromByteArray(byteArray, 0, byteArray.Length, null, options);
else blockBlob.UploadFromStream(stream, stream.Length, null, options);
return Tuple.Create(new Config().BaseUrl + "/api/blobs/" + container.Name + "/" + reference, blockBlob.Properties.ContentType);
}
public BlobDto DownloadBlob(string container, string filename) {
var account = CloudStorageAccount.Parse(new BlobConfig().StorageConnectionString);
var blobClient = account.CreateCloudBlobClient();
var blobContainer = blobClient.GetContainerReference(container);
var blob = blobContainer.GetBlockBlobReference(filename);
var cloudResolver = new KeyVaultKeyResolver(GetToken);
var policy = new BlobEncryptionPolicy(null, cloudResolver);
var options = new BlobRequestOptions { EncryptionPolicy = policy };
var m = new MemoryStream();
blob.DownloadToStream(m, null, options);
return new BlobDto { Blob = m.ToArray(), BlobContentType = blob.Properties.ContentType };
}
private async static Task<string> GetToken(string authority, string resource, string scope) {
var config = new BlobConfig();
var clientCredential = new ClientCredential(config.BlobClientId, config.BlobClientSecret);
var authContext = new AuthenticationContext(authority);
var result = await authContext.AcquireTokenAsync(resource, clientCredential);
if (result == null) throw new InvalidOperationException("Failed to obtain the access token");
return result.AccessToken;
}

I replaced all the synchronous blob related calls with asynchronous calls, particularly the downloading call. Also changed from stream to bytearray, but I don't think that's relevant:
await blob.DownloadToByteArrayAsync(byteArray, 0, null, options, null);

Related

Azure Function and SharePoint webhook: not getting changes from SharePoint list

I'm using a couple of Azure Functions with SharePoint webhook.
The first function is the one used to save messages from SharePoint webhook to a queue (Azure storage queue). This is the function content:
[FunctionName("QueueFunction")]
public static async Task<HttpResponseMessage> Run([HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = null)]HttpRequestMessage req, TraceWriter log)
{
log.Info($"Webhook was triggered!");
// Grab the validationToken URL parameter
string validationToken = req.GetQueryNameValuePairs()
.FirstOrDefault(q => string.Compare(q.Key, "validationtoken", true) == 0)
.Value;
// If a validation token is present, we need to respond within 5 seconds by
// returning the given validation token. This only happens when a new
// web hook is being added
if (validationToken != null)
{
log.Info($"Validation token {validationToken} received");
var response = req.CreateResponse(HttpStatusCode.OK);
response.Content = new StringContent(validationToken);
return response;
}
log.Info($"SharePoint triggered our webhook...great :-)");
var content = await req.Content.ReadAsStringAsync();
log.Info($"Received following payload: {content}");
var notifications = JsonConvert.DeserializeObject<ResponseModel<NotificationModel>>(content).Value;
log.Info($"Found {notifications.Count} notifications");
if (notifications.Count > 0)
{
// get the cloud storage account
string queueName = "MYQUEUE";
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(Environment.GetEnvironmentVariable("AzureWebJobsStorage"));
CloudQueueClient queueClient = storageAccount.CreateCloudQueueClient();
CloudQueue queue = queueClient.GetQueueReference(queueName);
await queue.CreateIfNotExistsAsync();
// store each notification as a queue item
foreach (var notification in notifications)
{
string message = JsonConvert.SerializeObject(notification);
log.Info($"Adding to {queueName}: {message}");
await queue.AddMessageAsync(new CloudQueueMessage(message));
log.Info($"added.");
}
// if we get here we assume the request was well received
return new HttpResponseMessage(HttpStatusCode.OK);
}
The message in queue is correctly added.
Then I've another function triggered by queue. This is the code of the function:
[FunctionName("OCRFunction")]
public static void Run([QueueTrigger("MYQUEUE", Connection = "QueueConn")]string myQueueItem, TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}");
string siteUrl = "https://MYSHAREPOINT.sharepoint.com/sites/MYSITE";
log.Info($"Processing notifications...");
string json = myQueueItem;
var data = (JObject)JsonConvert.DeserializeObject(json);
string notificationResource = data["resource"].Value<string>();
ClientContext SPClientContext = LoginSharePoint(siteUrl);
log.Info($"Logged in SharePoint");
GetChanges(SPClientContext, notificationResource, log);
}
public static ClientContext LoginSharePoint(string BaseUrl)
{
// Login using UserOnly Credentials (User Name and User PW)
ClientContext cntReturn;
string myUserName = config["spUN"];
string myPassword = config["spPWD"];
SecureString securePassword = new SecureString();
foreach (char oneChar in myPassword) securePassword.AppendChar(oneChar);
SharePointOnlineCredentials myCredentials = new SharePointOnlineCredentials(myUserName, securePassword);
cntReturn = new ClientContext(BaseUrl);
cntReturn.Credentials = myCredentials;
return cntReturn;
}
static void GetChanges(ClientContext SPClientContext, string ListId, TraceWriter log)
{
Web spWeb = SPClientContext.Web;
List myList = spWeb.Lists.GetByTitle("MY LIST");
SPClientContext.Load(myList);
SPClientContext.ExecuteQuery();
ChangeQuery myChangeQuery = GetChangeQueryNew(ListId);
var allChanges = myList.GetChanges(myChangeQuery);
SPClientContext.Load(allChanges);
SPClientContext.ExecuteQuery();
log.Info($"---- Changes found : " + allChanges.Count());
foreach (Change oneChange in allChanges)
{
if (oneChange is ChangeItem)
{
int myItemId = (oneChange as ChangeItem).ItemId;
log.Info($"---- Changed ItemId : " + myItemId);
ListItem myItem = myList.GetItemById(myItemId);
Microsoft.SharePoint.Client.File myFile = myItem.File;
ClientResult<System.IO.Stream> myFileStream = myFile.OpenBinaryStream();
SPClientContext.Load(myFile);
SPClientContext.ExecuteQuery();
byte[] myFileBytes = ConvertStreamToByteArray(myFileStream);
[...] SOME CODE HERE [...]
myItem["OCRText"] = myText;
myItem.Update();
SPClientContext.ExecuteQuery();
log.Info($"---- Text Analyze OCR added to SharePoint Item");
}
}
}
public static ChangeQuery GetChangeQueryNew(string ListId)
{
ChangeToken lastChangeToken = new ChangeToken();
lastChangeToken.StringValue = string.Format("1;3;{0};{1};-1", ListId, DateTime.Now.AddMinutes(-1).ToUniversalTime().Ticks.ToString());
ChangeToken newChangeToken = new ChangeToken();
newChangeToken.StringValue = string.Format("1;3;{0};{1};-1", ListId, DateTime.Now.ToUniversalTime().Ticks.ToString());
ChangeQuery myChangeQuery = new ChangeQuery(false, false);
myChangeQuery.Item = true; // Get only Item changes
myChangeQuery.Add = true; // Get only the new Items
myChangeQuery.ChangeTokenStart = lastChangeToken;
myChangeQuery.ChangeTokenEnd = newChangeToken;
return myChangeQuery;
}
public static Byte[] ConvertStreamToByteArray(ClientResult<System.IO.Stream> myFileStream)
{
Byte[] bytReturn = null;
using (System.IO.MemoryStream myFileMemoryStream = new System.IO.MemoryStream())
{
if (myFileStream != null)
{
myFileStream.Value.CopyTo(myFileMemoryStream);
bytReturn = myFileMemoryStream.ToArray();
}
}
return bytReturn;
}
public static async Task<TextAnalyzeOCRResult> GetAzureTextAnalyzeOCR(byte[] myFileBytes)
{
TextAnalyzeOCRResult resultReturn = new TextAnalyzeOCRResult();
HttpClient client = new HttpClient();
client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", "XXXXXXXXXXXXXXXXXXXX");
string requestParameters = "language=unk&detectOrientation=true";
/* OCR API */
string uri = "https://MYOCRSERVICE.cognitiveservices.azure.com/vision/v3.0/ocr" + "?" + requestParameters;
string contentString = string.Empty;
HttpResponseMessage response;
using (ByteArrayContent content = new ByteArrayContent(myFileBytes))
{
content.Headers.ContentType = new MediaTypeHeaderValue("application/octet-stream");
response = await client.PostAsync(uri, content);
contentString = await response.Content.ReadAsStringAsync();
resultReturn = JsonConvert.DeserializeObject<TextAnalyzeOCRResult>(contentString);
return resultReturn;
}
}
Before current approach with two functions, I was using a single function where I managed the notifications and I executed some code to update a field in my SharePoint list. This method was having some problem when I was receiving many notifications from SharePoint so I decided to use queue as suggested in Microsoft documentation. This solution was working fine with a single notification received and my SharePoint list item were updated without problem.
To avoid problems with multiple notification, I decided to split functions, one registering notifications in a queue and the other executing some operations and updating a SharePoint field.
The first one function QueueFunction is working fine, the second one is triggering correctly but it is not getting changes from SharePoint list even if I just add one item.
I've tried to check GetChanges code to find why it is always returning no changes, but the code is the same of the one I used when I had only one function, so I can't understand why the behaviour is changed.
What's wrong with my approach? Is there something I could do to correct the second function?
According to the comments, just summarize the solution as below for other communities reference:
Use a function to save the message in a queue and then call an azure web job, the problem was caused by the the running time of the function may exceed 5 minutes.
By the way, the default timeout of azure function(with consumption plan) is 5 minutes, we can see all of the default timeout for different plan on this page (also shown as below screenshot).
If we want longer timeout, we can set the functionTimeout property in host.json of the function(but can not exceed the Maximum timeout). Or we can also use higher plan for the function app, such as Premium plan and App Service plan.

Azure - Programmatically Create Storage Account

I have tried the following code to create a new storage account in Azure:
Getting the token (success - I received a token):
var cc = new ClientCredential("clientId", "clientSecret");
var context = new AuthenticationContext("https://login.windows.net/subscription");
var result = context.AcquireTokenAsync("https://management.azure.com/", cc);
Create cloud storage credentials:
var credential = new TokenCloudCredentials("subscription", token);
Create the cloud storage account (fails):
using (var storageClient = new StorageManagementClient(credentials))
{
await storageClient.StorageAccounts.CreateAsync(new StorageAccountCreateParameters
{
Label = "samplestorageaccount",
Location = LocationNames.NorthEurope,
Name = "myteststorage",
AccountType = "RA-GRS"
});
}
Error:
ForbiddenError: The server failed to authenticate the request. Verify
that the certificate is valid and is associated with this
subscription.
I am not sure if this is one of those misleading messages or if I misconfigured something in Azure?
As far as I know, Azure provides two types of storage management library now.
Microsoft.Azure.Management.Storage
Microsoft.WindowsAzure.Management.Storage
Microsoft.Azure.Management.Storage is used to create new ARM storage.
Microsoft.WindowsAzure.Management.Storage is used to create classic ARM storage.
I guess you want to create the new arm storage but you used the "Microsoft.WindowsAzure.Management.Storage" library. Since the "Microsoft.WindowsAzure.Management.Storage" uses the certificate to auth requests, you will get the error. If you want to know how to use "Microsoft.WindowsAzure.Management.Storage" to create classic storage, I suggest you refer to this article.
I assume you want to create new ARM storage, I suggest you install the "Microsoft.Azure.Management.Storage" Nuget package.
More details, you could refer to the following code.
static void Main(string[] args)
{
var subscriptionId = "your subscriptionId";
var clientId = "your client id";
var tenantId = "your tenantid";
var secretKey = "secretKey";
StorageManagementClient StorageManagement = new StorageManagementClient(new Microsoft.Azure.TokenCloudCredentials(subscriptionId, GetAccessToken(tenantId, clientId, secretKey)));
var re= StorageManagement.StorageAccounts.CreateAsync("groupname", "sotrage name",new Microsoft.Azure.Management.Storage.Models.StorageAccountCreateParameters() {
Location = LocationNames.NorthEurope,
AccountType = Microsoft.Azure.Management.Storage.Models.AccountType.PremiumLRS
},new CancellationToken() { }).Result;
Console.ReadKey();
}
static string GetAccessToken(string tenantId, string clientId, string secretKey)
{
var authenticationContext = new AuthenticationContext($"https://login.windows.net/{tenantId}");
var credential = new ClientCredential(clientId, secretKey);
var result = authenticationContext.AcquireTokenAsync("https://management.core.windows.net/",
credential);
if (result == null)
{
throw new InvalidOperationException("Failed to obtain the JWT token");
}
var token = result.Result.AccessToken;
return token;
}

How to export SQL Database directly to blob storage programmatically

I need to programmatically backup/export a SQL Database (either in Azure, or a compatible-one on-prem) to Azure Storage, and restore it to another SQL Database. I would like to use only NuGet packages for code dependencies, since I cannot guarantee that either the build or production servers will have the Azure SDK installed. I cannot find any code examples for something that I assume would be a common action. The closest I found was this:
https://blog.hompus.nl/2013/03/13/backup-your-azure-sql-database-to-blob-storage-using-code/
But, this code exports to a local bacpac file (requiring RoleEnvironment, an SDK-only object). I would think there should be a way to directly export to Blob Storage, without the intermediary file. One thought was to create a Stream, and then run:
services.ExportBacpac(stream, "dbnameToBackup")
And then write the stream to storage; however a Memory Stream wouldn't work--this could be a massive database (100-200 GB).
What would be a better way to do this?
Based on my test, the sql Microsoft Azure SQL Management Library 0.51.0-prerelease support directly export the sql database .bacpac file to the azure storage.
We could using sqlManagementClient.ImportExport.Export(resourceGroup, azureSqlServer, azureSqlDatabase,exportRequestParameters) to export the .bacpac file the azure storage.
But we couldn't find ImportExport in the lastest version of Microsoft Azure SQL Management Library SDK. So we could only use sql Microsoft Azure SQL Management Library 0.51.0-prerelease SDK.
More details about how to use sql Microsoft Azure SQL Management Library to export the sql backup to azure blob storage, you could refer to below steps and codes.
Prerequisites:
Registry an App in Azure AD and create service principle for it. More detail steps about how to registry app and get access token please refer to document.
Details codes:
Notice: Replace the clientId,tenantId,secretKey,subscriptionId with your registered azure AD information. Replace the azureSqlDatabase,resourceGroup,azureSqlServer,adminLogin,adminPassword,storageKey,storageAccount with your own sql database and storage.
static void Main(string[] args)
{
var subscriptionId = "xxxxxxxx";
var clientId = "xxxxxxxxx";
var tenantId = "xxxxxxxx";
var secretKey = "xxxxx";
var azureSqlDatabase = "data base name";
var resourceGroup = "Resource Group name";
var azureSqlServer = "xxxxxxx"; //testsqlserver
var adminLogin = "user";
var adminPassword = "password";
var storageKey = "storage key";
var storageAccount = "storage account";
var baseStorageUri = $"https://{storageAccount}.blob.core.windows.net/brandotest/";//with container name endwith "/"
var backName = azureSqlDatabase + "-" + $"{DateTime.UtcNow:yyyyMMddHHmm}" + ".bacpac"; //back up sql file name
var backupUrl = baseStorageUri + backName;
ImportExportOperationStatusResponse exportStatus = new ImportExportOperationStatusResponse();
try
{
ExportRequestParameters exportRequestParameters = new ExportRequestParameters
{
AdministratorLogin = adminLogin,
AdministratorLoginPassword = adminPassword,
StorageKey = storageKey,
StorageKeyType = "StorageAccessKey",
StorageUri = new Uri(backupUrl)
};
SqlManagementClient sqlManagementClient = new SqlManagementClient(new Microsoft.Azure.TokenCloudCredentials(subscriptionId, GetAccessToken(tenantId, clientId, secretKey)));
var export = sqlManagementClient.ImportExport.Export(resourceGroup, azureSqlServer, azureSqlDatabase,
exportRequestParameters); //do export operation
while (exportStatus.Status != Microsoft.Azure.OperationStatus.Succeeded) // until operation successed
{
Thread.Sleep(1000 * 60);
exportStatus = sqlManagementClient.ImportExport.GetImportExportOperationStatus(export.OperationStatusLink);
}
Console.WriteLine($"Export DataBase {azureSqlDatabase} to Storage {storageAccount} Succesfully");
}
catch (Exception exception)
{
//todo
}
}
private static string GetAccessToken(string tenantId, string clientId, string secretKey)
{
var authenticationContext = new AuthenticationContext($"https://login.windows.net/{tenantId}");
var credential = new ClientCredential(clientId, secretKey);
var result = authenticationContext.AcquireTokenAsync("https://management.core.windows.net/",
credential);
if (result == null)
{
throw new InvalidOperationException("Failed to obtain the JWT token");
}
var token = result.Result.AccessToken;
return token;
}
Result like this:
1.Send request to tell sql server start exporting to azure blob storage
2.Continue sending request to monitor the database exported operation status.
3.Finish exported operation.
Here's an idea:
Pass the stream to the .ExportBacPac method but hold a reference to it on a different thread where you regularly empty and reset the stream so that there's no memory overflow. I'm assuming here, that Dac does not have any means to access the stream while it is being filled.
The thing you have to take care of yourself though is thread safety - MemoryStreams are not thread safe by default. So you'd have to write your own locking mechanisms around .Position and .CopyTo. I've not tested this, but if you handle locking correctly I'd assume the .ExportBacPac method won't throw any errors while the other thread accesses the stream.
Here's a very simple example as pseudo-code just outlining my idea:
ThreadSafeStream stream = new ThreadSafeStream();
Task task = new Task(async (exitToken) => {
MemoryStream partialStream = new MemoryStream();
// Check if backup completed
if (...)
{
exitToken.Trigger();
}
stream.CopyToThreadSafe(partialStream);
stream.PositionThreadSafe = 0;
AzureService.UploadToStorage(partialStream);
await Task.Delay(500); // Play around with this - it shouldn't take too long to copy the stream
});
services.ExportBacpac(stream, "dbnameToBackup");
await TimerService.RunTaskPeriodicallyAsync(task, 500);
It's similiar to the Brando's answer but this one uses a stable package:
using Microsoft.WindowsAzure.Management.Sql;
Nuget
Using the same variables in the Brando's answer, the code will be like this:
var azureSqlServer = "xxxxxxx"+".database.windows.net";
var azureSqlServerName = "xxxxxxx";
SqlManagementClient managementClient = new SqlManagementClient(new TokenCloudCredentials(subscriptionId, GetAccessToken(tenantId, clientId, secretKey)));
var exportParams = new DacExportParameters()
{
BlobCredentials = new DacExportParameters.BlobCredentialsParameter()
{
StorageAccessKey = storageKey,
Uri = new Uri(baseStorageUri)
},
ConnectionInfo = new DacExportParameters.ConnectionInfoParameter()
{
ServerName = azureSqlServer,
DatabaseName = azureSqlDatabase,
UserName = adminLogin,
Password = adminPassword
}
};
var exportResult = managementClient.Dac.Export(azureSqlServerName, exportParams);
You can use Microsoft.Azure.Management.Fluent to export your database to a .bacpac file and store it in a blob. To do this, there are few things you need to do.
Create an AZAD (Azure Active Directory) application and Service Principal that can access resources. Follow this link for a comprehensive guide.
From the first step, you are going to need "Application (client) ID", "Client Secret", and "Tenant ID".
Install Microsoft.Azure.Management.Fluent NuGet packages, and import Microsoft.Azure.Management.Fluent, Microsoft.Azure.Management.ResourceManager.Fluent, and Microsoft.Azure.Management.ResourceManager.Fluent.Authentication namespaces.
Replace the placeholders in the code snippets below with proper values for your usecase.
Enjoy!
var principalClientID = "<Applicaiton (Client) ID>";
var principalClientSecret = "<ClientSecret>";
var principalTenantID = "<TenantID>";
var sqlServerName = "<SQL Server Name> (without '.database.windows.net'>";
var sqlServerResourceGroupName = "<SQL Server Resource Group>";
var databaseName = "<Database Name>";
var databaseLogin = "<Database Login>";
var databasePassword = "<Database Password>";
var storageResourceGroupName = "<Storage Resource Group>";
var storageName = "<Storage Account>";
var storageBlobName = "<Storage Blob Name>";
var bacpacFileName = "myBackup.bacpac";
var credentials = new AzureCredentialsFactory().FromServicePrincipal(principalClientID, principalClientSecret, principalTenantID, AzureEnvironment.AzureGlobalCloud);
var azure = await Azure.Authenticate(credentials).WithDefaultSubscriptionAsync();
var storageAccount = await azure.StorageAccounts.GetByResourceGroupAsync(storageResourceGroupName, storageName);
var sqlServer = await azure.SqlServers.GetByResourceGroupAsync(sqlServerResourceGroupName, sqlServerName);
var database = await sqlServer.Databases.GetAsync(databaseName);
await database.ExportTo(storageAccount, storageBlobName, bacpacFileName)
.WithSqlAdministratorLoginAndPassword(databaseLogin, databasePassword)
.ExecuteAsync();

Win 10 zip upload reorders blocks of data

Our system has been used to upload millions of files over several years. The clients use the following code to send an authentication token and zip file to our WEB API on Windows Server 2008 R2. On our Windows 7 devices, the system works great. As we are attempting to move to Windows 10 devices, we have suddenly encountered an issue where the received file has blocks of data in a different order than the source file. The problem only occurs about half of the time, which makes it very difficult to track down.
client code (.NET 4.5)
private static void UploadFile(string srcFile, string username, string password)
{
if (File.Exists(srcFile))
{
ConnectionUtilities connUtil = new ConnectionUtilities();
string authToken = connUtil.GetAuthToken(username, password);
using (HttpContent authContent = new StringContent(authToken))
{
using (HttpContent fileStreamContent = new ByteArrayContent(File.ReadAllBytes(srcFile)))
{
FileInfo fi = new FileInfo(srcFile);
using (HttpClient client = new HttpClient())
using (MultipartFormDataContent formData = new MultipartFormDataContent())
{
client.DefaultRequestHeaders.ExpectContinue = false;
formData.Add(authContent, "auth");
formData.Add(fileStreamContent, "data", fi.Name);
var response = client.PostAsync(ConfigItems.hostName + "UploadData", formData).Result;
if (response.IsSuccessStatusCode)
{
File.Delete(srcFile);
}
}
}
}
}
}
WEB API code (.NET 4.5.2)
public async Task<HttpResponseMessage> PostUploadData()
{
if (Request.Content.IsMimeMultipartContent())
{
MultipartFormDataStreamProvider streamProvider =
MultipartFormDataStreamProvider(HttpContext.Current.Server.MapPath("~/app_data"));
await Request.Content.ReadAsMultipartAsync(streamProvider);
string auth = streamProvider.FormData["auth"];
if (auth != null)
{
auth = HttpUtility.UrlDecode(auth);
}
if (Util.IsValidUsernameAndPassword(auth))
{
string username = Util.GetUsername(auth);
foreach (var file in streamProvider.FileData)
{
DirectoryInfo di = new DirectoryInfo(ConfigurationManager.AppSettings["DataRoot"]);
di = di.CreateSubdirectory(username);
string contentFileName = file.Headers.ContentDisposition.FileName;
di = di.CreateSubdirectory("storage");
FileInfo fi = new FileInfo(file.LocalFileName);
string destFileName = Path.Combine(di.FullName, contentFileName);
File.Move(fi.FullName, destFileName);
}
return new HttpResponseMessage(HttpStatusCode.OK);
}
}
return new HttpResponseMessage(HttpStatusCode.ServiceUnavailable);
}
The problem initially manifests as a zipped file that can't open in Windows. Only by doing a hexadecimal compare did we determine that the file was all there, just not in the same order as the original.
Any thoughts on what might be causing the blocks of data to be reordered?
P.S. I know the HttpClient is not being used as effectively as possible.
After some long and tedious testing (Yay, scientific method) we determined that our web content filter software was causing the issue.

Decrypt and copy blob to other blob storage account?

I'm using this approach to encrypt files and store them in Azure block blobs. I would like to copy the encrypted blob to another blob storage account and decrypt it in the process. I know it's possible to do a "copy blob" operation which runs entirely inside Azure asynchronously and doesn't download the blob contents through my local computer in transit. I believe this is accomplished through the CloudBlockBlob.StartCopy method. But is it possible to do that with an encrypted file and decrypt it in transit to the other storage account?
Following that link above, my code looks like the following. blob.OpenRead works but blob2.StartCopy doesn't work.
BlobEncryptionPolicy policy = new BlobEncryptionPolicy(null, cloudResolver);
BlobRequestOptions options = new BlobRequestOptions() { EncryptionPolicy = policy };
CloudBlockBlob blob = container.GetBlockBlobReference("MyFile.txt");
//var blobStream = blob.OpenRead(null, options); //this works
CloudBlockBlob blob2 = container2.GetBlockBlobReference("MyFile2.txt");
blob2.StartCopy(blob, null, null, options, null); //this fails with: The remote server returned an error: (404) Not Found.
The answer is that encryption is done in the storage client library so if you do a copy blob to a new storage account it will still be encrypted.
The reason your code is failing is because the source blob is in a Private container. For cross-account copy to work, the source blob should be publicly accessible. Within same storage account, you can copy a blob from a private container. AFAIK, the error has nothing to do with encryption.
What you could do is create a SAS URL on the source blob and then use the following override of StartCopy method:
public string StartCopy(
Uri source,
AccessCondition sourceAccessCondition = null,
AccessCondition destAccessCondition = null,
BlobRequestOptions options = null,
OperationContext operationContext = null
)
Here's the sample code to do so:
private static void StartCopyAcrossAccount()
{
var sourceAccount = new CloudStorageAccount(new StorageCredentials("source-account-name", "source-account-key"), true);
var sourceContainer = sourceAccount.CreateCloudBlobClient().GetContainerReference("source-container");
var sourceBlob = sourceContainer.GetBlockBlobReference("blob-name");
var sourceBlobSas = sourceBlob.GetSharedAccessSignature(new Microsoft.WindowsAzure.Storage.Blob.SharedAccessBlobPolicy()
{
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(1),
Permissions = Microsoft.WindowsAzure.Storage.Blob.SharedAccessBlobPermissions.Read
});
var sourceBlobSasUrl = sourceBlob.Uri.AbsoluteUri + sourceBlobSas;
var targetAccount = new CloudStorageAccount(new StorageCredentials("target-account-name", "target-account-key"), true);
var targetContainer = targetAccount.CreateCloudBlobClient().GetContainerReference("target-container");
var targetBlob = targetContainer.GetBlockBlobReference("blob-name");
var copyId = targetBlob.StartCopy(new Uri(sourceBlobSasUrl), null, null);
}

Categories

Resources