The app is set up on multiple on-premise services and uploads regularly some files to Azure Blob Storage placed in East US. But now it's necessary to place an instance of the app in the Australian region. As a result, upload time to the cloud increased drastically.
I have tested if Azure Front Door can help to improve it and I found that download from blob storage works 5 times faster if I use the Azure Front Door link. Now I struggle to change C# code to upload files via Azure Front Door. I have tried to use the suffix "azurefd.net" instead of "core.windows.net" in the connection string but it does not help. Could somebody give me a hint on how to upload files to Azure blob storage via Azure Front Door in C#?
As the Storage connection string uses only storage endpoint (core.windows.net), we cannot use front door endpoint (azurefd.net) in the connection string.
I integrated Azure Storage Account with Front Door. I am able to access the Blob Files in the Azure Storage Account with Front Door URL.
We cannot upload files to Blob Storage via Azure Front Door using C#
This is because Azure Storage in C# accepts connection string only from storage endpoint
Unfortunately, for upload, Azure Front Door does not provide any benefit. I used
PUT requests for the test described here: https://learn.microsoft.com/en-us/rest/api/storageservices/put-blob
PUT https://<entityName>.azurefd.net/<containerName>/<blobName>?<sharedAccessSignature>
x-ms-version: 2020-10-02
x-ms-blob-type: BlockBlob
< C:\Downloads\t1.txt
and compared times for storage account and Azure Front account. There is no difference in speed for upload.
Code that I used for test:
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Threading.Tasks;
namespace SandboxV2
{
class Program
{
static async Task Main()
{
string frontDoorUrl = "https://<FRONT-DOOR>.azurefd.net";
string storageUrl = "https://{STORAGE-ACCOUNT}.blob.core.windows.net";
string sasString = "...";
Console.Write("File Path: ");
string filePath = Console.ReadLine();
await RunUploadTestAsync(filePath, frontDoorUrl, sasString, "-fd");
await RunUploadTestAsync(filePath, storageUrl, sasString, "-storage");
}
private static async Task RunUploadTestAsync(string filePath, string rootUrl, string sasString, string suffix)
{
string blobName = Path.GetFileNameWithoutExtension(filePath) + suffix + Path.GetExtension(filePath);
Console.WriteLine(rootUrl);
string containerName = "testaccess";
var speeds = new List<double>();
var times = new List<TimeSpan>();
for (int i = 0; i < 5; i++)
{
var t1 = DateTime.UtcNow;
var statusCode = await UploadAsSingleBlock(filePath, rootUrl, blobName, containerName, sasString);
var t2 = DateTime.UtcNow;
var time = t2 - t1;
var speed = new FileInfo(filePath).Length / time.TotalSeconds / 1024 / 1024 * 8;
speeds.Add(speed);
times.Add(time);
Console.WriteLine($"Code: {statusCode}. Time: {time}. Speed: {speed}");
}
Console.WriteLine($"Average time: {TimeSpan.FromTicks((long)times.Select(t => t.Ticks).Average())}. Average speed: {speeds.Average()}.");
}
private static async Task<HttpStatusCode> UploadAsSingleBlock(string filePath, string rootUrl, string blobName, string containerName, string sasString)
{
var request = new HttpRequestMessage(HttpMethod.Put, $"{rootUrl}/{containerName}/{blobName}?{sasString}");
request.Headers.Add("x-ms-version", "2020-10-02");
request.Headers.Add("x-ms-blob-type", "BlockBlob");
HttpResponseMessage response;
using (var fileStream = new FileStream(filePath, FileMode.Open, FileAccess.Read, FileShare.Read))
{
request.Content = new StreamContent(fileStream);
using (var client = new HttpClient())
response = await client.SendAsync(request, HttpCompletionOption.ResponseHeadersRead);
}
return response.StatusCode;
}
}
}
Related
I want to get a specific block from an azure block blob with the blockId, is this even possible?
something like
var blockBlob = new BlockBlobClient(connectionString, containerName, blobName);
var blocklist = await GetBlobBlockList(blobName, cancellationToken);
var firstBlock = blocklist.First();
var memStream = new MemoryStream();
await blockBlob.DownloadStreamingAsync(memStream, firstBlock.Name);
I want to get a specific block from an azure block blob with the
blockId, is this even possible?
It should be possible to do so however it won't be as simple as you mentioned in your sample code.
Here's what you would need to do:
Fetch list of blocks. Each element in the list will have a block id and the size of the block.
Assuming you want to get data for block "n", what you will do is iterate over the list from 0 to n - 1 block and add the size of each block.
Next you would need to call DownloadRangeToStreamAsync(Stream, Nullable<Int64>, Nullable<Int64>), where your offset value will be the sum of the size of each block calculated in step 2 and length value will be the size of the block you wish to download.
You need to create a block blob below you can find the procedure:
you need to create a container under storage account as follows:
inside the container you can find the block blob under properties as shown below
To Get the block Id you can follow the code below:
code:
using System;
using System.Threading.Tasks;
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
class Program
{
static async Task Main(string[] args)
{
string connectionString = "DefaultEndpointsProtocol=https;AccountName=<account-name>;AccountKey=<account-key>;EndpointSuffix=core.windows.net";
string containerName = "***";
string blobName = "***.txt";
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(containerName);
BlobClient blobClient = containerClient.GetBlobClient(blobName);
BlockList blockList = await blobClient.GetBlockListAsync(BlockListType.All)
string firstBlockId = blockList.Value.First().Name;
Console.WriteLine("First block ID: " + firstBlockId);
}
}
After getting the Block Id now you can get the certain block using the code below:
code:
using System;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
class Program
{
static async Task Main(string[] args)
{
string connectionString = "DefaultEndpointsProtocol=https;AccountName=<account-name>;AccountKey=<account-key>;EndpointSuffix=core.windows.net";
string containerName = "mycontainer---";
string blobName = "myblob**";
string blockId = "1234567**";
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(containerName);
BlobClient blobClient = containerClient.GetBlobClient(blobName);
MemoryStream blockData = new MemoryStream();
await blobClient.DownloadStreamingAsync(blockId, blockData, cancellationToken: CancellationToken.None);
Console.WriteLine("Block data:");
Console.WriteLine(Convert.ToBase64String(blockData.ToArray()));
}
}
In the above code you need to replace connection string, container name, block Id, blob name as below:
By following the above procedure, I got successfully.
I am in the process of moving applications to Azure using Azure Function Apps and Blob Storge. I’m trying to figure out a way to do Encrypting/Decrypting in Azure. When doing the same in our on-premise environment I used PGPCore package but that is when you can point to a local file. I’m struggling to figure out a way to do that in Azure Blob Storage. All examples seem to use local files.
Is see paid for apps like DidiSoft but I’m trying to stick with a free version if possible.
Currently I’m working on the decrypt side of things and the concept is that I will have a PGP file delivered to my Blob Storage. I have a Azure Function with a blob trigger looking for *.pgp files and that will call another Function App specifically for just PGP work. The idea is that the PGP Function App will, in this case, decrypt the file right back into the same blob storage but this time as a .txt file. I then have a blob trigger looking for .txt files and it will run code it needs to process that file. The .txt trigger is already done and working. I already have my PGP function app calling Key Vault to get any Public/Private keys needed but what I’m not finding is a good way to do the actual Encrypt/Decrypt in Azure.
Anyone have any suggestions or examples to go look at?
We follow a similar process: download a PGP file (SFTP in our case) and load it to Blob Storage. Then we call an Azure Function that uses PgpCore to Decrypt the PGP files in Blob Storage.
Here is the Decrypt code in the Azure Function:
using System;
using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
using PgpCore;
using Azure.Storage.Blobs;
namespace AF_Services
{
public static class PGPDecryptBlob
{
[FunctionName("PGPDecryptBlob")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequest req,
ILogger log)
{
log.LogInformation($"C# HTTP trigger function {nameof(PGPDecryptBlob)} processed a request.");
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
dynamic data = JsonConvert.DeserializeObject(requestBody);
try
{
// Create TMP directory
Directory.CreateDirectory(IOHelper.TempAppDirectoryName);
string sourceContainerName = data?.sourceContainerName;
log.LogInformation($"Source container name: {sourceContainerName}");
string sourceBlobName = data?.sourceBlobName;
log.LogInformation($"Source blob name: {sourceBlobName}");
string targetContainerName = data?.targetContainerName;
log.LogInformation($"Target container name: {targetContainerName}");
string targetBlobName = data?.targetBlobName;
log.LogInformation($"Target blob name: {targetBlobName}");
if (sourceContainerName == null)
throw new ArgumentNullException("Required parameter sourceContainerName not provided");
if (sourceBlobName == null || targetContainerName == null || targetBlobName == null)
throw new ArgumentNullException("Required parameter sourceBlobName not provided");
if (targetContainerName == null)
throw new ArgumentNullException("Required parameter targetContainerName not provided");
if (targetBlobName == null)
throw new ArgumentNullException("Required parameter targetBlobName not provided");
string keyBlobName = SettingsHelper.Get(Settings.PrivateKeyBlobName);
string passPhrase = SettingsHelper.Get(Settings.PassPhrase);
string blobAccountConnStr = SettingsHelper.Get(Settings.BlobConnectionString);
// TMP file names
var temp_sourceFileName = IOHelper.BuildTempFileName(BlobHelper.StripPath(sourceBlobName));
var temp_targetFileName = IOHelper.BuildTempFileName(BlobHelper.StripPath(targetBlobName));
var temp_keyFileName = IOHelper.BuildTempFileName(BlobHelper.StripPath(keyBlobName));
// download Blob to TMP
using (var sourceStream = new FileStream(temp_sourceFileName, FileMode.Create))
{
var sourceBlobClient = new BlobClient(blobAccountConnStr, sourceContainerName, sourceBlobName);
await sourceBlobClient.DownloadToAsync(sourceStream);
}
// download key to TMP
using (var keyStream = new FileStream(temp_keyFileName, FileMode.Create))
{
var keyBlobClient = new BlobClient(blobAccountConnStr, sourceContainerName, keyBlobName);
await keyBlobClient.DownloadToAsync(keyStream);
}
// Decrypt
using (var pgp = new PGP())
{
using (FileStream inputFileStream = new FileStream(temp_sourceFileName, FileMode.Open))
{
using (Stream outputFileStream = File.Create(temp_targetFileName))
{
using (Stream privateKeyStream = new FileStream(temp_keyFileName, FileMode.Open))
{
await pgp.DecryptStreamAsync(inputFileStream, outputFileStream, privateKeyStream, passPhrase);
}
}
}
}
// write to target blob
using (var decryptStream = new FileStream(temp_targetFileName, FileMode.Open))
{
var targetBlobClient = new BlobClient(blobAccountConnStr, targetContainerName, targetBlobName);
await targetBlobClient.UploadAsync(decryptStream, true);
return new OkObjectResult(new { Status = "Decrypted", BlobInfo = targetBlobClient });
}
}
catch (Exception ex)
{
return new BadRequestObjectResult(new { RequestBody = data, Exception = ex });
}
finally
{
Directory.Delete(IOHelper.TempAppDirectoryName, true);
}
}
}
}
The Encrypt code is similar, just in the opposite direction.
UPDATE:
The sourceContainer, sourceBlobName, targetContainer, and targetBlobName are passed in the JSON body of the Azure Function call.
The PrivateKeyBlobName, PassPhrase, and BlobConnectionString are stored in the Azure Function App's Settings.
The PGP work has to happen locally, so the IOHelper class (code below) is for working with the Azure Function's local file system.
public static class IOHelper
{
private static string _TempAppDirectoryName;
public static string TempAppDirectoryName
{
get
{
if (String.IsNullOrWhiteSpace(_TempAppDirectoryName))
{
var ts = DateTime.UtcNow.Ticks.ToString();
_TempAppDirectoryName = Path.Combine(Path.GetTempPath(), "{YourLocalDirectoryName}", ts);
}
return _TempAppDirectoryName;
}
}
public static string BuildTempFileName(string fileName)
{
return Path.Combine(TempAppDirectoryName, fileName);
}
}
I have next test setup:
One test Azure blob storage account
Local folder with ~3000 small files (200 bytes each)
When I execute azcopy command:
azcopy copy --recursive "c:\localDir\*" "https://BLOBConnectionString"
it takes ~2 seconds to copy data.
When I do next c# code:
ServicePointManager.Expect100Continue = false;
ServicePointManager.DefaultConnectionLimit = 32;
TransferManager.Configurations.ParallelOperations = 32;
var account = CloudStorageAccount.Parse("https://BLOBConnectionString");
CloudBlobClient client = account.CreateCloudBlobClient();
CloudBlobContainer container = client.GetContainerReference("container");
await container.CreateIfNotExistsAsync();
CloudBlobDirectory destinationBlob = container.GetDirectoryReference("data");
await TransferManager.UploadDirectoryAsync(#"c:\localDir\", destinationBlob);
It takes ~1 minute to copy same amount of data.
I expect to have approximately same latency results for c# code base.
I tried in my environment and got below results:
Code:
using Microsoft.Azure.Storage;
using Microsoft.Azure.Storage.Blob;
using Microsoft.Azure.Storage.DataMovement;
using System.ComponentModel;
namespace fastercpy
{
class program
{
public static void Main()
{
string storageConnectionString = "< Connection string >";
CloudStorageAccount account = CloudStorageAccount.Parse(storageConnectionString);
CloudBlobClient blobClient = account.CreateCloudBlobClient();
CloudBlobContainer blobContainer = blobClient.GetContainerReference("test");
blobContainer.CreateIfNotExists();
string sourcePath = "C:\\Users\\v-vsettu\\Documents\\Venkat";
CloudBlobDirectory destinationBlob = blobContainer.GetDirectoryReference("data");
TransferManager.Configurations.ParallelOperations = 64;
// Setup the transfer context and track the upoload progress
SingleTransferContext context = new SingleTransferContext();
context.ProgressHandler = new Progress<TransferStatus>((progress) =>
{
Console.WriteLine("Bytes uploaded: {0}", progress.BytesTransferred);
});
var task=TransferManager.UploadDirectoryAsync(sourcePath, destinationBlob);
task.Wait();
}
}
}
you have been used
TransferManager.Configurations.ParallelOperations = 32 ,So try to use TransferManager.Configurations.ParallelOperations = 64; in your code it will speed up process.
The Microsoft Azure Storage Data Movement Library was created for fast uploading, downloading, and copying of Azure Storage Blob and File.
Console:
Portal:
Reference:
Transfer data with the Data Movement library for .NET - Azure Storage | Microsoft Learn
I built an ASP.NET MVC API hosted on IIS on Windows 10 Pro (VM on Azure - 4GB RAM, 2CPU). Within I call an .exe (wkhtmltopdf) that I want to convert an HTML page to image and save it locally. Everything works fine, except I noticed that after some calls to the API, the RAM goes crazy and while investigating the process with Task Manager I saw a process, called IIS Worker Process, that adds more RAM every time the API is called. Of course I wrapped my System.Diagnostics.Process instance usage inside a using statement to be disposed, because IDisposable is implemented, but it still consumes more and more RAM and after a while the server becomes laggy and unresponsive (it has only 4GB of RAM after all). I noticed that after some number of minutes (10-15-20 maybe) this IIS Worker Process calms down in terms of RAM usage... Here is my code, pretty straight forward:
Gets base64 encoded url
Decodes it
Uses wkhtmltoimage.exe to convert it to image
Saves it locally
Reads the byte array
Creates a blob in Azure with the image
Returns json with the url
public async Task<ActionResult> Index(string url)
{
object oJSON = new { url = string.Empty };
if (!string.IsNullOrEmpty(value: url))
{
try
{
byte[] EncodedData = Convert.FromBase64String(s: url);
string DecodedURL = Encoding.UTF8.GetString(bytes: EncodedData);
using (Process proc = new Process())
{
proc.StartInfo.FileName = wkhtmltopdfExecutablePath;
proc.StartInfo.Arguments = $"--encoding utf-8 \"{DecodedURL}\" {LocalImageFilePath}";
proc.Start();
proc.WaitForExit();
oJSON = new { procStatusCode = proc.ExitCode };
}
if (System.IO.File.Exists(path: LocalImageFilePath))
{
byte[] pngBytes = System.IO.File.ReadAllBytes(path: LocalImageFilePath);
System.IO.File.Delete(path: LocalImageFilePath);
string ImageURL = await CreateBlob(blobName: $"{BlobName}.png", data: pngBytes);
oJSON = new { url = ImageURL };
}
}
catch (Exception ex)
{
Debug.WriteLine(value: ex);
}
}
return Json(data: oJSON, behavior: JsonRequestBehavior.AllowGet);
}
private async Task<string> CreateBlob(string blobName, byte[] data)
{
string ConnectionString = "DefaultEndpointsProtocol=https;AccountName=" + AzureStorrageAccountName + ";AccountKey=" + AzureStorageAccessKey + ";EndpointSuffix=core.windows.net";
CloudStorageAccount cloudStorageAccount = CloudStorageAccount.Parse(connectionString: ConnectionString);
CloudBlobClient cloudBlobClient = cloudStorageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference(containerName: AzureBlobContainer);
await cloudBlobContainer.CreateIfNotExistsAsync();
BlobContainerPermissions blobContainerPermissions = await cloudBlobContainer.GetPermissionsAsync();
blobContainerPermissions.PublicAccess = BlobContainerPublicAccessType.Container;
await cloudBlobContainer.SetPermissionsAsync(permissions: blobContainerPermissions);
CloudBlockBlob cloudBlockBlob = cloudBlobContainer.GetBlockBlobReference(blobName: blobName);
cloudBlockBlob.Properties.ContentType = "image/png";
using (Stream stream = new MemoryStream(buffer: data))
{
await cloudBlockBlob.UploadFromStreamAsync(source: stream);
}
return cloudBlockBlob.Uri.AbsoluteUri;
}
Here are the resources I'm reading somehow related to this issue IMO, but are not helping much:
Investigating ASP.Net Memory Dumps for Idiots (like Me)
ASP.NET app eating memory. Application / Session objects the reason?
IIS Worker Process using a LOT of memory?
Run dispose method upon asp.net IIS app restart
IIS: Idle Timeout vs Recycle
UPDATE:
if (System.IO.File.Exists(path: LocalImageFilePath))
{
string BlobName = Guid.NewGuid().ToString(format: "n");
string ImageURL = string.Empty;
using(FileStream fileStream = new FileStream(LocalImageFilePath, FileMode.Open)
{
ImageURL = await CreateBlob(blobName: $"{BlobName}.png", dataStream: fileStream);
}
System.IO.File.Delete(path: LocalImageFilePath);
oJSON = new { url = ImageURL };
}
The most likely cause of your pain is the allocation of large byte arrays:
byte[] pngBytes = System.IO.File.ReadAllBytes(path: LocalImageFilePath);
The easiest change to make, to try and encourage the GC to collect the Large Object Heap more often, is to set GCSettings.LargeObjectHeapCompactionMode to CompactOnce at the end of the method. That might help.
But, a better idea would be to remove the need for the large array altogether. To do this, change:
private async Task<string> CreateBlob(string blobName, byte[] data)
to instead be:
private async Task<string> CreateBlob(string blobName, FileStream data)
And then later use:
await cloudBlockBlob.UploadFromStreamAsync(source: data);
In the caller, you'll need to stop using ReadAllBytes, and instead use a FileStream to read the file instead.
i am trying to get a COMPLETE example of downloading a file form Azure Blob Storage using the .DownloadToStreamAsync() method wired up to a progress bar.
i've found references to old implementations of the azure storage sdk, but they dont compile with the newer sdk (that has implemented these async methods) or don't work with current nuget packages.
https://blogs.msdn.microsoft.com/avkashchauhan/2010/11/03/uploading-a-blob-to-azure-storage-with-progress-bar-and-variable-upload-block-size/
https://blogs.msdn.microsoft.com/kwill/2013/03/05/asynchronous-parallel-blob-transfers-with-progress-change-notification-2-0/
i'm a newbie to async/await threading in .NET, and was wondering if someone could help me out with taking the below (in a windows form app) and showing how i can 'hook' into the progress of the file download... i see some examples dont use the .DownloadToStream method, and they instead download chunks of the blob file.. but i wondered since these new ...Async() methods exist in the newer Storage SDK's, if there was something smarter that could be done?
So assuming the below is working (non async), what additionally would i have to do to use the blockBlob.DownloadToStreamAsync(fileStream); method, is this even the right use of this, and how can i get the progress?
ideally i am after any way i can just hook the progress of the blob download so i can update a Windows Form UI on big downloads.. so if the below is not the right way, please enlighten me :)
// Retrieve storage account from connection string.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve reference to a previously created container.
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
// Retrieve reference to a blob named "photo1.jpg".
CloudBlockBlob blockBlob = container.GetBlockBlobReference("photo1.jpg");
// Save blob contents to a file.
using (var fileStream = System.IO.File.OpenWrite(#"path\myfile"))
{
blockBlob.DownloadToStream(fileStream);
}
Using the awesome suggested method (downloading 1mb chunks) kindly suggsted by Gaurav, i have implemented using a background worker to do the download so i can update the UI as i go.
The main part inside the do loop that downloads the range to a stream and then writes the stream to the file system I havent touched from the original example, but i have added code to update the worker progress and to listen for the worker cancellation (to abort the download).. not sure if this could be the issue?
For completeness, below is everything inside the worker_DoWork method:
public void worker_DoWork(object sender, DoWorkEventArgs e)
{
object[] parameters = e.Argument as object[];
string localFile = (string)parameters[0];
string blobName = (string)parameters[1];
string blobContainerName = (string)parameters[2];
CloudBlobClient client = (CloudBlobClient)parameters[3];
try
{
int segmentSize = 1 * 1024 * 1024; //1 MB chunk
var blobContainer = client.GetContainerReference(blobContainerName);
var blob = blobContainer.GetBlockBlobReference(blobName);
blob.FetchAttributes();
blobLengthRemaining = blob.Properties.Length;
blobLength = blob.Properties.Length;
long startPosition = 0;
do
{
long blockSize = Math.Min(segmentSize, blobLengthRemaining);
byte[] blobContents = new byte[blockSize];
using (MemoryStream ms = new MemoryStream())
{
blob.DownloadRangeToStream(ms, startPosition, blockSize);
ms.Position = 0;
ms.Read(blobContents, 0, blobContents.Length);
using (FileStream fs = new FileStream(localFile, FileMode.OpenOrCreate))
{
fs.Position = startPosition;
fs.Write(blobContents, 0, blobContents.Length);
}
}
startPosition += blockSize;
blobLengthRemaining -= blockSize;
if (blobLength > 0)
{
decimal totalSize = Convert.ToDecimal(blobLength);
decimal downloaded = totalSize - Convert.ToDecimal(blobLengthRemaining);
decimal blobPercent = (downloaded / totalSize) * 100;
worker.ReportProgress(Convert.ToInt32(blobPercent));
}
if (worker.CancellationPending)
{
e.Cancel = true;
blobDownloadCancelled = true;
return;
}
}
while (blobLengthRemaining > 0);
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}
This is working, but on bigger files (30mb for example), i sometimes am getting 'can't write to file as open in another process error...' and the process fails..
Using your code:
using (var fileStream = System.IO.File.OpenWrite(#"path\myfile"))
{
blockBlob.DownloadToStream(fileStream);
}
It is not possible to show the progress because the code comes out of this function only when the download is complete. DownloadToStream function will internally split a large blob in chunks and download the chunks.
What you need to do is download these chunks using your code. What you have to do is use DownloadRangeToStream method. I answered a similar question some time back that you may find useful: Azure download blob part.