I'm using Azure Blob Storage to save some files. I'm having issues downloading this stream & am not sure why. I don't get any errors, just an empty stream. I've verified that the file exists in the container, and even ran code to list all files in the container. Any help would be greatly appreciated.
private async Task<MemoryStream> GetMemoryStreamAsync(string fileName)
{
var storageAccountName = Environment.GetEnvironmentVariable("storage_account_name");
var storageAccountKey = Environment.GetEnvironmentVariable("storage_access_key");
var storageContainerName = Environment.GetEnvironmentVariable("storage_container_name");
CloudStorageAccount storageAccount = new CloudStorageAccount(new Microsoft.WindowsAzure.Storage.Auth.StorageCredentials(storageAccountName, storageAccountKey), true);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference(storageContainerName);
CloudBlockBlob blockBlob = container.GetBlockBlobReference(fileName);
MemoryStream stream = new MemoryStream();
await blockBlob.DownloadToStreamAsync(stream);
return stream;
}
You'll need to set the position to zero before returning the stream so that the consumer of the stream reads it from the beginning.
MemoryStream stream = new MemoryStream();
await blockBlob.DownloadToStreamAsync(stream);
stream.Position = 0;
return stream;
Related
I am trying to update a JSON file which is located in azure blob storage. when the program does the put call the saved file looks like this:
Zona de especial protecci\u00F3n
the accents and other characters are the matter, but that only happens when I download the file from the azure UI interface if I do a get from postman, that does not happen. this is my code:
SemanticDictionaryContent semanticDictionaryContent = new SemanticDictionaryContent()
{
Name = entity.Id + JSON_EXTENSION,
Content = BinaryData.FromObjectAsJson(entity)
};
Create a storage account in azure.
Create a container in azure.
Uploaded a Json file to a container using the below code.
I have used the below approach in Uploading / downloading / editing the Json file.
public static bool Upload()
{
try
{
var containerName = "mycontainer";
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConnectionSting);
CloudBlobClient client = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = client.GetContainerReference(containerName);
var isCreated = container.CreateIfNotExists();
container.SetPermissionsAsync(new BlobContainerPermissions { PublicAccess = BlobContainerPublicAccessType.Blob });
using (FileStream fileStream = File.Open(#"C:\Tools\local.settings.json", FileMode.Open))
{
using (MemoryStream memoryStream = new MemoryStream())
{
memoryStream.Position = 0;
fileStream.CopyTo(memoryStream);
var fileName = "local.settings.json";
CloudBlockBlob blob = container.GetBlockBlobReference(fileName);
string mimeType = "application/unknown";
string ext = (fileName.Contains(".")) ? System.IO.Path.GetExtension(fileName).ToLower() : "." + fileName;
Microsoft.Win32.RegistryKey regKey = Microsoft.Win32.Registry.ClassesRoot.OpenSubKey(ext);
if (regKey != null && regKey.GetValue("Content Type") != null) mimeType = regKey.GetValue("Content Type").ToString();
memoryStream.ToArray();
memoryStream.Seek(0, SeekOrigin.Begin);
blob.Properties.ContentType = mimeType;
blob.UploadFromStream(memoryStream);
}
}
return true;
}
catch (Exception ex)
{
throw;
}
}
Uploaded Json file
Updated the Json file in azure and uploaded it using the below code.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConnectionSting);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
CloudBlockBlob jsonBlob = container.GetBlockBlobReference("local.settings.json");
string jsonText = jsonBlob.DownloadText();
dynamic jsonData = JsonConvert.DeserializeObject(jsonText);
jsonData.property1 = "Property1";
jsonData.property2 = "Property2";
jsonBlob.UploadText(JsonConvert.SerializeObject(jsonData));
And downloaded the Json file from azure manually and do not find any special characters.
Context: Encrypt and Decrypt an audio file (.wav) in Azure Storage.
Issue: inputStream must be seek-able (when encrypting) "await pgp.EncryptStreamAsync(sourceStream, outputStream);"
I'm not a C# Developer :)
Thank you for your help,
Here is the code i'm using:
static async Task Main()
{
string connectionString = Environment.GetEnvironmentVariable("AZURE_STORAGE_CONNECTION_STRING");
////Create a unique name for the container
string containerName = "audioinput";
string filename = "abc.wav";
//// Create a BlobServiceClient object which will be used to create a container client
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
BlobContainerClient sourcecontainer = blobServiceClient.GetBlobContainerClient(containerName);
BlobClient blobClient = sourcecontainer.GetBlobClient(filename);
if (sourcecontainer.Exists())
{
var sourceStream = new MemoryStream();
//Download blob to MemoryStream
await blobClient.DownloadToAsync(sourceStream);
sourceStream.Position = 0;
//OutputStream
await using var outputStream = new MemoryStream();
//Get encryptionkeys
EncryptionKeys encryptionKeys;
using (Stream publicKeyStream = new FileStream(#"...\public.asc", FileMode.Open))
encryptionKeys = new EncryptionKeys(publicKeyStream);
PGP pgp = new PGP(encryptionKeys);
await pgp.EncryptStreamAsync(sourceStream, outputStream);
}
else
{
Console.WriteLine("container doesn't exist");
}
}
Below is the method where I am reading a csv file from an azure blob container and later calling a function to copy the contents in a tabular storage.
Now my requirement has bit changed and now .csv file will be compressed to .gz file in the blob container. I would like to know, how can I modify the below code so that I can read .gz file , decompress it and then pass the contents as I am already passing
public async Task<string> ReadStream(string BlobcontainerName, string fileName, string connectionString)
{
var contents = await DownloadBlob(BlobcontainerName, fileName, connectionString);
string data = Encoding.UTF8.GetString(contents.ToArray());
return data;
}
foreach (var files in recFiles)// recFiles are list of CSV files
{
string data = await ReadStream(containerName, files.Name, connectionString);}
public async Task<MemoryStream> DownloadBlob(string containerName, string fileName, string connectionString)
{
MemoryStream memoryStream = new MemoryStream();
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(connectionString);
CloudBlobClient serviceClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = serviceClient.GetContainerReference(containerName);
CloudBlockBlob blob = container.GetBlockBlobReference(fileName);
if (blob.Exists())
{
using (memoryStream = new MemoryStream())
{
await blob.DownloadToStreamAsync(memoryStream);
}
}
return memoryStream;
}
Try this code:
public async Task<MemoryStream> DownloadBlob(string containerName, string fileName, string connectionString)
{
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(connectionString);
CloudBlobClient serviceClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = serviceClient.GetContainerReference(containerName);
CloudBlockBlob blob = container.GetBlockBlobReference(fileName);
var memoryStream = new MemoryStream();
if (blob.Exists())
{
using (var gZipStream = new GZipStream(memoryStream, CompressionMode.Decompress))
{
await blob.DownloadToStreamAsync(gZipStream);
}
}
return memoryStream;
}
I 'm doing PGP Encryption for a csv file, below is the code where I 'm stuck, basically below code works if the public key is in local text file however when I 'm having the same file in Azure blob storage, I download the contents in Memory stream and then passing it as parameter it's not working, in short File.OpenRead works but not memory stream, please help
public static PgpPublicKey ReadPublicKey12()
{
var containerName = "pgpkeys";
string storageConnection = CloudConfigurationManager.GetSetting("StorageConnnection");
CloudStorageAccount cloudStorageAccount = CloudStorageAccount.Parse(storageConnection);
CloudBlobClient blobClient = cloudStorageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer = blobClient.GetContainerReference(containerName);
CloudBlockBlob blockBlob = cloudBlobContainer.GetBlockBlobReference("keyPublic.txt");
Stream inputStream = new MemoryStream();
blockBlob.DownloadToStream(inputStream);
// inputStream = File.OpenRead(#"C:\PGPTest\keyPublic1234.txt");
inputStream = PgpUtilities.GetDecoderStream(inputStream);
PgpPublicKeyRingBundle pgpPub = new PgpPublicKeyRingBundle(inputStream);
foreach (PgpPublicKeyRing kRing in pgpPub.GetKeyRings())
{
foreach (PgpPublicKey k in kRing.GetPublicKeys())
{
if (k.IsEncryptionKey)
return k;
}
}
throw new ArgumentException("Can't find encryption key in key ring.");
}
If we don't reset the stream position to zero (inputStream.Position = 0;) 0 byte blob is being written in to a memory stream, so you need to add that as below.
var containerName = "pgpkeys";
string storageConnection = CloudConfigurationManager.GetSetting("StorageConnnection");
CloudStorageAccount cloudStorageAccount = CloudStorageAccount.Parse(storageConnection);
CloudBlobClient blobClient = cloudStorageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer = blobClient.GetContainerReference(containerName);
CloudBlockBlob blockBlob = cloudBlobContainer.GetBlockBlobReference("keyPublic.txt");
Stream inputStream = new MemoryStream();
blockBlob.DownloadToStream(inputStream);
inputStream.Position = 0;
inputStream = PgpUtilities.GetDecoderStream(inputStream);
PgpPublicKeyRingBundle pgpPub = new PgpPublicKeyRingBundle(inputStream);
foreach (PgpPublicKeyRing kRing in pgpPub.GetKeyRings())
{
foreach (PgpPublicKey k in kRing.GetPublicKeys())
{
Console.WriteLine("Obtained key from BLOB");
if (k.IsEncryptionKey)
return k;
Console.WriteLine("Obtained key from BLOB");
}
}
throw new ArgumentException("Can't find encryption key in key ring.");
I have a ASP.NET core application that need to send a stream(the stream is posted by client) to Microsoft Cognitive Service to get the ID. And then send the same stream to azure blob for backup, the file name should be the ID received from Cognitive Service.
But it seems like the MemoryStream ms closed after used by faceServiceClient : an error accrued at the second "ms.Position = 0" statement saying "Cannot access a closed stream".
public static async Task CreatPerson(string _key, HttpRequest _req)
{
var faceServiceClient = new FaceServiceClient(_key);
using (MemoryStream ms = new MemoryStream())
{
_req.Body.CopyTo(ms);
ms.Position = 0;
var facesTask = faceServiceClient.AddFaceToFaceListAsync("himlens", ms);
//init azure blob
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(AZURE_STORE_CONN_STR);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("xxx");
var faces = await facesTask;
var blob = container.GetBlockBlobReference(faces.PersistedFaceId.ToString());
ms.Position = 0;//Error Here
await blob.UploadFromStreamAsync(ms);
}
}
I'm confused about it, can anybody help me solve this problem?
Thanks!
ms.Position = 0;//Error Here
To easily fix it, you could create a new instance of MemoryStream and copy the value from ms. Then you could upload it to your blob storage. Code below is for your reference.
using (MemoryStream ms = new MemoryStream())
{
_req.Body.CopyTo(ms);
ms.Position = 0;
//new code which I added
MemoryStream ms2 = new MemoryStream();
ms.CopyTo(ms2);
ms.Position = 0;
var facesTask = faceServiceClient.AddFaceToFaceListAsync("himlens", ms);
//init azure blob
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(AZURE_STORE_CONN_STR);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("xxx");
var faces = await facesTask;
var blob = container.GetBlockBlobReference(faces.PersistedFaceId.ToString());
//Code which I modified
ms2.Position = 0;
await blob.UploadFromStreamAsync(ms2);
}