I'm rewriting my C# code to use Azure Blob storage instead of filesystem. So far no problems rewriting code for normal fileoperations. But I have some code that uses async write from a stream:
using (var stream = await Request.Content.ReadAsStreamAsync())
{
FileStream fileStream = new FileStream(#"c:\test.txt", FileMode.Create, FileAccess.Write, FileShare.None);
await stream.CopyToAsync(fileStream).ContinueWith(
(copyTask) =>
{
fileStream.Close();
});
}
I need to change the above to use Azure CloudBlockBlob or CloudBlobStream - but can't find a way to declare a stream object that copyToAsync can write to.
You would want to use UploadFromStreamAsync method on CloudBlockBlob. Here's a sample code to do so (I have not tried running this code though):
var cred = new StorageCredentials(accountName, accountKey);
var account = new CloudStorageAccount(cred, true);
var blobClient = account.CreateCloudBlobClient();
var container = blobClient.GetContainerReference("container-name");
var blob = container.GetBlockBlobReference("blob-name");
using (var stream = await Request.Content.ReadAsStreamAsync())
{
stream.Position = 0;
await blob.UploadFromStreamAsync(stream);
}
Related
I'm uploading files to Azure Blob Storage with the .Net package specifying the encoding iso-8859-1. The stream seems ok in Memory but when I upload to the blob storage it ends with corrupted characters that seems that could not be converted to that encoding. It would seem as if the file gets storaged in a corrupted state and when I download it again and check it the characters get all messed up. Here is the code I'm using.
public static async Task<bool> UploadFileFromStream(this CloudStorageAccount account, string containerName, string destBlobPath, string fileName, Stream stream, Encoding encoding)
{
if (account is null) throw new ArgumentNullException(nameof(account));
if (string.IsNullOrEmpty(containerName)) throw new ArgumentException("message", nameof(containerName));
if (string.IsNullOrEmpty(destBlobPath)) throw new ArgumentException("message", nameof(destBlobPath));
if (stream is null) throw new ArgumentNullException(nameof(stream));
stream.Position = 0;
CloudBlockBlob blob = GetBlob(account, containerName, $"{destBlobPath}/{fileName}");
blob.Properties.ContentType = FileUtils.GetFileContentType(fileName);
using var reader = new StreamReader(stream, encoding);
var ct = await reader.ReadToEndAsync();
await blob.UploadTextAsync(ct, encoding ?? Encoding.UTF8, AccessCondition.GenerateEmptyCondition(), new BlobRequestOptions(), new OperationContext());
return true;
}
This is the file just before uploading it
<provinciaDatosInmueble>Sevilla</provinciaDatosInmueble>
<inePoblacionDatosInmueble>969</inePoblacionDatosInmueble>
<poblacionDatosInmueble>Valencina de la Concepción</poblacionDatosInmueble>
and this is the file after the upload
<provinciaDatosInmueble>Sevilla</provinciaDatosInmueble>
<inePoblacionDatosInmueble>969</inePoblacionDatosInmueble>
<poblacionDatosInmueble>Valencina de la Concepci�n</poblacionDatosInmueble>
The encoding I send is ISO-5589-1 in the parameter of the encoding. Anybody knows why Blob Storage seems to ignore the encoding I'm specifying? Thanks in advance!
We could able to achieve this using Azure.Storage.Blobs instead of WindowsAzure.Storage which is a legacy Storage SDK. Below is the code that worked for us.
class Program
{
static async Task Main(string[] args)
{
string sourceContainerName = "<Source_Container_Name>";
string destBlobPath = "<Destination_Path>";
string fileName = "<Source_File_name>";
MemoryStream stream = new MemoryStream();
BlobServiceClient blobServiceClient = new BlobServiceClient("<Your_Connection_String>");
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(sourceContainerName);
BlobClient blobClientSource = containerClient.GetBlobClient(fileName);
BlobClient blobClientDestination = containerClient.GetBlobClient(destBlobPath);
// Reading From Blob
var line =" ";
if (await blobClientSource.ExistsAsync())
{
var response = await blobClientSource.DownloadAsync();
using (StreamReader streamReader = new StreamReader(response.Value.Content))
{
line = await streamReader.ReadToEndAsync();
}
}
// Writing To Blob
var content = Encoding.UTF8.GetBytes(line);
using (var ms = new MemoryStream(content))
blobClientDestination.Upload(ms);
}
}
RESULT:
The following code creates a zip file from S3 by pulling them into memory and write the final product to a file on disk. However, it is observer it corrupted few file (out of thousands) while creating the zip. I've checked, there is nothing wrong with files which got corrupted during the process, because same file(s) get zipped properly by other means. Any suggestions to fine tune the code?
Code:
public static async Task S3ToZip(List<string> pdfBatch, string zipPath, IAmazonS3 s3Client)
{
FileStream fileStream = new FileStream(zipPath, FileMode.OpenOrCreate, FileAccess.ReadWrite, FileShare.ReadWrite);
using (ZipArchive archive = new ZipArchive(fileStream, ZipArchiveMode.Update, true))
{
foreach (var file in pdfBatch)
{
GetObjectRequest request = new GetObjectRequest
{
BucketName = "sample-bucket",
Key = file
};
using GetObjectResponse response = await s3Client.GetObjectAsync(request);
using Stream responseStream = response.ResponseStream;
ZipArchiveEntry zipFileEntry = archive.CreateEntry(file.Split('/')[^1]);
using Stream zipEntryStream = zipFileEntry.Open();
await responseStream.CopyToAsync(zipEntryStream);
zipEntryStream.Seek(0, SeekOrigin.Begin);
zipEntryStream.CopyTo(fileStream);
}
archive.Dispose();
fileStream.Close();
}
}
Don't call Dispose() or Close() explicitly, let using do all the job. And you don't need to write anything to fileStream writing to ZipArchiveEntrystream does it under the hood. You also need to use FileMode.Create to guarantee that your file is always truncated before writing to it. Also as you only creating archive not updating it, you should use ZipArchiveMode.Create to enable memory efficient streaming (thanks to #canton7 for some deep diving in details of zip archive format).
public static async Task S3ToZip(List<string> pdfBatch, string zipPath, IAmazonS3 s3Client)
{
using FileStream fileStream = new FileStream(zipPath, FileMode.Create, FileAccess.ReadWrite, FileShare.ReadWrite);
using ZipArchive archive = new ZipArchive(fileStream, ZipArchiveMode.Create, true);
foreach (var file in pdfBatch)
{
GetObjectRequest request = new GetObjectRequest
{
BucketName = "sample-bucket",
Key = file
};
using GetObjectResponse response = await s3Client.GetObjectAsync(request);
using Stream responseStream = response.ResponseStream;
ZipArchiveEntry zipFileEntry = archive.CreateEntry(file.Split('/')[^1]);
using Stream zipEntryStream = zipFileEntry.Open();
await responseStream.CopyToAsync(zipEntryStream);
}
}
I am trying to take a file and split it into piece and then push each new smaller file piece to azure. I have tried writing a MemoryStream to azure, but that causes the file to upload immediately, but the file is basically empty. I have tried using a BufferedStream which allows the data to be sent as i am writing to it, but i am not sure how to end the stream. I have tried to close each of the different streams i am using, but that does not work as it results in a stream closed exception. Any idea how to mark the stream as complete so the azure library will know to finish the file upload?
It does work to wait until the full file is build and then upload the memory stream, but i would like to be able to write to it while it is uploading if possible.
CloudBlobClient blobClient = StorageAccount.CreateCloudBlobClient();
CloudBlobContainer blobContainer = blobClient.GetContainerReference("containerName");
using (FileStream fileStream = File.Open(path)
{
int key = 0;
CsvWriter csvWriter = null;
MemoryStream memoryStream = null;
BufferedStream bufferedStream = null;
StreamWriter streamWriter = null;
Task<StorageUri> uploadTask = null;
using (var reader = new StreamReader(fileStream))
using (var csv = new CsvReader(reader, CultureInfo.InvariantCulture))
{
csv.Read();
csv.ReadHeader();
await foreach (model row in csv.GetRecordsAsync<MyModel>())
{
if (row.KeyColumn != key)
{
if (memoryStream != null)
{
//Wiat for the current upload to finish
await csvWriter.FlushAsync();
csvWriter.Dispose();
await uploadTask;
}
//Start New Upload
key = row.KeyColumn;
memoryStream = new MemoryStream();
bufferedStream = new BufferedStream(memoryStream)
streamWriter = new StreamWriter(bufferedStream);
csvWriter = new CsvWriter(streamWriter, CultureInfo.InvariantCulture);
csvWriter.WriteHeader<MyModel>();
await csvWriter.FlushAsync();
CloudBlockBlob blockBlob = blobContainer.GetBlockBlobReference($"file_{key}.csv");
uploadTask = blockBlob.UploadFromStreamAsync(bufferedStream);
}
csvWriter.WriteRecord(row);
await csvWriter.FlushAsync();
}
if (memoryStream != null)
{
await csvWriter.FlushAsync();
csvWriter.Dispose();
await uploadTask;
}
}
}
I have a ASP.NET core application that need to send a stream(the stream is posted by client) to Microsoft Cognitive Service to get the ID. And then send the same stream to azure blob for backup, the file name should be the ID received from Cognitive Service.
But it seems like the MemoryStream ms closed after used by faceServiceClient : an error accrued at the second "ms.Position = 0" statement saying "Cannot access a closed stream".
public static async Task CreatPerson(string _key, HttpRequest _req)
{
var faceServiceClient = new FaceServiceClient(_key);
using (MemoryStream ms = new MemoryStream())
{
_req.Body.CopyTo(ms);
ms.Position = 0;
var facesTask = faceServiceClient.AddFaceToFaceListAsync("himlens", ms);
//init azure blob
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(AZURE_STORE_CONN_STR);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("xxx");
var faces = await facesTask;
var blob = container.GetBlockBlobReference(faces.PersistedFaceId.ToString());
ms.Position = 0;//Error Here
await blob.UploadFromStreamAsync(ms);
}
}
I'm confused about it, can anybody help me solve this problem?
Thanks!
ms.Position = 0;//Error Here
To easily fix it, you could create a new instance of MemoryStream and copy the value from ms. Then you could upload it to your blob storage. Code below is for your reference.
using (MemoryStream ms = new MemoryStream())
{
_req.Body.CopyTo(ms);
ms.Position = 0;
//new code which I added
MemoryStream ms2 = new MemoryStream();
ms.CopyTo(ms2);
ms.Position = 0;
var facesTask = faceServiceClient.AddFaceToFaceListAsync("himlens", ms);
//init azure blob
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(AZURE_STORE_CONN_STR);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("xxx");
var faces = await facesTask;
var blob = container.GetBlockBlobReference(faces.PersistedFaceId.ToString());
//Code which I modified
ms2.Position = 0;
await blob.UploadFromStreamAsync(ms2);
}
I am trying to save an image from the web to local storage to be manipulated later, but it appears to be corrupt and attempting to open it with an external application fails. Opening the image in the webbrowser works completely normally. Thanks for any help.
var client = new HttpClient();
var clientResponse = await client.GetByteArrayAsync(imageUri);
var temp = ApplicationData.Current.TemporaryFolder;
StorageFile file;
if ((await temp.GetFilesAsync()).Any(f => f.Name == "temp_image.png")) {
file = await temp.GetFileAsync("tempcolorizer.png");
} else {
file = await temp.CreateFileAsync("temp_image.png");
}
using (var fs = await file.OpenReadAsync())
using (var writer = new DataWriter(fs)) {
writer.WriteBytes(clientResponse);
}
You have to call StoreAsync:
using (var fs = await file.OpenReadAsync())
using (var writer = new DataWriter(fs)) {
writer.WriteBytes(clientResponse);
await writer.StoreAsync();
}