I am trying to take a file and split it into piece and then push each new smaller file piece to azure. I have tried writing a MemoryStream to azure, but that causes the file to upload immediately, but the file is basically empty. I have tried using a BufferedStream which allows the data to be sent as i am writing to it, but i am not sure how to end the stream. I have tried to close each of the different streams i am using, but that does not work as it results in a stream closed exception. Any idea how to mark the stream as complete so the azure library will know to finish the file upload?
It does work to wait until the full file is build and then upload the memory stream, but i would like to be able to write to it while it is uploading if possible.
CloudBlobClient blobClient = StorageAccount.CreateCloudBlobClient();
CloudBlobContainer blobContainer = blobClient.GetContainerReference("containerName");
using (FileStream fileStream = File.Open(path)
{
int key = 0;
CsvWriter csvWriter = null;
MemoryStream memoryStream = null;
BufferedStream bufferedStream = null;
StreamWriter streamWriter = null;
Task<StorageUri> uploadTask = null;
using (var reader = new StreamReader(fileStream))
using (var csv = new CsvReader(reader, CultureInfo.InvariantCulture))
{
csv.Read();
csv.ReadHeader();
await foreach (model row in csv.GetRecordsAsync<MyModel>())
{
if (row.KeyColumn != key)
{
if (memoryStream != null)
{
//Wiat for the current upload to finish
await csvWriter.FlushAsync();
csvWriter.Dispose();
await uploadTask;
}
//Start New Upload
key = row.KeyColumn;
memoryStream = new MemoryStream();
bufferedStream = new BufferedStream(memoryStream)
streamWriter = new StreamWriter(bufferedStream);
csvWriter = new CsvWriter(streamWriter, CultureInfo.InvariantCulture);
csvWriter.WriteHeader<MyModel>();
await csvWriter.FlushAsync();
CloudBlockBlob blockBlob = blobContainer.GetBlockBlobReference($"file_{key}.csv");
uploadTask = blockBlob.UploadFromStreamAsync(bufferedStream);
}
csvWriter.WriteRecord(row);
await csvWriter.FlushAsync();
}
if (memoryStream != null)
{
await csvWriter.FlushAsync();
csvWriter.Dispose();
await uploadTask;
}
}
}
Related
I created the functionality to get documents from blob storage and then add them to a zip file for download.
[HttpPost]
public FileContentResult DownloadDocumentsByDocIDZIP(List<int> documentIDs)
{
List<Document> docs = new List<Document>();
foreach (int doc in documentIDs)
{
if (doc != 0)
{
Document document = documentService.GetDocumentByID(doc, false);
docs.Add(document);
}
}
MemoryStream outms = new MemoryStream();
using (ZipArchive zar = new ZipArchive(outms, ZipArchiveMode.Create, false))
{
foreach (Document docu in docs)
{
if (docu != null)
{
byte[] documentdata = documentService.DownloadDocumentData(docu.DocumentID);
string name = docu.DocumentNiceName ?? docu.DocumentFileName;
byte[] unzipped = documentdata;
ZipArchiveEntry entry = zar.CreateEntry(name);
Stream str = entry.Open();
MemoryStream ms = new MemoryStream(unzipped);
ms.CopyTo(str);
}
}
outms.Seek(0, SeekOrigin.Begin);
}
var outdata = outms.ToArray();
var result = File(outdata, "application/zip", "documents.zip");
return result;
}
When I hit the function via ajax, It fails at
ZipArchiveEntry entry = zar.CreateEntry(name);
I'm given the exception,
System.IO.IOException: 'Entries cannot be created while previously created entries are still open.'
So I added str.close()
using (ZipArchive zar = new ZipArchive(outms, ZipArchiveMode.Create, false))
{
foreach (Document docu in docs)
{
if (docu != null)
{
byte[] documentdata = documentService.DownloadDocumentData(docu.DocumentID);
string name = docu.DocumentNiceName ?? docu.DocumentFileName;
byte[] unzipped = documentdata;
ZipArchiveEntry entry = zar.CreateEntry(name);
Stream str = entry.Open();
MemoryStream ms = new MemoryStream(unzipped);
ms.CopyTo(str);
str.Close();
}
}
outms.Seek(0, SeekOrigin.Begin);
}
var outdata = outms.ToArray();
var result = File(outdata, "application/zip", "documents.zip");
return result;
Now it creates the file but when you try to unzip it after download.
It gives me an error in WinZip. Error: unable to seek to beginning of Central Directory.
Can someone please assist I have no idea what I'm doing wrong?
you have to dispose the Stream before add new stream to zip but the real problem is that you call Seek on stream, try the following code:
using (ZipArchive zar = new ZipArchive(outms, ZipArchiveMode.Create, false))
{
foreach (Document docu in docs)
{
if (docu != null)
{
byte[] documentdata = documentService.DownloadDocumentData(docu.DocumentID);
string name = docu.DocumentNiceName ?? docu.DocumentFileName;
byte[] unzipped = documentdata;
ZipArchiveEntry entry = zar.CreateEntry(name);
using (Stream str = entry.Open())
{
str.Write(unzipped);
}
}
}
//outms.Seek(0, SeekOrigin.Begin); //This causes "Error: unable to seek to beginning of Central Directory."
}
var outdata = outms.ToArray();
var result = File(outdata, "application/zip", "documents.zip");
return result;
I'm uploading files to Azure Blob Storage with the .Net package specifying the encoding iso-8859-1. The stream seems ok in Memory but when I upload to the blob storage it ends with corrupted characters that seems that could not be converted to that encoding. It would seem as if the file gets storaged in a corrupted state and when I download it again and check it the characters get all messed up. Here is the code I'm using.
public static async Task<bool> UploadFileFromStream(this CloudStorageAccount account, string containerName, string destBlobPath, string fileName, Stream stream, Encoding encoding)
{
if (account is null) throw new ArgumentNullException(nameof(account));
if (string.IsNullOrEmpty(containerName)) throw new ArgumentException("message", nameof(containerName));
if (string.IsNullOrEmpty(destBlobPath)) throw new ArgumentException("message", nameof(destBlobPath));
if (stream is null) throw new ArgumentNullException(nameof(stream));
stream.Position = 0;
CloudBlockBlob blob = GetBlob(account, containerName, $"{destBlobPath}/{fileName}");
blob.Properties.ContentType = FileUtils.GetFileContentType(fileName);
using var reader = new StreamReader(stream, encoding);
var ct = await reader.ReadToEndAsync();
await blob.UploadTextAsync(ct, encoding ?? Encoding.UTF8, AccessCondition.GenerateEmptyCondition(), new BlobRequestOptions(), new OperationContext());
return true;
}
This is the file just before uploading it
<provinciaDatosInmueble>Sevilla</provinciaDatosInmueble>
<inePoblacionDatosInmueble>969</inePoblacionDatosInmueble>
<poblacionDatosInmueble>Valencina de la Concepción</poblacionDatosInmueble>
and this is the file after the upload
<provinciaDatosInmueble>Sevilla</provinciaDatosInmueble>
<inePoblacionDatosInmueble>969</inePoblacionDatosInmueble>
<poblacionDatosInmueble>Valencina de la Concepci�n</poblacionDatosInmueble>
The encoding I send is ISO-5589-1 in the parameter of the encoding. Anybody knows why Blob Storage seems to ignore the encoding I'm specifying? Thanks in advance!
We could able to achieve this using Azure.Storage.Blobs instead of WindowsAzure.Storage which is a legacy Storage SDK. Below is the code that worked for us.
class Program
{
static async Task Main(string[] args)
{
string sourceContainerName = "<Source_Container_Name>";
string destBlobPath = "<Destination_Path>";
string fileName = "<Source_File_name>";
MemoryStream stream = new MemoryStream();
BlobServiceClient blobServiceClient = new BlobServiceClient("<Your_Connection_String>");
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(sourceContainerName);
BlobClient blobClientSource = containerClient.GetBlobClient(fileName);
BlobClient blobClientDestination = containerClient.GetBlobClient(destBlobPath);
// Reading From Blob
var line =" ";
if (await blobClientSource.ExistsAsync())
{
var response = await blobClientSource.DownloadAsync();
using (StreamReader streamReader = new StreamReader(response.Value.Content))
{
line = await streamReader.ReadToEndAsync();
}
}
// Writing To Blob
var content = Encoding.UTF8.GetBytes(line);
using (var ms = new MemoryStream(content))
blobClientDestination.Upload(ms);
}
}
RESULT:
I'm having strange problem with this piece of code which basically zips files (docs) and uploads them to blob storage.
v11SDK: (docs)
var blockBlobClient = new BlockBlobClient(ConnectionString, ContainerName, "test-blob.zip");
// Saved zip is valid
// using (FileStream zipStream = new FileStream(#"C:\Users\artur\Desktop\test-local.zip", FileMode.OpenOrCreate))
// Uploaded zip is invalid
using (var stream = await blockBlobClient.OpenWriteAsync(true))
using (var archive = new ZipArchive(stream, ZipArchiveMode.Create))
{
var readmeEntry = archive .CreateEntry("Readme.txt");
using (StreamWriter writer = new StreamWriter(readmeEntry.Open()))
{
writer.WriteLine("Information about this package.");
writer.WriteLine("========================");
}
await stream.FlushAsync();
}
v12SDK: (docs)
var blobClient = new BlobClient(ConnectionString, InputContainerName, "test-blob.zip");
using var stream = new MemoryStream();
using (var archive = new ZipArchive(stream, ZipArchiveMode.Create))
{
var readmeEntry = archive.CreateEntry("Readme.txt");
using StreamWriter writer = new StreamWriter(readmeEntry.Open());
{
writer.WriteLine("Information about this package.");
writer.WriteLine("========================");
await writer.FlushAsync();
}
stream.Position = 0;
await blobClient.UploadAsync(stream, true);
await stream.FlushAsync();
}
Saving zip file locally produces a valid zip (164 bytes). Saving zip to blob storage (using storage emulator) produces invalid zip (102 bytes).
I can't figure out why
Here is the correct code.
The problem was premature disposing of inner stream by ZipArchive. Note in my code below, I have passed leaveInnerStreamOpen as true while creating ZipArchive since we are already disposing stream in the outer using. Also for V11 code, I have switched to MemoryStream instead of OpenWrite of blob stream since did not have control to set stream position to 0 if we use OpenWrite. And you don't need any Flush :)
v11SDK:
var blockBlobClient = new BlockBlobClient(ConnectionString, ContainerName, "test-blob.zip");
using var stream = new MemoryStream();
using (var archive = new ZipArchive(stream, ZipArchiveMode.Create, true))
{
var readmeEntry = archive.CreateEntry("Readme.txt");
using (StreamWriter writer = new StreamWriter(readmeEntry.Open()))
{
writer.WriteLine("Information about this package.");
writer.WriteLine("========================");
}
}
stream.Position = 0;
await blockBlobClient.UploadAsync(stream);
v12SDK:
var blobClient = new BlobClient(ConnectionString, InputContainerName, "test-blob.zip");
using var stream = new MemoryStream();
using (var archive = new ZipArchive(stream, ZipArchiveMode.Create, true))
{
var readmeEntry = archive.CreateEntry("Readme.txt");
using StreamWriter writer = new StreamWriter(readmeEntry.Open());
{
writer.WriteLine("Information about this package.");
writer.WriteLine("========================");
}
}
stream.Position = 0;
await blobClient.UploadAsync(stream, true);
I'm rewriting my C# code to use Azure Blob storage instead of filesystem. So far no problems rewriting code for normal fileoperations. But I have some code that uses async write from a stream:
using (var stream = await Request.Content.ReadAsStreamAsync())
{
FileStream fileStream = new FileStream(#"c:\test.txt", FileMode.Create, FileAccess.Write, FileShare.None);
await stream.CopyToAsync(fileStream).ContinueWith(
(copyTask) =>
{
fileStream.Close();
});
}
I need to change the above to use Azure CloudBlockBlob or CloudBlobStream - but can't find a way to declare a stream object that copyToAsync can write to.
You would want to use UploadFromStreamAsync method on CloudBlockBlob. Here's a sample code to do so (I have not tried running this code though):
var cred = new StorageCredentials(accountName, accountKey);
var account = new CloudStorageAccount(cred, true);
var blobClient = account.CreateCloudBlobClient();
var container = blobClient.GetContainerReference("container-name");
var blob = container.GetBlockBlobReference("blob-name");
using (var stream = await Request.Content.ReadAsStreamAsync())
{
stream.Position = 0;
await blob.UploadFromStreamAsync(stream);
}
I read data from printer like this:
using (Stream stream = client.GetStream())
{
using (MemoryStream ms = new MemoryStream())
{
stream.CopyTo(ms);
client.Close();
byte[] result = ms.ToArray();
...
using (var memoryStream = new MemoryStream(result))
{
using (var package = Package.Open(memoryStream, FileMode.Open))
{
var packageUri = new Uri("memorystream://printstream");
PackageStore.AddPackage(packageUri, package);
var xpsDocument = new XpsDocument(package, CompressionOption.Fast, packageUri.OriginalString);
return xpsDocument.GetFixedDocumentSequence() // NULL
I tried also saving stream directly to the file and then read as File.ReadAllBytes(filePath), but result is the same.
After I save stream to the file I can open it via XpsViewer without any problem.
Strange but it's working. I have to used XpsConverter.exe and convert .xps file to .xps. Here is #AXMIM solution