Write Zip file to AWS as a stream - c#

Using C#, I would like to create a zip file in AWS S3, add file entries to it, then close the stream. System.IO.Compression.ZipArchive can be created from a System.IO.Stream. Is it possible to get a writeable stream into an S3 bucket? I am using the .NET SDK FOR S3.

An object uploaded to S3 must have a known size when the request is made. Since the size of the zip file won't be known till the stream is closed you can't do what you are asking about. You would have to create the zip file locally then upload it to S3.
The closest you could get to what you are asking for is using S3's multi part upload. I would use a MemoryStream as the underlying stream for the ZipArchive and each time you add a file to the zip archive check to see if the MemoryStream is larger than 5 megabytes. If it is take the byte buffer from the MemoryStream and upload a new part to S3. Then clear the MemoryStream and continue adding files to the zip archive.

You'll probably want to take a look at this answer here for an existing discussion around this.
This doc page seems to suggest that there is an Upload method can take a stream (with S3 taking care of re-assembling the multi-part upload). Although this is for version 1 so might not be available in version 3.

Related

ZipArchive Read From Unseekable Stream Without Buffering to Memory

Is there a way to read a zip file from a network stream without buffering it entirely in memory? I'd have liked to avoid downloading the entire file before starting to process its contents to save on processing time.
I'm using .Net Core 3.1
The ZipArchive class as shown here will buffer the stream into memory if it's not seekable. So the only way to avoid buffering large zip files into memory would be to first download the file to the local file system and opening a FileStream which is seekable.
This is because the Directory of the zip, the part of the file that has a list of all the contents and their locations, is located at the end of the file. So the class needs to jump around between different parts of the zip to extract its contents.

Generate and stream Zip archive without storing it in the filesystem or reading into memory first

How can I asynchronously take multiple existing streams (from the db), add them to a zip archive stream and return it in asp.net web api 2?
The key difference with the other "duplicate" question is how to do this in a streaming fashion without writing it to a temp file or buffer it completely in memory first.
It looks like you can't do this directly
Writing to ZipArchive using the HttpContext OutputStream
The http response stream needs to support seeking for a zip to be written directly to it which it doesn't. Will need to write to a temporary file by the looks of it.

Buffering stream of byte array

I am using DropNet library to download files from Dropbox.
public Stream GetFileStream(string path)
{
return new MemoryStream(dropboxClient.GetFile(path));
}
I am facing a problem in downloading large files because DropNet library returns byte array then I convert that byte array to stream for another logical purposes using MemoryStream which is not good because I have to download files to server memory then complete my logic
I am trying to find a way to buffer that files as a stream.
I looked at BufferedStream Class but to create new buffersteam it requires a stream first. I can't figure the best solution for my problem.
The DropNet API does not expose a Stream functionality for retrieving files. You must wait for the entire file to be downloaded before you can use it. If you want to be able to read the stream as it comes in you will need to use a different library, modify an existing one, or write your own.

Cannot read zip file from HttpInputStream using DotNetZip 1.9

I am trying to use DotNetZip 1.9 to read an uploaded zip file in Asp.Net MVC 3.
I already verified that the HttpPostedFileBase object I receive is fine. I can save it to disk and then unzip it. However, saving to disk first seemed wasteful since I should be able to unzip from memory directly.
From MSDN, the HttpPostedFileBase.InputStream Property "gets a Stream object that points to an uploaded file to prepare for reading the contents of the file".
According to DetNetZip references, ZipFile.Read() can accept a Stream object. So I tried it and DotNetZip throws a BadReadException. I have attached screen shots showing the problem.
Problem unzipping from HttpInputStream
Value of the InputStream, Length matches that of the uploaded zip file
Help anyone? Thx
I suspect that the ZipFile.IsZipFile method call has advanced your InputStream position and when you try to read it later it is no longer a valid zip file since the stream position has moved. Try sticking a
fileData.InputStream.Position = 0;
just after verifying that the stream is a valid zip file and just before the using clause in which you attempt to read it. This will reset the stream position at the beginning.

Appending bytes using Amazon S3 .Net SDK

I have the following piece of code which works great for a simple file upload. But let's say I wanted to append to an existing file or simply upload random chunks of bytes, like the first and last 10 bytes? Is this even possible with the official SDK?
PutObjectRequest request = new PutObjectRequest();
FileStream fs = new FileStream(#"C:\myFolder\MyFile.bin", FileMode.Open);
request.WithInputStream(fs);
request.WithBucketName(bucketName);
request.WithKey(keyName);
client.PutObject(request);
fs.Close();
There is no way to append data to existing objects in S3. You have to overwrite the entire file.
Although, in saying that, it is possible to a degree with Amazon's large file support. With this uploads are broken into chunks and reassembled on S3. But you have to do it as part of a single transfer and its only for large files.
This previous answer appears to no longer be the case. You can currently manage an append like process by using an existing object as the initial part of a multi-part upload. Then delete the previous object when done transferring.
See:
http://docs.aws.amazon.com/AmazonS3/latest/dev/CopyingObjctsUsingLLNetMPUapi.html
http://docs.aws.amazon.com/AmazonS3/latest/API/mpUploadUploadPartCopy.html

Categories

Resources