Zip files which are located in IsolatedStorage - c#

My application need to download multiple files in Silverlight, and because I don't want to ask user multiple times for permission to save the files, I save the files in IsolatedStorage first and then I want to zip them all to a file and ask once for saving permission.
therefore I used SharpZipLib to zip multiple files which are located in IsolatedStorage, the problem is that SharpZipLib just accept file address as ZipEntery:
ZipEntry z= new ZipEntry(name);
and as you know cause the files are located in IsolatedStorage I don't have the address of them.
I saw sample on Create a Zip from/to a memory stream or byte array but I cant use it for multiple files.
Please help me to find a way to use SharpZipLib or introduce me another way to downloading multiple files without asking multiple times for permission.

The name in ZipEntry z= new ZipEntry(name); is a logical/relative name inside your zip, you can establish it any way you want.
So as long as you can re-open you IsoStorage files as a Stream, you should be able to use SharpZip.

Related

Stream files directly to new ZIP archive to save memory with FluentFTP

In my file manager, I need to offer the capability of downloading files. I need to be able to select both individual files but also directories. This could be an example:
/www/index.html
/www/images/
/www/styles.css
If I select those 3 items (2 files and 1 folder), I need to add them all to a ZIP archive. I already have an working example, where I utilize DownloadFolder() and DownloadFile(). However, it goes like this:
Download each file to disk
If there are any folders, recursively look through them and download those files to their respective folders (automatically done)
Call System.IO.Compression.ZipFile.CreateFromDirectory() to ZIP the downloaded files to a ZIP archive
Delete the downloaded files from before
Stream the ZIP file back using new FileStream(zipFile, FileMode.Open, FileAccess.Read, FileShare.None, 4096, FileOptions.DeleteOnClose) so the ZIP file gets deleted afterwards
This is quite bad, because I need to first download the files, add them to an archive, delete the files I just downloaded, stream the archive to the user, and then finally delete the archive to clean up. What would be better:
Tell FluentFTP which files to stream
Create a ZIP archive ON DISK
Add each file and directory recursively to the archive
Stream the archive back and delete the file afterwards
By doing this, I should be able to make very, very large files (100+ GB if that's a case), and all I would have to care about, is temporary storage until the archive has been deleted.
I wasn't able to find any information on how to do this, so something tells me, I need to call the GetListing() method with the FtpListOption.Recursive flag, then create each directory "manually", and finally call the Download() method, which returns a stream.
Are there any better ways, though?

How to efficiently write multiple data ranges from one file on the internet simultaneosly into one File

I want to have multiple network stream threads writing/downloading into one file simultaneosly.
So e.G you have one File and download the ranges:
0-1000
1001-2002
2003-3004...
And I want them all to write their receiving bytes into one File as efficient as possible.
Right now I am downloading each range part into one File and combine them later when they are all finished into the final File.
I would like them to, if it is possible to all write into one File to reduce disk usage and I feel like this could all be done better.
You could use persisted memory mapped files, see https://learn.microsoft.com/en-us/dotnet/standard/io/memory-mapped-files
Persisted files are memory-mapped files that are associated with a source file on a disk. When the last process has finished working with the file, the data is saved to the source file on the disk. These memory-mapped files are suitable for working with extremely large source files.

Efficient extracting all files from a zip file

What is the most efficient way to extract all the files from a zip file (and store them in a dictionary file_name->contents) using DotNetZip? The zip is in a slow network location, so I want to make sure it is (a) downloaded and (b) decompressed only once.
There is not much to do here, then
1) download the file
2) unzip it localy
You need (1) to avoid expensive permission check on every network access.
Just one more point: make sure that you download/unzip into the location were current user has read/write permission.
For example, it could be:
var path = Path.Combine(
Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData),
APP_NAME);
which results on Windows7 in C:\ProgramData\APP_NAME

Create a ZIP file without entries touching the disk?

I'm trying to create a program that has the capability of creating a zipped package containing files based on user input.
I don't need any of those files to be written to the hard drive before they're zipped, as that would be unnecessary, so how do I create these files without actually writing them to the hard drive, and then have them zipped?
I'm using DotNetZip.
See the documentation here, specifically the example called "Create a zip using content obtained from a stream":
using (ZipFile zip = new ZipFile())
{
ZipEntry e= zip.AddEntry("Content-From-Stream.bin", "basedirectory", StreamToRead);
e.Comment = "The content for entry in the zip file was obtained from a stream";
zip.AddFile("Readme.txt");
zip.Save(zipFileToCreate);
}
If your files are not already in a stream format, you'll need to convert them to one. You'll probably want to use a MemoryStream for that.
I use SharpZipLib, but if DotNetZip can do everything against a basic System.IO.Stream, then yes, just feed it a MemoryStream to write to.
Writing to the hard disk shouldn't be something avoid because it's unnecessary. That's backwards. If it's not a requirement that the entire zipping process is done in memory then avoid it by writing to the hard disk.
The hard disk is better suited for storing large amounts of data than memory is. If by some chance your zip file ends up being around a gigabyte in size your application could croak or at least cause a system slowdown. If you write directly to the hard drive the zip could be several gigabytes in size without causing an issue.

upload file only to unzip it without writing file on server

In my asp.net aplication I would like to give the oportunity to the user to check on its local machine zip file. Then I would like to get this file on server(without saving it on the server) unzip it proceed and save proceeded data to database.
What should I do to accomplish this task? Thank You for any hints
I would use a zip library to read in the zip file stream and process the files contained inside. I have previously used SharpZipLib for this purpose before. See SharpZipLib Examples for more information.

Categories

Resources