How to add a file in a zip using SharpZibLib in c# - c#

I am trying to add a file to an existing zip using sharpZibLib in c#.
When run zip gets qverwrite i.e all files within zip gets deleted and only new file is there in a zip.
using (FileStream fileStream = File.Open("D:/Work/Check.zip", FileMode.Open, FileAccess.ReadWrite))
using (ZipOutputStream zipToWrite = new ZipOutputStream(fileStream))
{
zipToWrite.SetLevel(9);
using (FileStream newFileStream = File.OpenRead("D:/Work/file1.txt"))
{
byte[] byteBuffer = new byte[newFileStream.Length - 1];
newFileStream.Read(byteBuffer, 0, byteBuffer.Length);
ZipEntry entry = new ZipEntry("file1.txt");
zipToWrite.PutNextEntry(entry);
zipToWrite.Write(byteBuffer, 0, byteBuffer.Length);
zipToWrite.CloseEntry();
zipToWrite.Finish();
zipToWrite.Close();
}
}
Can anyone tell me whats the issue in above code? Why the zip gets overwitten

Have a look here:
http://wiki.sharpdevelop.net/SharpZipLib_Updating.ashx
you need to call
zipFile.BeginUpdate();
//add file..
zipFile.CommitUpdate();

Related

How to Compress Large Files C#

I am using this method to compress files and it works great until I get to a file that is 2.4 GB then it gives me an overflow error:
void CompressThis (string inFile, string compressedFileName)
{
FileStream sourceFile = File.OpenRead(inFile);
FileStream destinationFile = File.Create(compressedFileName);
byte[] buffer = new byte[sourceFile.Length];
sourceFile.Read(buffer, 0, buffer.Length);
using (GZipStream output = new GZipStream(destinationFile,
CompressionMode.Compress))
{
output.Write(buffer, 0, buffer.Length);
}
// Close the files.
sourceFile.Close();
destinationFile.Close();
}
What can I do to compress huge files?
You should not to write the whole file to into the memory. Use Stream.CopyTo instead. This method reads the bytes from the current stream and writes them to another stream using a specified buffer size (81920 bytes by default).
Also you don't need to close Stream objects if use using keyword.
void CompressThis (string inFile, string compressedFileName)
{
using (FileStream sourceFile = File.OpenRead(inFile))
using (FileStream destinationFile = File.Create(compressedFileName))
using (GZipStream output = new GZipStream(destinationFile, CompressionMode.Compress))
{
sourceFile.CopyTo(output);
}
}
You can find a more complete example on Microsoft Docs (formerly MSDN).
You're trying to allocate all of this into memory. That just isn't necessary, you can feed the input stream directly into the output stream.
Alternative solution for zip format without allocating memory -
using (var sourceFileStream = new FileStream(this.GetFilePath(sourceFileName), FileMode.Open))
{
using (var destinationStream =
new FileStream(this.GetFilePath(zipFileName), FileMode.Create, FileAccess.ReadWrite))
{
using (var archive = new ZipArchive(destinationStream, ZipArchiveMode.Create, true))
{
var file = archive.CreateEntry(sourceFileName, CompressionLevel.Optimal);
using (var entryStream = file.Open())
{
var fileStream = sourceFileStream;
await fileStream.CopyTo(entryStream);
}
}
}
}
The solution will write directly from input stream to output stream

File Stream file saving

I have an image file in Canvas element that I get in code behind in asp.net. now I want to save it to a folder in my project but file stream always saves it to c drive. What do I do?
[WebMethod()]
public void SaveUser(string imageData)
{
//Create image to local machine.
string fileNameWitPath = path + "4200020789506" + ".png";
using (FileStream fs = new FileStream(fileNameWitPath, FileMode.Create))
{
using (BinaryWriter bw = new BinaryWriter(fs))
{
byte[] data = Convert.FromBase64String(imageData);
bw.Write(data);
bw.Close();
}
}
// Save fileNameWitPath variable to Database.
}
Here is an example of how I save files to an Images folder in my project directory.
var fileName = "4200020789506.png";
var base64String = SOME_REALLY_LONG_STRING;
using (var s = new MemoryStream(Convert.FromBase64String(base64String)))
using (var f = new FileStream(Path.Combine(Server.MapPath("~/Images"), fileName), FileMode.Create, FileAccess.Write))
{
s.CopyTo(f);
}
Here's what I do and it works well. For you, filePath/filename = fileNameWitPath. Do this for each file you have. Hope it works for you. If you need further info, Id be glad to help.
using (var stream = File.Create(filePath + filename))
{
attachment.ContentObject.DecodeTo(stream, cancel.Token);
}
I can only imagine your path variable points to your C:\ drive.
You need to set the path variable equal to the location you want, for instance:
public void SaveUser(string imageData)
{
path = #"C:\YourCustomFolder\"; // your path needs to point to the Directory you want to save
//Create image to local machine.
string fileNameWitPath = path + "4200020789506" + ".png";
//chekc if directory exist, if not, create
if (!Directory.Exists(path))
Directory.CreateDirectory(path);
using (FileStream fs = new FileStream(fileNameWitPath, FileMode.Create))
{
using (BinaryWriter bw = new BinaryWriter(fs))
{
byte[] data = Convert.FromBase64String(imageData);
bw.Write(data);
bw.Close();
}
}
// Save fileNameWitPath variable to Database.
}
I also included a Check to see if your directory exists, and if it does not, it will create a folder called 'YourCustomFolder' on your C drive, where it will save images.
If you would like to save your image to a folder in your project, I would recommend using Server.MapPath(~/YourFolderInApplication)

gzipstream memory stream to file

I am trying to compress JSON files using Gzip compression to be sent to another location. It needs to process 5,000 - 10,000 files daily, and I don't need the compressed version of the file on the local machine (they are actually being transferred to AWS S3 for long-term archiving).
Since I don't need them, I am trying to compress to a memory stream and then use that to write to AWS, rather than compress each one to disk. Whenever I try to do this, the files are broken (as in, when I open them in 7-Zip and try to open the JSON file inside, I get "Data error File is Broken).
The same thing happens when I try to write the memory stream to a local file, so I'm trying to solve that for now. Here's the code:
string[] files = Directory.GetFiles(#"C:\JSON_Logs");
foreach(string file in files)
{
FileInfo fileToCompress = new FileInfo(file);
using (FileStream originalFileStream = fileToCompress.OpenRead())
{
using (MemoryStream compressedMemStream = new MemoryStream())
{
using (GZipStream compressionStream = new GZipStream(compressedMemStream, CompressionMode.Compress))
{
originalFileStream.CopyTo(compressionStream);
compressedMemStream.Seek(0, SeekOrigin.Begin);
FileStream compressedFileStream = File.Create(fileToCompress.FullName + ".gz");
//Eventually this will be the AWS transfer, but that's not important here
compressedMemStream.WriteTo(compressedFileStream);
}
}
}
}
Rearrange your using statements so the GZipStream is definitely done by the time you read the memory stream contents:
foreach(string file in files)
{
FileInfo fileToCompress = new FileInfo(file);
using (MemoryStream compressedMemStream = new MemoryStream())
{
using (FileStream originalFileStream = fileToCompress.OpenRead())
using (GZipStream compressionStream = new GZipStream(
compressedMemStream,
CompressionMode.Compress,
leaveOpen: true))
{
originalFileStream.CopyTo(compressionStream);
}
compressedMemStream.Seek(0, SeekOrigin.Begin);
FileStream compressedFileStream = File.Create(fileToCompress.FullName + ".gz");
//Eventually this will be the AWS transfer, but that's not important here
compressedMemStream.WriteTo(compressedFileStream);
}
}
Disposing a stream takes care of flushing and closing it.

Get Filestream from ZipArchive

I have a function that requires a Filestream as input.
I want to hand several Files to that function which I get from uploaded zip-Files.
Is it possible to create the Filestream without extracting the file to a temporary folder?
I imagin something like this:
string path = #"C:\somepathtomyzip";
string filepath = "nameofimagefile"
using (ZipArchive archive = ZipFile.OpenRead(path))
{
ZipArchiveEntry entry = archive.GetEntry(file_path);
//generate Filestream from entry
myFunction(filestreamIneed);
}
You can use ZipArchiveEntry.Open() and copy the output from the returned Stream instance to a FileStream instance:
using (ZipArchive archive = ZipFile.OpenRead(path))
{
ZipArchiveEntry entry = archive.GetEntry(file_path);
var memoryStream = return entry.Open();
using (var fileStream = new FileStream(fileName, FileMode.CreateNew, FileAccess.ReadWrite))
{
memoryStream.CopyTo(fileStream); // fileStream is not populated
}
}

Create new FileStream out of a byte array

I am attempting to create a new FileStream object from a byte array. I'm sure that made no sense at all so I will try to explain in further detail below.
Tasks I am completing:
1) Reading the source file which was previously compressed
2) Decompressing the data using GZipStream
3) copying the decompressed data into a byte array.
What I would like to change:
1) I would like to be able to use File.ReadAllBytes to read the decompressed data.
2) I would then like to create a new filestream object usingg this byte array.
In short, I want to do this entire operating using byte arrays. One of the parameters for GZipStream is a stream of some sort, so I figured I was stuck using a filestream. But, if some method exists where I can create a new instance of a FileStream from a byte array - then I should be fine.
Here is what I have so far:
FolderBrowserDialog fbd = new FolderBrowserDialog(); // Shows a browser dialog
fbd.ShowDialog();
// Path to directory of files to compress and decompress.
string dirpath = fbd.SelectedPath;
DirectoryInfo di = new DirectoryInfo(dirpath);
foreach (FileInfo fi in di.GetFiles())
{
zip.Program.Decompress(fi);
}
// Get the stream of the source file.
using (FileStream inFile = fi.OpenRead())
{
//Create the decompressed file.
string outfile = #"C:\Decompressed.exe";
{
using (GZipStream Decompress = new GZipStream(inFile,
CompressionMode.Decompress))
{
byte[] b = new byte[blen.Length];
Decompress.Read(b,0,b.Length);
File.WriteAllBytes(outfile, b);
}
}
}
Thanks for any help!
Regards,
Evan
It sounds like you need to use a MemoryStream.
Since you don't know how many bytes you'll be reading from the GZipStream, you can't really allocate an array for it. You need to read it all into a byte array and then use a MemoryStream to decompress.
const int BufferSize = 65536;
byte[] compressedBytes = File.ReadAllBytes("compressedFilename");
// create memory stream
using (var mstrm = new MemoryStream(compressedBytes))
{
using(var inStream = new GzipStream(mstrm, CompressionMode.Decompress))
{
using (var outStream = File.Create("outputfilename"))
{
var buffer = new byte[BufferSize];
int bytesRead;
while ((bytesRead = inStream.Read(buffer, 0, BufferSize)) != 0)
{
outStream.Write(buffer, 0, bytesRead);
}
}
}
}
Here is what I ended up doing. I realize that I did not give sufficient information in my question - and I apologize for that - but I do know the size of the file I need to decompress as I am using it earlier in my program. This buffer is referred to as "blen".
string fi = #"C:\Path To Compressed File";
// Get the stream of the source file.
// using (FileStream inFile = fi.OpenRead())
using (MemoryStream infile1 = new MemoryStream(File.ReadAllBytes(fi)))
{
//Create the decompressed file.
string outfile = #"C:\Decompressed.exe";
{
using (GZipStream Decompress = new GZipStream(infile1,
CompressionMode.Decompress))
{
byte[] b = new byte[blen.Length];
Decompress.Read(b,0,b.Length);
File.WriteAllBytes(outfile, b);
}
}
}

Categories

Resources