GZipStream Decompress not unzipping complete file - c#

I am trying to unzip .gz files using GZipStream. It will only exctract 12kb of the file but if I use winzip or 7zip to unzip the .gz file the extracted file should be 753kb. See the code below that I am using, any pointers on where I am going wrong?
public bool StartLibZipExtraction()
{
bool returnResult = false;
try
{
DirectoryInfo directorySelected = new DirectoryInfo(m_ProjectPath);
foreach (FileInfo fileToDecompress in directorySelected.GetFiles("*.gz"))
{
// check if this is the file to UNZIP
if (fileToDecompress.Name.Equals(m_ZipFileName))
{
returnResult = true; // unzip successful
Agent.LogInfo("~~~ Unzipping file: " + fileToDecompress.ToString() + " ~~~");
Decompress(fileToDecompress);
break;
}
}
}
catch(Exception e)
{
Agent.LogInfo("*** StartLibZipExtraction() Error: " + e.Message + " ***");
return false;
}
return returnResult;
}
public void Decompress(FileInfo fileToDecompress)
{
using (FileStream originalFileStream = fileToDecompress.OpenRead())
{
string currentFileName = fileToDecompress.FullName;
string newFileName = currentFileName.Remove(currentFileName.Length - fileToDecompress.Extension.Length);
using (FileStream decompressedFileStream = File.Create(newFileName))
{
using (GZipStream decompressionStream = new GZipStream(originalFileStream, CompressionMode.Decompress))
{
decompressionStream.CopyTo(decompressedFileStream);
Agent.LogInfo(decompressionStream.ToString());
Agent.LogInfo("Decompressed: {0}" + fileToDecompress.Name);
}
}
}
}

Related

Download SFTP files that starts with prefix name using SSH.NET in C#

I am using SSH.NET library to download a file from SFTP server. When I gave it full file name it works. But I want to download a file with prefix name and in that folder, the prefix name is POS_ETH_SE7*. There will be always one file. After I download it, I move it to another folder. Here is my method:
var auth = new PasswordAuthenticationMethod(username, password);
var connectionInfo = new ConnectionInfo(ipAddress, port, auth);
// Upload File
using (var sftp = new SftpClient(connectionInfo))
{
string pathLocalFile =
Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.Desktop),
"POS_ETH_SE7.ics");
sftp.Connect();
Console.WriteLine("Downloading {0}", remoteFilePath);
using (Stream fileStream = File.OpenWrite(pathLocalFile))
using (StreamWriter writer = new StreamWriter(fileStream))
{
try
{
sftp.DownloadFile(remoteFilePath, fileStream);
}
catch (SftpPathNotFoundException ex)
{
}
}
try
{
var inFile = sftp.Get(remoteFilePath);
inFile.MoveTo(remoteMoveFileToPath + "/POS_ETH_SE7.xml");
}
catch (SftpPathNotFoundException ex)
{
Console.WriteLine("\nnothing to update...\n");
}
sftp.Disconnect();
}
Start with the code from the following question and add the additional constraint on the file name prefix.
Downloading a directory using SSH.NET SFTP in C#
const string prefix = "POS_ETH_SE7";
IEnumerable<SftpFile> files = client.ListDirectory(remotePath);
files = files.Where(file => file.Name.StartsWith(prefix));
foreach (SftpFile file in files)
{
string pathLocalFile = Path.Combine(localPath, file.Name);
using (var stream = File.Create(pathLocalFile))
{
client.DownloadFile(file.FullName, stream);
}
// If you want to archive the downloaded files:
string archivePath = remoteMoveFileToPath + "/" + file.Name;
client.RenameFile(file.FullName, archivePath);
}
Or use a more powerful SFTP library. For example with my WinSCP .NET assembly, you can do the same with a single call to Session.GetFilesToDirectory:
session.GetFilesToDirectory(remotePath, localPath, prefix + "*").Check();
using (var sftp = new SftpClient(connectionInfo))
{
sftp.Connect();
IEnumerable<SftpFile> files = sftp.ListDirectory(configSftpClient.remoteFilePath);
files = files.Where(file => file.Name.StartsWith(configSftpClient.filePrefix));
foreach (SftpFile file in files)
{
string pathLocalFile = Path.Combine(configSftpClient.localFilePath, file.Name);
try
{
using (var stream = File.Create(pathLocalFile))
{
sftp.DownloadFile(file.FullName, stream);
var movableFile = sftp.Get(file.FullName);
Console.WriteLine(file.FullName);
movableFile.MoveTo(configSftpClient.remoteMoveFileToPath + "/" + file.Name);
stream.Close();
}
}
catch(Exception ex)
{
Console.WriteLine("file used by other");
}
}

Transfer files directly from FTP to Azure File Storage without keeping them locally in memory or disk

I have to transfer files from FTP to an Azure File Storage. My code works fine, but I'm transferring those files in memory which is not a best practice. So first I read the stream to an Byte array in memory. Then I upload the output to an Azure file storage.
Now I know it's better to do this asynchronicaly. But I don't know if this is possible and how to do it.
My code:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Microsoft.WindowsAzure.Storage;
using System.Configuration;
using Microsoft.WindowsAzure.Storage.File;
using System.IO;
using Microsoft.Azure;
using System.Net;
namespace TransferFtpToAzure
{
class Program
{
public static void Main(string[] args)
{
List<FileName> sourceFileList = new List<FileName>();
List<FileName> targetFileList = new List<FileName>();
string targetShareReference = ConfigurationManager.AppSettings["AzureShare"];
string targetDirectoryReference = ConfigurationManager.AppSettings["Environment"] + "/" + Enums.AzureFolders.Mos + "/" + Enums.AzureFolders.In;
string sourceURI = (ConfigurationManager.AppSettings["FtpConnectionString"] + ConfigurationManager.AppSettings["Environment"].ToUpper() +"/"+ Enums.FtpFolders.Mos + "/").Replace("\\","/");
string sourceUser = ConfigurationManager.AppSettings["FtpServerUserName"];
string sourcePass = ConfigurationManager.AppSettings["FtpServerPassword"];
getFileLists(sourceURI, sourceUser, sourcePass, sourceFileList, targetShareReference, targetDirectoryReference, targetFileList);
Console.WriteLine(sourceFileList.Count + " files found!");
CheckLists(sourceFileList, targetFileList);
targetFileList.Sort();
Console.WriteLine(sourceFileList.Count + " unique files on sourceURI" + Environment.NewLine + "Attempting to move them.");
foreach (var file in sourceFileList)
{
try
{
CopyFile(file.fName, sourceURI, sourceUser, sourcePass, targetShareReference, targetDirectoryReference);
}
catch
{
Console.WriteLine("There was move error with : " + file.fName);
}
}
}
public class FileName : IComparable<FileName>
{
public string fName { get; set; }
public int CompareTo(FileName other)
{
return fName.CompareTo(other.fName);
}
}
public static void CheckLists(List<FileName> sourceFileList, List<FileName> targetFileList)
{
for (int i = 0; i < sourceFileList.Count; i++)
{
if (targetFileList.BinarySearch(sourceFileList[i]) > 0)
{
sourceFileList.RemoveAt(i);
i--;
}
}
}
public static void getFileLists(string sourceURI, string sourceUser, string sourcePass, List<FileName> sourceFileList, string targetShareReference, string targetDirectoryReference, List<FileName> targetFileList)
{
string line = "";
/////////Source FileList
FtpWebRequest sourceRequest;
sourceRequest = (FtpWebRequest)WebRequest.Create(sourceURI);
sourceRequest.Credentials = new NetworkCredential(sourceUser, sourcePass);
sourceRequest.Method = WebRequestMethods.Ftp.ListDirectory;
sourceRequest.UseBinary = true;
sourceRequest.KeepAlive = false;
sourceRequest.Timeout = -1;
sourceRequest.UsePassive = true;
FtpWebResponse sourceRespone = (FtpWebResponse)sourceRequest.GetResponse();
//Creates a list(fileList) of the file names
using (Stream responseStream = sourceRespone.GetResponseStream())
{
using (StreamReader reader = new StreamReader(responseStream))
{
line = reader.ReadLine();
while (line != null)
{
var fileName = new FileName
{
fName = line
};
sourceFileList.Add(fileName);
line = reader.ReadLine();
}
}
}
/////////////Target FileList
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
//var test = fileClient.ListShares();
CloudFileShare fileShare = fileClient.GetShareReference(targetShareReference);
if (fileShare.Exists())
{
CloudFileDirectory rootDirectory = fileShare.GetRootDirectoryReference();
if (rootDirectory.Exists())
{
CloudFileDirectory customDirectory = rootDirectory.GetDirectoryReference(targetDirectoryReference);
if (customDirectory.Exists())
{
var fileCollection = customDirectory.ListFilesAndDirectories().OfType<CloudFile>();
foreach (var item in fileCollection)
{
var fileName = new FileName
{
fName = item.Name
};
targetFileList.Add(fileName);
}
}
}
}
}
public static void CopyFile(string fileName, string sourceURI, string sourceUser, string sourcePass, string targetShareReference, string targetDirectoryReference)
{
try
{
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(sourceURI + fileName);
request.Method = WebRequestMethods.Ftp.DownloadFile;
request.Credentials = new NetworkCredential(sourceUser, sourcePass);
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
Stream responseStream = response.GetResponseStream();
Upload(fileName, ToByteArray(responseStream), targetShareReference, targetDirectoryReference);
responseStream.Close();
}
catch
{
Console.WriteLine("There was an error with :" + fileName);
}
}
public static Byte[] ToByteArray(Stream stream)
{
MemoryStream ms = new MemoryStream();
byte[] chunk = new byte[4096];
int bytesRead;
while ((bytesRead = stream.Read(chunk, 0, chunk.Length)) > 0)
{
ms.Write(chunk, 0, bytesRead);
}
return ms.ToArray();
}
public static bool Upload(string FileName, byte[] Image, string targetShareReference, string targetDirectoryReference)
{
try
{
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
//var test = fileClient.ListShares();
CloudFileShare fileShare = fileClient.GetShareReference(targetShareReference);
if (fileShare.Exists())
{
CloudFileDirectory rootDirectory = fileShare.GetRootDirectoryReference();
if (rootDirectory.Exists())
{
CloudFileDirectory customDirectory = rootDirectory.GetDirectoryReference(targetDirectoryReference);
if (customDirectory.Exists())
{
var cloudFile = customDirectory.GetFileReference(FileName);
using (var stream = new MemoryStream(Image, writable: false))
{
cloudFile.UploadFromStream(stream);
}
}
}
}
return true;
}
catch
{
return false;
}
}
}
}
If I understand you correctly, you want to avoid storing the file in memory between the download and upload.
For that see:
Azure function to copy files from FTP to blob storage.
Using Azure Storage File Share this is the only way it worked for me without loading the entire ZIP into Memory. I tested with a 3GB ZIP File (with thousands of files or with a big file inside) and Memory/CPU was low and stable. I hope it helps!
var zipFiles = _directory.ListFilesAndDirectories()
.OfType<CloudFile>()
.Where(x => x.Name.ToLower().Contains(".zip"))
.ToList();
foreach (var zipFile in zipFiles)
{
using (var zipArchive = new ZipArchive(zipFile.OpenRead()))
{
foreach (var entry in zipArchive.Entries)
{
if (entry.Length > 0)
{
CloudFile extractedFile = _directory.GetFileReference(entry.Name);
using (var entryStream = entry.Open())
{
byte[] buffer = new byte[16 * 1024];
using (var ms = extractedFile.OpenWrite(entry.Length))
{
int read;
while ((read = entryStream.Read(buffer, 0, buffer.Length)) > 0)
{
ms.Write(buffer, 0, read);
}
}
}
}
}
}
}

Zip complete folder using System.IO.Compression

I have one example where all the files in the folder are zipped but not the folder itself .[This code is from MSDN]
using System;
using System.IO;
using System.IO.Compression;
namespace zip
{
public class Program
{
public static void Main()
{
string directoryPath = #"c ------------------------------------------------------------------------ :\users\public\reports";
DirectoryInfo directorySelected = new DirectoryInfo(directoryPath);
foreach (FileInfo fileToCompress in directorySelected.GetFiles())
{
Compress(fileToCompress);
}
foreach (FileInfo fileToDecompress in directorySelected.GetFiles("*.gz"))
{
Decompress(fileToDecompress);
}
}
public static void Compress(FileInfo fileToCompress)
{
using (FileStream originalFileStream = fileToCompress.OpenRead())
{
if ((File.GetAttributes(fileToCompress.FullName) & FileAttributes.Hidden) != FileAttributes.Hidden & fileToCompress.Extension != ".gz")
{
using
(FileStream compressedFileStream = File.Create(fileToCompress.FullName + ".gz"))
{
using (GZipStream compressionStream = new GZipStream(compressedFileStream, CompressionMode.Compress))
{
originalFileStream.CopyTo(compressionStream);
Console.WriteLine("Compressed {0} from {1} to {2} bytes.", fileToCompress.Name, fileToCompress.Length.ToString(), compressedFileStream.Length.ToString());
}
}
}
}
}
public static void Decompress(FileInfo fileToDecompress)
{
using (FileStream originalFileStream = fileToDecompress.OpenRead())
{
string currentFileName = fileToDecompress.FullName;
string newFileName = currentFileName.Remove(currentFileName.Length - fileToDecompress.Extension.Length);
using (FileStream decompressedFileStream = File.Create(newFileName))
{
using (GZipStream decompressionStream = new GZipStream(originalFileStream, CompressionMode.Decompress))
{
decompressionStream.CopyTo(decompressedFileStream);
Console.WriteLine("Decompressed: {0}", fileToDecompress.Name);
}
}
}
}
}
}
I don't think you can zip a complete folder using System.IO.Compression, you can only compress files inside the folder. You can use DotNetZip instead. It is a 100% managed code library that can be used in any .NET application - Console, Winforms, WPF, ASP.NET, Sharepoint, Web services apps, and so on.
Download developer's kit package from http://dotnetzip.codeplex.com/Release/ProjectReleases.aspx.
Reference necessaries including DotNetZip DLL in your application and do follows:
string[] MainDirs = Directory.GetDirectories(""c:\users\public\reports");
for (int i = 0; i < MainDirs.Length; i++)
{
using (ZipFile zip = new ZipFile())
{
zip.UseUnicodeAsNecessary = true;
zip.AddDirectory(MainDirs[i]);
zip.CompressionLevel = Ionic.Zlib.CompressionLevel.BestCompression;
zip.Comment = "This zip was created at " + System.DateTime.Now.ToString("G");
zip.Save(string.Format("test{0}.zip", i));
}
}
Hope this helps,
Thanks

Create a file then create a zip and move it to another directory

I use this simple code for log files.
private string LogFile
{
get
{
if (String.IsNullOrEmpty(this.LogFile1))
{
string fn = "\\log.txt";
int count = 0;
while (File.Exists(fn))
{
fn = fn + "(" + count++ + ").txt";
}
this.LogFile1 = fn;
}
return this.LogFile1;
}
}
How can I move every log file into another directory ( folder ) and make it archive like .zip?
This will run once per and I will have one file per day.
File moving:
public static void Move()
{
string path = "";
string path2 = "";
try
{
if (!File.Exists(path))
{
using (FileStream fs = File.Create(path)) { }
}
if (File.Exists(path2))
File.Delete(path2);
File.Move(path, path2);
}
catch (Exception e)
{
Console.WriteLine("The process failed: {0}", e.ToString());
}
}
For move files, you can use the static method Move of File class. And for zip files, you can look at GZipStream or ZipArchive class.
If you want windows zipping.
Then check this out :
https://msdn.microsoft.com/en-us/library/system.io.compression.zipfile(v=vs.110).aspx
using System;
using System.IO;
using System.IO.Compression;
namespace ConsoleApplication
{
class Program
{
static void Main(string[] args)
{
string startPath = #"c:\example\start";
string zipPath = #"c:\example\result.zip";
string extractPath = #"c:\example\extract";
ZipFile.CreateFromDirectory(startPath, zipPath);
ZipFile.ExtractToDirectory(zipPath, extractPath);
}
}
}
// for moving
File.Move(SourceFile, DestinationFile); // store in dateTime directory to move file.
//method for zip file
private static void CompressFile(string path)
{
FileStream sourceFile = File.OpenRead(path);
FileStream destinationFile = File.Create(path + ".gz");
byte[] buffer = new byte[sourceFile.Length];
sourceFile.Read(buffer, 0, buffer.Length);
using (GZipStream output = new GZipStream(destinationFile,
CompressionMode.Compress))
{
Console.WriteLine("Compressing {0} to {1}.", sourceFile.Name,
destinationFile.Name, false);
output.Write(buffer, 0, buffer.Length);
}
// Close the files.
sourceFile.Close();
destinationFile.Close();
}

How to decompress .bz2 file in C#?

I am developing wpf application. I am using sharpziplib to compress and decompress files. I am easily decompress the .zip files using following code
public static void UnZip(string SrcFile, string DstFile, string safeFileName, int bufferSize)
{
//ICSharpCode.SharpZipLib.Zip.UseZip64.Off;
FileStream fileStreamIn = new FileStream(SrcFile, FileMode.Open, FileAccess.Read);
ZipInputStream zipInStream = new ZipInputStream(fileStreamIn);
string rootDirectory = string.Empty;
if (safeFileName.Contains(".zip"))
{
rootDirectory = safeFileName.Replace(".zip", string.Empty);
}
else
{
rootDirectory = safeFileName;
}
Directory.CreateDirectory(App.ApplicationPath + rootDirectory);
while (true)
{
ZipEntry entry = zipInStream.GetNextEntry();
if (entry == null)
break;
if (entry.Name.Contains("/"))
{
string[] folders = entry.Name.Split('/');
string lastElement = folders[folders.Length - 1];
var folderList = new List<string>(folders);
folderList.RemoveAt(folders.Length - 1);
folders = folderList.ToArray();
string folderPath = "";
foreach (string str in folders)
{
folderPath = folderPath + "/" + str;
if (!Directory.Exists(App.ApplicationPath + rootDirectory + "/" + folderPath))
{
Directory.CreateDirectory(App.ApplicationPath + rootDirectory + "/" + folderPath);
}
}
if (!string.IsNullOrEmpty(lastElement))
{
folderPath = folderPath + "/" + lastElement;
WriteToFile(DstFile + rootDirectory + #"\" + folderPath, bufferSize, zipInStream, rootDirectory, entry);
}
}
else
{
WriteToFile(DstFile + rootDirectory + #"\" + entry.Name, bufferSize, zipInStream, rootDirectory, entry);
}
}
zipInStream.Close();
fileStreamIn.Close();
}
private static void WriteToFile(string DstFile, int bufferSize, ZipInputStream zipInStream, string rootDirectory, ZipEntry entry)
{
FileStream fileStreamOut = new FileStream(DstFile, FileMode.OpenOrCreate, FileAccess.Write);
int size;
byte[] buffer = new byte[bufferSize];
do
{
size = zipInStream.Read(buffer, 0, buffer.Length);
fileStreamOut.Write(buffer, 0, size);
} while (size > 0);
fileStreamOut.Close();
}
But the same code is not working with .bz2 files. It is giving error at line
ZipEntry entry = zipInStream.GetNextEntry();
The error is - Wrong Local header signature: 0x26594131. How should I decompress the .bz2 file ? Can you please provide me any code or link through which I can resolve the above issue ?
While you use a ZipInputStream for .zip files, you should use a BZip2InputStream for .bz2 files (and GZipInputStream for .gz files etc.).
Unlike Zip (and RAR and tar), bz2 and gzip are just byte stream compressors. They have no concept of a container format like the aforementioned, and hence why it fails on GetNextEntry. (In other words, bz2 and gzip will only have 1 entry at most).

Categories

Resources