How to attach files to dotnet zip library c# - c#

I am using dotnet zip library to create zip file. My files are saved into a database and I am reading each row and load file data into memory from the database and save the memory stream to zip file. my below code works fine.
using System.IO;
using Ionic.Zip;
private void button2_Click(object sender, EventArgs e)
{
using (SqlConnection sqlConn = new SqlConnection(#"Data Source=BBATRIDIP\SQLSERVER2008R2;Initial Catalog=test;Integrated Security=True"))
{
string query = String.Format(#"SELECT Name, ContentType, Data FROM [TestTable]");
SqlCommand cmd = new SqlCommand(query, sqlConn);
cmd.Connection.Open();
System.IO.MemoryStream memStream = null;
ZipFile zip = new ZipFile();
zip.MaxOutputSegmentSize = 1024 * 1024; // 1MB each segment size would be
// the above line would split zip file into multiple files and each file
//size would be 1MB
using (SqlDataReader reader = cmd.ExecuteReader())
{
while (reader.Read())
{
byte[] data = (byte[])reader["Data"];
memStream = new System.IO.MemoryStream(data);
string strFile = reader["Name"].ToString() + "\\" + reader["ContentType"].ToString();
ZipEntry ze = zip.AddEntry(strFile, memStream);
int xx = 0;
}
}
zip.Save(#"e:\MyCustomZip.zip");
memStream.Dispose();
MessageBox.Show("Job Done");
// here u can save the zip in memory stream also there is a overload insteaa of saving in HD
}
}
but if i change a bit into the code then corrupted zip file is creating. if i change this line zip.Save(#"e:\MyCustomZip.zip"); then problem occur.
using (SqlConnection sqlConn = new SqlConnection(#"Data Source=BBATRIDIP\SQLSERVER2008R2;Initial Catalog=test;Integrated Security=True"))
{
string query = String.Format(#"SELECT Name, ContentType, Data FROM [TestTable]");
SqlCommand cmd = new SqlCommand(query, sqlConn);
cmd.Connection.Open();
System.IO.MemoryStream memStream = null;
ZipFile zip = new ZipFile();
zip.MaxOutputSegmentSize = 1024 * 1024; // 1MB each segment size would be
// the above line would split zip file into multiple files and each file
//size would be 1MB
zip.Save(#"e:\MyCustomZip.zip");
using (SqlDataReader reader = cmd.ExecuteReader())
{
while (reader.Read())
{
byte[] data = (byte[])reader["Data"];
memStream = new System.IO.MemoryStream(data);
string strFile = reader["Name"].ToString() + "\\" + reader["ContentType"].ToString();
ZipEntry ze = zip.AddEntry(strFile, memStream);
int xx = 0;
}
}
memStream.Dispose();
MessageBox.Show("Job Done");
// here u can save the zip in memory stream also there is a overload insteaa of saving in HD
}
I want to create zip with zero kb initially and when I will add stream to zip in loop then zip file size will increase gradually. if you see my above code where I try to do that but I got no success. what am I doing wrong in the code?
Another issue
dotnet zip compression ratio not good. i zip a 152 KB doc file with dotnet zip and when zip was created then i was zip file size was 136KB. is there any tweak exist which create small size zip file. share the knowledge. thanks

So, looking at the documentation for ZipFile.Save, we see the following:
Use this when creating a new zip file, or when updating a zip archive.
So, it seems that you need to call this repeatedly to get the "update" behaviour you are looking for. As such just need to move your Save into the loop:
//...
ZipFile zip = new ZipFile();
zip.MaxOutputSegmentSize = 1024 * 1024;
using (SqlDataReader reader = cmd.ExecuteReader())
{
while (reader.Read())
{
byte[] data = (byte[])reader["Data"];
memStream = new System.IO.MemoryStream(data);
string strFile =
reader["Name"].ToString() + "\\" + reader["ContentType"].ToString();
ZipEntry ze = zip.AddEntry(strFile, memStream);
zip.Save(#"e:\MyCustomZip.zip");
}
}
With regards to your second question (never a good idea on SO, try to stick to one problem per question):
Without knowing what you are compressing, it's very difficult to speculate about what kind of compression ratio you might achieve. If the data is already compressed (e.g. compressed image/video such as jpeg/h264), zipping isn't going to give you any gains whatsoever. Your only hint about the nature of the content is that it is a "doc". If you're talking about a modern Word document (docx), this is already a zip compressed folder structure. Gains will be minimal. If it's another kind of doc, perhaps it contains embedded (compressed) media? This will also be relatively incompressible.

Related

Stored exe File in sql is not executable after Loaded From SQL Server

We are trying to store an executable(exe) file in SQL. We are getting no error either writing or reading. Just the file we stored is not working after downloading back.
This is how we store the file:
databaseFilePut(#"FilePath", con, dt.Rows[0].ItemArray[0].ToString(), "ASIL");
And this is the inside of the function:
public static void databaseFilePut(string varFilePath, SqlConnection con, string version, string OFSET )
{
byte[] file;
using (var stream = new FileStream(varFilePath, FileMode.Open, FileAccess.Read))
{
using (var reader = new BinaryReader(stream))
{
file = reader.ReadBytes((int)stream.Length);
}
}
using (var sqlWrite = new SqlCommand("UPDATE ERP_TOOL_UPDATE SET Version=#Version1, ExeDosyasi= #ExeDosyasi1, OFSET= #OFSET1", con))
{
sqlWrite.Parameters.AddWithValue("#Version1", (double.Parse(version) + 1).ToString());
sqlWrite.Parameters.Add("#ExeDosyasi1", SqlDbType.VarBinary, file.Length).Value = file;
sqlWrite.Parameters.AddWithValue("#OFSET1", "ASIL");
sqlWrite.ExecuteNonQuery();
}
}
After saving to the database this is what the data like:
0x4D5A90000300000004000000FFFF0000B8000.... and goes on.
After reading we try to recreate the stored exe with this code:
SqlCommand com = new SqlCommand("Select ExeDosyasi From ERP_TOOL_UPDATE WHERE OFSET = 'ASIL' ", con);
com.CommandType = CommandType.Text;
SqlDataReader reader = com.ExecuteReader();
reader.Read();
byte[] blob;
byte[] blob2;
blob = (byte[])reader[0];
blob2 = System.Text.Encoding.Default.GetBytes(System.Text.Encoding.Unicode.GetString(blob));
using (var fs = new FileStream(#"C:\Users\Bilal\Desktop\ERPAnalizTool.exe", FileMode.Create, FileAccess.Write))
{
fs.Write(blob2, 0, blob2.Length);
fs.Flush();
}
We are not getting any errors, it saves the file. Just the problem is the file has a little bit smaller size. When we try to run, it doesn't run. Like it never was an exe before.
Any help would be appreciated. Thank you all.
Your problem is the following line:
blob2 = System.Text.Encoding.Default.GetBytes(System.Text.Encoding.Unicode.GetString(blob));
The blob variable contains the bytes you wrote to the database, which is the content of the file read in the databaseFilePut method. There is no reason at all to convert it to a Unicode string and then to the system default encoding (Windows-1252 on my system). The data is not a string, it is binary. The double conversion will simply produce a mangled byte sequence.
Simply write the blob variable to disk:
blob = (byte[])reader[0];
File.WriteAllBytes(#"C:\Users\Bilal\Desktop\ERPAnalizTool.exe", blob);

C#: file corrupted after downloading as byte array

I have a project in which I store documents as blobs in a MySQL database, and download them as byte arrays using C#:
public static byte[] GetFile(string fileName)
{
conn.Open();
MySqlCommand cmd = conn.CreateCommand();
cmd.CommandText = ...
using (MySqlDataReader reader = cmd.ExecuteReader())
{
reader.Read();
if (reader.HasRows)
{
return Util.ObjectToByteArray(reader["Content"]);
}
...
}
...
public static byte[] ObjectToByteArray(object obj)
{
BinaryFormatter bf = new BinaryFormatter();
using (var ms = new MemoryStream())
{
bf.Serialize(ms, obj);
return ms.ToArray();
}
}
I upload the files like this:
byte[] newFile = File.ReadAllBytes(fileName);
and download like this:
File.WriteAllBytes(path + "\\" + selectedFileName,
DocumentTable.GetFile(selectedFileName));
but when I download the files, they are corrupted and cannot be opened (for example Excel files, some other types can be opened). The extension of the downloaded file seems to be correct, but I am getting the message that "the file extension or file format are not valid".
I recommend using the GetStream() override of the data reader;
content = reader.GetStream(1);
and simply returning the stream object to your File writer.

SqlFileStream append new data create new file

I sending file to SqlFileStream in parts. Like this:
public void StreamFile(int storageId, Stream stream)
{
var select = string.Format(
#"Select TOP(1) Content.PathName(),
GET_FILESTREAM_TRANSACTION_CONTEXT() FROM FileStorage WHERE FileStorageID={0}",
storageId);
using (var conn = new SqlConnection(this.ConnectionString))
{
conn.Open();
var sqlTransaction = conn.BeginTransaction(IsolationLevel.ReadCommitted);
string serverPath;
byte[] serverTxn;
using (var cmd = new SqlCommand(select, conn))
{
cmd.Transaction = sqlTransaction;
using (SqlDataReader rdr = cmd.ExecuteReader())
{
rdr.Read();
serverPath = rdr.GetSqlString(0).Value;
serverTxn = rdr.GetSqlBinary(1).Value;
rdr.Close();
}
}
this.SaveFile(stream, serverPath, serverTxn);
sqlTransaction.Commit();
}
}
private void SaveFile(Stream clientStream, string serverPath, byte[] serverTxn)
{
const int BlockSize = 512;
using (var dest = new SqlFileStream(serverPath, serverTxn,
FileAccess.ReadWrite, FileOptions.SequentialScan, 0))
{
var buffer = new byte[BlockSize];
int bytesRead;
dest.Seek(dest.Length, SeekOrigin.Begin);
while ((bytesRead = clientStream.Read(buffer, 0, buffer.Length)) > 0)
{
dest.Write(buffer, 0, bytesRead);
}
}
clientStream.Close();
}
I tryed upload file in 10 parts. When I look up folder where is hold data I saw 10 files. For each additional parts SqlFilesStream creates a new file. The last file holding all data but rest file unnecessarily wasted space. Is possible holding all data in one file - last append operation?
SqlFileStream doesn't support "partial updates". Each time the content changes, a new file is written to disk. These extra files are de-referenced and will eventually be cleaned up, but until they are, they will affect your backups/logs/etc
If you want all the data in a single file, you will need to process the entire file in a single stream.
If you don't need to support large files ( > 2 GB) and you expect lots of small partial updates, you might be better off storing the file in a VARBINARY(MAX) column instead.

C# Asp.Net Create text file, zip it, and save to Blob - Without writing anything to disk

A complicated one here, well for me anyway :)
Basically what i would like to achieve is to generate some text, zip this text file within two directories and then upload it to a MySQL blob field - all without writing anything to the disk. I am relatively new to all this so any pointers are greatly appreciated. Heres what i have sort of put together so far, it obviously crashes and burns but hopefully gives a better idea of what id like to do. Oh and im using DotNetZip currently :)
public void broadcastItem()
{
System.IO.MemoryStream ms = new System.IO.MemoryStream();
System.IO.StreamWriter sw = new System.IO.StreamWriter(ms);
System.IO.MemoryStream ms2 = new System.IO.MemoryStream();
sw.Write("Some Text generated and placed in a file");
sw.Close(); //Text File Now Created
using (ZipFile zip = new ZipFile())
{
zip.AddDirectory(#"Directory1\Directory2");
//Zipping within two directories
ZipEntry e = zip.AddEntry("Test", ms);
e.
e.Comment = "The content for entry in the zip file was obtained from a stream";
zip.Comment = "This zip was created at " + System.DateTime.Now.ToString("G");
zip.Save(ms2); //Trying to save to memory stream
}
try
{
OdbcConnection Server = new OdbcConnection("DSN=CentralServer");
Server.Open();
OdbcCommand DbCommand = Server.CreateCommand();
DbCommand.CommandText = "INSERT INTO blobtest(blobfield) VALUES(?)";
OdbcParameter param = new OdbcParameter("#file", SqlDbType.Binary);
param.Value = ms2;
DbCommand.Parameters.Add(param);
DbCommand.ExecuteNonQuery();
//Trying to save zip file from memory stream to blob field
}
catch (Exception ex)
{
throw ex;
}
}
** EDIT - Moving Closer ***
I now can create a text file and zip it in memory, problem is the text doesnt display in the file - ie its blank i now have the file within two directories :)
amended code below
public void test3()
{
MemoryStream ms = new MemoryStream();
StreamWriter sw = new StreamWriter(ms);
sw.WriteLine("HELLO!");
sw.WriteLine("I WANT TO SAVE THIS FILE AS A .TXT FILE WITHIN TWO FOLDERS");
ms.Position = 0;
// create the ZipEntry archive from the xml doc store in memory stream ms
MemoryStream outputMS = new System.IO.MemoryStream();
ZipOutputStream zipOutput = new ZipOutputStream(outputMS);
ZipEntry ze = new ZipEntry(#"Directory1\Directory2\example.txt");
zipOutput.PutNextEntry(ze);
zipOutput.Write(ms.ToArray(), 0, Convert.ToInt32(ms.Length));
zipOutput.Finish();
zipOutput.Close();
byte[] byteArrayOut = outputMS.ToArray();
outputMS.Close();
ms.Close();
try
{
OdbcConnection rstServer = new OdbcConnection("DSN=CentralServer");
Server.Open();
OdbcCommand DbCommand = Server.CreateCommand();
DbCommand.CommandText = "INSERT INTO blobtest(blobfield) VALUES(?)";
OdbcParameter param = new OdbcParameter("#file", SqlDbType.Binary);
param.Value = byteArrayOut;
DbCommand.Parameters.Add(param);
DbCommand.ExecuteNonQuery();
Response.Write(byteArrayOut.ToString());
}
catch (Exception ex)
{
Response.Write(ex.ToString());
}
}
You can use the open source c# compression library SharpZipLib to create an in-memory zip file, as explained here: In Memory compression using SharpZipLib
// zip XElement xdoc and add to requests MTOM value
using (MemoryStream ms = new System.IO.MemoryStream())
{
xdoc.Save(ms);
ms.Position = 0;
// create the ZipEntry archive from the xml doc store in memory stream ms
using (MemoryStream outputMS = new System.IO.MemoryStream())
{
using (ZipOutputStream zipOutput = new ZipOutputStream(outputMS))
{
ZipEntry ze = new ZipEntry("example.xml");
zipOutput.PutNextEntry(ze);
zipOutput.Write(ms.ToArray(), 0, Convert.ToInt32(ms.Length));
zipOutput.Finish();
zipOutput.Close();
// add the zip archive to the request
SubmissionReceiptListAttachmentMTOM = new base64Binary();
SubmissionReceiptListAttachmentMTOM.Value = outputMS.ToArray();
}
outputMS.Close();
}
ms.Close();
}
Now you just need to convert the memory stream to a byte array and save it in the database.
Basically this is the whole compilation of what i wanted to achieve so i thought id put this together for anyone who needed something similar down the line - this is a working piece of code
public void diskLess()
{
MemoryStream ms = new MemoryStream();
StreamWriter sw = new StreamWriter(ms);
sw.WriteLine("HELLO!");
sw.WriteLine("I WANT TO SAVE THIS FILE AS A .TXT FILE WITHIN TWO FOLDERS");
sw.Flush(); //This is required or you get a blank text file :)
ms.Position = 0;
// create the ZipEntry archive from the txt file in memory stream ms
MemoryStream outputMS = new System.IO.MemoryStream();
ZipOutputStream zipOutput = new ZipOutputStream(outputMS);
ZipEntry ze = new ZipEntry(#"dir1\dir2\whatever.txt");
zipOutput.PutNextEntry(ze);
zipOutput.Write(ms.ToArray(), 0, Convert.ToInt32(ms.Length));
zipOutput.Finish();
zipOutput.Close();
byte[] byteArrayOut = outputMS.ToArray();
outputMS.Close();
ms.Close();
try
{
OdbcConnection rstServer = new OdbcConnection("DSN=CentralServer");
Server.Open();
OdbcCommand DbCommand = Server.CreateCommand();
DbCommand.CommandText = "INSERT INTO blobtest(blobfield) VALUES(?)";
OdbcParameter param = new OdbcParameter("#file", SqlDbType.Binary);
param.Value = byteArrayOut;
DbCommand.Parameters.Add(param);
DbCommand.ExecuteNonQuery();
Response.Write(byteArrayOut.ToString());
}
catch (Exception ex)
{
Response.Write(ex.ToString());
}
}

How do I load text files greater than the 64 kb buffersize limit?

I'm trying to load text files (.aspx, .cs, html, etc) into a sql server 2008 database.
I'm able to load all files that are less than 64 kb so far. I have two questions; How do I get around the 64 kb limit, and is the method I'm using the best way to do this?
Thanks for the help.
Database:
file_length int,
file_path varchar(250),
file_string varchar(MAX)
Code:
private static void Load_Files()
{
string source = HttpContext.Current.Server.MapPath("~/website/");
DirectoryInfo di = new DirectoryInfo(source);
FileInfo[] files = di.GetFiles();
foreach (FileInfo f in files)
{
string sourceFile = f.FullName;
FileStream fs_reader = new FileStream(sourceFile, FileMode.Open, FileAccess.Read);
StreamReader reader = new StreamReader(fs_reader);
string content = reader.ReadToEnd();
Int32 file_length = content.Length;
string CS = ConfigurationManager.ConnectionStrings["SQL_CS"].ConnectionString;
SqlConnection SQL_Conn_01 = new SqlConnection(CS);
string SQL_01 = "INSERT INTO Page_File_Store (file_length, file_path, file_string) VALUES (#file_length, #file_path, #file_string)";
SqlCommand SQL_File_Load = new SqlCommand(SQL_01, SQL_Conn_01);
SQL_File_Load.Parameters.Add(new SqlParameter("#file_length", file_length));
SQL_File_Load.Parameters.Add(new SqlParameter("#file_path", sourceFile));
SQL_File_Load.Parameters.Add(new SqlParameter("#file_string", content));
SQL_Conn_01.Open();
SQL_File_Load.ExecuteNonQuery();
SQL_Conn_01.Close();
reader.Close();
}
}
In SQL server 2008, Microsoft added a new data type, called FileStream. It "integrates the SQL Server Database Engine with an NTFS file system by storing varbinary(max) binary large object (BLOB) data as files on the file system". You can read more about FileStream here.
You seem to be doing a lot of extra, unnecessary work - somewhere along the lines, something goes wrong.
I tried your scenario with this code here and it works without a hitch - no problems at all, even for text files over 500 KB in size:
// read the whole text of the file in a single operation
// no filestream, memorystream and other messy stuff needed!
string file_content = File.ReadAllText(sourceFile);
Int32 file_length = content.Length;
// best practice: always put SqlConnection and SqlCommand into using() {..} blocks!
using (SqlConnection _con = new SqlConnection(CS))
{
string _query =
"INSERT INTO Page_File_Store (file_length, file_path, file_string) " +
"VALUES (#file_length, #file_path, #file_string)";
using(SqlCommand _cmd = new SqlCommand(_query, _con))
{
// just add the three parameters with their values
// ADO.NET figures out the rest (datatypes etc.) by itself!
_cmd.Parameters.AddWithValue("#file_length", file_length);
_cmd.Parameters.AddWithValue("#file_path", fileName);
_cmd.Parameters.AddWithValue("#file_string", content);
_con.Open();
_cmd.ExecuteNonQuery();
_con.Close();
}
}
That should definitely work - try it!
Marc
Does it solve your problem if you replace this:
SQL_File_Load.Parameters.Add(new SqlParameter("#file_string", content));
With this:
SqlParameter contentParameter = new SqlParameter("#file_string", SqlDbType.NVarChar, -1);
contentParameter.Value = content;
SQL_File_Load.Parameters.Add(contentParameter);
The -1 is interpretted as (MAX)

Categories

Resources