c#- Corrupt file when writing to Response - c#

I am trying to write a file to the output response stream using ASP.Net. When I download the file, I get a Errors were found opening .zip file. You can extract files from this archive, but other programs may not be able to open it. Do you want to try fixing the problems? error. I am however able to cancel this and open the zip file correctly.
What could I be doing wrong?
ASP.Net\C# code:
var pathToFile = "c:\abc.zip";
string fileName = Path.GetFileName(pathToFile);
byte[] buffer;
using (var fileStream = new FileStream(pathToFile, Filemode.open))
{
buffer = new byte[fileStream. Length];
fileStream.Read(buffer, 0, (int)fileStream.Length);
}
Response.Clear();
Response.Buffer = true;
Response.AppendHeader("content-disposition", string.Format(#"attachment; filename={0}.zip", filename));
Response.ContentType = "application/zip";
Response.AppendHeader("content-length", buffer.Length.ToString());
Response.WriteFile(pathToFile);

If you want to use buffer and write file to user, do not use Response.WriteFile.
Change to Response.OutputStream.Write, and modify your codes, like this:
var pathToFile = "c:\abc.zip";
string fileName = Path.GetFileName(pathToFile);
byte[] buffer = new byte[0];
using (var fileStream = new FileStream(pathToFile, FileMode.Open, FileAccess.Read))
{
buffer = new byte[fileStream.Length];
fileStream.Read(buffer, 0, buffer.Length);
}
Response.Clear();
Response.AppendHeader("content-disposition", string.Format(#"attachment; filename={0}", filename));
Response.ContentType = "application/zip";
Response.OutputStream.Write(buffer, 0, buffer.Length);
Response.Flush();
Response.End();

Related

C# SFTP - download file corrupted and showing different size when compare the file SFTP server

I am trying to download .zip and .xlsx files from SFTP server,after downloading when I tried to open the zip file, it saying the compressed zip files is invalid and also the files size is high compared with the SFTP file size(remote file size).
I am using the below code :
string sFTPHost = "sftphost";
string sFTPDirectory = "file.zip";
string sFTPUser = "username";
string sFTPPassword = "pwd";
string sFTPPort = "22";
ConnectionInfo ConnNfo = new ConnectionInfo(#sFTPHost, Convert.ToInt32(sFTPPort), #sFTPUser,
new AuthenticationMethod[]{
new PasswordAuthenticationMethod(#sFTPUser,#sFTPPassword),
}
);
using (var sftp = new SftpClient(ConnNfo))
{
sftp.Connect();
MemoryStream ms = new MemoryStream();
sftp.DownloadFile(#sFTPDirectory, ms);
byte[] feedData = ms.GetBuffer();
var response = HttpContext.Current.Response;
response.AddHeader("Content-Disposition", "attachment; filename="filename.zip");
response.AddHeader("Content-Length", feedData.Length.ToString());
response.ContentType = "application/octet-stream";
response.BinaryWrite(feedData);
sftp.Disconnect();
}
}
What could be the issue?
MemoryStream.GetBuffer returns the underlying array of the stream which can/will contain allocated, unused bytes. For example the length of the returned buffer will match the stream's current Capacity, but will most likely be larger than the stream's current Length.
From the documentation:
Note that the buffer contains allocated bytes which might be unused.
For example, if the string "test" is written into
the MemoryStream object, the length of the buffer returned
from GetBuffer is 256, not 4, with 252 bytes unused.
You will need to use ToArray instead. Note however that this creates a new array and copies the data into it.
byte[] feedData = ms.ToArray();
var response = HttpContext.Current.Response;
response.AddHeader("Content-Disposition", "attachment; filename=filename.zip");
response.AddHeader("Content-Length", feedData.Length.ToString());
response.ContentType = "application/octet-stream";
response.BinaryWrite(feedData);
Alternatively you should be able to copy from one stream to the other:
var response = HttpContext.Current.Response;
response.AddHeader("Content-Disposition", "attachment; filename=filename.zip");
response.AddHeader("Content-Length", ms.Length.ToString());
response.ContentType = "application/octet-stream";
// rewind stream and copy to response
ms.Position = 0;
ms.CopyTo(response.OutputStream);

Zip files created by DotNetZip using ASP.NET sometimes causing network error

I'm debugging a rather odd situation involving DotNetZip and ASP.NET. Long story short, the resulting zip files that are being created by the code are being reliably downloaded by Firefox, but most other browsers are intermittently returning a Network Error. I've examined the code and it reads about as generically as anything that involves DotNetZip.
Any clues?
Thanks!
EDIT: Here's the complete method. As I mentioned, it's about as generic as it gets:
protected void btnDownloadFolders_Click(object sender, EventArgs e)
{
//Current File path
var diRoot = new DirectoryInfo(_currentDirectoryPath);
var allFiles = Directory.GetFiles(diRoot.FullName, "*.*", SearchOption.AllDirectories);
Response.Clear();
Response.BufferOutput = false;
var archiveName = String.Format("{0}-{1}.zip", diRoot.Name, DateTime.Now.ToString("yyyy-MM-dd HHmmss"));
Response.ContentType = "application/zip";
Response.AddHeader("content-disposition", "inline; filename=\"" + archiveName + "\"");
using (var zip = new ZipFile())
{
foreach (var strFile in allFiles)
{
var strFileName = Path.GetFileName(strFile);
zip.AddFile(strFile,
strFile.Replace("\\" + strFileName, string.Empty).Replace(diRoot.FullName, string.Empty));
}
zip.Save(Response.OutputStream);
}
Response.Close();
}
It could be because you are not sending the content-length. I've seen errors occur in sending files to the browser where it was not specified. So create the zip file in a MemoryStream. save the stream to a Byte Array so you can send the length as a Response also. Although I can't say for sure that it will fix your specific problem.
byte[] bin;
using (MemoryStream ms = new MemoryStream())
{
using (var zip = new ZipFile())
{
foreach (var strFile in allFiles)
{
var strFileName = Path.GetFileName(strFile);
zip.AddFile(strFile, strFile.Replace("\\" + strFileName, string.Empty).Replace(diRoot.FullName, string.Empty));
}
//save the zip into the memorystream
zip.Save(ms);
}
//save the stream into the byte array
bin = ms.ToArray();
}
//clear the buffer stream
Response.ClearHeaders();
Response.Clear();
Response.Buffer = true;
//set the correct contenttype
Response.ContentType = "application/zip";
//set the filename for the zip file package
Response.AddHeader("content-disposition", "attachment; filename=\"" + archiveName + "\"");
//set the correct length of the data being send
Response.AddHeader("content-length", bin.Length.ToString());
//send the byte array to the browser
Response.OutputStream.Write(bin, 0, bin.Length);
//cleanup
Response.Flush();
HttpContext.Current.ApplicationInstance.CompleteRequest();

Downloading files in asp.net using C#

How do i download a file in asp.net?
here is what i did to upload it:
I upload the file into the website and saved the url to it in a database like this:
string CVPath = null;
if (uploadfiles.HasFile)
{
string file = uploadfiles.FileName;
uploadfiles.PostedFile.SaveAs(Server.MapPath(".") + "//CV//" + file);
CVPath = "~//ProfileImages//" + file;
FileName.InnerText = file;
}
else
CVPath = "";
and then I save the "CVPath" in a database
To download a file, first you need to read all the contents to a string.
MemoryStream ms = new MemoryStream();
TextWriter tw = new StreamWriter(ms);
tw.WriteLine("YourString");
tw.Flush();
byte[] bytes = ms.ToArray();
ms.Close();
Response.Clear();
Response.ContentType = "application/force-download";
Response.AddHeader("content-disposition", "attachment; filename=file.txt");
Response.BinaryWrite(bytes);
Response.End();

memorystream contains bad xml markup when read

What is my issue here? When I write the stream back out to web, open the file it contains some of the content but all malformed and some missing.
Am I experiencing loss of data due to a logic error?
Note: the readstream and writestream below mocks up what a service will be filling in. I will be receiving a stream to read from the service. I'll need to write that stream back out.
MemoryStream writeStream = new MemoryStream();
byte[] buffer = new byte[256];
OrderDocument doc = new OrderDocument();
doc.Format = "xml";
doc.DocumentId = "5555555";
doc.Aid = "ZZ";
doc.PrimaryServerPort = "PORT";
MemoryStream readStream = new MemoryStream(doc.GetDocument());
while (readStream != null && readStream.Read(buffer, 0, buffer.Length) > 0)
{
writeStream.Write(buffer, 0, buffer.Length);
}
writeStream.Flush();
writeStream.Position = 0;
Response.Buffer = true;
Response.Clear();
Response.ClearContent();
Response.ClearHeaders();
Response.ContentType = "text/xml";
Response.ClearHeaders();
Response.AddHeader("Content-Disposition", string.Format("attachment; filename={0}.xml", doc.DocumentId));
Response.AddHeader("Content-Length", writeStream.Length.ToString());
Response.BinaryWrite(writeStream.ToArray());
Response.End();
Am I experiencing loss of data due to a logic error?
Yes probably, you may try simplifying your code a little bit. I don't really see the need of multiple memory streams here:
OrderDocument doc = new OrderDocument();
doc.Format = "xml";
doc.DocumentId = "5555555";
doc.Aid = "ZZ";
doc.PrimaryServerPort = "PORT";
byte[] buffer = doc.GetDocument();
Response.Buffer = true;
Response.Clear();
Response.ClearHeaders();
Response.ContentType = "text/xml";
Response.AddHeader("Content-Disposition", string.Format("attachment; filename={0}.xml", doc.DocumentId));
Response.OutputStream.Write(buffer, 0, buffer.Length);

Create zip from multiple files in memory

I am trying to use SharpZipLib to generate zip file and let the client download it.
Currently zip folder is created and available on client machine but the issue is download.zip folder is blank. The files which are specified in a folder are not available in zip folder.
Below is the code I tried.
System.Web.HttpResponse response = System.Web.HttpContext.Current.Response;
ICSharpCode.SharpZipLib.Checksums.Crc32 crc = new ICSharpCode.SharpZipLib.Checksums.Crc32();
//stream directly to client.
ICSharpCode.SharpZipLib.Zip.ZipOutputStream output = new ICSharpCode.SharpZipLib.Zip.ZipOutputStream(response.OutputStream);
output.SetLevel(9);
string[] files = Directory.GetFiles("D:/newfolder/");
for (int i = 0; i < files.Length; i++)
{
ICSharpCode.SharpZipLib.Zip.ZipEntry entry = new ICSharpCode.SharpZipLib.Zip.ZipEntry(files[i].ToString());
entry.DateTime = DateTime.Now;
System.IO.FileStream fs = new System.IO.FileStream(files[i].ToString(), FileMode.Open);
byte[] buffer = new byte[fs.Length];
fs.Read(buffer, 0, buffer.Length);
entry.Size = fs.Length;
fs.Close();
crc.Reset();
crc.Update(buffer);
entry.Crc = crc.Value;
output.PutNextEntry(entry);
output.Write(buffer, 0, buffer.Length);
}
output.Finish();
output.Close();
response.Clear();
response.ContentType = "D:/Work Area/";
response.AddHeader("Content-Disposition", "attachment; filename=" + "download.zip");
response.End();
Can anyone tell me whats the issue? Why the files are not available in .zip
After browsing the SharpDevelop wiki I found the following which may be of assistance to you in helping you resolve the problem:
using ICSharpCode.SharpZipLib.Zip;
// This will accumulate each of the files named in the fileList into a zip file,
// and stream it to the browser.
// This approach writes directly to the Response OutputStream.
// The browser starts to receive data immediately which should avoid timeout problems.
// This also avoids an intermediate memorystream, saving memory on large files.
//
private void DownloadZipToBrowser(List <string> zipFileList)
{
Response.ContentType = "application/zip";
// If the browser is receiving a mangled zipfile, IIS Compression may cause this problem. Some members have found that
// Response.ContentType = "application/octet-stream" has solved this. May be specific to Internet Explorer.
Response.AppendHeader("content-disposition", "attachment; filename=\"Download.zip\"");
response.CacheControl = "Private";
response.Cache.SetExpires(DateTime.Now.AddMinutes(3)); // or put a timestamp in the filename in the content-disposition
byte[] buffer = new byte[4096];
ZipOutputStream zipOutputStream = new ZipOutputStream(Response.OutputStream);
zipOutputStream.SetLevel(3); //0-9, 9 being the highest level of compression
foreach (string fileName in zipFileList) {
Stream fs = File.OpenRead(fileName); // or any suitable inputstream
ZipEntry entry = new ZipEntry(ZipEntry.CleanName(fileName));
entry.Size = fs.Length;
// Setting the Size provides WinXP built-in extractor compatibility,
// but if not available, you can set zipOutputStream.UseZip64 = UseZip64.Off instead.
zipOutputStream.PutNextEntry(entry);
int count = fs.Read(buffer, 0, buffer.Length);
while (count > 0) {
zipOutputStream.Write(buffer, 0, count);
count = fs.Read(buffer, 0, buffer.Length);
if (!Response.IsClientConnected) {
break;
}
Response.Flush();
}
fs.Close();
}
zipOutputStream.Close();
Response.Flush();
Response.End();
}
CRC32 corresponds to each compressed file.
System.Web.HttpResponse response = System.Web.HttpContext.Current.Response;
//stream directly to client.
ICSharpCode.SharpZipLib.Zip.ZipOutputStream output = new ICSharpCode.SharpZipLib.Zip.ZipOutputStream(response.OutputStream);
output.SetLevel(9);
string[] files = Directory.GetFiles("D:/newfolder/");
for (int i = 0; i < files.Length; i++)
{
ICSharpCode.SharpZipLib.Checksums.Crc32 crc = new ICSharpCode.SharpZipLib.Checksums.Crc32();
ICSharpCode.SharpZipLib.Zip.ZipEntry entry = new ICSharpCode.SharpZipLib.Zip.ZipEntry(files[i].ToString());
entry.DateTime = DateTime.Now;
System.IO.FileStream fs = new System.IO.FileStream(files[i].ToString(), FileMode.Open);
byte[] buffer = new byte[fs.Length];
fs.Read(buffer, 0, buffer.Length);
entry.Size = fs.Length;
fs.Close();
crc.Reset();
crc.Update(buffer);
entry.Crc = crc.Value;
output.PutNextEntry(entry);
output.Write(buffer, 0, buffer.Length);
}
output.Finish();
output.Close();
response.Clear();
response.ContentType = "D:/Work Area/";
response.AddHeader("Content-Disposition", "attachment; filename=" + "download.zip");
response.End();

Categories

Resources