I have this code to download files:
WebClient req = new WebClient();
HttpResponse response = HttpContext.Current.Response;
string filePath = file.Path;
response.Clear();
response.ClearContent();
response.ClearHeaders();
response.Buffer = true;
response.AddHeader("Content-Disposition", "attachment; filename=\"" + _fileName + "\"");
response.ContentType = Path.GetExtension(filePath);
byte[] data = req.DownloadData(filePath);
response.BinaryWrite(data);
response.Flush();
response.Close();
Some of the variables like file.path and _fileName are from a list I created to save information from the upload control in asp.net webforms.
This download code works great on all browsers in my computer, but when I passed it to a local machine, it doesn't work at all, on firefox it downloads but I can't open the file, and on chrome it says Failed Network Error.
The files are uploaded on my C: drive, but on the machine they are uploaded on a fileshare that simulates a C: drive. I used logs to understand if he was getting the file properly, and this is what I get:
\\10.50.237.18\opt\Liferay_DATA\eforms\20180126\70edd349-8cd1-4518-a55e-d49dcadb3f7c\0eadc797-7817-4e99-b39a-35f179ec1c5d.pdf
For What I understood, the file is being upload properly. But when I'm trying to download it, it's not working. I know this is a common question, but I check several solutions and I can't seem to make it work. Could you please help me understand what the problem is? Maybe the response.header?
Related
I am working with an ASP.NET website that is deployed on a remote server using IIS. When a user logs on to access the site, they have an option to download a file. However, the file gets saved to the remote server rather than the user's local computer.
Is there a way to transfer/copy the file that is saved on the remote server to the user's own local computer? Or some way that I can indicate that the file needs to be saved to the local computer instead of the remote server?
Below is the code I am using to download the files (csv and mat). The mat file download is more complicated than a typical file download, and requires a NuGet package extension. So there is not much flexibility in how the file is create and downloaded, which is why I think it may be simpler to transfer the file from the server to the local computer after everything is saved.
//matCheck and csvCheck are booleans that allow the user to pick one or both of the file formats to download as.
if (matCheck == true)
{
//MLArray and MatFileWriter are part of the nuget extension I mentioned above
List<MLArray> mlList = new List<MLArray>();
mlList.Add(structure);
new MatFileWriter(new FileStream(downloadsPath + FN + ".mat", FileMode.Create), mlList, false);
}
if (csvCheck == true)
{
//sb is the variable created in earlier code storing the csv data to download
System.IO.File.WriteAllText(downloadsPath + FN + ".csv", sb.ToString());
var response = System.Web.HttpContext.Current.Response;
response.BufferOutput = true;
response.Clear();
response.ClearHeaders();
response.ContentEncoding = Encoding.Unicode;
response.AddHeader("content-disposition", "attachment;filename=" + FN + ".csv");
response.ContentType = "text/plain";
response.Write(sb.ToString());
response.End();
}
In one of project i saw the below code.
string FileName = Request.QueryString["filename"];
System.Web.HttpResponse response = System.Web.HttpContext.Current.Response;
response.ClearContent();
response.Clear();
response.ContentType = "application/octet-stream";
response.AddHeader("Content-Disposition", "attachment; filename=" + FileName + ";");
response.TransmitFile(Server.MapPath("~/" + FileName));
response.Flush();
response.End();
which is used to download file from server. Now if someone changes filename (like web.config) in query string manually then it downloads config file also.
So please share your knowledge how to restrict from download based on file extension.
That is usually done in IIS. But, if you wanna to it programmatically:
string[] forbiddenExtensions = new string[] {".config", ".abc", ".xml" };
FileInfo fi = new FileInfo(FileName);
if (forbiddenExtensions.Contains(fi.Extension))
{
//throw some error or something...
}
First of all, you'll have to ask yourself whether you want to write code to let users download files present on the file system. The web server itself is perfectly fine handling file downloads and preventing access to files that shouldn't be shared.
If you're sure you want to do this, for example because you want some code to run before every download, then you should look at creating a handler instead. Then the web server will still first determine permissions and whatnot, in order to prevent malicious users from downloading sensitive files, and when allowed, your code will run before the download.
That being said, you don't want to serve files for download from your web root. Instead create a dedicated directory, say, "Downloads", and host your files for download from there. Then use filename = Path.GetFileName(filename) to prevent the user from navigating outside that directory (using .. and/or \).
The problem here is that upon trying to download an excel file with html markup contents. The system takes a long time to download it. Sometimes the save dialog of IE does not appear immediately upon request for download.
The flow is like this. This is a legacy code so I know it looks dumb.
Client request for generation of excel file -> create excel file on server -> query data from DB -> Populate excel with HTML markups together with data from DB -> Send excel file to client using HTTP
Now here comes the problem.
The issue does not always happen, it is intermittent.
Sometimes the save file dialog of IE does not show. Sometimes it takes 6-10 min before it shows. What is the cause of this.
here is how the file is sent through http.
Response.AddHeader("Content-Disposition", "attachment; filename=" + file.Name);
Response.AddHeader("Content-Length", file.Length.ToString());
Response.ContentType = "application/vnd.ms-excel";
Response.WriteFile(file.FullName);
Response.End();
It depends on the size of your excel file, if the size is large, it would take time to generate file and download file, I suggest you zipping your file before send it back client via HTTP.
Please try this:
Response.AddHeader("Content-Disposition", "inline; filename=" + file.Name);
Response.AddHeader("Content-Length", file.Length.ToString());
Response.ContentType = "application/vnd.ms-excel";
Response.WriteFile(file.FullName);
Response.End();
I have a web application which lists DOCX files within a directory when the users click a link the document opens within the browser using the word plug in. Is it at all possible to force the browser to open the document in full word instead of the word plug in?
This application is running on a corporate network and i have no access to amend any settings on the machines it can only be done from the web server.
They are currently using IE6 and a browser upgrade is not due for several months.
I suspect the answer is it cant be done.
I had a similar problem. I achieve a solution using a Generic Handler.
Add a new GeneriChandler.cs to you project and in the ProcessRequest do this code:
//Search for the file by querystring or other method you did, like name.
string fileId = context.Request.QueryString["fileId"];
FileInfo file = new FileInfo("C:\\" + fileId + ".docx");
context.Response.Clear();
context.Response.AddHeader("Content-Disposition", "attachment; filename=" + file.Name);
context.Response.AddHeader("Content-Length", file.Length.ToString());
context.Response.ContentType = "application/octet-stream";
context.Response.WriteFile(file.FullName);
So if your handler is called MyHandler.ashx you can call the file in the link like that :
File 1
So my code is generating a PDF report and posting it to a remote file share server, all that works so i know i have proper read/write access to the share.
Now i am attempting to create an "Open PDF" button which will basically just serve that file from the remote share to the user so the can open the PDF and view it without having to go to the file share itself and dig through it.
I've tried a few things but no matter what i do i get errors.
Ive tried this code, which i used to serve the file when it was on my ASP.NET server and worked fine, but it wont work when i give it a remote file.
Response.Clear();
Response.ContentType = "Application/pdf";
Response.AppendHeader("Content-Disposition", "attachment; filename=Report_Submission_" + SubmissionID + ".pdf");
String filePath = "\\\\Fileserver\\Directory1\\Directory2\\Directory3" + "\\Report_Submission_" + SubmissionID + ".pdf";
Response.TransmitFile(filePath);
Response.End();
Using that i get the following exception
0x800a139e - Microsoft JScript runtime error: Sys.WebForms.PageRequestManagerParserErrorException: The message received from the server could not be parsed.
I also tried doing this
Response.ContentType = "application/pdf";
Response.WriteFile(filePath);
And i get the same exception.
Ive even just tried
Response.Redirect(filePath);
But that doesnt work either.
Is what im trying to do possible? Can you serve a file from a remote file share without copying it to the local server first?
Try this.
Response.AddHeader("Content-Type", "application/octet-stream");
Response.AddHeader("Content-Transfer-Encoding","Binary");
Response.AddHeader("Content-disposition", "attachment; filename=\"sample.pdf\"");
Response.WriteFile(HttpRuntime.AppDomainAppPath + #"path\to\file\sample.pdf");
Response.End();