FtpWebRequest closing the upload stream hangs on large files - c#

I am trying to use an FtpWebRequest to upload some files. This works for smallish files (say <2MB), but when I am trying to load a 16MB file, the files uploads successfully, but when I call request.GetRequestStream().Close, the code hangs (or timesout if the timeout is low enough).
I could just a) not close it and b)not bother to get the response from the server, but that doesn't seem right! See code below (using SSL or not, the same problem occurs.)
output.Close() is the line that hangs....
public static void SendFileViaFtp(string file, string url, bool useSsl, ICredentials credentials)
{
var request = (FtpWebRequest)WebRequest.Create(url + Path.GetFileName(file));
request.EnableSsl = useSsl;
request.UseBinary = true;
request.Credentials = credentials;
request.Method = WebRequestMethods.Ftp.UploadFile;
request.Timeout = 10000000;
request.ReadWriteTimeout = 10000000;
request.KeepAlive = true;
var input = File.Open(file, FileMode.Open);
var output = request.GetRequestStream();
var buffer = new byte[1024];
var lastBytesRead = -1;
var i = 0;
while (lastBytesRead != 0)
{
i++;
lastBytesRead = input.Read(buffer, 0, 1024);
Debug.WriteLine(lastBytesRead + " " + i);
if (lastBytesRead > 0)
{
output.Write(buffer, 0, lastBytesRead);
}else
{
Debug.WriteLine("Finished");
}
}
input.Close();
output.Close();
var response = (FtpWebResponse)request.GetResponse();
response.Close();
}
Thanks,

try
// after finished uploading
request.Abort(); // <=== MAGIC PART
// befor response.Close()
var response = (FtpWebResponse)request.GetResponse();
response.Close();
taken from here

try to close output before input.
make sure the last buffer isn't to large, or you could write a few empty bytes.
I don't know if it's necessary but i always set the request contentlength to the inputfile-length.
Here is an good example: http://dotnet-snippets.de/dns/ftp-file-upload-mit-buffer-SID886.aspx

Related

C# FtpWebRequest - Hangs when finished

NET Framework 4.6
I got a weird and difficult bug inside my code. When i upload a ZIP file; 150MB (slow internet, takes 15 minuten to upload) sometimes the task doesn't 100% complete. It will never get to the last Console line and hangs on the progressbar while loop.
The file on the FTP server is complete and not corrupted. So why does it hang on some files (5 out of the 10 times)?
Tried some posts from other old posts saying that it could be the request config. Also tried to close the ftpStream/fileStream after the while loop. But this won't have any effect.
Hope someone could help me out. This is the Task:
Task TaskA = Task.Run(async () =>
{
// Get filesize
double len = new FileInfo(zipPathFull).Length;
string resultSize = ConvertToReadable(len);
ThreadSafeUpdateStatus(String.Format("Zip bestand aangemaakt, uploaden {0}...", resultSize));
// Verstuur ZIP over FTP
Console.WriteLine("Starting FTP commands...");
FtpWebRequest request = (FtpWebRequest)WebRequest.Create("ftp://DOMAIN.nl/domains/DOMAIN.nl/public_html/transfer/zips/" + zipfilename + ".zip");
request.Credentials = new NetworkCredential("USERNAME", "PASSWORD");
request.Timeout = -1;
request.KeepAlive = true;
request.UseBinary = true;
request.UsePassive = true;
request.ServicePoint.ConnectionLimit = 1000;
request.Method = WebRequestMethods.Ftp.UploadFile;
Console.WriteLine("Uploading over FTP...");
using (Stream fileStream = File.OpenRead(zipPathFull))
using (Stream ftpStream = request.GetRequestStream())
{
byte[] buffer = new byte[10240];
int read;
while ((read = fileStream.Read(buffer, 0, buffer.Length)) > 0)
{
Console.WriteLine("Buffer status: " + (int)fileStream.Position);
ftpStream.Write(buffer, 0, read);
var percent = 100m * ((decimal)fileStream.Position / fileStream.Length);
double doublepercentage = (double)percent;
ThreadSafeUpdateProgress(doublepercentage);
string CurrentresultSize = ConvertToReadable((double)fileStream.Position);
ThreadSafeUpdateStatus(String.Format("Zip bestand aangemaakt, uploaden {0}/{1}...", CurrentresultSize, resultSize));
}
ftpStream.Close();
fileStream.Close();
}
request.Abort();
Console.WriteLine("FTP Done");
.....
});

FTP Upload fails 'The underlying connection was closed: An unexpected error occurred on a receive.'

I'm trying to upload files to an FTP, and have been doing so successfully until today. Here (at work) I've never had issues, but in production on-site we have had sporadic issues, but now today all of a sudden it wont work 100% of the time and I can't figure this out for the life of me.
I've tried nearly everything I could find.
This is NOT a web service.
I have increased my Timeout and ReadWriteTimeout to -1 for the FtpWebRequest.
I have set the WriteTimeout and ReadTimeout of the stream I use to write the file contents.
We have made sure outbound rules are set to allow the port we're communicating over etc and can confirm we can write/read using FileZilla from the same machine/network.
The files are NOT large. They max at 1MB and are simple text files.
Below is the method I use. Let me know if you need anymore information.
private static void FtpUpload(String username, String password, String address,
Int32 port, Boolean usePassive, String filePath)
{
try
{
String fileName = "";
fileName = Path.GetFileName(filePath);
FtpWebRequest request = null;
request = (FtpWebRequest)FtpWebRequest.Create(String.Format("ftp://{0}", address));
request.Credentials = new NetworkCredential(
username,
password);
request.UsePassive = usePassive;
request.Method = WebRequestMethods.Ftp.UploadFile;
request.Timeout = -1;
request.ReadWriteTimeout = -1;
request.KeepAlive = false;
request.Proxy = null;
Console.WriteLine(String.Format("Uploading {0}...", fileName));
Stream ftpStream = null;
using (ftpStream = request.GetRequestStream())
{
ftpStream.WriteTimeout = -1;
ftpStream.ReadTimeout = -1;
using (FileStream file = File.OpenRead(filePath))
{
file.CopyTo(ftpStream);
}
}
}
catch
{
throw;
}
}
EDIT:
This code snippit works. Note when I update this snippet to use using blocks it goes back to failing. Could the use of using block be the cause?
public static void FtpUpload(
String username,
String password,
String address,
Int32 port,
Boolean usePassive,
String filePath)
{
string ftpServerIP = String.Format("ftp://{0}", address);
string ftpUserID = username;
string ftpPassword = password;
FileInfo fileInf = new FileInfo(filePath);
try
{
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(ftpServerIP);
request.Credentials = new NetworkCredential(ftpUserID, ftpPassword);
request.Method = WebRequestMethods.Ftp.UploadFile;
request.UseBinary = false;
request.UsePassive = false;
request.ContentLength = fileInf.Length;
// The buffer size is set to 2kb
int buffLength = 2048;
byte[] buff = new byte[buffLength];
int contentLen;
FileStream fs = fileInf.OpenRead();
Stream strm = request.GetRequestStream();
contentLen = fs.Read(buff, 0, buffLength);
while (contentLen != 0)
{
// Write Content from the file stream to the FTP Upload Stream
strm.Write(buff, 0, contentLen);
contentLen = fs.Read(buff, 0, buffLength);
}
strm.Close();
fs.Close();
}
catch
{
throw;
}
}
For anyone else who stumbles across this, the solution contained in
http://www.experts-exchange.com/questions/28693905/FtpWebRequest-The-underlying-connection-as-closed-An-unexpected-error-has-occurred-on-a-receive.html
worked for me, you had to set the following to false
var request = (FtpWebRequest)WebRequest.Create(finalPath);
request.UseBinary = false;
request.UsePassive = false;
This solved the problem of the FTP upload working locally and failing once deployed to UAT

FTP over SSL issue

I'm having issue uploading file to ftp site over SSL to specific directory. I'm using System.Net.FtpWebRequest class for this purpose. Upload going through fine. But the file always dropped to home directory. Any idea what might be doing wrong? Appreciate your help.
public bool UploadFile(string srcFilePath, string destFilePath = null)
{
if (String.IsNullOrWhiteSpace(srcFilePath))
throw new ArgumentNullException("Source FilePath.");
if (String.IsNullOrWhiteSpace(destFilePath))
destFilePath = Path.GetFileName(srcFilePath);
Uri serverUri = GetUri(destFilePath);
//// the serverUri should start with the ftp:// scheme.
if (serverUri.Scheme != Uri.UriSchemeFtp)
return false;
// get the object used to communicate with the server.
FtpWebRequest request = CreateFtpRequest(serverUri, WebRequestMethods.Ftp.UploadFile);
// read file into byte array
StreamReader sourceStream = new StreamReader(srcFilePath);
byte[] fileContents = Encoding.UTF8.GetBytes(sourceStream.ReadToEnd());
sourceStream.Close();
request.ContentLength = fileContents.Length;
// send bytes to server
Stream requestStream = request.GetRequestStream();
requestStream.Write(fileContents, 0, fileContents.Length);
requestStream.Close();
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
Debug.WriteLine("Response status: {0} - {1}", response.StatusCode, response.StatusDescription);
return true;
}
private FtpWebRequest CreateFtpRequest(Uri serverUri, string method)
{
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(serverUri);
request.EnableSsl = true;
request.UsePassive = true;
request.UseBinary = true;
request.KeepAlive = true;
request.Credentials = new NetworkCredential(_userName, _password);
request.Method = method;
return request;
}
private Uri GetUri(string remoteFilePath)
{
return new Uri(_baseUri, remoteFilePath);
}
OK. Finally figured it out. It is .NET 4.0 framework issue. Build the solution with .NET 3.5, it worked beautifully.
Hate to see bugs in new releases of .NET from Microsoft and wasting lot of quality time in figuring out.

Uploading files to FTP are corrupted once in destination

I'm creating a simple drag-file-and-upload-automatically-to-ftp windows application
and I'm using the MSDN code to upload the file to the FTP.
The code is pretty straight forward:
// Get the object used to communicate with the server.
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(String.Format("{0}{1}", FTP_PATH, filenameToUpload));
request.Method = WebRequestMethods.Ftp.UploadFile;
// Options
request.UseBinary = true;
request.UsePassive = false;
// FTP Credentials
request.Credentials = new NetworkCredential(FTP_USR, FTP_PWD);
// Copy the contents of the file to the request stream.
StreamReader sourceStream = new StreamReader(fileToUpload.FullName);
byte[] fileContents = Encoding.UTF8.GetBytes(sourceStream.ReadToEnd());
sourceStream.Close();
request.ContentLength = fileContents.Length;
Stream requestStream = request.GetRequestStream();
requestStream.Write(fileContents, 0, fileContents.Length);
requestStream.Close();
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
writeOutput("Upload File Complete!");
writeOutput("Status: " + response.StatusDescription);
response.Close();
and it does get uploaded to the FTP
Problem is when I see the file on a browser, or simply download and try to see it on desktop I get:
I already used request.UseBinary = false; and request.UsePassive = false; but it does not seam to do any kind of good whatsoever.
What I have found out was that, the original file has 122Kb lenght and in the FTP (and after downloading), it has 219Kb...
What am I doing wrong?
By the way, the uploadFileToFTP() method is running inside a BackgroundWorker, but I don't really thing that makes any difference...
You shouldn't use a StreamReader but only a Stream to read binary files.
Streamreader is designed to read text files only.
Try with this :
private static void up(string sourceFile, string targetFile)
{
try
{
string ftpServerIP = ConfigurationManager.AppSettings["ftpIP"];
string ftpUserID = ConfigurationManager.AppSettings["ftpUser"];
string ftpPassword = ConfigurationManager.AppSettings["ftpPass"];
////string ftpURI = "";
string filename = "ftp://" + ftpServerIP + "//" + targetFile;
FtpWebRequest ftpReq = (FtpWebRequest)WebRequest.Create(filename);
ftpReq.UseBinary = true;
ftpReq.Method = WebRequestMethods.Ftp.UploadFile;
ftpReq.Credentials = new NetworkCredential(ftpUserID, ftpPassword);
byte[] b = File.ReadAllBytes(sourceFile);
ftpReq.ContentLength = b.Length;
using (Stream s = ftpReq.GetRequestStream())
{
s.Write(b, 0, b.Length);
}
FtpWebResponse ftpResp = (FtpWebResponse)ftpReq.GetResponse();
if (ftpResp != null)
{
MessageBox.Show(ftpResp.StatusDescription);
}
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
}
}
The problems are caused by your code decoding the binary data to character data and back to binary data. Don't do this.
Use the UploadFile Method of the WebClient Class:
using (WebClient client = new WebClient())
{
client.Credentials = new NetworkCredential(FTP_USR, FTP_PWD);
client.UploadFile(FTP_PATH + filenameToUpload, filenameToUpload);
}

Download/Stream file from URL - asp.net

I need to stream a file which will result in save as prompt in the browser.
The issue is, the directory that the file is located is virtually mapped, so I am unable to use Server.MapPath to determine it's actual location. The directory is not in the same location (or even phyical server on the live boxes) as the website.
I'd like something like the following, but that will allow me to pass a web URL, and not a server file path.
I may have to end up building my file path from a config base path, and then append on the rest of the path, but hopefully I can do it this way instead.
var filePath = Server.MapPath(DOCUMENT_PATH);
if (!File.Exists(filePath))
return;
var fileInfo = new System.IO.FileInfo(filePath);
Response.ContentType = "application/octet-stream";
Response.AddHeader("Content-Disposition", String.Format("attachment;filename=\"{0}\"", filePath));
Response.AddHeader("Content-Length", fileInfo.Length.ToString());
Response.WriteFile(filePath);
Response.End();
You could use HttpWebRequest to get the file and stream it back to the client. This allows you to get the file with a url. An example of this that I found ( but can't remember where to give credit ) is
//Create a stream for the file
Stream stream = null;
//This controls how many bytes to read at a time and send to the client
int bytesToRead = 10000;
// Buffer to read bytes in chunk size specified above
byte[] buffer = new Byte[bytesToRead];
// The number of bytes read
try
{
//Create a WebRequest to get the file
HttpWebRequest fileReq = (HttpWebRequest) HttpWebRequest.Create(url);
//Create a response for this request
HttpWebResponse fileResp = (HttpWebResponse) fileReq.GetResponse();
if (fileReq.ContentLength > 0)
fileResp.ContentLength = fileReq.ContentLength;
//Get the Stream returned from the response
stream = fileResp.GetResponseStream();
// prepare the response to the client. resp is the client Response
var resp = HttpContext.Current.Response;
//Indicate the type of data being sent
resp.ContentType = MediaTypeNames.Application.Octet;
//Name the file
resp.AddHeader("Content-Disposition", "attachment; filename=\"" + fileName + "\"");
resp.AddHeader("Content-Length", fileResp.ContentLength.ToString());
int length;
do
{
// Verify that the client is connected.
if (resp.IsClientConnected)
{
// Read data into the buffer.
length = stream.Read(buffer, 0, bytesToRead);
// and write it out to the response's output stream
resp.OutputStream.Write(buffer, 0, length);
// Flush the data
resp.Flush();
//Clear the buffer
buffer = new Byte[bytesToRead];
}
else
{
// cancel the download if client has disconnected
length = -1;
}
} while (length > 0); //Repeat until no data is read
}
finally
{
if (stream != null)
{
//Close the input stream
stream.Close();
}
}
Download url to bytes and convert bytes into stream:
using (var client = new WebClient())
{
var content = client.DownloadData(url);
using (var stream = new MemoryStream(content))
{
...
}
}
I do this quite a bit and thought I could add a simpler answer. I set it up as a simple class here, but I run this every evening to collect financial data on companies I'm following.
class WebPage
{
public static string Get(string uri)
{
string results = "N/A";
try
{
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(uri);
HttpWebResponse resp = (HttpWebResponse)req.GetResponse();
StreamReader sr = new StreamReader(resp.GetResponseStream());
results = sr.ReadToEnd();
sr.Close();
}
catch (Exception ex)
{
results = ex.Message;
}
return results;
}
}
In this case I pass in a url and it returns the page as HTML. If you want to do something different with the stream instead you can easily change this.
You use it like this:
string page = WebPage.Get("http://finance.yahoo.com/q?s=yhoo");
2 years later, I used Dallas' answer, but I had to change the HttpWebRequest to FileWebRequest since I was linking to direct files. Not sure if this is the case everywhere, but I figured I'd add it. Also, I removed
var resp = Http.Current.Resonse
and just used Http.Current.Response in place wherever resp was referenced.
If you are looking for a .NET Core version of #Dallas's answer, use the below.
Stream stream = null;
//This controls how many bytes to read at a time and send to the client
int bytesToRead = 10000;
// Buffer to read bytes in chunk size specified above
byte[] buffer = new Byte[bytesToRead];
// The number of bytes read
try
{
//Create a WebRequest to get the file
HttpWebRequest fileReq = (HttpWebRequest)HttpWebRequest.Create(#"file url");
//Create a response for this request
HttpWebResponse fileResp = (HttpWebResponse)fileReq.GetResponse();
if (fileReq.ContentLength > 0)
fileResp.ContentLength = fileReq.ContentLength;
//Get the Stream returned from the response
stream = fileResp.GetResponseStream();
// prepare the response to the client. resp is the client Response
var resp = HttpContext.Response;
//Indicate the type of data being sent
resp.ContentType = "application/octet-stream";
//Name the file
resp.Headers.Add("Content-Disposition", "attachment; filename=test.zip");
resp.Headers.Add("Content-Length", fileResp.ContentLength.ToString());
int length;
do
{
// Verify that the client is connected.
if (!HttpContext.RequestAborted.IsCancellationRequested)
{
// Read data into the buffer.
length = stream.Read(buffer, 0, bytesToRead);
// and write it out to the response's output stream
resp.Body.Write(buffer, 0, length);
//Clear the buffer
buffer = new Byte[bytesToRead];
}
else
{
// cancel the download if client has disconnected
length = -1;
}
} while (length > 0); //Repeat until no data is read
}
finally
{
if (stream != null)
{
//Close the input stream
stream.Close();
}
}
I would argue the simplest way to do so in .Net Core is:
using (MemoryStream ms = new MemoryStream())
using (HttpClient client = new HttpClient())
{
client.GetStreamAsync(url).Result.CopyTo(ms);
// use ms in what you want
}
}
now you have the file downloaded as stream inside ms.
You could try using the DirectoryEntry class with the IIS path prefix:
using(DirectoryEntry de = new DirectoryEntry("IIS://Localhost/w3svc/1/root" + DOCUMENT_PATH))
{
filePath = de.Properties["Path"].Value;
}
if (!File.Exists(filePath))
return;
var fileInfo = new System.IO.FileInfo(filePath);
Response.ContentType = "application/octet-stream";
Response.AddHeader("Content-Disposition", String.Format("attachment;filename=\"{0}\"", filePath));
Response.AddHeader("Content-Length", fileInfo.Length.ToString());
Response.WriteFile(filePath);
Response.End();
The accepted solution from Dallas was working for us if we use Load Balancer on the Citrix Netscaler (without WAF policy).
The download of the file doesn't work through the LB of the Netscaler when it is associated with WAF as the current scenario (Content-length not being correct) is a RFC violation and AppFW resets the connection, which doesn't happen when WAF policy is not associated.
So what was missing was:
Response.End();
See also:
Trying to stream a PDF file with asp.net is producing a "damaged file"

Categories

Resources