How to continue or resume FTP upload after interruption of internet - c#

I am using below code (C# .NET 3.5) to upload a file:
FtpWebRequest request =
(FtpWebRequest)WebRequest.Create("ftp://someweb.mn/altanzulpharm/file12.zip");
request.Method = WebRequestMethods.Ftp.UploadFile;
request.KeepAlive = true;
request.UseBinary = true;
request.Credentials = new NetworkCredential(username, password);
FileStream fs = File.OpenRead(FilePath);
byte[] buffer = new byte[fs.Length];
fs.Read(buffer, 0, buffer.Length);
fs.Close();
Stream ftpstream = request.GetRequestStream();
ftpstream.Write(buffer, 0, buffer.Length);
ftpstream.Close();
But the upload breaks when internet interrupted. Interruption occurs for a very small amount of time, almost a millisecond. But uploading breaks forever!
Is it possible to continue or resume uploading after interruption of internet?

I don't believe FtpWebRequest supports re-connection after losing connection. You can resume upload from given position if server supports it (this support is not required and presumably less common that retry for download).
You'll need to set FtpWebRequet.ContentOffset upload. Part of sample from the article:
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(serverUri);
request.ContentOffset = offset;
Internal details of restore in FTP protocol itself - RFC959: 3.5 - Error recovery and restart. Question showing retry code for download - Downloading from FTP with c#, can't retry when fail

The only way to resume transfer after a connection is interrupted with FtpWebRequest, is to reconnect and start writing to the end of the file.
For that use FtpWebRequest.ContentOffset.
A related question for upload with full code (although for C#):
How to download FTP files with automatic resume in case of disconnect
Or use an FTP library that can resume the transfer automatically.
For example WinSCP .NET assembly does. With it, a resumable upload is as trivial as:
// Setup session options
var sessionOptions = new SessionOptions
{
Protocol = Protocol.Ftp,
HostName = "ftp.example.com",
UserName = "user",
Password = "mypassword"
};
using (var session = new Session())
{
// Connect
session.Open(sessionOptions);
// Resumable upload
session.PutFileToDirectory(#"C:\path\file.zip", "/home/user");
}
(I'm the author of WinSCP)

Related

System.Net.WebResponse throwing "System.Net.WebException: The operation has timed out" but connection to web server was made

I have a method which is intended to download a file from an HTTP URL to a byte array:
private static byte[] DownloadFileToByteArrayWorker(HttpWebRequest Request, int bufferLength)
{
byte[] responseByes = null;
//Round up to the nearest multiple of 1024
bufferLength = AdjustBufferLength(bufferLength);
Request.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate;
Request.ServicePoint.Expect100Continue = false;
Request.Headers.Add(HttpRequestHeader.CacheControl, "no-cache");
using (var Response = (HttpWebResponse)Request.GetResponse())
{
using (Stream ResponseStream = Response.GetResponseStream())
{
using (MemoryStream ms = new MemoryStream())
{
int count = 0;
byte[] buf = new byte[bufferLength];
while ((count = ResponseStream.Read(buf, 0, buf.Length)) > 0)
{
ms.Write(buf, 0, count);
}
responseByes = ms.ToArray();
}
}
}
return responseByes;
}
Request.GetResponse() is throwing a time out exception no matter how long I make the Timeout property of the HttpWebRequest. I can verify via my logs that the program is waiting the full Timeout period before erroring out, however, correlating my logs with the web server logs indicates that the web server is sending back a response almost immediately.
An interesting note is that when I access the same web server via the load balancer rather than directly, it downloads the file practically instantly. Also, if I access the URL via the web server directly in a web browser (no proxy needed, btw) I can download the file from individual web servers instantly that way too.
Some additional details:
I am using .NET Framework 4.7 on Windows 2012 R2.
The web server I'm trying to connect to is Apache on RHEL7. I'm not sure about the specific Apache version
I am connecting to the web server on a specific port which is reserved for HTTP traffic (a separate website is hosted on a different port number for HTTPS)
There's no web proxy
Any suggestions?
As you said your code has problem only when you call the load balancer,
I think the problem is the your client send a 100 continue request but your load balancer don't know how to handle it.
That is the reason you client doesn't send all the data right after the beginning of connection.
You can find more information about 100 continue in HTTP rfc section 8.2.3.
To fix the behavior from client side in c# you have to add this code:
ServicePointManager.Expect100Continue = false;
You can see the full documentation about this feature here.

Reusing FtpWebRequest

I'm trying to make a simple method to download a file from an FTP using FtpWebRequest with the method WebRequestMethods.Ftp.DownloadFile. The problem is that I wan't to display the progress of downloading and thus need to know the file size ahead to be able to calculate the percentage transfered. But when I call GetResponse in FtpWebRequest the ContentLength member is -1.
OK - so I get the size of the file in advance using the method WebRequestMethods.Ftp.GetFileSize. No problem. Then after getting the size I download the file.
This is where the problem in question appears...
After getting the size I try to reuse the FtpWebRequest and resets the method to WebRequestMethods.Ftp.DownloadFile. This causes an System.InvalidOperationException saying something like "Can't perform this action after sending the request." (may not be the exact formulation - translated from the one I get in Swedish).
I've found elsewhere that as long as I set the KeepAlive property to true, it doesn't matter, the connection is kept active. This is what I don't understand... The only object I've created is my FtpWebRequest object. And if I create another one, how can it know what connection to use? And what credentials?
Pseudo code:
Create FtpWebRequest
Set Method property to GetFileSize
Set KeepAlive property to true
Set Credentials property to new NetworkCredential(...)
Get FtpWebResponse from the request
Read and store ContentLength
Now I got the file size. So it's time to download the file. Setting Method now causes the exception mentioned above. So do I create a new FtpWebRequest? Or is there anyway to reset the request to be reused? (Closing the response made no difference.)
I don't understand how to move forward without re-creating the object. I could do that, but it just doesn't feel right. So i'm posting here in hope to find the correct way of doing this.
Here's the (non working) code (Inputs are sURI, sDiskName, sUser and sPwd.) :
FtpWebRequest request = (FtpWebRequest)FtpWebRequest.Create(sURI);
request.Method = WebRequestMethods.Ftp.GetFileSize;
request.Credentials = new NetworkCredential(sUser, sPwd);
request.UseBinary = true;
request.UsePassive = true;
request.KeepAlive = true;
FtpWebResponse resp = (FtpWebResponse)request.GetResponse();
int contLen = (int)resp.ContentLength;
resp.Close();
request.Method = WebRequestMethods.Ftp.DownloadFile;
resp = (FtpWebResponse)request.GetResponse();
Stream inStr = resp.GetResponseStream();
byte[] buff = new byte[16384];
sDiskName = Environment.ExpandEnvironmentVariables(sDiskName);
FileStream file = File.Create(sDiskName);
int readBytesCount;
int readTotal=0;
while ((readBytesCount = inStr.Read(buff, 0, buff.Length)) > 0)
{
readTotal += readBytesCount;
toolStripProgressBar1.Value = 100*readTotal/contLen;
Application.DoEvents();
file.Write(buff, 0, readBytesCount);
}
file.Close();
I hope someone can explain how this is supposed to work. Thanks in advance.
I don't think this will be answered so I'm "closing it" by telling you how I solved it.
Well, I didn't really solve it. I did however test the download by recreating the FtpWebRequest and noticed that on the FTP server it behaved as I wanted i.e. only one log on and then sequentially executing my requests.
This is how the code getting the file size and starting the download ended up:
// Start by fetching the file size
FtpWebRequest request = (FtpWebRequest)FtpWebRequest.Create(sURI);
request.Method = WebRequestMethods.Ftp.GetFileSize;
NetworkCredential nc = new NetworkCredential(sUser, sPwd);
request.Credentials = nc;
request.UseBinary = true;
request.UsePassive = true;
request.KeepAlive = true;
// Get the result (size)
FtpWebResponse resp = (FtpWebResponse)request.GetResponse();
Int64 contLen = resp.ContentLength;
// and now download the file
request = (FtpWebRequest)FtpWebRequest.Create(sURI);
request.Method = WebRequestMethods.Ftp.DownloadFile;
request.Credentials = nc;
request.UseBinary = true;
request.UsePassive = true;
request.KeepAlive = true;
resp = (FtpWebResponse)request.GetResponse();
So no answer on if it's possible to reset the FtpWebRequest for re-use. But at least I know there's no redundant information being transferred.
Thanks to everybody who took an interest and spent time thinking of an answer.
FtpWebRequest can be used for only 1 request, like getting the file size or to download the file, but not both. You have to create 2 FtpWebRequests. Behind the scene, FtpWebRequest notices that it is the same URL and credentials and will reuse the same ftp connection without closing it, as long IsKeepAlieve is true, which is the default setting.
This is a sad example of bad design by Microsoft. Instead of letting us explicitly open and close a connection, they want to do it automatically for us and confuse everyone.
You're going to probably want to use the Async method. Here's the link to the MSDN doc.
http://msdn.microsoft.com/en-us/library/system.net.ftpwebrequest.aspx
GetResponseAsync()
That will keep your application from locking up too, so you won't have to use
Application.DoEvents();
You could also look at possibly using an alternative ftp library. imo the FtpWebRequest is not exactly the best ftp class. A quick search turned up this library. Ftp isn't stateless like HTTP. I prefer libraries that let you create a client, open a connect, and keep the connection alive.
http://sanity-free.org/dist/NullFX.Net-binary.zip
Here's the code exacmple I found
FtpClient client =
new FtpClient(
new IPEndPoint( IPAddress.Loopback, 21 ),
new NetworkCredential( "test", "testing#localdomain" )
);
client.Connect();
client.Download("testfile.zip", #"C:\downloads\testfile.zip");
The source is there too, so you would be able to possibly attach some events to the read process for your download progress tracking.

FTP through FTP proxy

I am trying to download a file using FTP through a FTP proxy (on my side).
This is script I am trying to implement in C#:
On Commandline:
ftp -i -s:get.ini CORPORATE_PROXY.com
-----------get.ini------------
CORPORATE_PROXY_USER#CLIENT_FTP.com abc/user_name
CORPORATE_PROXY_PASSWORD
user_name_password
cd pub/linux/knoppix
get packages.txt
bye
-----------get.ini------------
abc/user_name is my user name who was granted by permissions to FTP through my corporate proxy.
I want to implement above script in C#, but after playing with many types of code found on Internet I cannot do that.
FtpWebRequest request = FtpWebRequest.Create(new Uri(#"ftp://" + CORPORATE_PROXY.com + #"/" + Path.GetFileName(fileToUpload))) as FtpWebRequest;
request.UseBinary = true;
request.KeepAlive = false;
request.Method = WebRequestMethods.Ftp.UploadFile;
if (!string.IsNullOrEmpty(CORPORATE_PROXY_USER) && !string.IsNullOrEmpty(CORPORATE_PROXY_PASSWORD ))
request.Credentials = new NetworkCredential(CORPORATE_PROXY_USER, CORPORATE_PROXY_PASSWORD );
//Get physical file
FileInfo fi = new FileInfo(fileToUpload);
Byte[] contents = new Byte[fi.Length];
//Read file
FileStream fs = fi.OpenRead();
fs.Read(contents, 0, Convert.ToInt32(fi.Length));
fs.Close();
request.Proxy = new WebProxy("CLIENT_FTP.com");
request.Proxy.Credentials = new NetworkCredential(abc/user_name, user_name_password);
//Write file contents to FTP server
Stream rs = request.GetRequestStream();
rs.Write(contents, 0, Convert.ToInt32(fi.Length));
rs.Close();
FtpWebResponse response = request.GetResponse() as FtpWebResponse;
string statusDescription = response.StatusDescription;
response.Close();
return statusDescription;
The main problem is that for the proxy I am using WebProxy, while I suspect I should use FTPProxy - which I cannot find anythere? Any ideas which direction should I go, or maybe WebProxy is fine?
In the past I have used Indy Project to get through FTP proxies.
Try using WebRequest instead of FtpWebRequest in your code example.
By doing so, the connection from client to proxy can be HTTP whereas the connection from proxy to destination server is FTP. The proxy will handle the protocol translation, this technique is referred to as FTP over HTTP.
It is also possible to use a native FTP Proxy where client to proxy and proxy to server connections are FTP. Make sure your proxy supports this.
The proxy offers a separate proxy port to serve native FTP proxy connections.

Transfering a file using ftp from one server to another in different location

I have a requirement of transferring a document file (.txt, .xls, .doc, .bmp, .jpg etc) from one server to another server. Both servers are at different locations. My main application is running on second server. And I have to deploy this functionality on the first server from where the files in any fixed folder (say D:\documents) will be transferred to second server periodically at any timer event.
I am using a code like as follows
WebClient wc = new WebClient();
wc.UploadFile("ftp://1.23.153.248//d://ftp.bmp",
#"C:\Documents and Settings\varun\Desktop\ftp.bmp");
I am getting the error as
unable to connect to remote server
or
sometime underlying connection was closed
Could you tell me what's wrong.
The URI of your FTP location seems to be off: you shouldn't have to double all the forward slashes, and I don't think drive letters are supported.
If you do:
WebClient wc = new WebClient();
wc.UploadFile("ftp://1.23.153.248/ftp.bmp",
#"C:\Documents and Settings\varun\Desktop\ftp.bmp");
The file will be sent to the directory set as the FTP location for the anonymous user. If you configure the FTP service on 1.23.153.248 so that location is D:\, everything should work as planned.
Is your ftp public? if no you should define credentials like this
wc.Credentials = new NetworkCredential ("username","password");
before sending file, and try to remove d: from ftp path. ftp clients don't have to know where on server files should be saved... shortly try this
WebClient wc = new WebClient();
wc.Credentials = new NetworkCredential ("username","password");
wc.UploadFile("ftp://1.23.153.248/ftp.bmp", #"C:\Documents and Settings\varun\Desktop\ftp.bmp");
There are also classes created specially for ftp request in .net, here is sample from MSDN
// Get the object used to communicate with the server.
FtpWebRequest request = (FtpWebRequest)WebRequest.Create("ftp://www.contoso.com/test.htm");
request.Method = WebRequestMethods.Ftp.UploadFile;
// This example assumes the FTP site uses anonymous logon.
request.Credentials = new NetworkCredential ("anonymous","janeDoe#contoso.com");
// Copy the contents of the file to the request stream.
StreamReader sourceStream = new StreamReader("testfile.txt");
byte [] fileContents = Encoding.UTF8.GetBytes(sourceStream.ReadToEnd());
sourceStream.Close();
request.ContentLength = fileContents.Length;
Stream requestStream = request.GetRequestStream();
requestStream.Write(fileContents, 0, fileContents.Length);
requestStream.Close();
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
Console.WriteLine("Upload File Complete, status {0}", response.StatusDescription);
response.Close();

Error using HttpWebRequest to upload files with PUT

We've got a .NET 2.0 WinForms app that needs to upload files to an IIS6 Server via WebDav. From time to time we get complaints from a remote office that they get one of the following error messages
The underlying connection was closed:
an unexpected error occurred on send.
The underlying connection was closed:
an unexpected error occurred on
receive.
This only seems to occur with large files (~20Mb plus). I've tested it with a 40Mb file from my home computer and tried putting 'Sleep's in the loop to simulate a slow connection so I suspect that it's down to network issues at their end... but
The IT at the remote office are no help
I'd like to rule out the posibility my code is at fault.
So - can anybody spot any misakes or suggest any workarounds that might 'bulletproof' the code against this problem. Thanks for any help. Chopped down version of code follows:
public bool UploadFile(string localFile, string uploadUrl)
{
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(uploadUrl);
try
{
req.Method = "PUT";
req.AllowWriteStreamBuffering = true;
req.UseDefaultCredentials = Program.WebService.UseDefaultCredentials;
req.Credentials = Program.WebService.Credentials;
req.SendChunked = false;
req.KeepAlive = true;
Stream reqStream = req.GetRequestStream();
FileStream rdr = new FileStream(localFile, FileMode.Open, FileAccess.Read);
byte[] inData = new byte[4096];
int bytesRead = rdr.Read(inData, 0, inData.Length);
while (bytesRead > 0)
{
reqStream.Write(inData, 0, bytesRead);
bytesRead = rdr.Read(inData, 0, inData.Length);
}
reqStream.Close();
rdr.Close();
System.Net.HttpWebResponse response = (HttpWebResponse)req.GetResponse();
if (response.StatusCode != HttpStatusCode.OK && response.StatusCode!=HttpStatusCode.Created)
{
MessageBox.Show("Couldn't upload file");
return false;
}
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
return false;
}
return true;
}
Try setting KeepAlive to false:
req.KeepAlive = false;
This will allow the connection to be closed and opened again. It will not allow to use a persistent connection. I found a lot of references in the Web that suggested this in order to solve a similar to yours error. This is a relevant link.
Anyway, it is not a good idea to use HTTP PUT (or HTTP POST) to upload large files. It will be better to use FTP or a download/upload manager. These will handle retries, connection problems, timeouts automatically for you. The upload will be faster too and you could also resume a stopped uploading. If you decide to stay with HTTP, you should at least try to add a retry mechanism. If an upload is taking too long, then there is a high probability that it will fail due to proxy, server timeout, firewall or what ever reason not to have with your code.
To remove the risk of a bug in your code, try using WebClient:
using (WebClient client = new WebClient())
{
client.UseDefaultCredentials = Program.WebService.UseDefaultCredentials;
client.Credentials = Program.WebService.Credentials;
client.UploadFile(uploadUrl, "PUT", localFile);
}
Maybe try using POST, but the real culprit is probably the content type.
Try setting
req.ContentType = "application/octet-stream";
req.ContentLength = inData.Length;
or look at the code in the accepted answer here: Upload files with HTTPWebrequest (multipart/form-data)
Both my example and the link I provided involve modifying the ContentType - my example is simpler but might not work, as most applications receiving files expect multipart
Please you check whether [Enable Http Keep-Alives] is set [on] at [Web Site] tab in IIS manager.
The size of the uploads might be limited.
See here for one discussion:
http://www.codeproject.com/KB/aspnet/uploadlargefilesaspnet.aspx
Start by checking some basic configuration. The default values of either of the following may cause problems in file upload - including termination of the connection. I believe IIS 6 would never allow file upload > 2GB (even if it could complete, regardless of config). Msdn describes these nicely.
<httpRuntime executionTimeout = "30" maxRequestLength="200"/>
EDIT: This is ASP.NET config, of course, which assumes you are running your own webdav server or a 3rd party server within ASP.NET. If it's a different webdav server, you'll want to look for the equivalent.

Categories

Resources