I have an application that download files from a Unix FTP server. It works fine, just have this performance problem: Files which size is <= 1K takes in average between 2084 and 2400 milliseconds to download, while applications like Filezilla download the same files in less than 1 second (per each file).
Maybe this time its OK for some average users, but is not acceptable for my application, since I need to download THOUSANDS of files.
I optimize the code as much as I could:
- The cache and buffer to read the content are created 1 time in the constructor of the class.
- I create 1 time the network credentials, and re-use on every file download. I know this is working, since for the first file it takes like 7s to download, and all subsequent downloads are on the range of 2s.
- I change the size of the buffer from 2K until 32K. I dont know if this will help or not, since the files Im downloading are less than 1K, so in theory the buffer will be fill with all the information in 1 round from network.
Maybe is not related to the network, but to the way Im writing and/or windows handles the write of the file??
Can someone please give me some tips on how to reduce the time to something similar to filezilla??
I need to reduce the time, otherwise my ftp will be running for 3 days 24 hours a day to finish its task :(
Many thanks in advance.
The code here: Its not complete, it just show the download part.
//Create this on the constructor of my class
downloadCache = new MemoryStream(2097152);
downloadBuffer = new byte[32768];
public bool downloadFile(string pRemote, string pLocal, out long donwloadTime)
{
FtpWebResponse response = null;
Stream responseStream = null;
try
{
Stopwatch fileDownloadTime = new Stopwatch();
donwloadTime = 0;
fileDownloadTime.Start();
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(pRemote);
request.Method = WebRequestMethods.Ftp.DownloadFile;
request.UseBinary = false;
request.AuthenticationLevel = AuthenticationLevel.None;
request.EnableSsl = false;
request.Proxy = null;
//I created the credentials 1 time and re-use for every file I need to download
request.Credentials = this.manager.ftpCredentials;
response = (FtpWebResponse)request.GetResponse();
responseStream = response.GetResponseStream();
downloadCache.Seek(0, SeekOrigin.Begin);
int bytesSize = 0;
int cachedSize = 0;
//create always empty file. Need this because WriteCacheToFile just append the file
using (FileStream fileStream = new FileStream(pLocal, FileMode.Create)) { };
// Download the file until the download is completed.
while (true)
{
bytesSize = responseStream.Read(downloadBuffer, 0, downloadBuffer.Length);
if (bytesSize == 0 || 2097152 < cachedSize + bytesSize)
{
WriteCacheToFile(pLocal, cachedSize);
if (bytesSize == 0)
{
break;
}
downloadCache.Seek(0, SeekOrigin.Begin);
cachedSize = 0;
}
downloadCache.Write(downloadBuffer, 0, bytesSize);
cachedSize += bytesSize;
}
fileDownloadTime.Stop();
donwloadTime = fileDownloadTime.ElapsedMilliseconds;
//file downloaded OK
return true;
}
catch (Exception ex)
{
return false;
}
finally
{
if (response != null)
{
response.Close();
}
if (responseStream != null)
{
responseStream.Close();
}
}
}
private void WriteCacheToFile(string downloadPath, int cachedSize)
{
using (FileStream fileStream = new FileStream(downloadPath, FileMode.Append))
{
byte[] cacheContent = new byte[cachedSize];
downloadCache.Seek(0, SeekOrigin.Begin);
downloadCache.Read(cacheContent, 0, cachedSize);
fileStream.Write(cacheContent, 0, cachedSize);
}
}
Sounds to me your problem is related to Nagels algorithm used in the TCP client.
You can try turning the Nagel's algorithm off and also set SendChunked to false.
Related
I am basically trying to trace a specific file on my PC and whenever the file's content changes, it should be uploaded to my FTP server. The problem I'm facing is that it won't let me upload to my FTP server while the file is currently being written on by another process. I have tried everything, including opening in Administrator etc but this error (550 Permission denied) comes from the FTP server.
Here is my code:
public static void Trace()
{
string checksum = "";
while (true)
{
if (CheckMd5(Path) != checksum)
{
UploadFile1();
}
checksum = CheckMd5(Path);
Thread.Sleep(5000);
}
}
public static void UploadFile1()
{
var ftp1 = new myFTP();
if (!File.Exists(Path))
{
}
else
{
var currentTime = CurrentTime; // gets the current time
ftp1.UploadFile(Path, timeRn);
}
}
public void UploadFile(string filePath, string CurrentTime)
{
FtpWebRequest request =
(FtpWebRequest)WebRequest.Create("ftp://127.0.0.1/" + CurrentTime);
request.Method = WebRequestMethods.Ftp.UploadFile;
request.Credentials = new NetworkCredential("user", "password");
request.UsePassive = true;
request.UseBinary = true;
request.KeepAlive = false;
request.EnableSsl = false;
FileStream stream = File.Open(filePath, FileMode.Open, FileAccess.Read, FileShare.ReadWrite);
byte[] buffer = new byte[stream.Length];
stream.Read(buffer, 0, buffer.Length);
stream.Close();
Stream reqStream = request.GetRequestStream();
reqStream.Write(buffer, 0, buffer.Length);
reqStream.Close();
}
You should probably restructure those 2 processes so that the file-changing process fires an event after it's done changing the file, while the ftp-uploading process should stop forcing its way in by looping and comparing checksum values (just let it sit dormant and wait for the file-done signal). That approach will improve perf of your app as a bonus (aside from the accuracy that you need).
Aside from that, maybe try using FileSystemWatcher class. You can filter for modification events only.
I just wanted to download a single file from FTP server in multiple segments in multiple threads using C#.
Is it possible to give ranges for the file download like in HttpWebRequest?
First the disclaimer:
Multitasking is not a magical "go faster" bullet. If you apply it to the wrong problem, you end up with code that is more complex/prone to errors, more memory demanding and actually slower then the plain old singletasked/sequential approach. One alternate Task for a long running operation is genereally mandatory. But Massive parallelisation is only in very specific circumtances.
Generaly file operations are Disk or Network bound. Multitasking will not add any speedup to Disk or Network bound operations. And indeed might cause a slowdown, as NCQ and similar features have to straightern out your random access requests. That being said with Netowrking it sometimes can help. Some servers do apply a "per connection" limit, and thus splitting the download into multiple segments with their own connection can be a net speedup by.
But be certain this is actually the case here. Consider all but point 1 of the Speed Rant.
Asuming FTPWebRequest is still the class you are using, it looks like ContentLenght and ContentOffset might be the droids you are looking for. You basically use it similar to substring - each connection/sub-request takes X bytes from Y offset.
You can use FtpWebRequest.ContentOffset to specify the starting offset.
But FtpWebRequest.ContentLength is not implemented. To workaround that you have to abort the download, once you receive the desired amount of bytes.
const string name = "bigfile.dat";
const int chunks = 3;
const string url = "ftp://example.com/remote/path/" + name;
NetworkCredential credentials = new NetworkCredential("username", "password");
Console.WriteLine("Starting...");
FtpWebRequest sizeRequest = (FtpWebRequest)WebRequest.Create(url);
sizeRequest.Credentials = credentials;
sizeRequest.Method = WebRequestMethods.Ftp.GetFileSize;
long size = sizeRequest.GetResponse().ContentLength;
Console.WriteLine($"File has {size} bytes");
long chunkLength = size / chunks;
List<Task> tasks = new List<Task>();
for (int chunk = 0; chunk < chunks; chunk++)
{
int i = chunk;
tasks.Add(Task.Run(() =>
{
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(url);
request.Credentials = credentials;
request.Method = WebRequestMethods.Ftp.DownloadFile;
request.ContentOffset = chunkLength * i;
long toread =
(i < chunks - 1) ? chunkLength : size - request.ContentOffset;
Console.WriteLine(
$"Downloading chunk {i + 1}/{chunks} with {toread} bytes ...");
using (Stream ftpStream = request.GetResponse().GetResponseStream())
using (Stream fileStream = File.Create(name + "." + i))
{
byte[] buffer = new byte[10240];
int read;
while (((read = (int)Math.Min(buffer.Length, toread)) > 0) &&
((read = ftpStream.Read(buffer, 0, read)) > 0))
{
fileStream.Write(buffer, 0, read);
toread -= read;
}
}
Console.WriteLine($"Downloaded chunk {i + 1}/{chunks}");
}));
}
Console.WriteLine(
"Started all chunks downloads, waiting for them to complete...");
Task.WaitAll(tasks.ToArray());
Console.WriteLine("Done");
Console.ReadKey();
The default connection limit is 2. Use ServicePoint to change the limit.
request.Method = WebRequestMethods.Ftp.DownloadFile;
ServicePoint sp = request.ServicePoint;
sp.ConnectionLimit = 4;
My software i designed uploads files using ftp to my sever I'm using the ftpwebrequest to do all uploading. When uploading a file 700mb it uploads about 500mbs then stops, it works fine when uploading smaller files the smaller files upload successfully but it just want work properly on large files. I have the uploading done in a background worker that reports the upload progress to a progress bar on the main client. When the background worker completes it executes the background worker completed function. The background worker completed function gets executed but the upload never completes and the progress bar is stuck at about 65 percent its like the client just stops uploading and executes the background worker completed function as though it completed uploading. What could be going wrong here the upload doesn't complete and the file dose not appear on the server here is the code that dose the uploading
void UploadFileInBackground_DoWork(object sender,DoWorkEventArgs e)
{
byte[] data;
int packetsize = 1024 * 8;
string Destination = UploadURI + cattext + "/" + ID + ".obj";
string source = DialogBrower.FileName;
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(Destination);
request.Credentials = new NetworkCredential("user", "pass");
request.Method = WebRequestMethods.Ftp.UploadFile;
request.UsePassive = true;
request.UseBinary = true;
request.KeepAlive = false;
using (FileStream fs = new FileStream(source, FileMode.Open, FileAccess.Read))
{
try
{
long filesize = fs.Length;
long sum = 0;
int count = 0;
data = new byte[packetsize];
Stream reqStream = request.GetRequestStream();
float totalpackits = filesize / packetsize;
float weightofpackit = 100 / totalpackits;
float percentage = 0;
while (sum < filesize)
{
List<string> statusparms = new List<string>();
count = fs.Read(data, 0, packetsize);
reqStream.Write(data, 0, count);
sum += count;
percentage += weightofpackit;
int percentagetotal = Convert.ToInt32(Math.Round(percentage));
statusparms.Add(sum.ToString());
statusparms.Add(filesize.ToString());
UploadFileInBackground.ReportProgress(percentagetotal, statusparms);
}
reqStream.Close();
uploadedname = uploadingname;
}
finally
{
fs.Dispose();
data = null;
}
}
}
Please try this instead:
request.UseBinary = false;
let's try this
request.KeepAlive = false;
to
request.KeepAlive = true;
I want to download a file in a method, and then continue working with that file using some data that is stored in variables in the first method.
I know you can use DownloadFileAsync, but then I need to continue my work in the DownloadFileCompleted method, and the variables can't be reached from there (unless I declare some global ones and use instead, though that isn't the right way I suppose).
So I googled and found another way, by downloading the file manually, bit by bit. That would suit me quite perfect. Though what I want to know is if there are any other methods/solution to my problem that is more simple?
Or if you can play around with the events and achieve something that suits me better :)
Oh, and please change my question if you find a better title of it, I couldn't think of one.
You have to do it piece by piece to update a progress bar. This code does the trick.
public class WebDownloader
{
private static readonly ILog log = LogManager.GetLogger(typeof(WebDownloader));
public delegate void DownloadProgressDelegate(int percProgress);
public static void Download(string uri, string localPath, DownloadProgressDelegate progressDelegate)
{
long remoteSize;
string fullLocalPath; // Full local path including file name if only directory was provided.
log.InfoFormat("Attempting to download file (Uri={0}, LocalPath={1})", uri, localPath);
try
{
/// Get the name of the remote file.
Uri remoteUri = new Uri(uri);
string fileName = Path.GetFileName(remoteUri.LocalPath);
if (Path.GetFileName(localPath).Length == 0)
fullLocalPath = Path.Combine(localPath, fileName);
else
fullLocalPath = localPath;
/// Have to get size of remote object through the webrequest as not available on remote files,
/// although it does work on local files.
using (WebResponse response = WebRequest.Create(uri).GetResponse())
using (Stream stream = response.GetResponseStream())
remoteSize = response.ContentLength;
log.InfoFormat("Downloading file (Uri={0}, Size={1}, FullLocalPath={2}).",
uri, remoteSize, fullLocalPath);
}
catch (Exception ex)
{
throw new ApplicationException(string.Format("Error connecting to URI (Exception={0})", ex.Message), ex);
}
int bytesRead = 0, bytesReadTotal = 0;
try
{
using (WebClient client = new WebClient())
using (Stream streamRemote = client.OpenRead(new Uri(uri)))
using (Stream streamLocal = new FileStream(fullLocalPath, FileMode.Create, FileAccess.Write, FileShare.None))
{
byte[] byteBuffer = new byte[1024 * 1024 * 2]; // 2 meg buffer although in testing only got to 10k max usage.
int perc = 0;
while ((bytesRead = streamRemote.Read(byteBuffer, 0, byteBuffer.Length)) > 0)
{
bytesReadTotal += bytesRead;
streamLocal.Write(byteBuffer, 0, bytesRead);
int newPerc = (int)((double)bytesReadTotal / (double)remoteSize * 100);
if (newPerc > perc)
{
log.InfoFormat("...Downloading (BytesRead={0}, Perc={1})...", bytesReadTotal, perc);
perc = newPerc;
if (progressDelegate != null)
progressDelegate(perc);
}
}
}
}
catch (Exception ex)
{
throw new ApplicationException(string.Format("Error downloading file (Exception={0})", ex.Message), ex);
}
log.InfoFormat("File successfully downloaded (Uri={0}, BytesDownloaded={1}/{2}, FullLocalPath={3}).",
uri, bytesReadTotal, remoteSize, fullLocalPath);
}
}
You will need to spin off a thread to run this code as its obviously synchronous.
e.g.
Task.Factory.StartNew(_ => Download(...));
I've multiple files on a ftp server.I do not know the names of these files except that they are all. xml files.
How do I programmatically download these files using .Net's FtpWebRequest?
Thanks.
Most likely you'll have to issue a Dir command that lists out all the files, then go through each one downloading it.
Here is some info on getting a directory listing.
http://msdn.microsoft.com/en-us/library/ms229716.aspx
Take a look at the ListDirectory function. It's the equivalent of the NLIST command in FTP.
You'll probably want to use an existing library like this one rather than write your own.
FtpWebRequest __request = (FtpWebRequest)FtpWebRequest.Create(__requestLocation);
__request.Method = WebRequestMethods.Ftp.ListDirectory;
var __response = (FtpWebResponse)__request.GetResponse();
using (StreamReader __directoryList = new StreamReader(__response.GetResponseStream())) {
string ___line = __directoryList.ReadLine();
while (___line != null) {
if (!String.IsNullOrEmpty(___line)) { __output.Add(___line); }
___line = __directoryList.ReadLine();
}
break;
}
Getting the target file...
FtpWebRequest __request = null;
FtpWebResponse __response = null;
byte[] __fileBuffer = null;
byte[] __outputBuffer = null;
__request = (FtpWebRequest)FtpWebRequest.Create(__requestLocation);
__request.Method = WebRequestMethods.Ftp.DownloadFile;
__response = (FtpWebResponse)__request.GetResponse();
using (MemoryStream __outputStream = new MemoryStream()) {
using (Stream __responseStream = __response.GetResponseStream()) {
using (BufferedStream ___outputBuffer = new BufferedStream(__responseStream)) {
__fileBuffer = new byte[BLOCKSIZE];
int ___readCount = __responseStream.Read(__fileBuffer, 0, BLOCKSIZE);
while (___readCount > 0) {
__outputStream.Write(__fileBuffer, 0, ___readCount);
___readCount = __responseStream.Read(__fileBuffer, 0, BLOCKSIZE);
}
__outputStream.Position = 0;
__outputBuffer = new byte[__outputStream.Length];
//Truncate Buffer to only the specified bytes. Store into output buffer
Array.Copy(__outputStream.GetBuffer(), __outputBuffer, __outputStream.Length);
break;
}
}
}
try { __response.Close(); } catch { }
__request = null;
__response = null;
return __outputBuffer;
Ripped out of some other code I have, so it probably wont compile and run directly.
I don't know if the FtpWebRequest is a strict requirement. If you can use a third party component following code would accomplish your task:
// create client, connect and log in
Ftp client = new Ftp();
client.Connect("ftp.example.org");
client.Login("username", "password");
// download all files in the current directory which matches the "*.xml" mask
// at the server to the 'c:\data' directory
client.GetFiles("*.xml", #"c:\data", FtpBatchTransferOptions.Default);
client.Disconnect();
The code uses Rebex FTP which can be downloaded here.
Disclaimer: I'm involved in the development of this product.