uploading large file using ftpwebrequest upload wont complete - c#

My software i designed uploads files using ftp to my sever I'm using the ftpwebrequest to do all uploading. When uploading a file 700mb it uploads about 500mbs then stops, it works fine when uploading smaller files the smaller files upload successfully but it just want work properly on large files. I have the uploading done in a background worker that reports the upload progress to a progress bar on the main client. When the background worker completes it executes the background worker completed function. The background worker completed function gets executed but the upload never completes and the progress bar is stuck at about 65 percent its like the client just stops uploading and executes the background worker completed function as though it completed uploading. What could be going wrong here the upload doesn't complete and the file dose not appear on the server here is the code that dose the uploading
void UploadFileInBackground_DoWork(object sender,DoWorkEventArgs e)
{
byte[] data;
int packetsize = 1024 * 8;
string Destination = UploadURI + cattext + "/" + ID + ".obj";
string source = DialogBrower.FileName;
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(Destination);
request.Credentials = new NetworkCredential("user", "pass");
request.Method = WebRequestMethods.Ftp.UploadFile;
request.UsePassive = true;
request.UseBinary = true;
request.KeepAlive = false;
using (FileStream fs = new FileStream(source, FileMode.Open, FileAccess.Read))
{
try
{
long filesize = fs.Length;
long sum = 0;
int count = 0;
data = new byte[packetsize];
Stream reqStream = request.GetRequestStream();
float totalpackits = filesize / packetsize;
float weightofpackit = 100 / totalpackits;
float percentage = 0;
while (sum < filesize)
{
List<string> statusparms = new List<string>();
count = fs.Read(data, 0, packetsize);
reqStream.Write(data, 0, count);
sum += count;
percentage += weightofpackit;
int percentagetotal = Convert.ToInt32(Math.Round(percentage));
statusparms.Add(sum.ToString());
statusparms.Add(filesize.ToString());
UploadFileInBackground.ReportProgress(percentagetotal, statusparms);
}
reqStream.Close();
uploadedname = uploadingname;
}
finally
{
fs.Dispose();
data = null;
}
}
}

Please try this instead:
request.UseBinary = false;

let's try this
request.KeepAlive = false;
to
request.KeepAlive = true;

Related

Download FTP using FtpWebRequest in Windows Form .Net

I have tried to download an FTP file using C# and have had various problems. What I want to achieve is to be able to show download progress in a progressBar. It is important that I use Windows Form and .Net.
I have tried two codes;
My first code works perfectly, that is, I can download the FTP file without problems.
CODE 1
FtpWebRequest dirFtp = ((FtpWebRequest)FtpWebRequest.Create(ficFTP));
dirFtp.KeepAlive = true;
dirFtp.UsePassive = UsePassive;
dirFtp.UseBinary = UseBinary;
// Los datos del usuario (credenciales)
NetworkCredential cr = new NetworkCredential(user, pass);
dirFtp.Credentials = cr;
FtpWebResponse response = (FtpWebResponse)dirFtp.GetResponse();
long size = (long)response.ContentLength;
Stream responseStream = response.GetResponseStream();
StreamReader reader = new StreamReader(responseStream);
using (FileStream writer = new FileStream(dirLocal, FileMode.Create))
{
int bufferSize = 2048;
int readCount;
byte[] buffer = new byte[2048];
readCount = responseStream.Read(buffer, 0, bufferSize);
while (readCount > 0)
{
writer.Write(buffer, 0, readCount);
readCount = responseStream.Read(buffer, 0, bufferSize);
}
}
lblDescarga.Text = "¡Downloaded!";
reader.Close();
response.Close();
Problem with this code
My problem with this code is that I can't get the size of the FTP file to be able to use the progressBar, In theory this section of code would tell me the size of my file but it always returns -1:
long size = (long)response.ContentLength;
As this did not work as I wanted, I made a post and people recommended this solution FtpWebRequest FTP download with ProgressBar:
CODE 2
try
{
const string url = "ftp://185.222.111.11:21/patch/archive.zip";
NetworkCredential credentials = new NetworkCredential("user", "pass");
// Query size of the file to be downloaded
WebRequest sizeRequest = WebRequest.Create(url);
sizeRequest.Credentials = credentials;
sizeRequest.Method = WebRequestMethods.Ftp.GetFileSize;
int size = (int)sizeRequest.GetResponse().ContentLength;
progressBar1.Invoke(
(MethodInvoker)(() => progressBar1.Maximum = size));
// Download the file
WebRequest request = WebRequest.Create(url);
request.Credentials = credentials;
request.Method = WebRequestMethods.Ftp.DownloadFile;
using (Stream ftpStream = request.GetResponse().GetResponseStream())
using (Stream fileStream = File.Create(#"C:\tmp\archive.zip"))
{
byte[] buffer = new byte[10240];
int read;
while ((read = ftpStream.Read(buffer, 0, buffer.Length)) > 0)
{
fileStream.Write(buffer, 0, read);
int position = (int)fileStream.Position;
progressBar1.Invoke(
(MethodInvoker)(() => progressBar1.Value = position));
}
}
}
catch (Exception e)
{
MessageBox.Show(e.Message);
}
Problem with this code
The problem with this code is when it gets to this point:
int size = (int) sizeRequest.GetResponse (). ContentLength;
Remote server error: (550) File not available (eg file not found or not accessed).
The truth is that it is impossible to tell that you do not have permission if code 1 works well. However I have the normal permissions in FTP, could someone give me an idea please?

C# - Can't upload a file to FTP server while file is being written on by another program

I am basically trying to trace a specific file on my PC and whenever the file's content changes, it should be uploaded to my FTP server. The problem I'm facing is that it won't let me upload to my FTP server while the file is currently being written on by another process. I have tried everything, including opening in Administrator etc but this error (550 Permission denied) comes from the FTP server.
Here is my code:
public static void Trace()
{
string checksum = "";
while (true)
{
if (CheckMd5(Path) != checksum)
{
UploadFile1();
}
checksum = CheckMd5(Path);
Thread.Sleep(5000);
}
}
public static void UploadFile1()
{
var ftp1 = new myFTP();
if (!File.Exists(Path))
{
}
else
{
var currentTime = CurrentTime; // gets the current time
ftp1.UploadFile(Path, timeRn);
}
}
public void UploadFile(string filePath, string CurrentTime)
{
FtpWebRequest request =
(FtpWebRequest)WebRequest.Create("ftp://127.0.0.1/" + CurrentTime);
request.Method = WebRequestMethods.Ftp.UploadFile;
request.Credentials = new NetworkCredential("user", "password");
request.UsePassive = true;
request.UseBinary = true;
request.KeepAlive = false;
request.EnableSsl = false;
FileStream stream = File.Open(filePath, FileMode.Open, FileAccess.Read, FileShare.ReadWrite);
byte[] buffer = new byte[stream.Length];
stream.Read(buffer, 0, buffer.Length);
stream.Close();
Stream reqStream = request.GetRequestStream();
reqStream.Write(buffer, 0, buffer.Length);
reqStream.Close();
}
You should probably restructure those 2 processes so that the file-changing process fires an event after it's done changing the file, while the ftp-uploading process should stop forcing its way in by looping and comparing checksum values (just let it sit dormant and wait for the file-done signal). That approach will improve perf of your app as a bonus (aside from the accuracy that you need).
Aside from that, maybe try using FileSystemWatcher class. You can filter for modification events only.

File download in chunks in http-context response C#

I have a below scenario.
Client send request to Server-1 for file download
Server-1 send request to Server-2 for file.
To make this work I need to create a mechanism where once client send request to the Server-1, Server-1 will request to Server-2 which will send file as response output-stream in chunks. Server-1 will send this file chunks to client browser continuously as it keep receiving from server-2.
I have done code as below, theoretically it looks fine but still it is not working.
It is not downloading entire file in client browser, it seems like last chunk is not transferred to the Server-1 or it is not downloading to client browser from Server-1
Server-1 Code (Where client request for File download)
private void ProccesBufferedResponse(HttpWebRequest webRequest, HttpContext context)
{
char[] responseChars = null;
byte[] buffer = null;
if (webRequest == null)
logger.Error("Request string is null for Perfios Docs Download at ProccesBufferedResponse()");
context.Response.Buffer = false;
context.Response.BufferOutput = false;
try
{
WebResponse webResponse = webRequest.GetResponse();
context.Response.ContentType = "application/pdf";
context.Response.AddHeader("Content-disposition", webResponse.Headers["Content-disposition"]);
StreamReader responseStream = new StreamReader(webResponse.GetResponseStream());
while (!responseStream.EndOfStream)
{
responseChars = new char[responseStream.ToString().ToCharArray().Length];
responseStream.Read(responseChars, 0, responseChars.Length);
buffer = Encoding.ASCII.GetBytes(responseChars);
context.Response.Clear();
context.Response.OutputStream.Write(buffer, 0, buffer.Length);
context.Response.Flush();
}
}
catch (Exception ex)
{
throw;
}
finally
{
context.Response.Flush();
context.Response.End();
}
}
Server-2 Code (Where Server-1 will send request for file)
private void DownloadInstaPerfiosDoc(int CompanyID, string fileName, string Foldertype)
{
string folderPath;
string FilePath;
int chunkSize = 1024;
int startIndex = 0;
int endIndex = 0;
int length = 0;
byte[] bytes = null;
DirectoryInfo dir;
folderPath = GetDocumentDirectory(CompanyID, Foldertype);
FilePath = folderPath + "\\" + fileName;
dir = new DirectoryInfo(folderPath);
HttpContext.Current.Response.Buffer = false;
HttpContext.Current.Response.BufferOutput = false;
if (dir.Exists && dir.GetFiles().Length > 0)
{
foreach (var file in dir.GetFiles(fileName))
{
FilePath = folderPath + "\\" + file.Name;
FileStream fsReader = new FileStream(FilePath, FileMode.Open, FileAccess.Read);
HttpContext.Current.Response.ContentType = "application/pdf";
HttpContext.Current.Response.AddHeader("Content-disposition", string.Format("attachment; filename = \"{0}\"", fileName));
int totalChunks = (int)Math.Ceiling((double)fsReader.Length / chunkSize);
for (int i = 0; i < totalChunks; i++)
{
startIndex = i * chunkSize;
if (startIndex + chunkSize > fsReader.Length)
endIndex = (int)fsReader.Length;
else
endIndex = startIndex + chunkSize;
length = (int)endIndex - startIndex;
bytes = new byte[length];
fsReader.Read(bytes, 0, bytes.Length);
HttpContext.Current.Response.Clear();
HttpContext.Current.Response.OutputStream.Write(bytes, 0, bytes.Length);
HttpContext.Current.Response.Flush();
}
}
}
}
Please help me to resolve this issue.
It is possible and feasible. I'll give a pseudo procedure for you to understand the overall idea.
Server1
download action gets hit
create a request to server2
get the response stream of your server2 request
read the response stream in desired chunk sizes until it's consumed completely
write each chunk (as soon as you read) to current response stream
Server2
download action gets hit
write your stream onto your current response stream however you like

C# FtpWebRequest progress bar tooltip update with uploaded amount

I'm trying to write to my uploadProgress progress bar's toolTip the current uploaded amount, so when the user mouseovers the progress bar, the can see the tooltip changing showing the uploaded amount against the file size.
the code I have so far give me the "busy" icon when I mouse over, until the file has finished uploaded, and then it shows the downloaded amount and file size.
Could someone help me get this working ?
private void uploadFile()
{
try
{
richTextBox1.AppendText("\n\nStarting file upload");
FtpWebRequest request =
(FtpWebRequest)WebRequest.Create("ftp://ftpsite.com/public_html/test.htm");
request.Credentials = new NetworkCredential("username", "password");
request.UsePassive = true;
request.UseBinary = true;
request.KeepAlive = true;
request.Method = WebRequestMethods.Ftp.UploadFile;
using (Stream fileStream = File.OpenRead(#"C:\path\testfile.UPLOAD"))
using (Stream ftpStream = request.GetRequestStream())
{
uploadProgress.Invoke(
(MethodInvoker)delegate {
uploadProgress.Maximum = (int)fileStream.Length; });
byte[] buffer = new byte[10240];
int read;
while ((read = fileStream.Read(buffer, 0, buffer.Length)) > 0)
{
ftpStream.Write(buffer, 0, read);
uploadProgress.Invoke(
(MethodInvoker)delegate {
uploadProgress.Value = (int)fileStream.Position;
toolTip1.SetToolTip(
uploadProgress, string.Format("{0} MB's / {1} MB's\n",
(uploadProgress.Value / 1024d / 1024d).ToString("0.00"),
(fileStream.Length / 1024d / 1024d).ToString("0.00")));
});
}
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}
Thanks
Your code works for me. Assuming you run the uploadFile on a background thread, like:
private void button1_Click(object sender, EventArgs e)
{
Task.Run(() => uploadFile());
}
See also How can we show progress bar for upload with FtpWebRequest
(though you know that link already)
You just update the tooltip too often, so it flickers.

c# ftpwebrequest performance

I have an application that download files from a Unix FTP server. It works fine, just have this performance problem: Files which size is <= 1K takes in average between 2084 and 2400 milliseconds to download, while applications like Filezilla download the same files in less than 1 second (per each file).
Maybe this time its OK for some average users, but is not acceptable for my application, since I need to download THOUSANDS of files.
I optimize the code as much as I could:
- The cache and buffer to read the content are created 1 time in the constructor of the class.
- I create 1 time the network credentials, and re-use on every file download. I know this is working, since for the first file it takes like 7s to download, and all subsequent downloads are on the range of 2s.
- I change the size of the buffer from 2K until 32K. I dont know if this will help or not, since the files Im downloading are less than 1K, so in theory the buffer will be fill with all the information in 1 round from network.
Maybe is not related to the network, but to the way Im writing and/or windows handles the write of the file??
Can someone please give me some tips on how to reduce the time to something similar to filezilla??
I need to reduce the time, otherwise my ftp will be running for 3 days 24 hours a day to finish its task :(
Many thanks in advance.
The code here: Its not complete, it just show the download part.
//Create this on the constructor of my class
downloadCache = new MemoryStream(2097152);
downloadBuffer = new byte[32768];
public bool downloadFile(string pRemote, string pLocal, out long donwloadTime)
{
FtpWebResponse response = null;
Stream responseStream = null;
try
{
Stopwatch fileDownloadTime = new Stopwatch();
donwloadTime = 0;
fileDownloadTime.Start();
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(pRemote);
request.Method = WebRequestMethods.Ftp.DownloadFile;
request.UseBinary = false;
request.AuthenticationLevel = AuthenticationLevel.None;
request.EnableSsl = false;
request.Proxy = null;
//I created the credentials 1 time and re-use for every file I need to download
request.Credentials = this.manager.ftpCredentials;
response = (FtpWebResponse)request.GetResponse();
responseStream = response.GetResponseStream();
downloadCache.Seek(0, SeekOrigin.Begin);
int bytesSize = 0;
int cachedSize = 0;
//create always empty file. Need this because WriteCacheToFile just append the file
using (FileStream fileStream = new FileStream(pLocal, FileMode.Create)) { };
// Download the file until the download is completed.
while (true)
{
bytesSize = responseStream.Read(downloadBuffer, 0, downloadBuffer.Length);
if (bytesSize == 0 || 2097152 < cachedSize + bytesSize)
{
WriteCacheToFile(pLocal, cachedSize);
if (bytesSize == 0)
{
break;
}
downloadCache.Seek(0, SeekOrigin.Begin);
cachedSize = 0;
}
downloadCache.Write(downloadBuffer, 0, bytesSize);
cachedSize += bytesSize;
}
fileDownloadTime.Stop();
donwloadTime = fileDownloadTime.ElapsedMilliseconds;
//file downloaded OK
return true;
}
catch (Exception ex)
{
return false;
}
finally
{
if (response != null)
{
response.Close();
}
if (responseStream != null)
{
responseStream.Close();
}
}
}
private void WriteCacheToFile(string downloadPath, int cachedSize)
{
using (FileStream fileStream = new FileStream(downloadPath, FileMode.Append))
{
byte[] cacheContent = new byte[cachedSize];
downloadCache.Seek(0, SeekOrigin.Begin);
downloadCache.Read(cacheContent, 0, cachedSize);
fileStream.Write(cacheContent, 0, cachedSize);
}
}
Sounds to me your problem is related to Nagels algorithm used in the TCP client.
You can try turning the Nagel's algorithm off and also set SendChunked to false.

Categories

Resources