I think I have some memory issue with a function that should download a file from ftp server. The reason that I think it's a memory thing is because it works fine during debug (maybe it gives the garbage collector more time). Yet I thought that the using should solve this...
Just to be clear the function works when called once yet calling it several times in a for loop invokes the err message: 550 The specified network name is no longer available.
Please help
Asaf
private void downloadFile(string sourceFile, string targetFolder)
{
string remoteFile = sourceFile.Replace("\\","//");
string localFolder = targetFolder + "\\" + sourceFile.Substring(sourceFile.LastIndexOf("\\")+1);
string filename = "ftp://" + ftpServerIP + "//" + remoteFile;
FtpWebRequest ftpReq = (FtpWebRequest)WebRequest.Create(filename);
ftpReq.Method = WebRequestMethods.Ftp.DownloadFile;
ftpReq.Credentials = new NetworkCredential(ftpUserID, ftpPassword);
ftpReq.UseBinary = true;
ftpReq.Proxy = null;
ftpReq.KeepAlive = false; //'3. Settings and action
try
{
using (System.Net.FtpWebResponse response = (System.Net.FtpWebResponse)(ftpReq.GetResponse()))
{
using (System.IO.Stream responseStream = response.GetResponseStream())
{
using (System.IO.FileStream fs = new System.IO.FileStream(localFolder, System.IO.FileMode.Create))
{
Byte[] buffer = new byte[2047];
int read = 0;
do
{
read = responseStream.Read(buffer, 0, buffer.Length);
fs.Write(buffer, 0, read);
} while (read == 0);
responseStream.Close();
fs.Flush();
fs.Close();
}
responseStream.Close();
}
response.Close();
}
}
catch (WebException ex)
{
FtpWebResponse response = (FtpWebResponse)ex.Response;
Console.Out.WriteLine(response.StatusDescription);
}
}
There's a bug in the read loop that causes large files to be truncated. Use:
int read;
while ((read = responseStream.Read(buffer, 0, buffer.Length)) != 0)
{
fs.Write(buffer, 0, read);
}
With that change in place I was able to download a number of large files via FTP without encountering exceptions.
Related
I have two methods (C# 4.8): one that uploads a file to a server directory using FTP, and one that downloads a file from a server using FTP. Each of these work fine independently. However, when I call the UploadFile(...) method followed by the Download method, an exception is thrown on the FtpWebRequest.Create(...) method in the DownloadFile(...) method.
I can call DownloadFile(...) successively without problems.
Am I missing something, perhaps to close the connection?
The exception is:
Message: "The remote server returned an error: 226-File successfully transferred\r\n226 1.707 seconds (measured here), 4.59 Mbytes per second\r\n."
InnerException: null
Status: ProtocolError
My code is:
public void UploadFile(FileInfo SourceFile, string Domain, string TargetFileName, string Username, string Password)
{
FtpWebRequest _FTPRequest = (FtpWebRequest)WebRequest.Create(Domain + TargetFileName);
_FTPRequest.Credentials = new NetworkCredential(Username, Password);
_FTPRequest.Method = WebRequestMethods.Ftp.UploadFile;
_FTPRequest.UseBinary = true;
_FTPRequest.KeepAlive = false;
_FTPRequest.UsePassive = true;
using (FileStream fileStream = SourceFile.OpenRead())
using (FtpWebResponse response = (FtpWebResponse)_FTPRequest.GetResponse())
using (Stream ftpStream = _FTPRequest.GetRequestStream())
{
byte[] buffer = new byte[1024];
int iRead;
while ((iRead = fileStream.Read(buffer, 0, buffer.Length)) > 0)
ftpStream.Write(buffer, 0, iRead);
}
}
public void DownloadFile(ref byte[] TargetArray, string Domain, string SourceFileName, string Username, string Password)
{
FtpWebRequest _FTPRequest = (FtpWebRequest)WebRequest.Create(Domain + SourceFileName); // exception is thrown here //
_FTPRequest.Credentials = new NetworkCredential(Username, Password);
_FTPRequest.Method = WebRequestMethods.Ftp.DownloadFile;
_FTPRequest.UseBinary = true;
_FTPRequest.KeepAlive = false;
_FTPRequest.UsePassive = true;
using (FtpWebResponse response = (FtpWebResponse)_FTPRequest.GetResponse())
using (Stream ftpStream = response.GetResponseStream())
{
byte[] buffer = new byte[1024];
int totalRead = 0;
int iRead;
while ((iRead = ftpStream.Read(buffer, 0, buffer.Length)) > 0)
{
if (buffer.Length > iRead) Array.Resize<byte>(ref buffer, iRead);
Array.Resize<byte>(ref TargetArray, TargetArray.Length + iRead);
buffer.CopyTo(TargetArray, totalRead);
totalRead += iRead;
}
}
}
It appears that when I called _FTPRequest.GetResponse() before _FTPRequest.GetRequestStream(), it actually opened two separate requests and they were conflicting. I'm not yet clear as to why it only affected the subsequent download method, but once I removed that line completely, the methods worked as expected.
I have tried to download an FTP file using C# and have had various problems. What I want to achieve is to be able to show download progress in a progressBar. It is important that I use Windows Form and .Net.
I have tried two codes;
My first code works perfectly, that is, I can download the FTP file without problems.
CODE 1
FtpWebRequest dirFtp = ((FtpWebRequest)FtpWebRequest.Create(ficFTP));
dirFtp.KeepAlive = true;
dirFtp.UsePassive = UsePassive;
dirFtp.UseBinary = UseBinary;
// Los datos del usuario (credenciales)
NetworkCredential cr = new NetworkCredential(user, pass);
dirFtp.Credentials = cr;
FtpWebResponse response = (FtpWebResponse)dirFtp.GetResponse();
long size = (long)response.ContentLength;
Stream responseStream = response.GetResponseStream();
StreamReader reader = new StreamReader(responseStream);
using (FileStream writer = new FileStream(dirLocal, FileMode.Create))
{
int bufferSize = 2048;
int readCount;
byte[] buffer = new byte[2048];
readCount = responseStream.Read(buffer, 0, bufferSize);
while (readCount > 0)
{
writer.Write(buffer, 0, readCount);
readCount = responseStream.Read(buffer, 0, bufferSize);
}
}
lblDescarga.Text = "¡Downloaded!";
reader.Close();
response.Close();
Problem with this code
My problem with this code is that I can't get the size of the FTP file to be able to use the progressBar, In theory this section of code would tell me the size of my file but it always returns -1:
long size = (long)response.ContentLength;
As this did not work as I wanted, I made a post and people recommended this solution FtpWebRequest FTP download with ProgressBar:
CODE 2
try
{
const string url = "ftp://185.222.111.11:21/patch/archive.zip";
NetworkCredential credentials = new NetworkCredential("user", "pass");
// Query size of the file to be downloaded
WebRequest sizeRequest = WebRequest.Create(url);
sizeRequest.Credentials = credentials;
sizeRequest.Method = WebRequestMethods.Ftp.GetFileSize;
int size = (int)sizeRequest.GetResponse().ContentLength;
progressBar1.Invoke(
(MethodInvoker)(() => progressBar1.Maximum = size));
// Download the file
WebRequest request = WebRequest.Create(url);
request.Credentials = credentials;
request.Method = WebRequestMethods.Ftp.DownloadFile;
using (Stream ftpStream = request.GetResponse().GetResponseStream())
using (Stream fileStream = File.Create(#"C:\tmp\archive.zip"))
{
byte[] buffer = new byte[10240];
int read;
while ((read = ftpStream.Read(buffer, 0, buffer.Length)) > 0)
{
fileStream.Write(buffer, 0, read);
int position = (int)fileStream.Position;
progressBar1.Invoke(
(MethodInvoker)(() => progressBar1.Value = position));
}
}
}
catch (Exception e)
{
MessageBox.Show(e.Message);
}
Problem with this code
The problem with this code is when it gets to this point:
int size = (int) sizeRequest.GetResponse (). ContentLength;
Remote server error: (550) File not available (eg file not found or not accessed).
The truth is that it is impossible to tell that you do not have permission if code 1 works well. However I have the normal permissions in FTP, could someone give me an idea please?
I'm using .NET 3.5 and I need to transfer by FTP some files.
I don't want to use files because I manage all by using MemoryStream and bytes arrays.
Reading these articles (article and article), I made my client.
public void Upload(byte[] fileBytes, string remoteFile)
{
try
{
string uri = string.Format("{0}:{1}/{2}", Hostname, Port, remoteFile);
FtpWebRequest ftp = (FtpWebRequest)WebRequest.Create(uri);
ftp.Credentials = new NetworkCredential(Username.Normalize(), Password.Normalize());
ftp.UseBinary = true;
ftp.UsePassive = true;
ftp.Method = WebRequestMethods.Ftp.UploadFile;
using (Stream localFileStream = new MemoryStream(fileBytes))
{
using (Stream ftpStream = ftp.GetRequestStream())
{
int bufferSize = (int)Math.Min(localFileStream.Length, 2048);
byte[] buffer = new byte[bufferSize];
int bytesSent = -1;
while (bytesSent != 0)
{
bytesSent = localFileStream.Read(buffer, 0, bufferSize);
ftpStream.Write(buffer, 0, bufferSize);
}
}
}
}
catch (Exception ex)
{
LogHelper.WriteLog(logs, "Errore Upload", ex);
throw;
}
}
The FTP client connects, writes and close correctly without any error. But the written files are corrupted, such as PDF cannot be opened and for DOC/DOCX Word shows a message about file corruption and tries to restore them.
If I write to a file the same bytes passed to the Upload method, I get a correct file. So the problem must be with FTP transfer.
byte[] fileBytes = memoryStream.ToArray();
File.WriteAllBytes(#"C:\test.pdf", fileBytes); // --> File OK!
ftpClient.Upload(fileBytes, remoteFile); // --> File CORRUPTED on FTP folder!
You need to use bytesSent in the Write call:
bytesSent = localFileStream.Read(buffer, 0, bufferSize);
ftpStream.Write(buffer, 0, bytesSent);
Otherwise you write too many bytes in the last round.
Can anyone help with a small problem I am having, I have WCF Rest Based Service, which has function that can accept a stream, this will be used for uplaoding images/audio/video to the server and then storing them on the server somewhere.
Testing with and image, and it appears to work, i select the image in the client, and a few seconds later the image appears on the server in the location expected, but when i try to open the image in windows picture viewer (or any image viewer), i get "No Preview Available", and no image to view.
I am assuming it is because i am not recreating the file again correctly from the stream.
This is the method on the WCF Rest Service
public void PutFileInFolder(int eid, Stream fileContents)
{
try
{
byte[] buffer = new byte[32768];
MemoryStream ms = new MemoryStream();
int bytesRead = 0;
int totalBytesRead = 0;
do
{
bytesRead = fileContents.Read(buffer, 0, buffer.Length);
totalBytesRead += bytesRead;
ms.Write(buffer, 0, bytesRead);
} while (bytesRead > 0);
//now have file in memorystream
//save the file to the users folder
FileStream file = new FileStream(#"C:\bd_sites\ttgme\wwwroot\Evidence\{" + ed.LearnerID + #"}\" + ed.EvidenceFileName, FileMode.Create, System.IO.FileAccess.Write);
byte[] bytes = new byte[ms.Length];
ms.Read(bytes, 0, (int)ms.Length);
file.Write(bytes, 0, bytes.Length);
file.Close();
ms.Close();
}
catch (Exception ex)
{
return;
}
}
And this is the client function for sending the file/image
private void PostFile(EvidenceObject eo)
{
try
{
// Create the REST request.
string url = ConfigurationManager.AppSettings["serviceUrl"];
string requestUrl = string.Format("{0}/PutFileInFolder/{0}", 1001);
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(requestUrl);
request.Method = "POST";
request.ContentType = "text/plain";
byte[] fileToSend = File.ReadAllBytes(txtFileName.Text);
request.ContentLength = fileToSend.Length;
using (Stream requestStream = request.GetRequestStream())
{
// Send the file as body request.
requestStream.Write(fileToSend, 0, fileToSend.Length);
requestStream.Close();
}
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
Console.WriteLine("HTTP/{0} {1} {2}", response.ProtocolVersion, (int)response.StatusCode, response.StatusDescription);
MessageBox.Show("File sucessfully uploaded.", "Upload", MessageBoxButton.OK, MessageBoxImage.Information);
this.DialogResult = true;
}
catch (Exception ex)
{
MessageBox.Show("Error during file upload: " + ex.Message, "Upload", MessageBoxButton.OK, MessageBoxImage.Error);
}
}
Also just tested a video file, the orignal file plays happily, then when i upload it through the service, the file that is created on the server wont play.
I am sure it is somemthing really dumb i am doing, but any help is really appreciated.
The problem was the way i was writing to the file stream, i wasnt actually passing out the bytes of the file, but rather the new bytes making the file the same size but with basically no contents of the original file.
this was the change to the code
//FileStream file = new FileStream(#"C:\bd_sites\ttgme\wwwroot\Evidence\{" + ed.LearnerID + #"}\" + ed.EvidenceFileName, FileMode.Create, System.IO.FileAccess.Write);
//byte[] bytes = new byte[ms.Length];
////ms.Read(buffer, 0, (int)ms.Length);
//file.Write(bytes, 0, bytes.Length);
//file.Close();
//ms.Close();
using (FileStream fs = File.OpenWrite(#"C:\bd_sites\ttgme\wwwroot\Evidence\{" + ed.LearnerID + #"}\" + ed.EvidenceFileName))
{
ms.WriteTo(fs);
fs.Close();
ms.Close();
}
Finding some problems copying a zip file from an FTP location. It is just copying and empty file so I think there is something wrong with my use of StreamReader or StreamWriter.
Here is the code:
//read through directory details response
string line = reader.ReadLine();
while (line != null)
{
if (line.EndsWith("zip")) //"d" = dir don't need "." or ".." dirs
{
FtpWebRequest downloadRequest = (FtpWebRequest)FtpWebRequest.Create("ftp://" + ftpHost + line); //new Uri("ftp://" + ftpServerIP + DestinationFolder + fileInf.Name));
downloadRequest.Credentials = new NetworkCredential(ConfigurationManager.AppSettings["FilesUser"], ConfigurationManager.AppSettings["FilesPass"]);
downloadRequest.KeepAlive = false;
downloadRequest.UseBinary = true;
downloadRequest.Method = WebRequestMethods.Ftp.DownloadFile;
string folderToWrite = HttpContext.Current.Server.MapPath("~/Routing/RoutingFiles/");
string folderToSave = HttpContext.Current.Server.MapPath("~/Routing/");
StreamReader downloadRequestReader = new StreamReader(downloadRequest.GetResponse().GetResponseStream());
DirectoryInfo downloadDirectory = new DirectoryInfo(folderToWrite);
FileInfo file = new FileInfo(Path.Combine(downloadDirectory.FullName, line));
if (!file.Exists)
{
StreamWriter writer = new StreamWriter(Path.Combine(folderToWrite, line), false);
writer.Write(downloadRequestReader.ReadToEnd());
using (var downloadResponseStream = response.GetResponseStream())
{
}
}
}
}
By the time it gets to the bottom of that section, the file has been copied but is empty so I don't think I'm reading the stream correctly for a zip file. Anyone any ideas? I've seen talk of FileStream being better for downloading Zip files, but I couldn't get that to work either.
Thanks.
Here is an example that downloads a file from an ftp.
try
{
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(ftpAddr + "test.zip");
request.Credentials = new NetworkCredential(userName, password);
request.UseBinary = true; // Use binary to ensure correct dlv!
request.Method = WebRequestMethods.Ftp.DownloadFile;
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
Stream responseStream = response.GetResponseStream();
FileStream writer = new FileStream("test.zip", FileMode.Create);
long length = response.ContentLength;
int bufferSize = 2048;
int readCount;
byte[] buffer = new byte[2048];
readCount = responseStream.Read(buffer, 0, bufferSize);
while (readCount > 0)
{
writer.Write(buffer, 0, readCount);
readCount = responseStream.Read(buffer, 0, bufferSize);
}
responseStream.Close();
response.Close();
writer.Close();
}
catch (Exception e)
{
Console.WriteLine(e.ToString());
}
Edit I'm sorry for the error in previous code.
When correcting my previous code I found the following resource useful: example