I'm using .NET 3.5 and I need to transfer by FTP some files.
I don't want to use files because I manage all by using MemoryStream and bytes arrays.
Reading these articles (article and article), I made my client.
public void Upload(byte[] fileBytes, string remoteFile)
{
try
{
string uri = string.Format("{0}:{1}/{2}", Hostname, Port, remoteFile);
FtpWebRequest ftp = (FtpWebRequest)WebRequest.Create(uri);
ftp.Credentials = new NetworkCredential(Username.Normalize(), Password.Normalize());
ftp.UseBinary = true;
ftp.UsePassive = true;
ftp.Method = WebRequestMethods.Ftp.UploadFile;
using (Stream localFileStream = new MemoryStream(fileBytes))
{
using (Stream ftpStream = ftp.GetRequestStream())
{
int bufferSize = (int)Math.Min(localFileStream.Length, 2048);
byte[] buffer = new byte[bufferSize];
int bytesSent = -1;
while (bytesSent != 0)
{
bytesSent = localFileStream.Read(buffer, 0, bufferSize);
ftpStream.Write(buffer, 0, bufferSize);
}
}
}
}
catch (Exception ex)
{
LogHelper.WriteLog(logs, "Errore Upload", ex);
throw;
}
}
The FTP client connects, writes and close correctly without any error. But the written files are corrupted, such as PDF cannot be opened and for DOC/DOCX Word shows a message about file corruption and tries to restore them.
If I write to a file the same bytes passed to the Upload method, I get a correct file. So the problem must be with FTP transfer.
byte[] fileBytes = memoryStream.ToArray();
File.WriteAllBytes(#"C:\test.pdf", fileBytes); // --> File OK!
ftpClient.Upload(fileBytes, remoteFile); // --> File CORRUPTED on FTP folder!
You need to use bytesSent in the Write call:
bytesSent = localFileStream.Read(buffer, 0, bufferSize);
ftpStream.Write(buffer, 0, bytesSent);
Otherwise you write too many bytes in the last round.
Related
I am trying to modify this FTP connection method to upload multiple files in a single connection.
As you can see I have got a while loop that iterates through a filename and filepath array, I have set the WebRequest to keep alive. I am not sure what I should move out of the loop to stop new connections from constantly opening up.
This is the error I am getting:
The remote server returned an error: (550) File unavailable (e.g., file not found, no access).
Thanks in advance!
Public string FTPUploadMultipleFiles(string ftpURL, string Username, string Password, string[] filePaths, string[] fileNames)
{
string result = "OK";
try
{
int Counter = 0;
if (filePaths.Count() == fileNames.Count())
while (Counter <= filePaths.Count() - 1)
{
FileInfo fileInf = new FileInfo(filePaths[Counter]);
FtpWebRequest reqFTP;
// Create FtpWebRequest object from the Uri provided
reqFTP = (FtpWebRequest)FtpWebRequest.Create(new Uri(ftpURL + "/" + fileNames[Counter]));
reqFTP.Credentials = new NetworkCredential(Username, Password);
reqFTP.KeepAlive = true;
// Specify the command to be executed.
reqFTP.Method = WebRequestMethods.Ftp.UploadFile;
// Specify the data transfer type.
reqFTP.UsePassive = true;
reqFTP.UseBinary = true;
// Notify the server about the size of the uploaded file
reqFTP.ContentLength = fileInf.Length;
// The buffer size is set to 2kb
int buffLength = 204800;
byte[] buff = new byte[buffLength];
int contentLen;
// Opens a file stream (System.IO.FileStream) to read the file to be uploaded
FileStream fs = fileInf.OpenRead();
try
{
// Stream to which the file to be upload is written
Stream strm = reqFTP.GetRequestStream();
// Read from the file stream 2kb at a time
contentLen = fs.Read(buff, 0, Convert.ToInt32(buffLength));
// Till Stream content ends
while (contentLen != 0)
{
// Write Content from the file stream to the FTP Upload Stream
strm.Write(buff, 0, contentLen);
contentLen = fs.Read(buff, 0, buffLength);
}
// Close the file stream and the Request Stream
strm.Close();
fs.Close();
Counter++;
}
catch (Exception ex)
{
result = ex.Message;
}
}
}
catch (Exception ex)
{
result = ex.Message;
}
return result;
}
I have been working on this application that enables user to log in into another website, and then download specified file from that server. So far I have succeeded in logging on the website and download the file. But everything ruins when it comes to zip files.
Is there any chunk of code that could be helpful in reading the .zip files byte by byte or by using stream reader?
I m using downloadfile() but its not returning the correct zip file.
I need a method by which I can read zip files. Can I do it by using ByteReader()
The code used to download zip file is
string filename = "13572_BranchInformationReport_2012-05-22.zip";
string filepath = "C:\\Documents and Settings\\user\\Desktop\\" + filename.ToString();
WebClient client = new WebClient();
string user = "abcd", pass = "password";
client.Credentials = new NetworkCredential(user, pass);
client.Encoding = System.Text.Encoding.UTF8;
try
{
client.DownloadFile("https://web.site/archive/13572_BranchInformationReport_2012-05-22.zip", filepath);
Response.Write("Success");
}
catch (Exception ue)
{
Response.Write(ue.Message);
}
Thanks in advance.
is there any chunk of code that could be helpful in reading the zip files bytes by bytes aur by using stream reader.
Absolutely not. StreamReader - and indeed any TextReader is for reading text content, not binary content. A zip file is not text - it's composed of bytes, not characters.
If you're reading binary content such as zip files, you should be using a Stream rather than a TextReader of any kind.
Note that WebClient.DownloadFile and WebClient.DownloadData can generally make things easier for downloading binary content.
Another simple way to downlaod zip file
<asp:HyperLink ID="HyperLink1" runat="server" NavigateUrl="~/DOWNLOAD/Filename.zip">Click To Download</asp:HyperLink>
Another solution
private void DownloadFile()
{
string getPath = "DOWNLOAD/FileName.zip";
System.IO.Stream iStream = null;
byte[] buffer = new Byte[1024];
// Length of the file:
int length;
// Total bytes to read:
long dataToRead;
// Identify the file to download including its path.
string filepath = Server.MapPath(getPath);
// Identify the file name.
string filename = System.IO.Path.GetFileName(filepath);
try
{
// Open the file.
iStream = new System.IO.FileStream(filepath, System.IO.FileMode.Open,
System.IO.FileAccess.Read, System.IO.FileShare.Read);
// Total bytes to read:
dataToRead = iStream.Length;
// Page.Response.ContentType = "application/vnd.android.package-archive";
// Page.Response.ContentType = "application/octet-stream";
Page.Response.AddHeader("Content-Disposition", "attachment; filename=" + filename);
// Read the bytes.
while (dataToRead > 0)
{
// Verify that the client is connected.
if (Response.IsClientConnected)
{
// Read the data in buffer.
length = iStream.Read(buffer, 0, 1024);
// Write the data to the current output stream.
Page.Response.OutputStream.Write(buffer, 0, length);
// Flush the data to the HTML output.
Page.Response.Flush();
// buffer = new Byte[1024];
dataToRead = dataToRead - length;
}
else
{
//prevent infinite loop if user disconnects
dataToRead = -1;
}
}
}
catch (Exception ex)
{
// Trap the error, if any.
Page.Response.Write(ex.Message);
}
finally
{
if (iStream != null)
{
//Close the file.
iStream.Close();
Page.Response.Close();
}
}
}
Your answer
WebRequest objRequest = System.Net.HttpWebRequest.Create(url);
objResponse = objRequest.GetResponse();
byte[] buffer = new byte[32768];
using (Stream input = objResponse.GetResponseStream())
{
using (FileStream output = new FileStream ("test.doc",
FileMode.CreateNew))
{
int bytesRead;
while ( (bytesRead=input.Read (buffer, 0, buffer.Length)) > 0)
{
output.Write(buffer, 0, bytesRead);
}
}
}
This is how i achieved it. Thanks everyone for ur help
Problem: When i ftp upload only one file at a time, the files are uploaded fine, But when i use multiple Background workers to upload files to ftp server i get exception:
ex {"The remote server returned an error: (550) File unavailable
(e.g., file not found, no access)."} System.Exception
{System.Net.WebException}
And only Some of the files get uploaded. I am pretty sure the file exits at that location, infact in another run the file it was complaining about that does not exist is downloaded but the error shifts on another file.
Code Description:
In below code i am downloading a file from one ftp server and putting it on another. This code is inside a BackgroundsWorker_DoWork Method. Background Workers are being created inside a loop.
void imageDownloadWorker_DoWork(object sender, DoWorkEventArgs e)
{
string[] ftpInfo = (string[])e.Argument;
try
{
///////////////////////////Downloading///////////////////////////////////////
string uri = String.Format("ftp://{0}/{1}/images/{2}", ftpInfo[1], ftpInfo[2], ftpInfo[5]);
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(uri);
request.Method = WebRequestMethods.Ftp.DownloadFile;
request.UseBinary = true;
request.Credentials = new NetworkCredential(ftpInfo[3], ftpInfo[4]);
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
Stream ftpStream = response.GetResponseStream();
long cl = response.ContentLength;
int bufferSize = 4096;
int readCount = 0;
byte[] buffer = new byte[bufferSize];
MemoryStream memStream = new MemoryStream();
readCount = ftpStream.Read(buffer, 0, bufferSize);
while (readCount > 0)
{
memStream.Write(buffer, 0, readCount);
readCount = ftpStream.Read(buffer, 0, bufferSize);
}
response.Close();
///////////////////////////Uploading///////////////////////////////////////
string uri1 = String.Format("ftp://{0}/{1}/{2}", "127.0.0.1", string.Empty, ftpInfo[5]);
FtpWebRequest request1 = (FtpWebRequest)WebRequest.Create(uri1);
request1.Credentials = new NetworkCredential("user", "password");
request1.KeepAlive = false;
request1.Method = WebRequestMethods.Ftp.UploadFile;
request1.UseBinary = true;
request1.ContentLength = memStream.Length;
int buffLength = 4096;
byte[] buff = new byte[buffLength];
int contentLen;
// Stream to which the file to be upload is written
Stream strm = request1.GetRequestStream();
memStream.Seek(0, SeekOrigin.Begin);
contentLen = memStream.Read(buff, 0, buffLength);
// Till Stream content ends
while (contentLen != 0)
{
// Write Content from the file stream to the FTP Upload Stream
strm.Write(buff, 0, contentLen);
contentLen = memStream.Read(buff, 0, buffLength);
}
// Close the file stream and the Request Stream
strm.Close();
ftpStream.Close();
memStream.Close();
}
catch(Exception ex)
{
MessageBox.Show("While Downloading File " + ftpInfo[5] + " " + ex.ToString(), "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
e.Result = null;
return;
}
Related Thread
Edit:
In File Zilla Server there is an option General Settings>Perfomance Settings>Number of Threads i have set that to 20 it didnt make any difference.
There maybe nothing wrong with your code. That error is a Permissions error.
Complete stab in the dark, but does the upload target server have a connection per IP limit? If so you may be falling foul of this by exceeding the concurrent connection limit from a single IP address.
I'm trying to write a program which will download a few files from and FTP, zip them up then upload them again to the same FTP location.
I have got it to attempt to download a file. If it fails, it will try again.
If NO errors occur, all files download and upload file.
If any errors occur when downloading, it will download them on the re-attempt, but fail then to upload.
I think the problems are down to not correctly closing a connection, but I can't for the life of me figure it out.
Here's my code; I've added where it fails:
Upload:
FileInfo fileInf = new FileInfo("directory" + zip + ".zip");
string uri = "ftp://address" + fileInf.Name;
FtpWebRequest reqFTP2;
reqFTP2 = (FtpWebRequest)FtpWebRequest.Create(new Uri("ftp://address" + fileInf.Name));
reqFTP2.Credentials = new NetworkCredential("username", "password");
reqFTP2.KeepAlive = true;
reqFTP2.Method = WebRequestMethods.Ftp.UploadFile;
reqFTP2.UseBinary = true;
reqFTP2.ContentLength = fileInf.Length;
int buffLength = 2048;
byte[] buff = new byte[buffLength];
int contentLen;
FileStream fs = fileInf.OpenRead();
try
{
Stream strm = reqFTP2.GetRequestStream(); //FAILS HERE
contentLen = fs.Read(buff, 0, buffLength);
while (contentLen != 0)
{
strm.Write(buff, 0, contentLen);
contentLen = fs.Read(buff, 0, buffLength);
}
strm.Close();
fs.Close();
}
catch (Exception ex)
{
}
Download:
int errorOccured = 0;
while (errorOccured < 1)
{
FileStream outputStream = new FileStream("directory\\" + file, FileMode.Create);
FtpWebRequest reqFTP = (FtpWebRequest)FtpWebRequest.Create(new Uri("ftp://address/" + file));
reqFTP.Credentials = new NetworkCredential("username", "password");
try
{
reqFTP.Method = WebRequestMethods.Ftp.DownloadFile;
reqFTP.UseBinary = true;
FtpWebResponse response = (FtpWebResponse)reqFTP.GetResponse();
Stream ftpStream = response.GetResponseStream();
long cl = response.ContentLength;
int bufferSize = 2048;
int readCount;
byte[] buffer = new byte[bufferSize];
readCount = ftpStream.Read(buffer, 0, bufferSize);
while (readCount > 0)
{
outputStream.Write(buffer, 0, readCount);
readCount = ftpStream.Read(buffer, 0, bufferSize);
}
ftpStream.Close();
outputStream.Close();
response.Close();
errorOccured++;
}
catch (Exception er)
{
outputStream.Close();
}
The error 504 – Command not implemented for that parameter.
implies that some option you are using in there is not implemented by the target FTP server. I think that your code is resulting in a bizarre request, suggestion would be to look at the FTP chatter that your process creates on the server side. For exmaple, does the server support PASV mode? FTP protocol in ACTV mode (the default behavior) is always a pain because it explicitly causes the client to open a "file receive port" on port 20 and listen. While most servers support PASV mode transfers, it can become a pain if you don't put them in the PASV mode explicitly. So look at the chatter, see if the server is in PASV mode, and if you still have trouble, look at the chatter to see if there are "Extra Spaces" passed on during FTP negotiation. FTP is quite dinky and there can be several pitfalls. :-)
For starters, wrap your streams in using blocks so that they are disposed appropriately.
See MSDN for more details.
I'm using the following C# code to FTP a ~40MB CSV file from a remote service provider. Around 50% of the time, the download hangs and eventually times out. In my app log, I get a line like:
> Unable to read data from the transport
> connection: A connection attempt
> failed because the connected party did
> not properly respond after a period of
> time, or established connection failed
> because connected host has failed to
> respond.
When I download the file interactively using a graphical client like LeechFTP, the downloads almost never hang, and complete in about 45 seconds. I'm having a hell of a time understanding what's going wrong.
Can anyone suggest how I can instrument this code to get more insight into what's going on, or a better way to download this file? Should I increase the buffer size? By how much? Avoid the buffered writes to disk and try to swallow the whole file in memory? Any advice appreciated!
...
private void CreateDownloadFile()
{
_OutputFile = new FileStream(_SourceFile, FileMode.Create);
}
public string FTPDownloadFile()
{
this.CreateDownloadFile();
myReq = (FtpWebRequest)FtpWebRequest.Create(new Uri(this.DownloadURI));
myReq.Method = WebRequestMethods.Ftp.DownloadFile;
myReq.UseBinary = true;
myReq.Credentials = new NetworkCredential(_ID, _Password);
FtpWebResponse myResp = (FtpWebResponse)myReq.GetResponse();
Stream ftpStream = myResp.GetResponseStream();
int bufferSize = 2048;
int readCount;
byte[] buffer = new byte[bufferSize];
int bytesRead = 0;
readCount = ftpStream.Read(buffer, 0, bufferSize);
while (readCount > 0)
{
_OutputFile.Write( buffer, 0, readCount );
readCount = ftpStream.Read( buffer, 0, bufferSize );
Console.Write( '.' ); // show progress on the console
bytesRead += readCount;
}
Console.WriteLine();
logger.logActivity( " FTP received " + String.Format( "{0:0,0}", bytesRead ) + " bytes" );
ftpStream.Close();
_OutputFile.Close();
myResp.Close();
return this.GetFTPStatus();
}
public string GetFTPStatus()
{
return ((FtpWebResponse)myReq.GetResponse()).StatusDescription;
}
I tried to use FTPClient as suggested above and got the same timeout error, FTPClient uses FtpWebRequest so I must be missing something but I don't see the point.
After some more research I found that -1 is the value for infinity
For my purpose it is okay to use infinity so I went with that, problem solved.
Here is my code:
//gets file from FTP site.
FtpWebRequest reqFTP;
string fileName = #"c:\downloadDir\localFileName.txt";
FileInfo downloadFile = new FileInfo(fileName);
string uri = "ftp://ftp.myftpsite.com/ftpDir/remoteFileName.txt";
FileStream outputStream = new FileStream(fileName, FileMode.Append);
reqFTP = (FtpWebRequest)FtpWebRequest.Create(new Uri(uri));
reqFTP.Method = WebRequestMethods.Ftp.DownloadFile;
reqFTP.UseBinary = true;
reqFTP.KeepAlive = false;
reqFTP.Timeout = -1;
reqFTP.UsePassive = true;
reqFTP.Credentials = new NetworkCredential("userName", "passWord");
FtpWebResponse response = (FtpWebResponse)reqFTP.GetResponse();
Stream ftpStream = response.GetResponseStream();
long cl = response.ContentLength;
int bufferSize = 2048;
int readCount;
byte[] buffer = new byte[bufferSize];
readCount = ftpStream.Read(buffer, 0, bufferSize);
Console.WriteLine("Connected: Downloading File");
while (readCount > 0)
{
outputStream.Write(buffer, 0, readCount);
readCount = ftpStream.Read(buffer, 0, bufferSize);
Console.WriteLine(readCount.ToString());
}
ftpStream.Close();
outputStream.Close();
response.Close();
Console.WriteLine("Downloading Complete");
I suggest you don't use FtpWebRequest for FTP access. FtpWebRequest is the most brain-dead FTP API I have every seen.
Currently I use FtpClient http://www.codeplex.com/ftpclient
I also have had good luck with IndySockets http://www.indyproject.org/index.en.aspx
I haven't dealt with FTP on a code level, but FTP supports resuming. Maybe you could have it automatically try to resume the upload when it times out