I am downloading a FTP location file using below code. Its working for all of the files except where the file name contains international character.
I have learned in the URI format such this is not allowed but how can download if there is existing file at mentioned location.
For testing I have setup local FTP server under IIS like below.
http://www.online-tech-tips.com/computer-tips/setup-ftp-server-in-windows-iis/
string mat_address = "ftp://localhost/";
StringBuilder result = new StringBuilder();
FtpWebRequest ftp = (FtpWebRequest)WebRequest.Create(mat_address);
ftp.Credentials = new NetworkCredential("userid", "Password");
ftp.Method = WebRequestMethods.Ftp.ListDirectory;
string[] downloadfile = null;
using (FtpWebResponse response = (FtpWebResponse)ftp.GetResponse())
using (StreamReader reader = new StreamReader(response.GetResponseStream(), System.Text.Encoding.Default, true))
{
downloadfile = reader.ReadToEnd().Split(new string[] { "\r\n" }, StringSplitOptions.RemoveEmptyEntries);
}
foreach (string d in downloadfile)
{
if (d.Contains("d"))
{
string temp = mat_address + HttpUtility.UrlPathEncode(d);
FtpWebRequest ftp2 = (FtpWebRequest)WebRequest.Create(temp);
ftp2.Credentials = new NetworkCredential("userid", "Password");
ftp2.Method = WebRequestMethods.Ftp.GetDateTimestamp;
ftp2.UseBinary = true;
ftp2.Proxy = null;
ftp2.KeepAlive = false;
ftp2.UsePassive = false;
FtpWebResponse response2 = ftp2.GetResponse() as FtpWebResponse;
DateTime temp1 = response2.LastModified.Date;
if (temp1 > DateTime.Now.AddDays(-10))
{
// Some extra work
}
}
}
I am getting error
The remote server returned an error: (550) File unavailable (e.g., file not found, no access).
Below is my FTP root folder with problematic file name as diá.png
I am using C# for coding and Visual Studio 2013 for development. Whats going wrong can someone help.
Update to Question:
Changes in Encoding to UTF8.
Using the local host everything works fine. But when using the FTP server from international domain like germany and sweden. The name is read like below.
I am getting error for the line below.
FtpWebResponse response2 = ftp2.GetResponse() as FtpWebResponse;
Hex Value for File Name: Suggested include by Martin. Thanks
31,30,31,33,36,2D,49,43,4F,4D,20,50,4A,C4,54,54,45,52,59,44,20,70,69,63,74,20,37,38,78,31,31,38,20,61,6E,6E,69,2D,76,65,72,73,61,72,69,75,73,20,5B,77,31,33,32,31,20,78,20,68,39,32,31,5D,20,6D,6D,20,44,49,46,46,55,53,45,2E,50,4E,47,
Most FTP servers should use UTF-8 encoding. So does your local (IIS) server.
So you need to use the Encoding.UTF8, when parsing the directory listing.
Though your real/production server seems to be broken in some way. It looks like it uses Windows-1252 encoding for the directory listing. Yet it claims (and seems to require) UTF-8 encoding for commands. That clearly (and rightfully) confuses FileZilla. But I do not see, why it does not work with the FtpWebRequest as it should use the UTF-8 (as the server positively responds to OPTS utf8 on command), and you have tried to explicitly use the Windows-1252 encoding, when parsing the listing.
Anyway, as you found (in chat) that WinSCP works, you can try to use the WinSCP .NET assembly. It will also make your code a way simpler:
SessionOptions sessionOptions = new SessionOptions();
sessionOptions.Protocol = Protocol.Ftp;
sessionOptions.HostName = "hostname";
sessionOptions.UserName = "username";
sessionOptions.Password = "password";
using (Session session = new Session())
{
session.Open(sessionOptions);
foreach (RemoteFileInfo fileInfo in session.ListDirectory("/textures").Files)
{
if (fileInfo.Name.Contains("d"))
{
if (fileInfo.LastWriteTime > DateTime.Now.AddDays(-10))
{
string sourcePath =
RemotePath.EscapeFileMask("/textures/" + fileInfo.Name);
session.GetFiles(sourcePath, #"c:\local\path\").Check();
}
}
}
}
Or, even simpler, using file mask with time constraint:
SessionOptions sessionOptions = new SessionOptions();
sessionOptions.Protocol = Protocol.Ftp;
sessionOptions.HostName = "hostname";
sessionOptions.UserName = "username";
sessionOptions.Password = "password";
using (Session session = new Session())
{
session.Open(sessionOptions);
session.GetFiles("/textures/*d*>=10D", #"c:\local\path\").Check();
}
See also WinSCP example How do I transfer new/modified files only?
I'd say, you'll have to transform the encoding of the recieved filename to match the needs of your local file system. Could you post what filename you revieve? I think you get an excaped string containing some illegal characters...
Related
I have an annoying problem preventing me to get a file I need in an FTP. This file may have differents names so I need to access the folder first and list files inside to do a request directly to the file then.
My problem is that I can access this file in Filezilla for example, and perfectly discovers the folder as well, but when using an FtpWebResponse instance to get the folder, I have an error 550
550 File unavailable (e.g. file not found, no access)
here is the code :
FtpWebRequest wr = (FtpWebRequest)WebRequest.Create("ftp://ftp.dachser.com/data/edi/kunden/da46168958/out");
wr.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
wr.Credentials = new NetworkCredential("login", "password");
FtpWebResponse response = (FtpWebResponse)wr.GetResponse();
Stream reponseStream = response.GetResponseStream();
StreamReader reader = new StreamReader(reponseStream);
string names = reader.ReadToEnd();
FtpWebResponse response = (FtpWebResponse)wr.GetResponse();
is the line throwing the error
PS: Production, tests and FileZilla are on the same domain, using the same internet connection (if it helps)
Thanks for your attention and feedback
The FileZilla logs:
Logs from my program, error circled in red isn't related to FTP error
When FtpWebRequest interprets the URL, it does not consider the slash that separates the hostname and the path as a part of the path. The extracted path is then used with FTP CWD command as is. That means that the FTP server will resolve the path relatively to your home directory. If your account is not chrooted (the home is not seen as the root by the client), the lack of the leading slash leads to unexpected behaviour.
In your case, you start in /remote/path and with URL like ftp://example.com/remote/path/, it will try to change to remote/path, so ultimately to /remote/path/remote/path. That's not what you want.
Either you must use a relative path to the home folder. What in your case means using an URL without any path.
Or use an absolute path, for which you need to use two slashes after the hostname: ftp://example.com//remote/path/.
Also note that an URL to a folder should end with a slash: Why does FtpWebRequest return an empty stream for this existing directory?
For other 550 problems, see FtpWebRequest returns error 550 File unavailable
In 2021 this works on both our Linux and Windows live boxes reading from ftp server (both on Windows and Linux)
Note
the main folder on the Windows ftp is web
the main folder on the Linux ftp is public_html
TL;DR;
bottom line: the URL needs to be ended with /
It works:
ftp://ftp.yourdomain.com.br/public_html/
ftp://ftp.yourdomain.com.br//public_html/
ftp://ftp.yourdomain.com.br/web/
ftp://ftp.yourdomain.com.br//web/
It doesn't work:
ftp://ftp.yourdomain.com.br/public_html
ftp://ftp.yourdomain.com.br//public_html
ftp://ftp.yourdomain.com.br/web
ftp://ftp.yourdomain.com.br//web
Usage:
//verifiy if the directory public_html does exists
var url = "/public_html/";
var result = FtpUtil.DoesDirectoryExists(url, "ftp://ftp.yourdomain.com.br", "ftp user here", "ftp password here");
static bool DoesDirectoryExists(string directory, string ftpHost, string ftpUser, string ftpPassword) {
FtpWebRequest ftpRequest = null;
try {
ftpRequest = (FtpWebRequest)WebRequest.Create(new Uri("ftp://" + ftpHost + directory));
ftpRequest.Credentials = new NetworkCredential(ftpUser, string ftpPassword);
ftpRequest.UseBinary = true;// optional
ftpRequest.KeepAlive = false;// optional
ftpRequest.UsePassive = true;// optional
ftpRequest.Method = WebRequestMethods.Ftp.ListDirectory;
using (FtpWebResponse response = (FtpWebResponse)ftpRequest.GetResponse()) {
return true;//directory found
}
}
catch (WebException ex) {
if (ex.Response != null) {
FtpWebResponse response = (FtpWebResponse)ex.Response;
if (response.StatusCode == FtpStatusCode.ActionNotTakenFileUnavailable)
return false;// directory not found.
}
return false; // directory not found.
}
finally {
ftpRequest = null;
}
}
I work on C# web application and need to download files using FTP to local folder. Those images need to have modification date greater than date I specify.
Code:
public static List<FTPLineResult> GetFilesListSortedByDate(string ftpPath, Regex nameRegex, DateTime cutoff, System.Security.Cryptography.X509Certificates.X509Certificate cert)
{
List<FTPLineResult> output = new List<FTPLineResult>();
if (cert != null)
{
FtpWebRequest request = FtpWebRequest.Create(ftpPath) as FtpWebRequest;
request.Credentials = new NetworkCredential("unm", "pwd");
request.ClientCertificates.Add(cert);
ConfigureProxy(request);
request.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
FtpWebResponse response = request.GetResponse() as FtpWebResponse;
StreamReader directoryReader = new StreamReader(response.GetResponseStream(), System.Text.Encoding.ASCII);
var parser = new FTPLineParser();
while (!directoryReader.EndOfStream)
{
var result = parser.Parse(directoryReader.ReadLine());
if (!result.IsDirectory && result.DateTime > cutoff && nameRegex.IsMatch(result.Name))
{
output.Add(result);
}
}
// need to ensure the files are sorted in ascending date order
output.Sort(
new Comparison<FTPLineResult>(
delegate(FTPLineResult res1, FTPLineResult res2)
{
return res1.DateTime.CompareTo(res2.DateTime);
}
)
);
}
return output;
}
I have to use certificate (.p12).
How can I do this?
You have to retrieve timestamps of remote files to select those you want.
Unfortunately, there's no really reliable and efficient way to retrieve timestamps using features offered by .NET framework as it does not support FTP MLSD command. The MLSD command provides listing of remote directory in a standardized machine-readable format. The command and the format is standardized by the RFC 3659.
Alternatives you can use, that are supported by the .NET framework:
the ListDirectoryDetails method (FTP LIST command) to retrieve details of all files in a directory and then you deal with FTP server specific format of the details
*nix format: Parsing FtpWebRequest ListDirectoryDetails line
DOS/Windows format: C# class to parse WebRequestMethods.Ftp.ListDirectoryDetails FTP response
the GetDateTimestamp method (FTP MDTM command) to individually retrieve timestamps for each file. Advantage is that the response is standardized by RFC 3659 to YYYYMMDDHHMMSS[.sss]. Disadvantage is that you have to send a separate request for each file, what can be quite inefficient.
const string uri = "ftp://example.com/remote/path/file.txt";
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(uri);
request.Method = WebRequestMethods.Ftp.GetDateTimestamp;
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
Console.WriteLine("{0} {1}", uri, response.LastModified);
Alternatively you can use a 3rd party FTP client implementation that supports the modern MLSD command or that can directly download files given time constraint.
For example WinSCP .NET assembly supports both MLSD and time constraints.
There's even an example for your specific task: How do I transfer new/modified files only?
The example is for PowerShell, but translates to C# easily:
// Setup session options
SessionOptions sessionOptions = new SessionOptions
{
Protocol = Protocol.Ftp,
HostName = "ftp.example.com",
UserName = "username",
Password = "password",
};
using (Session session = new Session())
{
// Connect
session.Open(sessionOptions);
// Download files created in 2017-06-15 and later
TransferOptions transferOptions = new TransferOptions();
transferOptions.FileMask = "*>=2017-06-15";
session.GetFiles(
"/remote/path/*", #"C:\local\path\", false, transferOptions).Check();
}
Though for web application, WinSCP is probably not the best solution. You maybe able to find another 3rd party library with similar functionality.
WinSCP also supports authentication with a client certificate. See SessionOptions.TlsClientCertificatePath. But that's really for a separate question.
(I'm the author of WinSCP)
I have seen some answers similar to my question but still could not figure out.
I am using the code below for a user to upload an MP3 file (I am using FTP) and it worked fine with local host (simple WinForm app) but it threw the error when using remote server (remote DNN site):
System.IO.FileNotFoundException: Could not find file 'C:\Windows\SysWOW64\inetsrv\Test.mp3'.
I know that if the test.mp3 file is in this server location then it should work but it was actually in my C:\Temp\Test.mp3 path. I think the FileUpload1 did not give the correct file path. How can I fix this?
protected void btnUpload_Click(object sender, EventArgs e)
{
string url = System.Configuration.ConfigurationManager.AppSettings["FTPUrl"].ToString();
string username = System.Configuration.ConfigurationManager.AppSettings["FTPUserName"].ToString();
string password = System.Configuration.ConfigurationManager.AppSettings["FTPPassWord"].ToString();
string filePath = FileUpload1.PostedFile.FileName;
if (filePath != String.Empty)
UploadFileToFtp(url, filePath, username, password);
}
public static void UploadFileToFtp(string url, string filePath, string username, string password)
{
var fileName = Path.GetFileName(filePath);
var request = (FtpWebRequest)WebRequest.Create(url + fileName);
request.Method = WebRequestMethods.Ftp.UploadFile;
request.Credentials = new NetworkCredential(username, password);
request.UsePassive = true;
request.UseBinary = true;
request.KeepAlive = false;
using (var fileStream = File.OpenRead(filePath))
{
using (var requestStream = request.GetRequestStream())
{
fileStream.CopyTo(requestStream);
requestStream.Close();
}
}
var response = (FtpWebResponse)request.GetResponse();
Console.WriteLine("Upload done: {0}", response.StatusDescription);
response.Close();
}
The HttpPostedFile.FileName is a "fully qualified name of the file on the client".
And I believe most web browsers actually provide a file name only, without any path. So you get Test.mp3 only, and when you try to use such "relative" path locally on the server, it gets resolved to a current working directory of the web server, what is the C:\Windows\SysWOW64\inetsrv.
Instead, access an uploaded contents directly using HttpPostedFile.InputStream (copy that to GetRequestStream).
See HttpPostedFile documentation.
I have an FTP and I want to know the files that has been added today.
(in my business rules, there is no update to the files, so the files could be added and then can't be modified or removed at all).
I tried this:
FtpWebRequest request = (FtpWebRequest)WebRequest.Create("ftp://172.28.4.7/");
request.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
Console.WriteLine("{0} {1}", "ftp://172.28.4.7/", response.LastModified);
Stream responseStream = response.GetResponseStream();
StreamReader reader = new StreamReader(responseStream);
But, as expected, in the console I got the date of the last modifying.
Could you help me please to know the last added files?
You have to retrieve timestamps of remote files to select those you want (today's files).
Unfortunately, there's no really reliable and efficient way to retrieve timestamps using features offered by .NET framework as it does not support FTP MLSD command. The MLSD command provides listing of remote directory in a standardized machine-readable format. The command and the format is standardized by the RFC 3659.
Alternatives you can use, that are supported by the .NET framework:
the ListDirectoryDetails method (FTP LIST command) to retrieve details of all files in a directory and then you deal with FTP server specific format of the details
*nix format: Parsing FtpWebRequest ListDirectoryDetails line
DOS/Windows format: C# class to parse WebRequestMethods.Ftp.ListDirectoryDetails FTP response
the GetDateTimestamp method (FTP MDTM command) to individually retrieve timestamps for each file. Advantage is that the response is standardized by RFC 3659 to YYYYMMDDHHMMSS[.sss]. Disadvantage is that you have to send a separate request for each file, what can be quite inefficient.
const string uri = "ftp://example.com/remote/path/file.txt";
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(uri);
request.Method = WebRequestMethods.Ftp.GetDateTimestamp;
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
Console.WriteLine("{0} {1}", uri, response.LastModified);
This is what the answer by #tretom shows.
Alternatively you can use a 3rd party FTP client implementation that supports the modern MLSD command or that can directly download files given time constraint.
For example WinSCP .NET assembly supports both MLSD and time constraints.
There's even an example for your specific task: How do I transfer new/modified files only?
The example is for PowerShell, but translates to C# easily:
// Setup session options
SessionOptions sessionOptions = new SessionOptions
{
Protocol = Protocol.Ftp,
HostName = "ftp.example.com",
UserName = "username",
Password = "password",
};
using (Session session = new Session())
{
// Connect
session.Open(sessionOptions);
// Download today's times
TransferOptions transferOptions = new TransferOptions();
transferOptions.FileMask = "*>=" + DateTime.Today.ToString("yyyy-MM-dd");
session.GetFiles(
"/remote/path/*", #"C:\local\path\", false, transferOptions).Check();
}
(I'm the author of WinSCP)
First you have to Get All Directory Details using "ListDirectoryDetails" :
ftpRequest.Method = WebRequestMethods.Ftp.ListDirectoryDetails
Get the Response in string[] .
Than Check if the string[] For File or Directory by checking "DIR" text in String[] items.
And after getting the Filenames From string[] , Again Request For "File Creation Date" of Each and Every File using :
ftpRequest.Method = WebRequestMethods.Ftp.GetDateTimestamp;
So, You can Get The File Added Date of your FTP Server.
a possible synchronous solution (it might be useful for someone):
a data container type:
public class Entity
{
public DateTime uploadDate { get; set; }
public string fileName { get; set; }
}
and the Lister lass:
public class FTPLister
{
private List<Entity> fileList = new List<Entity>();
public List<Entity> ListFilesOnFTP(string ftpAddress, string user, string password)
{
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(ftpAddress);
request.Method = WebRequestMethods.Ftp.ListDirectory;
request.Credentials = new NetworkCredential(user, password);
List<string> tmpFileList = new List<string>();
using (FtpWebResponse response = (FtpWebResponse)request.GetResponse())
{
Stream responseStream = response.GetResponseStream();
StreamReader reader = new StreamReader(responseStream);
while (!reader.EndOfStream)
{
tmpFileList.Add(reader.ReadLine());
}
}
Uri ftp = new Uri(ftpAddress);
foreach (var f in tmpFileList)
{
FtpWebRequest req = (FtpWebRequest)WebRequest.Create(new Uri(ftp, f));
req.Method = WebRequestMethods.Ftp.GetDateTimestamp;
req.Credentials = new NetworkCredential(user, password);
using (FtpWebResponse resp = (FtpWebResponse)req.GetResponse())
{
fileList.Add(new Entity() { fileName=f, uploadDate=resp.LastModified });
}
}
fileList = fileList.Where(p => p.uploadDate>=DateTime.Today && p.uploadDate<DateTime.Today.AddDays(1)).ToList();
return fileList;
}
}
I have files on a server that can be accessed from a URL formatted like this:
http:// address/Attachments.aspx?id=GUID
I have access to the GUID and need to be able to download multiple files to the same folder.
if you take that URL and throw it in a browser, you will download the file and it will have the original file name.
I want to replicate that behavior in C#. I have tried using the WebClient class's DownloadFile method, but with that you have to specify a new file name. And even worse, DownloadFile will overwrite an existing file. I know I could generate a unique name for every file, but i'd really like the original.
Is it possible to download a file preserving the original file name?
Update:
Using the fantastic answer below to use the WebReqest class I came up with the following which works perfectly:
public override void OnAttachmentSaved(string filePath)
{
var webClient = new WebClient();
//get file name
var request = WebRequest.Create(filePath);
var response = request.GetResponse();
var contentDisposition = response.Headers["Content-Disposition"];
const string contentFileNamePortion = "filename=";
var fileNameStartIndex = contentDisposition.IndexOf(contentFileNamePortion, StringComparison.InvariantCulture) + contentFileNamePortion.Length;
var originalFileNameLength = contentDisposition.Length - fileNameStartIndex;
var originalFileName = contentDisposition.Substring(fileNameStartIndex, originalFileNameLength);
//download file
webClient.UseDefaultCredentials = true;
webClient.DownloadFile(filePath, String.Format(#"C:\inetpub\Attachments Test\{0}", originalFileName));
}
Just had to do a little string manipulation to get the actual filename. I'm so excited. Thanks everyone!
As hinted in comments, the filename will be available in Content-Disposition header. Not sure about how to get its value when using WebClient, but it's fairly simple with WebRequest:
WebRequest request = WebRequest.Create("http://address/Attachments.aspx?id=GUID");
WebResponse response = request.GetResponse();
string originalFileName = response.Headers["Content-Disposition"];
Stream streamWithFileBody = response.GetResponseStream();