Read folder paths from FTP directory to IEnumerable - c#

I'm currently working on a .NET 4.6 console application. I need to parse a couple of XML files from different directories on my FTP server. I thought the best approach would be, to read all file paths and store them into an IEnumerable, to process them further (Serialize XML Files to objects).
The root FTP path looks like this:
string urlFtpServer = #"ftp://128.0.1.70";
File paths look like this:
string file1 = #"ftp://128.0.1.70/MyFolder1/Mainfile.xml";
string file2 = #"ftp://128.0.1.70/MyFolder1/Subfile.xml";
string file3 = #"ftp://128.0.1.70/MyFolder2/Mainfile.xml";
string file4 = #"ftp://128.0.1.70/MyFolder2/Subfile.xml";
string file5 = #"ftp://128.0.1.70/MyFolder3/Mainfile.xml";
My question is, do you know how I can get those specific file paths?
I currently can read the folders of my FTP directory with this coding:
static void Main(string[] args)
{
string url = #"ftp://128.0.1.70";
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(url);
request.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
request.Credentials = new NetworkCredential("My-User", "mypassword");
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
Stream responseStream = response.GetResponseStream();
StreamReader reader = new StreamReader(responseStream);
Console.WriteLine(reader.ReadToEnd());
Console.WriteLine("Directory List Complete, status {0}", response.StatusDescription);
reader.Close();
response.Close();
Console.ReadKey();
}
Do you know how I can read all file paths from the FTP main directory and possibly store them into a List<string>?
Thank you very much!!

Using FtpWebRequest
The FtpWebRequest does not have any explicit support for recursive file operations (including listing). You have to implement the recursion yourself:
List the remote directory
Iterate the entries, recursing into subdirectories (listing them again, etc.)
Tricky part is to identify files from subdirectories. There's no way to do that in a portable way with the FtpWebRequest. The FtpWebRequest unfortunately does not support the MLSD command, which is the only portable way to retrieve directory listing with file attributes in FTP protocol. See also Checking if object on FTP server is file or directory.
Your options are:
Do an operation on a file name that is certain to fail for file and succeeds for directories (or vice versa). I.e. you can try to download the "name". If that succeeds, it's a file, if that fails, it's a directory.
You may be lucky and in your specific case, you can tell a file from a directory by a file name (i.e. all your files have an extension, while subdirectories do not)
You use a long directory listing (LIST command = ListDirectoryDetails method) and try to parse a server-specific listing. Many FTP servers use *nix-style listing, where you identify a directory by the d at the very beginning of the entry. But many servers use a different format. The following example uses this approach (assuming the *nix format)
void ListFtpDirectory(
string url, string rootPath, NetworkCredential credentials, List<string> list)
{
FtpWebRequest listRequest = (FtpWebRequest)WebRequest.Create(url + rootPath);
listRequest.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
listRequest.Credentials = credentials;
List<string> lines = new List<string>();
using (FtpWebResponse listResponse = (FtpWebResponse)listRequest.GetResponse())
using (Stream listStream = listResponse.GetResponseStream())
using (StreamReader listReader = new StreamReader(listStream))
{
while (!listReader.EndOfStream)
{
lines.Add(listReader.ReadLine());
}
}
foreach (string line in lines)
{
string[] tokens =
line.Split(new[] { ' ' }, 9, StringSplitOptions.RemoveEmptyEntries);
string name = tokens[8];
string permissions = tokens[0];
string filePath = rootPath + name;
if (permissions[0] == 'd')
{
ListFtpDirectory(url, filePath + "/", credentials, list);
}
else
{
list.Add(filePath);
}
}
}
Use the function like:
List<string> list = new List<string>();
NetworkCredential credentials = new NetworkCredential("user", "mypassword");
string url = "ftp://ftp.example.com/";
ListFtpDirectory(url, "", credentials, list);
Using 3rd party library
If you want to avoid troubles with parsing the server-specific directory listing formats, use a 3rd party library that supports the MLSD command and/or parsing various LIST listing formats; and recursive downloads.
For example with WinSCP .NET assembly you can list whole directory with a single call to the Session.EnumerateRemoteFiles:
// Setup session options
SessionOptions sessionOptions = new SessionOptions
{
Protocol = Protocol.Ftp,
HostName = "ftp.example.com",
UserName = "user",
Password = "mypassword",
};
using (Session session = new Session())
{
// Connect
session.Open(sessionOptions);
// List files
IEnumerable<string> list =
session.EnumerateRemoteFiles("/", null, EnumerationOptions.AllDirectories).
Select(fileInfo => fileInfo.FullName);
}
Internally, WinSCP uses the MLSD command, if supported by the server. If not, it uses the LIST command and supports dozens of different listing formats.
(I'm the author of WinSCP)

Related

Get FTP file details based on datetime in C#

Question: I want to get file Details from FTP server based on some specific datetime without using any 3rd party.
Problem : My FTP server contains 1000s of files so getting all files and after that filtering it takes time.
Is there any Quicker way to do this ?
string ftpPath = "ftp://directory/";
// Some expression to match against the files...do they have a consistent
// name? This example would find XML files that had 'some_string' in the file
Regex matchExpression = new Regex("^test.+\.xml$", RegexOptions.IgnoreCase);
// DateFilter
DateTime cutOff = DateTime.Now.AddDays(-10);
List<ftplineresult> results = FTPHelper.GetFilesListSortedByDate(ftpPath, matchExpression, cutOff);
public static List<FTPLineResult> GetFilesListSortedByDate(string ftpPath, Regex nameRegex, DateTime cutoff)
{
List<FTPLineResult> output = new List<FTPLineResult>();
FtpWebRequest request = FtpWebRequest.Create(ftpPath) as FtpWebRequest;
ConfigureProxy(request);
request.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
FtpWebResponse response = request.GetResponse() as FtpWebResponse;
StreamReader directoryReader = new StreamReader(response.GetResponseStream(), System.Text.Encoding.ASCII);
var parser = new FTPLineParser();
while (!directoryReader.EndOfStream)
{
var result = parser.Parse(directoryReader.ReadLine());
if (!result.IsDirectory && result.DateTime > cutoff && nameRegex.IsMatch(result.Name))
{
output.Add(result);
}
}
// need to ensure the files are sorted in ascending date order
output.Sort(
new Comparison<FTPLineResult>(
delegate(FTPLineResult res1, FTPLineResult res2)
{
return res1.DateTime.CompareTo(res2.DateTime);
}
)
);
return output;
}
Problem : My FTP server contains 1000s of files so geting all files and after that filtering it takes time.
Is there any Quicker way to do this ?
No.
The only standard FTP API, is the LIST command and its companions. All these will give you list of all files in a folder. There's no FTP API to give you files filtered by a timestamp.
Some servers support non-standard file masks in the LIST command.
So they will allow you to return only the *.xml files.
See How to get list of files based on pattern matching using FTP?
Similar questions:
Download files from FTP if they are created within the last hour
C# - Download files from FTP which have higher last-modified date
I have got an alternative solution to do my functionality using FluentFTP.
Explanation:
I am downloading the files from FTP (Read permission reqd.) with same folder structure.
So everytime the job/service runs I can check into the physical path same file(Full Path) exists or not If not exists then it can be consider as a new file. And Ii can do some action for the same and download as well.
Its just an alternative solution.
Code Changes:
private static void GetFiles()
{
using (FtpClient conn = new FtpClient())
{
string ftpPath = "ftp://myftp/";
string downloadFileName = #"C:\temp\FTPTest\";
downloadFileName += "\\";
conn.Host = ftpPath;
//conn.Credentials = new NetworkCredential("ftptest", "ftptest");
conn.Connect();
//Get all directories
foreach (FtpListItem item in conn.GetListing(conn.GetWorkingDirectory(),
FtpListOption.Modify | FtpListOption.Recursive))
{
// if this is a file
if (item.Type == FtpFileSystemObjectType.File)
{
string localFilePath = downloadFileName + item.FullName;
//Only newly created files will be downloaded.
if (!File.Exists(localFilePath))
{
conn.DownloadFile(localFilePath, item.FullName);
//Do any action here.
Console.WriteLine(item.FullName);
}
}
}
}
}

Upload all files matching wildcard to SFTP

I am using Tamir SharpSSH for transferring files from remote to local and vice versa with no issues.
But, when trying to upload multiple XML files via SFTP but I am receiving an error:
Illegal characters in path.
If I try to upload using the exact file name it transfers the file without any issues.
Every time I try to upload two XML files:
KDO_E2D_A21_AA769_20170124_143123.xml
KDO_E2D_A21_AA776_20170130_143010.xml
string ftpURL = "11.11.11.1";
string userName = "Aaaaaa"; //User Name of the SFTP server
string password = "hah4444"; //Password of the SFTP server
int port = 22; //Port No of the SFTP server (if any)
//The directory in SFTP server where the files will be uploaded
string ftpDirectory = "/home/A21sftp/kadoe/";
//Local directory from where the files will be uploaded
string localDirectory = "E:\\Zatpark\\*.xml";
Sftp Connection = new Sftp(ftpURL, userName, password);
Connection.Connect(port);
Connection.Put(localDirectory, ftpDirectory);
Connection.Close();
Do not use Tamir.SharpSSH, it's a dead project. Use some up to date SSH/SFTP implementation.
If you switch to library like SSH.NET that does not support wildcards, you have to use the Directory.GetFiles method to find files to upload:
SftpClient client = new SftpClient("example.com", "username", "password");
client.Connect();
string localDirectory = #"E:\Zatpark upload";
string localPattern = "*.xml";
string ftpDirectory = "/dv/inbound/";
string[] files = Directory.GetFiles(localDirectory, localPattern);
foreach (string file in files)
{
using (Stream inputStream = new FileStream(file, FileMode.Open))
{
client.UploadFile(inputStream, ftpDirectory + Path.GetFileName(file));
}
}
Or use a library that does support wildcards.
For example with WinSCP .NET assembly (what is not a pure .NET assembly though), you can do:
SessionOptions sessionOptions = new SessionOptions
{
Protocol = Protocol.Sftp,
HostName = "example.com",
UserName = "username",
Password = "password",
SshHostKeyFingerprint = "ssh-dss ...",
};
using (Session session = new Session())
{
session.Open(sessionOptions);
string ftpDirectory = "/dv/inbound/";
string localDirectory = #"E:\Zatpark upload\*.xml";
session.PutFiles(localDirectory, ftpDirectory).Check();
}
You can have a code template generated in WinSCP GUI.
(I'm the author of WinSCP)
To explain, why your code does not work: Check the ChannelSftp.glob_local method. Its implementation is strange, if not broken. It basically supports only masks consisting completely of * and ?'s.

Use "DirectoryInfo" with FTP server

I would use this instruction:
System.IO.DirectoryInfo dir = new System.IO.DirectoryInfo("ftp://192.168.47.1/DocXML");
But I canĀ“t.
How can I use ("ftp://192.168.47.1/DocXML"); with new System.IO.DirectoryInfo("");?
This is the code
System.IO.DirectoryInfo dir = new System.IO.DirectoryInfo(#"\\192.168.47.1\DocXML");`
IEnumerable<System.IO.FileInfo> fileList = dir.GetFiles("*.*", System.IO.SearchOption.AllDirectories);
I'm afraid you can't.
Try this instead:
FtpWebRequest req = (FtpWebRequest)WebRequest.Create("ftp://192.168.47.1/DocXML");
req.Credentials = new NetworkCredential("foo", "foo#foo.com");
req.Method = WebRequestMethods.Ftp.ListDirectory;
FtpWebResponse res = (FtpWebResponse)req.GetResponse();
using (StreamReader streamReader = new StreamReader(res.GetResponseStream()))
{
...
}
If you a need structured information about files in an FTP directory, you have to use a 3rd party library. The .NET framework does not offer such functionality.
Particularly because it does not support an MLSD FTP command, what is the only reliable way to retrieve a machine-readable listing of remote files with their attributes.
There are many 3rd party libraries that allow this.
For example with WinSCP .NET assembly:
// Setup session options
SessionOptions sessionOptions = new SessionOptions
{
Protocol = Protocol.Ftp,
HostName = "example.com",
UserName = "username",
Password = "password",
};
using (Session session = new Session())
{
// Connect
session.Open(sessionOptions);
// Get list of files in the directory
string remotePath = "/remote/path/";
RemoteDirectoryInfo directoryInfo = session.ListDirectory(remotePath);
foreach (RemoteFileInfo fileInfo in directoryInfo.Files)
{
Console.WriteLine("{0} with size {1}, permissions {2} and last modification at {3}",
fileInfo.Name, fileInfo.Length, fileInfo.FilePermissions,
fileInfo.LastWriteTime);
}
}
References:
https://winscp.net/eng/docs/library_session_listdirectory
https://winscp.net/eng/docs/library_remotefileinfo
From your comment and your other question, you seem to actually need to retrieve the oldest file in FTP directory. For that see:
Download the latest file from an FTP server (C#)
Downloading the most recent file (PowerShell, but translates easily to C#)
Both are for the newest, not oldest, file. Just replace the .OrderByDescending with the .Order in the C# code to get the oldest file.
(I'm the author of WinSCP)
Not working in this way.
I recommend using SFTP instead of FTP. For this I'm using the 3rd party library "SharpSSH".
The following example seems to work:
using System.IO;
using Tamir.SharpSsh;
using Tamir.SharpSsh.jsch;
string ip = "DestinationIp";
string user = "JohnDoe";
string password = "YourPassword";
Sftp sftp = new Tamir.SharpSsh.Sftp(ip, user, password);
sftp.Connect();
FileInfo yourFileInfo = new FileInfo("path");
There's also the possibility to add a primary key with
sftp.AddIdentityFile();

C# FTP, how to check if a Path is a File or a Directory?

I have an array that contains some FTP pathes, like follows:
"ftp//ip/directory/directory1",
"ftp//ip/directory/directory2",
"ftp//ip/directory/file.txt",
"ftp//ip/directory/directory3",
"ftp//ip/directory/another_file.csv"
How can i find out if the path is a file or a directory?
Thanks in advance.
Use the LIST command, which you can refer to RFC959, to get the details about items under the specified path. Take FileZilla Server for example, the LIST command will return standard LINUX permission format which you can find here. The first letter indicates if the requested path is file or directory. Also a simple library written in C# can be found here
I had the same problem. I worked off of hughs answer. You need to make an FTPRequest like:
request.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
grab it from streamreader and stick it in a string
StreamReader reader = new StreamReader(responseStream);
string directoryRaw = null;
try { while (reader.Peek() != -1) { directoryRaw += reader.ReadLine() + "|"; } }
catch (Exception ex) { Console.WriteLine(ex.ToString()); }
when you print this it is going to look like:
|-rw-r--r-- 1 user user 1699 Jun 1 2015
404.shtml
|drwxr-xr-x 2 user user 4096 Sep 8 19:39 cgi-bin
|drwxr-xr-x 2 user user 4096 Nov 3 10:52 css
These are seperated by | so that will be the delim for a splitstring
if it starts with a d and not a - then its a directory, else its a file.
these are all the same size before file name so make a new string for each of these strings starting at position 62 to end and that will be the file name.
Hope it helps
There's no direct way.
Indirectly you could assume that filenames that have no period "." are directories, but that is not going to always be true.
Best is to write the code that consumes these paths carefully so it e.g. treats the path as a directory, then if the FTP server reports an error, treat it as a file.
One way to do it is if we can assume that all files will end in an extension and all directories will not have an extension, we can use the System.IO.Path.GetExtension() method like this:
public bool IsDirectory(string directory)
{
if(directory == null)
{
throw new ArgumentNullException(); // or however you want to handle null values
}
// GetExtension(string) returns string.Empty when no extension found
return System.IO.Path.GetExtension(directory) == string.Empty;
}
You can use System.IO.Path.GetExtension(path)` as a way to check if your path has a file extension.
Given "ftp//ip/directory/directory1" or "ftp//ip/directory/directory2/", GetExtension will return a String.Empty to you.
This isn't foolproof, and it's possible though if there was a file without an extension that this would break down completely, or a directory with a period in it could cause issues.
I have found "hack" how to determine target type.
If you will use
request.Method = WebRequestMethods.Ftp.GetFileSize;
on a folder, it will result in Exception
Unhandled Exception: System.Net.WebException: The remote server
returned an erro r: (550) File unavailable (e.g., file not found, no
access).
But using it on file, it will naturally return its size.
I have create sample code for such method.
static bool IsFile(string ftpPath)
{
var request = (FtpWebRequest)WebRequest.Create(ftpPath);
request.Method = WebRequestMethods.Ftp.GetFileSize;
request.Credentials = new NetworkCredential("foo", "bar");
try
{
using (var response = (FtpWebResponse)request.GetResponse())
using (var responseStream = response.GetResponseStream())
{
return true;
}
}
catch(WebException ex)
{
return false;
}
}
You might want to alter it, because this one will catch any FTP error.
I had the same problem so I used GetFileSize to check if it's a File or Directory
var isFile = FtpHelper.IsFile("ftpURL", "userName", "password");
using System;
using System.Net;
public static class FtpHelper
{
public static bool IsFile(Uri requestUri, NetworkCredential networkCredential)
{
return GetFtpFileSize(requestUri, networkCredential) != default(long); //It's a Directory if it has no size
}
public static FtpWebRequest GetFtpWebRequest(Uri requestUri, NetworkCredential networkCredential, string method = null)
{
var ftpWebRequest = (FtpWebRequest)WebRequest.Create(requestUri); //Create FtpWebRequest with given Request Uri.
ftpWebRequest.Credentials = networkCredential; //Set the Credentials of current FtpWebRequest.
if (!string.IsNullOrEmpty(method))
ftpWebRequest.Method = method; //Set the Method of FtpWebRequest incase it has a value.
return ftpWebRequest; //Return the configured FtpWebRequest.
}
public static long GetFtpFileSize(Uri requestUri, NetworkCredential networkCredential)
{
//Create ftpWebRequest object with given options to get the File Size.
var ftpWebRequest = GetFtpWebRequest(requestUri, networkCredential, WebRequestMethods.Ftp.GetFileSize);
try { return ((FtpWebResponse)ftpWebRequest.GetResponse()).ContentLength; } //Incase of success it'll return the File Size.
catch (Exception) { return default(long); } //Incase of fail it'll return default value to check it later.
}
}

How can I send the client multiple files to download

I'm using a handler(.ashx) to serve some files. I have a folder where I store ebooks. I name them by the books PK, and each book may have a few different formats:
211.html
211.pdf
211.prc
The following test code successfully downloads one book.
context.Response.ContentType = "application/octet-stream";
context.Response.AppendHeader("Content-Disposition", "attachment;filename=myfile.pdf");
context.Response.TransmitFile(context.Server.MapPath("~/Media/eBooks/212.pdf"));
How can I serve the client the three different formats? (The clients existing organization isn't in a folder)
I was trying to do something like this:
DirectoryInfo bookDir = new DirectoryInfo(context.Server.MapPath("~/Media/eBooks"));
FileInfo[] f = bookDir.GetFiles();
foreach (var n in f)
{
context.Response.AppendHeader("Content-Disposition", "attachment;filename=myfile.pdf");
context.Response.TransmitFile(context.Server.MapPath("~/Media/eBooks/212.pdf"));
}
But it downloads one file with no file extension.
The only way you can send multiple files in one response is to put them inside an archive package, e.g. a .zip file. That is at least something that can be done with code, using various tools (IIRC there's a zip packager inside the main .NET framework now; otherwise, SharpZipLib will do the job nicely).
To send multiple file to be downloaded, you should zip them using sharpziplib or other file zipping utility,files should be zipped and then download link can be send to the client to download them at once. the code below use ICSharpCode.SharpZipLib.dll Library.
You can call this class and pass your files which you want to zip.
public string Makezipfile(string[] files)
{
string[] filenames = new string[files.Length];
for (int i = 0; i < files.Length; i++)
filenames[i] = HttpContext.Current.Request.PhysicalApplicationPath + files[i].Replace(HttpContext.Current.Request.UrlReferrer.ToString(), "");
string DirectoryName = filenames[0].Remove(filenames[0].LastIndexOf('/'));
DirectoryName = DirectoryName.Substring(DirectoryName.LastIndexOf('/') + 1).Replace("\\", "");
try
{
string newFile = HttpContext.Current.Request.PhysicalApplicationPath + "your image directory\\" + DirectoryName + ".zip";
if (File.Exists(newFile))
File.Delete(newFile);
using (ZipFile zip = new ZipFile())
{
foreach (string file in filenames)
{
string newfileName = file.Replace("\\'", "'");
zip.CompressionLevel = 0;
zip.AddFile(newfileName, "");
}
zip.Save(newFile);
}
}
catch (Exception ex)
{
//Console.WriteLine("Exception during processing {0}", ex);
Response.Write(ex);
// No need to rethrow the exception as for our purposes its handled.
}
string a;
a = "your images/" + DirectoryName + ".zip";
return a;
}
I acknowledge the good Zip solutions mentioned here, but alternatively could you make 3 calls to the handler using javascript/XHR, requesting a different file format each time?
Admittedly, you are restricted by the number of concurrent requests supported by the browser, though I believe the browser will queue requests over the limit.
The benefit is that the User won't need to deal with a zip file, which may confuse them. Instead they should get 3 separate downloads.

Categories

Resources