Get FTP file details based on datetime in C# - c#

Question: I want to get file Details from FTP server based on some specific datetime without using any 3rd party.
Problem : My FTP server contains 1000s of files so getting all files and after that filtering it takes time.
Is there any Quicker way to do this ?
string ftpPath = "ftp://directory/";
// Some expression to match against the files...do they have a consistent
// name? This example would find XML files that had 'some_string' in the file
Regex matchExpression = new Regex("^test.+\.xml$", RegexOptions.IgnoreCase);
// DateFilter
DateTime cutOff = DateTime.Now.AddDays(-10);
List<ftplineresult> results = FTPHelper.GetFilesListSortedByDate(ftpPath, matchExpression, cutOff);
public static List<FTPLineResult> GetFilesListSortedByDate(string ftpPath, Regex nameRegex, DateTime cutoff)
{
List<FTPLineResult> output = new List<FTPLineResult>();
FtpWebRequest request = FtpWebRequest.Create(ftpPath) as FtpWebRequest;
ConfigureProxy(request);
request.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
FtpWebResponse response = request.GetResponse() as FtpWebResponse;
StreamReader directoryReader = new StreamReader(response.GetResponseStream(), System.Text.Encoding.ASCII);
var parser = new FTPLineParser();
while (!directoryReader.EndOfStream)
{
var result = parser.Parse(directoryReader.ReadLine());
if (!result.IsDirectory && result.DateTime > cutoff && nameRegex.IsMatch(result.Name))
{
output.Add(result);
}
}
// need to ensure the files are sorted in ascending date order
output.Sort(
new Comparison<FTPLineResult>(
delegate(FTPLineResult res1, FTPLineResult res2)
{
return res1.DateTime.CompareTo(res2.DateTime);
}
)
);
return output;
}

Problem : My FTP server contains 1000s of files so geting all files and after that filtering it takes time.
Is there any Quicker way to do this ?
No.
The only standard FTP API, is the LIST command and its companions. All these will give you list of all files in a folder. There's no FTP API to give you files filtered by a timestamp.
Some servers support non-standard file masks in the LIST command.
So they will allow you to return only the *.xml files.
See How to get list of files based on pattern matching using FTP?
Similar questions:
Download files from FTP if they are created within the last hour
C# - Download files from FTP which have higher last-modified date

I have got an alternative solution to do my functionality using FluentFTP.
Explanation:
I am downloading the files from FTP (Read permission reqd.) with same folder structure.
So everytime the job/service runs I can check into the physical path same file(Full Path) exists or not If not exists then it can be consider as a new file. And Ii can do some action for the same and download as well.
Its just an alternative solution.
Code Changes:
private static void GetFiles()
{
using (FtpClient conn = new FtpClient())
{
string ftpPath = "ftp://myftp/";
string downloadFileName = #"C:\temp\FTPTest\";
downloadFileName += "\\";
conn.Host = ftpPath;
//conn.Credentials = new NetworkCredential("ftptest", "ftptest");
conn.Connect();
//Get all directories
foreach (FtpListItem item in conn.GetListing(conn.GetWorkingDirectory(),
FtpListOption.Modify | FtpListOption.Recursive))
{
// if this is a file
if (item.Type == FtpFileSystemObjectType.File)
{
string localFilePath = downloadFileName + item.FullName;
//Only newly created files will be downloaded.
if (!File.Exists(localFilePath))
{
conn.DownloadFile(localFilePath, item.FullName);
//Do any action here.
Console.WriteLine(item.FullName);
}
}
}
}
}

Related

Read folder paths from FTP directory to IEnumerable

I'm currently working on a .NET 4.6 console application. I need to parse a couple of XML files from different directories on my FTP server. I thought the best approach would be, to read all file paths and store them into an IEnumerable, to process them further (Serialize XML Files to objects).
The root FTP path looks like this:
string urlFtpServer = #"ftp://128.0.1.70";
File paths look like this:
string file1 = #"ftp://128.0.1.70/MyFolder1/Mainfile.xml";
string file2 = #"ftp://128.0.1.70/MyFolder1/Subfile.xml";
string file3 = #"ftp://128.0.1.70/MyFolder2/Mainfile.xml";
string file4 = #"ftp://128.0.1.70/MyFolder2/Subfile.xml";
string file5 = #"ftp://128.0.1.70/MyFolder3/Mainfile.xml";
My question is, do you know how I can get those specific file paths?
I currently can read the folders of my FTP directory with this coding:
static void Main(string[] args)
{
string url = #"ftp://128.0.1.70";
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(url);
request.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
request.Credentials = new NetworkCredential("My-User", "mypassword");
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
Stream responseStream = response.GetResponseStream();
StreamReader reader = new StreamReader(responseStream);
Console.WriteLine(reader.ReadToEnd());
Console.WriteLine("Directory List Complete, status {0}", response.StatusDescription);
reader.Close();
response.Close();
Console.ReadKey();
}
Do you know how I can read all file paths from the FTP main directory and possibly store them into a List<string>?
Thank you very much!!
Using FtpWebRequest
The FtpWebRequest does not have any explicit support for recursive file operations (including listing). You have to implement the recursion yourself:
List the remote directory
Iterate the entries, recursing into subdirectories (listing them again, etc.)
Tricky part is to identify files from subdirectories. There's no way to do that in a portable way with the FtpWebRequest. The FtpWebRequest unfortunately does not support the MLSD command, which is the only portable way to retrieve directory listing with file attributes in FTP protocol. See also Checking if object on FTP server is file or directory.
Your options are:
Do an operation on a file name that is certain to fail for file and succeeds for directories (or vice versa). I.e. you can try to download the "name". If that succeeds, it's a file, if that fails, it's a directory.
You may be lucky and in your specific case, you can tell a file from a directory by a file name (i.e. all your files have an extension, while subdirectories do not)
You use a long directory listing (LIST command = ListDirectoryDetails method) and try to parse a server-specific listing. Many FTP servers use *nix-style listing, where you identify a directory by the d at the very beginning of the entry. But many servers use a different format. The following example uses this approach (assuming the *nix format)
void ListFtpDirectory(
string url, string rootPath, NetworkCredential credentials, List<string> list)
{
FtpWebRequest listRequest = (FtpWebRequest)WebRequest.Create(url + rootPath);
listRequest.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
listRequest.Credentials = credentials;
List<string> lines = new List<string>();
using (FtpWebResponse listResponse = (FtpWebResponse)listRequest.GetResponse())
using (Stream listStream = listResponse.GetResponseStream())
using (StreamReader listReader = new StreamReader(listStream))
{
while (!listReader.EndOfStream)
{
lines.Add(listReader.ReadLine());
}
}
foreach (string line in lines)
{
string[] tokens =
line.Split(new[] { ' ' }, 9, StringSplitOptions.RemoveEmptyEntries);
string name = tokens[8];
string permissions = tokens[0];
string filePath = rootPath + name;
if (permissions[0] == 'd')
{
ListFtpDirectory(url, filePath + "/", credentials, list);
}
else
{
list.Add(filePath);
}
}
}
Use the function like:
List<string> list = new List<string>();
NetworkCredential credentials = new NetworkCredential("user", "mypassword");
string url = "ftp://ftp.example.com/";
ListFtpDirectory(url, "", credentials, list);
Using 3rd party library
If you want to avoid troubles with parsing the server-specific directory listing formats, use a 3rd party library that supports the MLSD command and/or parsing various LIST listing formats; and recursive downloads.
For example with WinSCP .NET assembly you can list whole directory with a single call to the Session.EnumerateRemoteFiles:
// Setup session options
SessionOptions sessionOptions = new SessionOptions
{
Protocol = Protocol.Ftp,
HostName = "ftp.example.com",
UserName = "user",
Password = "mypassword",
};
using (Session session = new Session())
{
// Connect
session.Open(sessionOptions);
// List files
IEnumerable<string> list =
session.EnumerateRemoteFiles("/", null, EnumerationOptions.AllDirectories).
Select(fileInfo => fileInfo.FullName);
}
Internally, WinSCP uses the MLSD command, if supported by the server. If not, it uses the LIST command and supports dozens of different listing formats.
(I'm the author of WinSCP)

C# Multiple Download from FTP using parallel task - Duplicate Download issue

I am facing a strange issue, I want to download a list of files from FTP. I preferred to go with Parallel Task. Below is my code. The issue is, all the list of files are getting downloaded, but duplicate files with different name are being generated. I am very new to Parallel task concept. Please help me to find out the issue.
Note: I am using SSH.Net for sftp connection and download.
private void ConcurrentDownload()
{
// Declaring Connection Information
PasswordAuthenticationMethod pm = new PasswordAuthenticationMethod("FTPUserName", "Password");
ConnectionInfo connectionInfo = new ConnectionInfo("FTPHost", 22, "FTPUserName", ProxyTypes.Socks5, "127.0.0.1", 8080, string.Empty, string.Empty, pm);
using (SftpClient sfc = new SftpClient(connectionInfo))
{
// Establish the remote connection
sfc.Connect();
// Getting Remote Directory Contents
IEnumerable<SftpFile> sFiles = new List<SftpFile>();
sFiles = sfc.ListDirectory(".\\");
// Building the File List
List<string> remotefiles = new List<string>();
foreach (SftpFile sfile in sFiles)
{
if (!sfile.IsDirectory)
{
string ss = sfile.Name;
remotefiles.Add(ss);
}
}
// Parallel Download
Parallel.ForEach(remotefiles.Distinct(), file => DownloadFile(sfc, file));
sfc.Disconnect();
}
}
private void DownloadFile(SftpClient sf, string RemoteFileName)
{
using (Stream ms = File.OpenWrite(RemoteFileName))
{
sf.DownloadFile(RemoteFileName, ms);
}
}
You better use Distinct like below
Parallel.ForEach(remotefiles.Distinct(), file => DownloadFile(sfc, file));
if you have duplicate file names and when parallel processing start on same file you will get exception on those duplicate files.
And also you are not downloading to another location, what you are doing is download to same ftp source location. is that correct?
I would give diferent download directory and get file name from source file and then download to that location as below
private void DownloadFile(SftpClient sf, string RemoteFileName)
{
string downloadTo = Path.Combine(DownloadDirectoryPath, Path.GetFileName(RemoteFileName));
using (Stream ms = File.OpenWrite(downloadTo))
{
sf.DownloadFile(RemoteFileName, ms);
}
}
Related Reference : SFTP Async Upload in Parallel

C# FTP, how to check if a Path is a File or a Directory?

I have an array that contains some FTP pathes, like follows:
"ftp//ip/directory/directory1",
"ftp//ip/directory/directory2",
"ftp//ip/directory/file.txt",
"ftp//ip/directory/directory3",
"ftp//ip/directory/another_file.csv"
How can i find out if the path is a file or a directory?
Thanks in advance.
Use the LIST command, which you can refer to RFC959, to get the details about items under the specified path. Take FileZilla Server for example, the LIST command will return standard LINUX permission format which you can find here. The first letter indicates if the requested path is file or directory. Also a simple library written in C# can be found here
I had the same problem. I worked off of hughs answer. You need to make an FTPRequest like:
request.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
grab it from streamreader and stick it in a string
StreamReader reader = new StreamReader(responseStream);
string directoryRaw = null;
try { while (reader.Peek() != -1) { directoryRaw += reader.ReadLine() + "|"; } }
catch (Exception ex) { Console.WriteLine(ex.ToString()); }
when you print this it is going to look like:
|-rw-r--r-- 1 user user 1699 Jun 1 2015
404.shtml
|drwxr-xr-x 2 user user 4096 Sep 8 19:39 cgi-bin
|drwxr-xr-x 2 user user 4096 Nov 3 10:52 css
These are seperated by | so that will be the delim for a splitstring
if it starts with a d and not a - then its a directory, else its a file.
these are all the same size before file name so make a new string for each of these strings starting at position 62 to end and that will be the file name.
Hope it helps
There's no direct way.
Indirectly you could assume that filenames that have no period "." are directories, but that is not going to always be true.
Best is to write the code that consumes these paths carefully so it e.g. treats the path as a directory, then if the FTP server reports an error, treat it as a file.
One way to do it is if we can assume that all files will end in an extension and all directories will not have an extension, we can use the System.IO.Path.GetExtension() method like this:
public bool IsDirectory(string directory)
{
if(directory == null)
{
throw new ArgumentNullException(); // or however you want to handle null values
}
// GetExtension(string) returns string.Empty when no extension found
return System.IO.Path.GetExtension(directory) == string.Empty;
}
You can use System.IO.Path.GetExtension(path)` as a way to check if your path has a file extension.
Given "ftp//ip/directory/directory1" or "ftp//ip/directory/directory2/", GetExtension will return a String.Empty to you.
This isn't foolproof, and it's possible though if there was a file without an extension that this would break down completely, or a directory with a period in it could cause issues.
I have found "hack" how to determine target type.
If you will use
request.Method = WebRequestMethods.Ftp.GetFileSize;
on a folder, it will result in Exception
Unhandled Exception: System.Net.WebException: The remote server
returned an erro r: (550) File unavailable (e.g., file not found, no
access).
But using it on file, it will naturally return its size.
I have create sample code for such method.
static bool IsFile(string ftpPath)
{
var request = (FtpWebRequest)WebRequest.Create(ftpPath);
request.Method = WebRequestMethods.Ftp.GetFileSize;
request.Credentials = new NetworkCredential("foo", "bar");
try
{
using (var response = (FtpWebResponse)request.GetResponse())
using (var responseStream = response.GetResponseStream())
{
return true;
}
}
catch(WebException ex)
{
return false;
}
}
You might want to alter it, because this one will catch any FTP error.
I had the same problem so I used GetFileSize to check if it's a File or Directory
var isFile = FtpHelper.IsFile("ftpURL", "userName", "password");
using System;
using System.Net;
public static class FtpHelper
{
public static bool IsFile(Uri requestUri, NetworkCredential networkCredential)
{
return GetFtpFileSize(requestUri, networkCredential) != default(long); //It's a Directory if it has no size
}
public static FtpWebRequest GetFtpWebRequest(Uri requestUri, NetworkCredential networkCredential, string method = null)
{
var ftpWebRequest = (FtpWebRequest)WebRequest.Create(requestUri); //Create FtpWebRequest with given Request Uri.
ftpWebRequest.Credentials = networkCredential; //Set the Credentials of current FtpWebRequest.
if (!string.IsNullOrEmpty(method))
ftpWebRequest.Method = method; //Set the Method of FtpWebRequest incase it has a value.
return ftpWebRequest; //Return the configured FtpWebRequest.
}
public static long GetFtpFileSize(Uri requestUri, NetworkCredential networkCredential)
{
//Create ftpWebRequest object with given options to get the File Size.
var ftpWebRequest = GetFtpWebRequest(requestUri, networkCredential, WebRequestMethods.Ftp.GetFileSize);
try { return ((FtpWebResponse)ftpWebRequest.GetResponse()).ContentLength; } //Incase of success it'll return the File Size.
catch (Exception) { return default(long); } //Incase of fail it'll return default value to check it later.
}
}

How to download a file only when the local file is older

I am trying to compare two files, one on a local computer and another on a web server, if the file on the web server is newer, it is downloaded / overwrites the local one. Although FileInfo will not take URI's, can someone recommend a way around this please
private void checkver()
{
FileInfo sourceFile = new FileInfo("download.zip");
if (sourceFile.Exists)
{
FileInfo destFile = new FileInfo(#"http://www.google.com/download.zip");
if (destFile.Exists && destFile.LastWriteTime >= sourceFile.LastWriteTime)
{
MessageBox.Show("File already up to date");
}
else
{
MessageBox.Show("File is not up to date");
}
}
}
Try using HttpWebRequest and HttpWebResponse:
var request = (HttpWebRequest)WebRequest.Create(#"http://www.google.com/download.zip");
request.Method = "HEAD";
var response = (HttpWebResponse)request.GetResponse();
if (response.LastModified > sourceFile.LastWriteTime)
{
// create another request to download the whole file
}

How can I send the client multiple files to download

I'm using a handler(.ashx) to serve some files. I have a folder where I store ebooks. I name them by the books PK, and each book may have a few different formats:
211.html
211.pdf
211.prc
The following test code successfully downloads one book.
context.Response.ContentType = "application/octet-stream";
context.Response.AppendHeader("Content-Disposition", "attachment;filename=myfile.pdf");
context.Response.TransmitFile(context.Server.MapPath("~/Media/eBooks/212.pdf"));
How can I serve the client the three different formats? (The clients existing organization isn't in a folder)
I was trying to do something like this:
DirectoryInfo bookDir = new DirectoryInfo(context.Server.MapPath("~/Media/eBooks"));
FileInfo[] f = bookDir.GetFiles();
foreach (var n in f)
{
context.Response.AppendHeader("Content-Disposition", "attachment;filename=myfile.pdf");
context.Response.TransmitFile(context.Server.MapPath("~/Media/eBooks/212.pdf"));
}
But it downloads one file with no file extension.
The only way you can send multiple files in one response is to put them inside an archive package, e.g. a .zip file. That is at least something that can be done with code, using various tools (IIRC there's a zip packager inside the main .NET framework now; otherwise, SharpZipLib will do the job nicely).
To send multiple file to be downloaded, you should zip them using sharpziplib or other file zipping utility,files should be zipped and then download link can be send to the client to download them at once. the code below use ICSharpCode.SharpZipLib.dll Library.
You can call this class and pass your files which you want to zip.
public string Makezipfile(string[] files)
{
string[] filenames = new string[files.Length];
for (int i = 0; i < files.Length; i++)
filenames[i] = HttpContext.Current.Request.PhysicalApplicationPath + files[i].Replace(HttpContext.Current.Request.UrlReferrer.ToString(), "");
string DirectoryName = filenames[0].Remove(filenames[0].LastIndexOf('/'));
DirectoryName = DirectoryName.Substring(DirectoryName.LastIndexOf('/') + 1).Replace("\\", "");
try
{
string newFile = HttpContext.Current.Request.PhysicalApplicationPath + "your image directory\\" + DirectoryName + ".zip";
if (File.Exists(newFile))
File.Delete(newFile);
using (ZipFile zip = new ZipFile())
{
foreach (string file in filenames)
{
string newfileName = file.Replace("\\'", "'");
zip.CompressionLevel = 0;
zip.AddFile(newfileName, "");
}
zip.Save(newFile);
}
}
catch (Exception ex)
{
//Console.WriteLine("Exception during processing {0}", ex);
Response.Write(ex);
// No need to rethrow the exception as for our purposes its handled.
}
string a;
a = "your images/" + DirectoryName + ".zip";
return a;
}
I acknowledge the good Zip solutions mentioned here, but alternatively could you make 3 calls to the handler using javascript/XHR, requesting a different file format each time?
Admittedly, you are restricted by the number of concurrent requests supported by the browser, though I believe the browser will queue requests over the limit.
The benefit is that the User won't need to deal with a zip file, which may confuse them. Instead they should get 3 separate downloads.

Categories

Resources