How to calculate size of FTP folder? Do you know any tool or programmatic way in C#?
If you have FileZilla, you can use this trick:
click on the folder(s) whose size you want to calculate
click on Add files to queue
This will scan all folders and files and add them to the queue. Then look at the queue pane and below it (on the status bar) you should see a message indicating the queue size.
You can use the du command in lftp for this purpose, like this:
echo "du -hs ." | lftp example.com 2>&1
This will print the current directory's disk size incl. all subdirectories, in human-readable format (-h) and omitting output lines for subdirectories (-s). stderr output is rerouted to stdout with 2>&1 so that it is included in the output.
However, lftp is a Linux-only software, so to use it from C# under Windows you would need to install it in the integrated Windows Subsystem for Linux (WSL) or using Cygwin or MSYS2. (Thanks to the commenters for the hints!)
The lftp du command documentation is missing from its manpage, but available within the lftp shell with the help du command. For reference, I copy its output here:
lftp :~> help du
Usage: du [options] <dirs>
Summarize disk usage.
-a, --all write counts for all files, not just directories
--block-size=SIZ use SIZ-byte blocks
-b, --bytes print size in bytes
-c, --total produce a grand total
-d, --max-depth=N print the total for a directory (or file, with --all)
only if it is N or fewer levels below the command
line argument; --max-depth=0 is the same as
--summarize
-F, --files print number of files instead of sizes
-h, --human-readable print sizes in human readable format (e.g., 1K 234M 2G)
-H, --si likewise, but use powers of 1000 not 1024
-k, --kilobytes like --block-size=1024
-m, --megabytes like --block-size=1048576
-S, --separate-dirs do not include size of subdirectories
-s, --summarize display only a total for each argument
--exclude=PAT exclude files that match PAT
WinSCP (free GUI on Microsoft Windows):
If you just need the work done, then SmartFTP might help you, it also has a PHP and ASP script to get the total folder size by recursively going through all the files.
You could send the LIST command which should give you a list of files in the directory and some info about them (fairly certain the size is included), which you could then parse out and add up.
Depends on how you connect to the server, but if you're useing the WebRequest.Ftp class there's the ListDirectoryDetails method to do this. See here for details and here for some sample code.
Just be aware, if you want to have the total size including all subdirectories, I think you'll have to enter each subdirectory and call it recursively so it could be quite slow. It can be quite slow thought so normally I'd recommended, if possible, to have a script on the server calculate the size and return the result in some way (possibly storing it in a file you could download and read).
Edit: Or if you just mean that you'd be happy with a tool that does it for you, I think FlashFXP does it and probably other advanced FTP clients will as well. Or if it's a unix server I have a vague memory that you could just login and type ls -laR or something to get a recursive directory listing.
I use the FTPS library from Alex Pilotti with C# to execute some FTP commands in a few production environments. The library works well, but you have to recursively get a list of files in the directory and add their sizes together to get the result. This can be a bit time consuming on some of our larger servers (sometimes 1-2 min) with complex file structures.
Anyway, this is the method I use with his library:
/// <summary>
/// <para>This will get the size for a directory</para>
/// <para>Can be lengthy to complete on complex folder structures</para>
/// </summary>
/// <param name="pathToDirectory">The path to the remote directory</param>
public ulong GetDirectorySize(string pathToDirectory)
{
try
{
var client = Settings.Variables.FtpClient;
ulong size = 0;
if (!IsConnected)
return 0;
var dirList = client.GetDirectoryList(pathToDirectory);
foreach (var item in dirList)
{
if (item.IsDirectory)
size += GetDirectorySize(string.Format("{0}/{1}", pathToDirectory, item.Name));
else
size += item.Size;
}
return size;
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
return 0;
}
Simplest and Efficient way to Get FTP Directory Size with it's all Contents recursively.
var size = FtpHelper.GetFtpDirectorySize("ftpURL", "userName",
"password");
using System;
using System.Collections.Generic;
using System.IO;
using System.Net;
using System.Threading;
using System.Threading.Tasks;
public static class FtpHelper
{
public static long GetFtpDirectorySize(Uri requestUri, NetworkCredential networkCredential, bool recursive = true)
{
//Get files/directories contained in CURRENT directory.
var directoryContents = GetFtpDirectoryContents(requestUri, networkCredential);
long ftpDirectorySize = default(long); //Set initial value of the size to default: 0
var subDirectoriesList = new List<Uri>(); //Create empty list to fill it later with new founded directories.
//Loop on every file/directory founded in CURRENT directory.
foreach (var item in directoryContents)
{
//Combine item path with CURRENT directory path.
var itemUri = new Uri(Path.Combine(requestUri.AbsoluteUri + "\\", item));
var fileSize = GetFtpFileSize(itemUri, networkCredential); //Get item file size.
if (fileSize == default(long)) //This means it has no size so it's a directory and NOT a file.
subDirectoriesList.Add(itemUri); //Add this item Uri to subDirectories to get it's size later.
else //This means it has size so it's a file.
Interlocked.Add(ref ftpDirectorySize, fileSize); //Add file size to overall directory size.
}
if (recursive) //If recursive true: it'll get size of subDirectories files.
//Get size of selected directory and add it to overall directory size.
Parallel.ForEach(subDirectoriesList, (subDirectory) => //Loop on every directory
Interlocked.Add(ref ftpDirectorySize, GetFtpDirectorySize(subDirectory, networkCredential, recursive)));
return ftpDirectorySize; //returns overall directory size.
}
public static long GetFtpDirectorySize(string requestUriString, string userName, string password, bool recursive = true)
{
//Initialize Uri/NetworkCredential objects and call the other method to centralize the code
return GetFtpDirectorySize(new Uri(requestUriString), GetNetworkCredential(userName, password), recursive);
}
public static long GetFtpFileSize(Uri requestUri, NetworkCredential networkCredential)
{
//Create ftpWebRequest object with given options to get the File Size.
var ftpWebRequest = GetFtpWebRequest(requestUri, networkCredential, WebRequestMethods.Ftp.GetFileSize);
try { return ((FtpWebResponse)ftpWebRequest.GetResponse()).ContentLength; } //Incase of success it'll return the File Size.
catch (Exception) { return default(long); } //Incase of fail it'll return default value to check it later.
}
public static List<string> GetFtpDirectoryContents(Uri requestUri, NetworkCredential networkCredential)
{
var directoryContents = new List<string>(); //Create empty list to fill it later.
//Create ftpWebRequest object with given options to get the Directory Contents.
var ftpWebRequest = GetFtpWebRequest(requestUri, networkCredential, WebRequestMethods.Ftp.ListDirectory);
try
{
using (var ftpWebResponse = (FtpWebResponse)ftpWebRequest.GetResponse()) //Excute the ftpWebRequest and Get It's Response.
using (var streamReader = new StreamReader(ftpWebResponse.GetResponseStream())) //Get list of the Directory Contentss as Stream.
{
var line = string.Empty; //Initial default value for line
while (!string.IsNullOrEmpty(line = streamReader.ReadLine())) //Read current line of Stream.
directoryContents.Add(line); //Add current line to Directory Contentss List.
}
}
catch (Exception) { throw; } //Do nothing incase of Exception occurred.
return directoryContents; //Return all list of Directory Contentss: Files/Sub Directories.
}
public static FtpWebRequest GetFtpWebRequest(Uri requestUri, NetworkCredential networkCredential, string method = null)
{
var ftpWebRequest = (FtpWebRequest)WebRequest.Create(requestUri); //Create FtpWebRequest with given Request Uri.
ftpWebRequest.Credentials = networkCredential; //Set the Credentials of current FtpWebRequest.
if (!string.IsNullOrEmpty(method))
ftpWebRequest.Method = method; //Set the Method of FtpWebRequest incase it has a value.
return ftpWebRequest; //Return the configured FtpWebRequest.
}
public static NetworkCredential GetNetworkCredential(string userName, string password)
{
//Create and Return NetworkCredential object with given UserName and Password.
return new NetworkCredential(userName, password);
}
}
As the answer by #FranckDernoncourt shows, if you want a GUI tool, you can use WinSCP GUI. Particularly its folder properties dialog.
If you need a code, you can use WinSCP too. Particularly with WinSCP .NET assembly and its Session.EnumerateRemoteFiles method it is easy to implement in many languages, including C#.
It is also doable with .NET built-in FtpWebRequest, but that's lot more work.
Both are covered in How to get a directories file size from an FTP protocol in a .NET application.
You can use The FileZilla client. Download here: https://filezilla-project.org/download.php?type=client
If you want more readable size go to:
Edit -> Settings -> Interface -> filesize format -> size formatting -> select binary prefixes using SI symbols.
When you select a directory you can see its size.
Just use FTP "SIZE" command...
Related
Question: I want to get file Details from FTP server based on some specific datetime without using any 3rd party.
Problem : My FTP server contains 1000s of files so getting all files and after that filtering it takes time.
Is there any Quicker way to do this ?
string ftpPath = "ftp://directory/";
// Some expression to match against the files...do they have a consistent
// name? This example would find XML files that had 'some_string' in the file
Regex matchExpression = new Regex("^test.+\.xml$", RegexOptions.IgnoreCase);
// DateFilter
DateTime cutOff = DateTime.Now.AddDays(-10);
List<ftplineresult> results = FTPHelper.GetFilesListSortedByDate(ftpPath, matchExpression, cutOff);
public static List<FTPLineResult> GetFilesListSortedByDate(string ftpPath, Regex nameRegex, DateTime cutoff)
{
List<FTPLineResult> output = new List<FTPLineResult>();
FtpWebRequest request = FtpWebRequest.Create(ftpPath) as FtpWebRequest;
ConfigureProxy(request);
request.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
FtpWebResponse response = request.GetResponse() as FtpWebResponse;
StreamReader directoryReader = new StreamReader(response.GetResponseStream(), System.Text.Encoding.ASCII);
var parser = new FTPLineParser();
while (!directoryReader.EndOfStream)
{
var result = parser.Parse(directoryReader.ReadLine());
if (!result.IsDirectory && result.DateTime > cutoff && nameRegex.IsMatch(result.Name))
{
output.Add(result);
}
}
// need to ensure the files are sorted in ascending date order
output.Sort(
new Comparison<FTPLineResult>(
delegate(FTPLineResult res1, FTPLineResult res2)
{
return res1.DateTime.CompareTo(res2.DateTime);
}
)
);
return output;
}
Problem : My FTP server contains 1000s of files so geting all files and after that filtering it takes time.
Is there any Quicker way to do this ?
No.
The only standard FTP API, is the LIST command and its companions. All these will give you list of all files in a folder. There's no FTP API to give you files filtered by a timestamp.
Some servers support non-standard file masks in the LIST command.
So they will allow you to return only the *.xml files.
See How to get list of files based on pattern matching using FTP?
Similar questions:
Download files from FTP if they are created within the last hour
C# - Download files from FTP which have higher last-modified date
I have got an alternative solution to do my functionality using FluentFTP.
Explanation:
I am downloading the files from FTP (Read permission reqd.) with same folder structure.
So everytime the job/service runs I can check into the physical path same file(Full Path) exists or not If not exists then it can be consider as a new file. And Ii can do some action for the same and download as well.
Its just an alternative solution.
Code Changes:
private static void GetFiles()
{
using (FtpClient conn = new FtpClient())
{
string ftpPath = "ftp://myftp/";
string downloadFileName = #"C:\temp\FTPTest\";
downloadFileName += "\\";
conn.Host = ftpPath;
//conn.Credentials = new NetworkCredential("ftptest", "ftptest");
conn.Connect();
//Get all directories
foreach (FtpListItem item in conn.GetListing(conn.GetWorkingDirectory(),
FtpListOption.Modify | FtpListOption.Recursive))
{
// if this is a file
if (item.Type == FtpFileSystemObjectType.File)
{
string localFilePath = downloadFileName + item.FullName;
//Only newly created files will be downloaded.
if (!File.Exists(localFilePath))
{
conn.DownloadFile(localFilePath, item.FullName);
//Do any action here.
Console.WriteLine(item.FullName);
}
}
}
}
}
Using this article from MSDN, I'm trying to search through files in a directory. The problem is, every time I execute the program, I get:
"An unhandled exception of type 'System.OutOfMemoryException' occurred in mscorlib.dll".
I have tried to some other options like StreamReader, but I can't get it to work. These files are HUGE. Some of them range in upwards to 1.5-2GB each and there could be 5 or more files per day.
This code fails:
private static string GetFileText(string name)
{
var fileContents = string.Empty;
// If the file has been deleted since we took
// the snapshot, ignore it and return the empty string.
if (File.Exists(name))
{
fileContents = File.ReadAllText(name);
}
return fileContents;
}
Any ideas what could be happening or how to make it read without memory errors?
Entire code (in case you don't want to open the MSDN article)
class QueryContents {
public static void Main()
{
// Modify this path as necessary.
string startFolder = #"c:\program files\Microsoft Visual Studio 9.0\";
// Take a snapshot of the file system.
System.IO.DirectoryInfo dir = new System.IO.DirectoryInfo(startFolder);
// This method assumes that the application has discovery permissions
// for all folders under the specified path.
IEnumerable<System.IO.FileInfo> fileList = dir.GetFiles("*.*", System.IO.SearchOption.AllDirectories);
string searchTerm = #"Visual Studio";
// Search the contents of each file.
// A regular expression created with the RegEx class
// could be used instead of the Contains method.
// queryMatchingFiles is an IEnumerable<string>.
var queryMatchingFiles =
from file in fileList
where file.Extension == ".htm"
let fileText = GetFileText(file.FullName)
where fileText.Contains(searchTerm)
select file.FullName;
// Execute the query.
Console.WriteLine("The term \"{0}\" was found in:", searchTerm);
foreach (string filename in queryMatchingFiles)
{
Console.WriteLine(filename);
}
// Keep the console window open in debug mode.
Console.WriteLine("Press any key to exit");
Console.ReadKey();
}
// Read the contents of the file.
static string GetFileText(string name)
{
string fileContents = String.Empty;
// If the file has been deleted since we took
// the snapshot, ignore it and return the empty string.
if (System.IO.File.Exists(name))
{
fileContents = System.IO.File.ReadAllText(name);
}
return fileContents;
}
}
The problem you're having is based on trying to load multiple gigabytes of text at the same time. If they're text files, you can stream them and just compare one line at a time.
var queryMatchingFiles =
from file in fileList
where file.Extension == ".htm"
let fileLines = File.ReadLines(file.FullName) // lazy IEnumerable<string>
where fileLines.Any(line => line.Contains(searchTerm))
select file.FullName;
I would suggest that you are getting an out of memory error because the way the query is written I believe that you will need to load the entire text of every file into memory and none of the objects can be released until the entire file set has been loaded. Could you not check for the search term in the GetFileText function and then just return a true or false?
If you did that the file text at least falls out of scope at the end of the function and the GC can recover the memory. It would actually be better to rewrite as a streaming function if you are dealing with large files/amounts then you could exit your reading early if you come across the search term and you wouldn't need the entire file in memory all the time.
Previous question on finding a term in an HTML file using a stream
I am trying to create a torrent for the files in my desktop using monotorrent i have tried like the below code
i am able to get the byte code i am not able to save it as torrent it shows access denied
enter code here string path = "C:/Users/snovaspace12/Desktop/monotorrent-0.90/files";
string savepath = "D:/results";
TorrentCreator nnnn = new TorrentCreator();
nnnn.CreateTorrent(path, savepath);
public void CreateTorrent(string path, string savePath)
{
// The class used for creating the torrent
TorrentCreator c = new TorrentCreator();
// Add one tier which contains two trackers
//RawTrackerTier tier = new RawTrackerTier();
//tier.Add("http://localhost/announce");
//c.Announces.Add(tier);
c.Comment = "This is the comment";
c.CreatedBy = "Doug using " + VersionInfo.ClientVersion;
c.Publisher = "www.aaronsen.com";
// Set the torrent as private so it will not use DHT or peer exchange
// Generally you will not want to set this.
c.Private = true;
// Every time a piece has been hashed, this event will fire. It is an
// asynchronous event, so you have to handle threading yourself.
c.Hashed += delegate(object o, TorrentCreatorEventArgs e)
{
Console.WriteLine("Current File is {0}% hashed", e.FileCompletion);
Console.WriteLine("Overall {0}% hashed", e.OverallCompletion);
Console.WriteLine("Total data to hash: {0}", e.OverallSize);
};
// ITorrentFileSource can be implemented to provide the TorrentCreator
// with a list of files which will be added to the torrent metadata.
// The default implementation takes a path to a single file or a path
// to a directory. If the path is a directory, all files will be
// recursively added
ITorrentFileSource fileSource = new TorrentFileSource(path);
// Create the torrent file and save it directly to the specified path
// Different overloads of 'Create' can be used to save the data to a Stream
// or just return it as a BEncodedDictionary (its native format) so it can be
// processed in memory
c.Create(fileSource, savePath);
}
public void Create(ITorrentFileSource fileSource, string savePath)
{
Check.SavePath(savePath);
var file = Create(fileSource);//getting the fbyte code
File.WriteAllBytes( savePath, Create(fileSource).Encode()); //getting exception here
}
when i checked the byte code is returning properly to the file
it shows access is denied
You’ve probably solved this already but I just encountered the same issue. The solution, at least in my case, was pretty simple.
The problem originated with the savePath parameter in c.Create(fileSource, savePath);
I assumed savePath was a directory where the torrent would be saved. It should be a file path instead. For example savePath = “C:\pathtomytorrents\content.torrent”
Hopefully that works for you!
I am trying to get the last modified date from a file, but need its path? Could someone please show me how i can get the file path?
[HttpGet]
public string uploadfile(string token, string filenameP, DateTime modDate, HttpPostedFileBase file)
{
MemoryStream target = new MemoryStream();
file.InputStream.CopyTo(target);
byte[] data = target.ToArray();
//ModDate = File.GetLastWriteTimeUtc("Path");
}
You are creating a new file on the server when you upload. The last modified date will be "now" (the time the file is created). There is no way to snoop the user's machine to get this information (which is not part of the file itself). Can't be done with an HTTP form upload.
Now, some file types may contain metadata in the file which may have pertinent information. If you know the file type and it does contain such metadata then you can open the file and have a look.
You just don't. Most (if not all) browsers do not provide this information for security reasons in internet sceanrios.
You can read date by javascript (HTML5) and send it as hidden input field of form.
Something like
<script>
function handleFileSelect(evt) {
var files = evt.target.files; // FileList object
// files is a FileList of File objects. List some properties.
var output = [];
for (var i = 0, f; f = files[i]; i++) {
output.push(f.lastModifiedDate ? f.lastModifiedDate.toLocaleDateString() );
}
document.getElementById('list').innerHTML = '<ul>' + output.join('') + '</ul>';
}
document.getElementById('files').addEventListener('change', handleFileSelect, false);
</script>
http://www.html5rocks.com/en/tutorials/file/dndfiles/
I have an array that contains some FTP pathes, like follows:
"ftp//ip/directory/directory1",
"ftp//ip/directory/directory2",
"ftp//ip/directory/file.txt",
"ftp//ip/directory/directory3",
"ftp//ip/directory/another_file.csv"
How can i find out if the path is a file or a directory?
Thanks in advance.
Use the LIST command, which you can refer to RFC959, to get the details about items under the specified path. Take FileZilla Server for example, the LIST command will return standard LINUX permission format which you can find here. The first letter indicates if the requested path is file or directory. Also a simple library written in C# can be found here
I had the same problem. I worked off of hughs answer. You need to make an FTPRequest like:
request.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
grab it from streamreader and stick it in a string
StreamReader reader = new StreamReader(responseStream);
string directoryRaw = null;
try { while (reader.Peek() != -1) { directoryRaw += reader.ReadLine() + "|"; } }
catch (Exception ex) { Console.WriteLine(ex.ToString()); }
when you print this it is going to look like:
|-rw-r--r-- 1 user user 1699 Jun 1 2015
404.shtml
|drwxr-xr-x 2 user user 4096 Sep 8 19:39 cgi-bin
|drwxr-xr-x 2 user user 4096 Nov 3 10:52 css
These are seperated by | so that will be the delim for a splitstring
if it starts with a d and not a - then its a directory, else its a file.
these are all the same size before file name so make a new string for each of these strings starting at position 62 to end and that will be the file name.
Hope it helps
There's no direct way.
Indirectly you could assume that filenames that have no period "." are directories, but that is not going to always be true.
Best is to write the code that consumes these paths carefully so it e.g. treats the path as a directory, then if the FTP server reports an error, treat it as a file.
One way to do it is if we can assume that all files will end in an extension and all directories will not have an extension, we can use the System.IO.Path.GetExtension() method like this:
public bool IsDirectory(string directory)
{
if(directory == null)
{
throw new ArgumentNullException(); // or however you want to handle null values
}
// GetExtension(string) returns string.Empty when no extension found
return System.IO.Path.GetExtension(directory) == string.Empty;
}
You can use System.IO.Path.GetExtension(path)` as a way to check if your path has a file extension.
Given "ftp//ip/directory/directory1" or "ftp//ip/directory/directory2/", GetExtension will return a String.Empty to you.
This isn't foolproof, and it's possible though if there was a file without an extension that this would break down completely, or a directory with a period in it could cause issues.
I have found "hack" how to determine target type.
If you will use
request.Method = WebRequestMethods.Ftp.GetFileSize;
on a folder, it will result in Exception
Unhandled Exception: System.Net.WebException: The remote server
returned an erro r: (550) File unavailable (e.g., file not found, no
access).
But using it on file, it will naturally return its size.
I have create sample code for such method.
static bool IsFile(string ftpPath)
{
var request = (FtpWebRequest)WebRequest.Create(ftpPath);
request.Method = WebRequestMethods.Ftp.GetFileSize;
request.Credentials = new NetworkCredential("foo", "bar");
try
{
using (var response = (FtpWebResponse)request.GetResponse())
using (var responseStream = response.GetResponseStream())
{
return true;
}
}
catch(WebException ex)
{
return false;
}
}
You might want to alter it, because this one will catch any FTP error.
I had the same problem so I used GetFileSize to check if it's a File or Directory
var isFile = FtpHelper.IsFile("ftpURL", "userName", "password");
using System;
using System.Net;
public static class FtpHelper
{
public static bool IsFile(Uri requestUri, NetworkCredential networkCredential)
{
return GetFtpFileSize(requestUri, networkCredential) != default(long); //It's a Directory if it has no size
}
public static FtpWebRequest GetFtpWebRequest(Uri requestUri, NetworkCredential networkCredential, string method = null)
{
var ftpWebRequest = (FtpWebRequest)WebRequest.Create(requestUri); //Create FtpWebRequest with given Request Uri.
ftpWebRequest.Credentials = networkCredential; //Set the Credentials of current FtpWebRequest.
if (!string.IsNullOrEmpty(method))
ftpWebRequest.Method = method; //Set the Method of FtpWebRequest incase it has a value.
return ftpWebRequest; //Return the configured FtpWebRequest.
}
public static long GetFtpFileSize(Uri requestUri, NetworkCredential networkCredential)
{
//Create ftpWebRequest object with given options to get the File Size.
var ftpWebRequest = GetFtpWebRequest(requestUri, networkCredential, WebRequestMethods.Ftp.GetFileSize);
try { return ((FtpWebResponse)ftpWebRequest.GetResponse()).ContentLength; } //Incase of success it'll return the File Size.
catch (Exception) { return default(long); } //Incase of fail it'll return default value to check it later.
}
}