I am listing all directories in a console app, but it takes forever, more than 10+ minutes (just assuming here, it probably took more than 10 minutes), I know there are a ton of directories, but is 10+ minutes too long to begin with?
class Program
{
static void Main(string[] args)
{
DirSearch(#"c:\");
Console.ReadKey();
}
static void DirSearch(string sDir)
{
try
{
foreach (string d in Directory.GetDirectories(sDir))
{
Console.WriteLine(d);
DirSearch(d);
}
}
catch (System.Exception excpt)
{
Console.WriteLine(excpt.Message);
}
}
}
Yes - don't make it recursive manually. This is built in, you can use SearchOption.AllDirectories to include all subdirectories in your search:
foreach (string d in Directory.GetDirectories(sDir, "*.*", SearchOption.AllDirectories))
{
Console.WriteLine(d);
}
Or alternatively use Directory.EnumerateDirectories which yields directory names as it finds them instead of putting all of them in an array first:
foreach (string d in Directory.EnumerateDirectories(sDir, "*.*", SearchOption.AllDirectories))
{
Console.WriteLine(d);
}
Related
The folder C:\Users contain 3 subfolders :
C:\Users\hacen
C:\Users\_rafi_000
C:\Users\Public
However, when I call :
DirSearch(#"C:\Users\", "*.jpg");
It outputs all jpg filenames from Public and hacen, but not from _rafi_000 which is the folder of current user.
Here is the function :
static void DirSearch(string dir, string pattern)
{
try
{
foreach (string f in Directory.GetFiles(dir, pattern))
{
Console.WriteLine(f);
}
foreach (string d in Directory.GetDirectories(dir))
{
DirSearch(d, pattern);
}
}
catch (System.Exception ex)
{
//MessageBox.Show(ex.Message);
}
}
EDIT:
I tried with the code below and it works. So it isn't an access denied problem :
DirSearch("C:\Users\_rafi_000\","*.jpg");
What I noticed so far is that unlike other subfolders, the folder _rafi_000 cannot be ranamed when I press F2
Would this work?
void DirSearch(string dir, "*.JPG")
{
foreach (string f in Directory.GetFiles(dir, "*.JPG"))
{
Console.WriteLine(f);
}
foreach (string d in Directory.GetDirectories(dir))
{
DirSearch(d);
}
}
Could be related to where the jpg are stored and how reparse points work in later windows version.
I suggest looking at:
Directory Searching: http://msdn.microsoft.com/en-us/library/bb513869.aspx
Reparse Point Info: http://msdn.microsoft.com/en-us/library/aa365503(VS.85).aspx
I ran your code and it works fine in windows XP:
C:\Users\hacen\bar.jpg
C:\Users\Public\bar1.jpg
C:\Users\_rafi_000\bar2.jpg
Your code is correct.
Perhaps Process Monitor can help?
If the code is fine, it must be something else. I understand that you can run the code directly against the directory which is causing you a problem (which is surprising) - but I think Process Mon could help.
How can append f to a list each time, then write the list into a text file ? I tried with the following code but the text file is always empty?
static void DirSearch(string dir, string pattern)
{
try
{
List<String> filesList = new List<string>();
foreach (string f in Directory.GetFiles(dir, pattern))
//Console.WriteLine(f);
{
// Console.WriteLine(f); // // <- This works correctly
filesList.Add(f);
}
foreach (string d in Directory.GetDirectories(dir))
{
//Console.WriteLine(d);
DirSearch(d, pattern);
}
File.WriteAllLines("files.txt", filesList.ToArray());
}
catch (System.Exception ex)
{
//Console.WriteLine(ex.Message);
}
}
I suspect the problem is that on each invocation of DirSearch, you're overwriting the file output from previous invocations. If the final invocation has no files (only directories) you'll end up with an empty file.
Options:
Use Directory.GetFiles(dir, pattern, SearchOption.AllDirectories so you don't need to use recursion at all.
Use File.AppendAllLines instead of File.WriteAllLines
Build up the list entirely in memory (recursively) and only at the very end call File.WriteAllLines.
The latter approach would look something like this:
public static DirSearch(string dir, string pattern)
{
List<string> files = new List<string>();
DirSearchImpl(dir, pattern, files);
File.WriteAllLines("files.txt", files);
}
private static DirSearchImpl(string dir, string pattern, List<string> files)
{
// Simpler than your previous loop...
files.AddRange(Directory.GetFiles(dir, pattern));
foreach (var subdirectory in Directory.GetDirectories(dir))
{
DirSearchImpl(subdirectory, pattern, files);
}
}
I'd also suggest changing your exception handling - catching Exception is rarely a good idea, and I'm not sure you really want to keep going if things fail, do you?
I wrote a method that needed to find all files within a path, and I want to get all the files using recursion. Here's my current method:
public void doStart(DirectoryInfo dir, string filePattern)
{
try
{
foreach (FileInfo fileInfo in dir.GetFiles(filePattern))
{
if (fileFound != null)
{
fileFound(fileInfo);
}
}
}
catch (Exception)
{
}
try
{
foreach (DirectoryInfo dirInfo in dir.GetDirectories())
{
doStart(dirInfo, filePattern);
}
}
catch (Exception)
{
}
}
public void Start(string path, string filePattern)
{
doStart(new DirectoryInfo(path), filePattern);
}
Is there is better way to write this kind of recursion or is this good enough ?
Try something like this:
string[] filePaths = Directory.GetFiles(#dir, "*.filetype", SearchOption.AllDirectories);
This would recursively look through the directory, finding all files with a certain filetype ('.filetype') and returns a string array containing all found files.
Also, I'd recommend not to use empty catch blocks, as your application won't let you know if something went wrong. Either show a message box (or something similar) or log it to a database or something.
Further, what would your DoStart() method do if there is a subdirectory in a subdirectory? From what I'm seeing, I'd say it only searches on 1 sublevel.
Don't swallow all exceptions. If you need to ignore specific exceptions, catch those but let others bubble up
(style) Methods should be PascalCased (e.g. DoStart and `FileFound'
(style) Create an OnFileFound method instead of calling FileFound directly (I assume fileFound is an event handler?)
Other than that it looks fine to me.
Here is an example of true recursion. This will search until there are no more sub-directories to find, unlike Directory.GetFiles SearchOption.AllDirectories. You can modify this to add search filters as a parameter.
public IEnumerable<string> GetFilesRecursive(string ParentDirectory)
{
string[] subDirectories = Directory.GetDirectories(ParentDirectory);
foreach (string file in Directory.GetFiles(ParentDirectory))
{
yield return file;
}
foreach (string subDirectory in subDirectories)
{
foreach (string file in GetFilesRecursive(subDirectory))
{
yield return file;
}
}
}
here is my code:
private static void TreeScan(string sDir)
{
foreach (string d in Directory.GetDirectories(sDir))
{
foreach (string f in Directory.GetFiles(d))
{
//Save file f
}
}
TreeScan(d, client);
}
The problem is that it doesn't get the FILES of the sDir (Starting Directory) it only gets the Folders and the Files in the Sub Folders.
How can I make it get the files from the sDir too ?
Don't reinvent the wheel, use the overload of GetFiles that allows you to specify that it searches subdirectories.
string[] files
= Directory.GetFiles(path, searchPattern, SearchOption.AllDirectories);
private static void TreeScan( string sDir )
{
foreach (string f in Directory.GetFiles( sDir ))
{
//Save f :)
}
foreach (string d in Directory.GetDirectories( sDir ))
{
TreeScan( d );
}
}
There are some problems with your code. For one, the reason you never saw the files from the root folder is because your recursed before doing and file reads. Try this:
public static void Main()
{
TreeScan(#"C:\someFolder");
}
private static void TreeScan(string sDir)
{
foreach (string f in Directory.GetFiles(sDir))
Console.WriteLine("File: " + f); // or some other file processing
foreach (string d in Directory.GetDirectories(sDir))
TreeScan(d); // recursive call to get files of directory
}
You have to use
Directory.GetFiles(targetDirectory);
like in This sample, wich contains a complete implementation of what you're looking for
Your GetFiles loop should be outside the GetDirectories loop. And shouldn't your TreeScan stay inside GetDirectories loop? In short the code should look like this:
private static void TreeScan(string sDir)
{
foreach (string d in Directory.GetDirectories(sDir))
{
TreeScan(d, client);
}
foreach (string f in Directory.GetFiles(d))
{
//Save file f
}
}
If using Fx4 and above the EnumerateFiles method will return all files with efficient memory management, whereas GetFiles can require max resources on big directories (or drives).
var files = Directory.EnumerateFiles(dir.Path, "*.*");
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
How to recursively list all the files in a directory in C#?
I want to list the "sub-path" of files and folders for the giving folder (path)
let's say I have the folder C:\files\folder1\subfolder1\file.txt
if I give the function c:\files\folder1\
I will get
subfolder1
subfolder1\file.txt
You can use the Directory.GetFiles method to list all files in a folder:
string[] files = Directory.GetFiles(#"c:\files\folder1\",
"*.*",
SearchOption.AllDirectories);
foreach (var file in files)
{
Console.WriteLine(file);
}
Note that the SearchOption parameter can be used to control whether the search is recursive (SearchOption.AllDirectories) or not (SearchOption.TopDirectoryOnly).
Try something like this:
static void Main(string[] args)
{
DirSearch(#"c:\temp");
Console.ReadKey();
}
static void DirSearch(string dir)
{
try
{
foreach (string f in Directory.GetFiles(dir))
Console.WriteLine(f);
foreach (string d in Directory.GetDirectories(dir))
{
Console.WriteLine(d);
DirSearch(d);
}
}
catch (System.Exception ex)
{
Console.WriteLine(ex.Message);
}
}
String[] subDirectories;
String[] subFiles;
subDirectories = System.IO.Directory.GetDirectories("your path here");
subFiles = System.IO.Directory.GetFiles("your path here");
Use the System.IO.Directory class and its methods
I remember solving a similar problem not too long ago on SO, albeit it was in VB. Here's the question.