DirectoryInfo.EnumerateFiles(...) causes UnauthorizedAccessException (and other exceptions) - c#

I have recently had a need to Enumerate an entire file system looking for specific types of files for auditing purposes. This has caused me to run into several exceptions due to having limited permissions on the file system to be scanned. Among them, the most prevalent have been UnauthorizedAccessException and much to my chagrin, PathTooLongException.
These would not normally be an issue except that they invalidate the IEnumerable, preventing me from being able to complete the scan.

In order to solve this problem, I have created a replacement File System Enumerator. Although it may not be perfect, it performs fairly quickly and traps the two exceptions that I have run into. It will find any directories or files that match the search pattern passed to it.
// This code is public domain
using System;
using System.Collections;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using log4net;
public class FileSystemEnumerable : IEnumerable<FileSystemInfo>
{
private ILog _logger = LogManager.GetLogger(typeof(FileSystemEnumerable));
private readonly DirectoryInfo _root;
private readonly IList<string> _patterns;
private readonly SearchOption _option;
public FileSystemEnumerable(DirectoryInfo root, string pattern, SearchOption option)
{
_root = root;
_patterns = new List<string> { pattern };
_option = option;
}
public FileSystemEnumerable(DirectoryInfo root, IList<string> patterns, SearchOption option)
{
_root = root;
_patterns = patterns;
_option = option;
}
public IEnumerator<FileSystemInfo> GetEnumerator()
{
if (_root == null || !_root.Exists) yield break;
IEnumerable<FileSystemInfo> matches = new List<FileSystemInfo>();
try
{
_logger.DebugFormat("Attempting to enumerate '{0}'", _root.FullName);
foreach (var pattern in _patterns)
{
_logger.DebugFormat("Using pattern '{0}'", pattern);
matches = matches.Concat(_root.EnumerateDirectories(pattern, SearchOption.TopDirectoryOnly))
.Concat(_root.EnumerateFiles(pattern, SearchOption.TopDirectoryOnly));
}
}
catch (UnauthorizedAccessException)
{
_logger.WarnFormat("Unable to access '{0}'. Skipping...", _root.FullName);
yield break;
}
catch (PathTooLongException ptle)
{
_logger.Warn(string.Format(#"Could not process path '{0}\{1}'.", _root.Parent.FullName, _root.Name), ptle);
yield break;
} catch (System.IO.IOException e)
{
// "The symbolic link cannot be followed because its type is disabled."
// "The specified network name is no longer available."
_logger.Warn(string.Format(#"Could not process path (check SymlinkEvaluation rules)'{0}\{1}'.", _root.Parent.FullName, _root.Name), e);
yield break;
}
_logger.DebugFormat("Returning all objects that match the pattern(s) '{0}'", string.Join(",", _patterns));
foreach (var file in matches)
{
yield return file;
}
if (_option == SearchOption.AllDirectories)
{
_logger.DebugFormat("Enumerating all child directories.");
foreach (var dir in _root.EnumerateDirectories("*", SearchOption.TopDirectoryOnly))
{
_logger.DebugFormat("Enumerating '{0}'", dir.FullName);
var fileSystemInfos = new FileSystemEnumerable(dir, _patterns, _option);
foreach (var match in fileSystemInfos)
{
yield return match;
}
}
}
}
IEnumerator IEnumerable.GetEnumerator()
{
return GetEnumerator();
}
}
The usage is fairly simple.
//This code is public domain
var root = new DirectoryInfo(#"c:\wherever");
var searchPattern = #"*.txt";
var searchOption = SearchOption.AllDirectories;
var enumerable = new FileSystemEnumerable(root, searchPattern, searchOption);
People are free to use it if they find it useful.

Here's another way, manage your own enumeration iteration:
IEnumerator<string> errFiles=Directory.EnumerateFiles(baseDir, "_error.txt", SearchOption.AllDirectories).GetEnumerator();
while (true)
{
try
{
if (!errFiles.MoveNext())
break;
string errFile = errFiles.Current;
// processing
} catch (Exception e)
{
log.Warn("Ignoring error finding in: " + baseDir, e);
}
}

Related

C#: try foreach catch continue [duplicate]

I am trying to display a list of all files found in the selected directory (and optionally any subdirectories). The problem I am having is that when the GetFiles() method comes across a folder that it cannot access, it throws an exception and the process stops.
How do I ignore this exception (and ignore the protected folder/file) and continue adding accessible files to the list?
try
{
if (cbSubFolders.Checked == false)
{
string[] files = Directory.GetFiles(folderBrowserDialog1.SelectedPath);
foreach (string fileName in files)
ProcessFile(fileName);
}
else
{
string[] files = Directory.GetFiles(folderBrowserDialog1.SelectedPath, "*.*", SearchOption.AllDirectories);
foreach (string fileName in files)
ProcessFile(fileName);
}
lblNumberOfFilesDisplay.Enabled = true;
}
catch (UnauthorizedAccessException) { }
finally {}
You will have to do the recursion manually; don't use AllDirectories - look one folder at a time, then try getting the files from sub-dirs. Untested, but something like below (note uses a delegate rather than building an array):
using System;
using System.IO;
static class Program
{
static void Main()
{
string path = ""; // TODO
ApplyAllFiles(path, ProcessFile);
}
static void ProcessFile(string path) {/* ... */}
static void ApplyAllFiles(string folder, Action<string> fileAction)
{
foreach (string file in Directory.GetFiles(folder))
{
fileAction(file);
}
foreach (string subDir in Directory.GetDirectories(folder))
{
try
{
ApplyAllFiles(subDir, fileAction);
}
catch
{
// swallow, log, whatever
}
}
}
}
Since .NET Standard 2.1 (.NET Core 3+, .NET 5+), you can now just do:
var filePaths = Directory.EnumerateFiles(#"C:\my\files", "*.xml", new EnumerationOptions
{
IgnoreInaccessible = true,
RecurseSubdirectories = true
});
According to the MSDN docs about IgnoreInaccessible:
Gets or sets a value that indicates whether to skip files or directories when access is denied (for example, UnauthorizedAccessException or SecurityException). The default is true.
Default value is actually true, but I've kept it here just to show the property.
The same overload is available for DirectoryInfo as well.
This simple function works well and meets the questions requirements.
private List<string> GetFiles(string path, string pattern)
{
var files = new List<string>();
var directories = new string[] { };
try
{
files.AddRange(Directory.GetFiles(path, pattern, SearchOption.TopDirectoryOnly));
directories = Directory.GetDirectories(path);
}
catch (UnauthorizedAccessException) { }
foreach (var directory in directories)
try
{
files.AddRange(GetFiles(directory, pattern));
}
catch (UnauthorizedAccessException) { }
return files;
}
A simple way to do this is by using a List for files and a Queue for directories.
It conserves memory.
If you use a recursive program to do the same task, that could throw OutOfMemory exception.
The output: files added in the List, are organised according to the top to bottom (breadth first) directory tree.
public static List<string> GetAllFilesFromFolder(string root, bool searchSubfolders) {
Queue<string> folders = new Queue<string>();
List<string> files = new List<string>();
folders.Enqueue(root);
while (folders.Count != 0) {
string currentFolder = folders.Dequeue();
try {
string[] filesInCurrent = System.IO.Directory.GetFiles(currentFolder, "*.*", System.IO.SearchOption.TopDirectoryOnly);
files.AddRange(filesInCurrent);
}
catch {
// Do Nothing
}
try {
if (searchSubfolders) {
string[] foldersInCurrent = System.IO.Directory.GetDirectories(currentFolder, "*.*", System.IO.SearchOption.TopDirectoryOnly);
foreach (string _current in foldersInCurrent) {
folders.Enqueue(_current);
}
}
}
catch {
// Do Nothing
}
}
return files;
}
Steps:
Enqueue the root in the queue
In a loop, Dequeue it, Add the files in that directory to the list, and Add the subfolders to the queue.
Repeat untill the queue is empty.
see https://stackoverflow.com/a/10728792/89584 for a solution that handles the UnauthorisedAccessException problem.
All the solutions above will miss files and/or directories if any calls to GetFiles() or GetDirectories() are on folders with a mix of permissions.
Here's a full-featured, .NET 2.0-compatible implementation.
You can even alter the yielded List of files to skip over directories in the FileSystemInfo version!
(Beware null values!)
public static IEnumerable<KeyValuePair<string, string[]>> GetFileSystemInfosRecursive(string dir, bool depth_first)
{
foreach (var item in GetFileSystemObjectsRecursive(new DirectoryInfo(dir), depth_first))
{
string[] result;
var children = item.Value;
if (children != null)
{
result = new string[children.Count];
for (int i = 0; i < result.Length; i++)
{ result[i] = children[i].Name; }
}
else { result = null; }
string fullname;
try { fullname = item.Key.FullName; }
catch (IOException) { fullname = null; }
catch (UnauthorizedAccessException) { fullname = null; }
yield return new KeyValuePair<string, string[]>(fullname, result);
}
}
public static IEnumerable<KeyValuePair<DirectoryInfo, List<FileSystemInfo>>> GetFileSystemInfosRecursive(DirectoryInfo dir, bool depth_first)
{
var stack = depth_first ? new Stack<DirectoryInfo>() : null;
var queue = depth_first ? null : new Queue<DirectoryInfo>();
if (depth_first) { stack.Push(dir); }
else { queue.Enqueue(dir); }
for (var list = new List<FileSystemInfo>(); (depth_first ? stack.Count : queue.Count) > 0; list.Clear())
{
dir = depth_first ? stack.Pop() : queue.Dequeue();
FileSystemInfo[] children;
try { children = dir.GetFileSystemInfos(); }
catch (UnauthorizedAccessException) { children = null; }
catch (IOException) { children = null; }
if (children != null) { list.AddRange(children); }
yield return new KeyValuePair<DirectoryInfo, List<FileSystemInfo>>(dir, children != null ? list : null);
if (depth_first) { list.Reverse(); }
foreach (var child in list)
{
var asdir = child as DirectoryInfo;
if (asdir != null)
{
if (depth_first) { stack.Push(asdir); }
else { queue.Enqueue(asdir); }
}
}
}
}
This should answer the question. I've ignored the issue of going through subdirectories, I'm assuming you have that figured out.
Of course, you don't need to have a seperate method for this, but you might find it a useful place to also verify the path is valid, and deal with the other exceptions that you could encounter when calling GetFiles().
Hope this helps.
private string[] GetFiles(string path)
{
string[] files = null;
try
{
files = Directory.GetFiles(path);
}
catch (UnauthorizedAccessException)
{
// might be nice to log this, or something ...
}
return files;
}
private void Processor(string path, bool recursive)
{
// leaving the recursive directory navigation out.
string[] files = this.GetFiles(path);
if (null != files)
{
foreach (string file in files)
{
this.Process(file);
}
}
else
{
// again, might want to do something when you can't access the path?
}
}
I prefer using c# framework functions, but the function i need will be included in .net framework 5.0, so i have to write it.
// search file in every subdirectory ignoring access errors
static List<string> list_files(string path)
{
List<string> files = new List<string>();
// add the files in the current directory
try
{
string[] entries = Directory.GetFiles(path);
foreach (string entry in entries)
files.Add(System.IO.Path.Combine(path,entry));
}
catch
{
// an exception in directory.getfiles is not recoverable: the directory is not accessible
}
// follow the subdirectories
try
{
string[] entries = Directory.GetDirectories(path);
foreach (string entry in entries)
{
string current_path = System.IO.Path.Combine(path, entry);
List<string> files_in_subdir = list_files(current_path);
foreach (string current_file in files_in_subdir)
files.Add(current_file);
}
}
catch
{
// an exception in directory.getdirectories is not recoverable: the directory is not accessible
}
return files;
}

Handle (skip) UnauthorizedAccessException in EnumerateFiles() LINQ C#

I am trying to write a high performance file system searcher that can search unindexed drives (both local and network) very fast filtering on extensions and keywords. I am trying to achieve this using C#'s DirectoryInfo.EnumerateDirectories(), DirectoryInfo.EnumerateFiles() and LINQ queries. From my testing, this is (by far) the best performing code I could find:
FileInfo[] dirFiles = dirInfo.EnumerateDirectories()
.AsParallel()
.SelectMany(di => di.EnumerateFiles("*.*", SearchOption.AllDirectories)
.Where(fi => EndsWithExtension(fi.Extension)) )
.ToArray();
However, the UnauthorizedAccessException is not handled and when thrown, crashes the entire query.
I've tried various ways as outlined on SO related to this issue, but I found that they are significantly slower in search performance. This second best method I found working is over 20 times slower for example:
try {
foreach (string fileName in EnumerateFiles(dirInfo, "*.*", SearchOption.AllDirectories)) {
if (ContainsKeyword(fileName)) {
Results.Add(fileName.FullName);
}
}
} catch (Exception e) { continue; }
I would like to skip over the directory when it throws an exception. I've being trying to achieve this with something similar to this, but I can't get it to work (my knowledge of LINQ and Enumerables is too limited...):
FileInfo[] dirFiles = dirInfo.EnumerateDirectories()
.AsParallel()
.SelectMany(di => di.EnumerateFiles("*.*", SearchOption.AllDirectories)
.SkipExceptions()
.Where(fi => EndsWithExtension(fi.Extension)) )
.ToArray();
public static class Extensions {
public static IEnumerable<T> SkipExceptions<T>(this IEnumerable<T> values) {
using (var enumerator = values.GetEnumerator()) {
bool next = true;
while (next) {
try {
if (enumerator.Current != null)
Console.WriteLine(enumerator.Current.ToString());
next = enumerator.MoveNext();
} catch {
continue;
}
if (next) yield return enumerator.Current;
}
}
}
}
Is it possible to handle (UnauthorizedAccess) exceptions, while still remaining as high performant as the "raw" LINQ query?
Thanks in advance for your help!
Answer EDITED:
A workaround is to call it recursively instead of using SearchOption.AllDirectories. This is actually more inefficient in your case because you don't need to load every file in the filesystem into an array. Start with the following helper methods:
List<string> GetDirectoriesRecursive (string parent)
{
var directories = new List<string>();
GetDirectoriesRecursive (directories, parent);
return directories;
}
void GetDirectoriesRecursive (List<string> directories, string parent)
{
directories.Add (parent);
foreach (string child in GetAuthorizedDirectories (parent))
GetDirectoriesRecursive (directories, child);
}
string[] GetAuthorizedDirectories (string dir)
{
try { return Directory.GetDirectories (dir); }
catch (UnauthorizedAccessException) { return new string[0]; }
}
string[] GetAuthorizedFiles (string dir)
{
try { return Directory.GetFiles (dir); }
catch (UnauthorizedAccessException) { return new string[0]; }
}
Then, to get the big files:
var bigFiles =
from dir in GetDirectoriesRecursive ( #"c:\" )
from file in GetAuthorizedFiles (dir)
where new FileInfo (file).Length > 100000000
select file;
Or, to get just their directories:
var foldersWithBigFiles =
from dir in GetDirectoriesRecursive ( #"c:\" )
where GetAuthorizedFiles (dir).Any (f => new FileInfo (f).Length > 100000000 )
select dir;
ANOTHER APPROACH:
string[] directories = Directory.EnumerateDirectories(#"\\testnetwork\abc$","*.*", SearchOption.AllDirectories).Catch(typeof(UnauthorizedAccessException)).ToArray();
ADDED missing part:
static class ExceptionExtensions
{
public static IEnumerable<TIn> Catch<TIn>(
this IEnumerable<TIn> source,
Type exceptionType)
{
using (var e = source.GetEnumerator())
while (true)
{
var ok = false;
try
{
ok = e.MoveNext();
}
catch(Exception ex)
{
if (ex.GetType() != exceptionType)
throw;
continue;
}
if (!ok)
yield break;
yield return e.Current;
}
}
}

How to minimize file search runtime complexity?

i wrote an application which is a custom console that allows execution of various commands. One of the commands allows to find a file's full path, according to part of its name. The input data is a string, which equals to part\full name of the file.
My question is - how to minimize the search code runtime complexity as much as possible?
Here is the command's code:
using CustomConsole.Common;
using System;
using System.Collections.Generic;
using System.IO;
namespace Shell_Commander.Commands
{
class FindFileCommand : ICommand
{
private string _findFileCommandName = "findfile";
public string Name { get { return _findFileCommandName; } set { _findFileCommandName = value; } }
public string Execute(string parameters)
{
var fileLocations = new Dictionary<string, bool>();
try
{
var splittedParameters = parameters.Split(" ");
var initialLocation = splittedParameters[0];
var fileName = splittedParameters[1];
foreach (var filePath in Directory.GetFiles(initialLocation, "*.*", SearchOption.A­llDirectories))
{
fileLocations.Add(filePath, false);
if (Path.GetFileName(filePath) == fileName || Path.GetFileNameWithoutExtension(filePath) == fileName)
{
fileLocations[filePath] = true;
}
}
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
bool fileFound = false;
string returnedOutput = "";
foreach (var location in fileLocations.Keys)
{
if (fileLocations[location])
{
returnedOutput += $"The file found in path: {location}\n";
Console.Write(returnedOutput);
fileFound = true;
}
}
if (!fileFound)
{
returnedOutput = "The file not found in this path";
Console.WriteLine(returnedOutput);
return returnedOutput;
}
return returnedOutput;
}
}
}
Example - for the input parameters "c:\temp test", the output can be:
The file found in path: c:\temp\test.json
The file found in path: c:\temp\test.json
The file found in path: c:\temp\test.xml
The file found in path: c:\temp\test.json
The file found in path: c:\temp\test.xml
The file found in path: c:\temp\test\test.json
You can simply your foreach like this
var fileLocations = Directory.GetFiles(initialLocation, $"{filePath}.*", SearchOption.A­llDirectories);
foreach (var location in fileLocations)
{
returnedOutput += $"The file found in path: {location}\n";
Console.Write(returnedOutput);
}
The rest of the code also can be simplified.

Error preventing any data from being written to array? [duplicate]

I am trying to display a list of all files found in the selected directory (and optionally any subdirectories). The problem I am having is that when the GetFiles() method comes across a folder that it cannot access, it throws an exception and the process stops.
How do I ignore this exception (and ignore the protected folder/file) and continue adding accessible files to the list?
try
{
if (cbSubFolders.Checked == false)
{
string[] files = Directory.GetFiles(folderBrowserDialog1.SelectedPath);
foreach (string fileName in files)
ProcessFile(fileName);
}
else
{
string[] files = Directory.GetFiles(folderBrowserDialog1.SelectedPath, "*.*", SearchOption.AllDirectories);
foreach (string fileName in files)
ProcessFile(fileName);
}
lblNumberOfFilesDisplay.Enabled = true;
}
catch (UnauthorizedAccessException) { }
finally {}
You will have to do the recursion manually; don't use AllDirectories - look one folder at a time, then try getting the files from sub-dirs. Untested, but something like below (note uses a delegate rather than building an array):
using System;
using System.IO;
static class Program
{
static void Main()
{
string path = ""; // TODO
ApplyAllFiles(path, ProcessFile);
}
static void ProcessFile(string path) {/* ... */}
static void ApplyAllFiles(string folder, Action<string> fileAction)
{
foreach (string file in Directory.GetFiles(folder))
{
fileAction(file);
}
foreach (string subDir in Directory.GetDirectories(folder))
{
try
{
ApplyAllFiles(subDir, fileAction);
}
catch
{
// swallow, log, whatever
}
}
}
}
Since .NET Standard 2.1 (.NET Core 3+, .NET 5+), you can now just do:
var filePaths = Directory.EnumerateFiles(#"C:\my\files", "*.xml", new EnumerationOptions
{
IgnoreInaccessible = true,
RecurseSubdirectories = true
});
According to the MSDN docs about IgnoreInaccessible:
Gets or sets a value that indicates whether to skip files or directories when access is denied (for example, UnauthorizedAccessException or SecurityException). The default is true.
Default value is actually true, but I've kept it here just to show the property.
The same overload is available for DirectoryInfo as well.
This simple function works well and meets the questions requirements.
private List<string> GetFiles(string path, string pattern)
{
var files = new List<string>();
var directories = new string[] { };
try
{
files.AddRange(Directory.GetFiles(path, pattern, SearchOption.TopDirectoryOnly));
directories = Directory.GetDirectories(path);
}
catch (UnauthorizedAccessException) { }
foreach (var directory in directories)
try
{
files.AddRange(GetFiles(directory, pattern));
}
catch (UnauthorizedAccessException) { }
return files;
}
A simple way to do this is by using a List for files and a Queue for directories.
It conserves memory.
If you use a recursive program to do the same task, that could throw OutOfMemory exception.
The output: files added in the List, are organised according to the top to bottom (breadth first) directory tree.
public static List<string> GetAllFilesFromFolder(string root, bool searchSubfolders) {
Queue<string> folders = new Queue<string>();
List<string> files = new List<string>();
folders.Enqueue(root);
while (folders.Count != 0) {
string currentFolder = folders.Dequeue();
try {
string[] filesInCurrent = System.IO.Directory.GetFiles(currentFolder, "*.*", System.IO.SearchOption.TopDirectoryOnly);
files.AddRange(filesInCurrent);
}
catch {
// Do Nothing
}
try {
if (searchSubfolders) {
string[] foldersInCurrent = System.IO.Directory.GetDirectories(currentFolder, "*.*", System.IO.SearchOption.TopDirectoryOnly);
foreach (string _current in foldersInCurrent) {
folders.Enqueue(_current);
}
}
}
catch {
// Do Nothing
}
}
return files;
}
Steps:
Enqueue the root in the queue
In a loop, Dequeue it, Add the files in that directory to the list, and Add the subfolders to the queue.
Repeat untill the queue is empty.
see https://stackoverflow.com/a/10728792/89584 for a solution that handles the UnauthorisedAccessException problem.
All the solutions above will miss files and/or directories if any calls to GetFiles() or GetDirectories() are on folders with a mix of permissions.
Here's a full-featured, .NET 2.0-compatible implementation.
You can even alter the yielded List of files to skip over directories in the FileSystemInfo version!
(Beware null values!)
public static IEnumerable<KeyValuePair<string, string[]>> GetFileSystemInfosRecursive(string dir, bool depth_first)
{
foreach (var item in GetFileSystemObjectsRecursive(new DirectoryInfo(dir), depth_first))
{
string[] result;
var children = item.Value;
if (children != null)
{
result = new string[children.Count];
for (int i = 0; i < result.Length; i++)
{ result[i] = children[i].Name; }
}
else { result = null; }
string fullname;
try { fullname = item.Key.FullName; }
catch (IOException) { fullname = null; }
catch (UnauthorizedAccessException) { fullname = null; }
yield return new KeyValuePair<string, string[]>(fullname, result);
}
}
public static IEnumerable<KeyValuePair<DirectoryInfo, List<FileSystemInfo>>> GetFileSystemInfosRecursive(DirectoryInfo dir, bool depth_first)
{
var stack = depth_first ? new Stack<DirectoryInfo>() : null;
var queue = depth_first ? null : new Queue<DirectoryInfo>();
if (depth_first) { stack.Push(dir); }
else { queue.Enqueue(dir); }
for (var list = new List<FileSystemInfo>(); (depth_first ? stack.Count : queue.Count) > 0; list.Clear())
{
dir = depth_first ? stack.Pop() : queue.Dequeue();
FileSystemInfo[] children;
try { children = dir.GetFileSystemInfos(); }
catch (UnauthorizedAccessException) { children = null; }
catch (IOException) { children = null; }
if (children != null) { list.AddRange(children); }
yield return new KeyValuePair<DirectoryInfo, List<FileSystemInfo>>(dir, children != null ? list : null);
if (depth_first) { list.Reverse(); }
foreach (var child in list)
{
var asdir = child as DirectoryInfo;
if (asdir != null)
{
if (depth_first) { stack.Push(asdir); }
else { queue.Enqueue(asdir); }
}
}
}
}
This should answer the question. I've ignored the issue of going through subdirectories, I'm assuming you have that figured out.
Of course, you don't need to have a seperate method for this, but you might find it a useful place to also verify the path is valid, and deal with the other exceptions that you could encounter when calling GetFiles().
Hope this helps.
private string[] GetFiles(string path)
{
string[] files = null;
try
{
files = Directory.GetFiles(path);
}
catch (UnauthorizedAccessException)
{
// might be nice to log this, or something ...
}
return files;
}
private void Processor(string path, bool recursive)
{
// leaving the recursive directory navigation out.
string[] files = this.GetFiles(path);
if (null != files)
{
foreach (string file in files)
{
this.Process(file);
}
}
else
{
// again, might want to do something when you can't access the path?
}
}
I prefer using c# framework functions, but the function i need will be included in .net framework 5.0, so i have to write it.
// search file in every subdirectory ignoring access errors
static List<string> list_files(string path)
{
List<string> files = new List<string>();
// add the files in the current directory
try
{
string[] entries = Directory.GetFiles(path);
foreach (string entry in entries)
files.Add(System.IO.Path.Combine(path,entry));
}
catch
{
// an exception in directory.getfiles is not recoverable: the directory is not accessible
}
// follow the subdirectories
try
{
string[] entries = Directory.GetDirectories(path);
foreach (string entry in entries)
{
string current_path = System.IO.Path.Combine(path, entry);
List<string> files_in_subdir = list_files(current_path);
foreach (string current_file in files_in_subdir)
files.Add(current_file);
}
}
catch
{
// an exception in directory.getdirectories is not recoverable: the directory is not accessible
}
return files;
}

How to avoid using foreach loop to get the filelist for different reason

Here what am trying to do:
I have a remote server (e.g:svr01,svr02,svr03). Using GetFileList to read the directory get all the files and match with the file name I have then copy to my local drive.
If any files matched then am adding them to an XML file also.
I was trying to do like below
class Program
{
static void Main(string[] args)
{
var getfiles = new fileshare.Program();
string realname = "*main*";
string Location = "SVR01";
bool anymatch = false;
foreach (var file in getfiles.GetFileList(realname,Location))
{anymatch=true;}
if (anymatch == true)
{ baseMeta(); }
foreach (var file in getfiles.GetFileList(realname,Location))
{getfiles.copytolocal(file.FullName); }
}
private FileInfo[] GetFileList(string pattern,string Location)
{
try
{
switch (Location)
{
case "SVR01":
{
var di = new DirectoryInfo(#"\\SVR01\Dev");
return di.GetFiles(pattern);
}
case "SVR02":
{
var di = new DirectoryInfo(#"\\SVR02\Dev");
return di.GetFiles(pattern);
}
case "SVR03":
{
var di = new DirectoryInfo(#"\\SVR03\Prod");
return di.GetFiles(pattern);
}
default: throw new ArgumentOutOfRangeException();
}
}
catch(Exception ex)
{ Console.Write(ex.ToString());
return null;
}
}
private void copytolocal(string filename)
{
string nameonly = Path.GetFileName(filename);
File.Copy(filename,Path.Combine(#"c:\",nameonly),true);
}
private void baseMeta()
{
XmlWriter xmlWrite = XmlWriter.Create(#"c:\basexml");
xmlWrite.WriteStartElement("job");
xmlWrite.WriteElementString("Name", "test");
xmlWrite.WriteElementString("time", DateTime);
xmlWrite.Close();
}
}
but this piece of code worries me because am doing the same process two times, any one please guide me how to avoid this.
foreach (var file in getfiles.GetFileList(realname,Location))
{
anymatch=true;}
if (anymatch == true)
{
baseMeta();
}
foreach (var file in getfiles.GetFileList(realname,Location))
{
getfiles.copytolocal(file.FullName);
}
}
Even am trying to find out if it match anyfile then i quit the first foreach loop generate the basemeta() then goes to next foreach loop to do the rest of the process.
Using LINQ you should be able to easily change your posted code into:
var getfiles = new fileshare.Program();
string realname = "*main*";
string Location = "SVR01";
var fileList = getFiles.GetFileList(realname, Location);
var anymatch = fileList.Any();
if (anymatch) // Or possibly `if (fileList.Any())` if anymatch isn't
// really used anywhere else
baseMeta();
foreach (var file in getfiles.GetFileList(realname,Location))
getfiles.copytolocal(file.FullName);
You'll get the greatest benefit by replacing your GetFileList method with:
private IEnumerable<FileInfo> GetFileList(string pattern,string Location)
{
string directory = string.Empty;
switch (Location)
{
case "SVR01":
directory = #"\\SVR01\Dev";
break;
case "SVR02":
directory = #"\\SVR02\Dev";
break;
case "SVR03":
directory = #"\\SVR03\Prod");
break;
default:
throw new ArgumentOutOfRangeException();
}
DirectoryInfo di = null;
try
{
di = new DirectoryInfo(directory);
}
catch(Exception ex)
{
Console.WriteLine(ex.Message);
yield break;
}
foreach(var fi in di.EnumerateFiles(pattern))
yield return fi;
}
Use this
var files = getfiles.GetFileList(realname, Location);
if (files.Length > 0)
{
baseMeta();
foreach(var file in files)
{
getfiles.copytolocal(file.FullName);
}
}
Try this:
Create method to check the file existence and do all in single loop.
your statement is not much clear that when you will copy or not.. use
your condition on which you want to copy or create xml entry..
What is your AnyMatch?? If you want to check that Is there any file then use
var fileList = getfiles.GetFileList(realname,Location);
if( fileList.Count() > 0)
{
baseMeta();
}
foreach (var file in fileList)
{
// copy the file if match does not exist..
getfiles.copytolocal(file.FullName);
}
But Foreach loop through collection if it have any item. so you need not to care about the count of the files..
If you want to do entry on every copy as per your code then why you need to check anyMatch etc. It will create entry on every file copy.
foreach (var file in getfiles.GetFileList(realname,Location))
{
baseMeta();
// copy the file
getfiles.copytolocal(file.FullName);
}

Categories

Resources