Search in files with detail check - c#

I want to search all the files in the folder and subfolders And find files that have specific details How can I do this?
I use the following method, but in this method, I have to wait until all the files are received, Then check their details with loop But it does take a lot of time when I have more than 1000 files
AllofItems = GetFileList(#"\myfolder").ToArray();
foreach (var item in AllofItems)
{
var file = ShellFile.FromFilePath(item); // for example C:\myfolder\1.jpg
if(file.Properties.System.Title.Value.Equal("Empty")){
coverView.Items.Add(item);
}
}
and this is GetFileList func
public IEnumerable<string> GetFileList(string rootFolderPath)
{
Queue<string> pending = new Queue<string>();
pending.Enqueue(rootFolderPath);
string[] tmp;
while (pending.Count > 0)
{
rootFolderPath = pending.Dequeue();
try
{
tmp = Directory.GetFiles(rootFolderPath);
}
catch (DirectoryNotFoundException) { continue; }
catch (UnauthorizedAccessException)
{
continue;
}
for (int i = 0; i < tmp.Length; i++)
{
yield return tmp[i];
}
tmp = Directory.GetDirectories(rootFolderPath);
for (int i = 0; i < tmp.Length; i++)
{
pending.Enqueue(tmp[i]);
}
}
}
I want to do this at the same time. Check the file detail when searching

The DirectoryInfo Class has useful methods for this.
var dir = new DirectoryInfo(#"C:\myBaseFolder");
FileInfo[] allfiles = dir.GetFiles("*.*", SearchOption.AllDirectories);
This will automatically include all subdirectories.
You could take advantage of the await keyword like this
private static async Task<FileInfo[]> GetFileList(string rootFolderPath)
{
FileInfo[] allfiles;
await Task.Run(() => {
var dir = new DirectoryInfo(rootFolderPath);
allfiles = dir.GetFiles("*.*", SearchOption.AllDirectories);
});
return allfiles;
}
and call it like this
FileInfo[] allFiles = await GetFileList(#"\myfolder");
and don't forget to add the async keyword to the method doing this call.

Related

C#: try foreach catch continue [duplicate]

I am trying to display a list of all files found in the selected directory (and optionally any subdirectories). The problem I am having is that when the GetFiles() method comes across a folder that it cannot access, it throws an exception and the process stops.
How do I ignore this exception (and ignore the protected folder/file) and continue adding accessible files to the list?
try
{
if (cbSubFolders.Checked == false)
{
string[] files = Directory.GetFiles(folderBrowserDialog1.SelectedPath);
foreach (string fileName in files)
ProcessFile(fileName);
}
else
{
string[] files = Directory.GetFiles(folderBrowserDialog1.SelectedPath, "*.*", SearchOption.AllDirectories);
foreach (string fileName in files)
ProcessFile(fileName);
}
lblNumberOfFilesDisplay.Enabled = true;
}
catch (UnauthorizedAccessException) { }
finally {}
You will have to do the recursion manually; don't use AllDirectories - look one folder at a time, then try getting the files from sub-dirs. Untested, but something like below (note uses a delegate rather than building an array):
using System;
using System.IO;
static class Program
{
static void Main()
{
string path = ""; // TODO
ApplyAllFiles(path, ProcessFile);
}
static void ProcessFile(string path) {/* ... */}
static void ApplyAllFiles(string folder, Action<string> fileAction)
{
foreach (string file in Directory.GetFiles(folder))
{
fileAction(file);
}
foreach (string subDir in Directory.GetDirectories(folder))
{
try
{
ApplyAllFiles(subDir, fileAction);
}
catch
{
// swallow, log, whatever
}
}
}
}
Since .NET Standard 2.1 (.NET Core 3+, .NET 5+), you can now just do:
var filePaths = Directory.EnumerateFiles(#"C:\my\files", "*.xml", new EnumerationOptions
{
IgnoreInaccessible = true,
RecurseSubdirectories = true
});
According to the MSDN docs about IgnoreInaccessible:
Gets or sets a value that indicates whether to skip files or directories when access is denied (for example, UnauthorizedAccessException or SecurityException). The default is true.
Default value is actually true, but I've kept it here just to show the property.
The same overload is available for DirectoryInfo as well.
This simple function works well and meets the questions requirements.
private List<string> GetFiles(string path, string pattern)
{
var files = new List<string>();
var directories = new string[] { };
try
{
files.AddRange(Directory.GetFiles(path, pattern, SearchOption.TopDirectoryOnly));
directories = Directory.GetDirectories(path);
}
catch (UnauthorizedAccessException) { }
foreach (var directory in directories)
try
{
files.AddRange(GetFiles(directory, pattern));
}
catch (UnauthorizedAccessException) { }
return files;
}
A simple way to do this is by using a List for files and a Queue for directories.
It conserves memory.
If you use a recursive program to do the same task, that could throw OutOfMemory exception.
The output: files added in the List, are organised according to the top to bottom (breadth first) directory tree.
public static List<string> GetAllFilesFromFolder(string root, bool searchSubfolders) {
Queue<string> folders = new Queue<string>();
List<string> files = new List<string>();
folders.Enqueue(root);
while (folders.Count != 0) {
string currentFolder = folders.Dequeue();
try {
string[] filesInCurrent = System.IO.Directory.GetFiles(currentFolder, "*.*", System.IO.SearchOption.TopDirectoryOnly);
files.AddRange(filesInCurrent);
}
catch {
// Do Nothing
}
try {
if (searchSubfolders) {
string[] foldersInCurrent = System.IO.Directory.GetDirectories(currentFolder, "*.*", System.IO.SearchOption.TopDirectoryOnly);
foreach (string _current in foldersInCurrent) {
folders.Enqueue(_current);
}
}
}
catch {
// Do Nothing
}
}
return files;
}
Steps:
Enqueue the root in the queue
In a loop, Dequeue it, Add the files in that directory to the list, and Add the subfolders to the queue.
Repeat untill the queue is empty.
see https://stackoverflow.com/a/10728792/89584 for a solution that handles the UnauthorisedAccessException problem.
All the solutions above will miss files and/or directories if any calls to GetFiles() or GetDirectories() are on folders with a mix of permissions.
Here's a full-featured, .NET 2.0-compatible implementation.
You can even alter the yielded List of files to skip over directories in the FileSystemInfo version!
(Beware null values!)
public static IEnumerable<KeyValuePair<string, string[]>> GetFileSystemInfosRecursive(string dir, bool depth_first)
{
foreach (var item in GetFileSystemObjectsRecursive(new DirectoryInfo(dir), depth_first))
{
string[] result;
var children = item.Value;
if (children != null)
{
result = new string[children.Count];
for (int i = 0; i < result.Length; i++)
{ result[i] = children[i].Name; }
}
else { result = null; }
string fullname;
try { fullname = item.Key.FullName; }
catch (IOException) { fullname = null; }
catch (UnauthorizedAccessException) { fullname = null; }
yield return new KeyValuePair<string, string[]>(fullname, result);
}
}
public static IEnumerable<KeyValuePair<DirectoryInfo, List<FileSystemInfo>>> GetFileSystemInfosRecursive(DirectoryInfo dir, bool depth_first)
{
var stack = depth_first ? new Stack<DirectoryInfo>() : null;
var queue = depth_first ? null : new Queue<DirectoryInfo>();
if (depth_first) { stack.Push(dir); }
else { queue.Enqueue(dir); }
for (var list = new List<FileSystemInfo>(); (depth_first ? stack.Count : queue.Count) > 0; list.Clear())
{
dir = depth_first ? stack.Pop() : queue.Dequeue();
FileSystemInfo[] children;
try { children = dir.GetFileSystemInfos(); }
catch (UnauthorizedAccessException) { children = null; }
catch (IOException) { children = null; }
if (children != null) { list.AddRange(children); }
yield return new KeyValuePair<DirectoryInfo, List<FileSystemInfo>>(dir, children != null ? list : null);
if (depth_first) { list.Reverse(); }
foreach (var child in list)
{
var asdir = child as DirectoryInfo;
if (asdir != null)
{
if (depth_first) { stack.Push(asdir); }
else { queue.Enqueue(asdir); }
}
}
}
}
This should answer the question. I've ignored the issue of going through subdirectories, I'm assuming you have that figured out.
Of course, you don't need to have a seperate method for this, but you might find it a useful place to also verify the path is valid, and deal with the other exceptions that you could encounter when calling GetFiles().
Hope this helps.
private string[] GetFiles(string path)
{
string[] files = null;
try
{
files = Directory.GetFiles(path);
}
catch (UnauthorizedAccessException)
{
// might be nice to log this, or something ...
}
return files;
}
private void Processor(string path, bool recursive)
{
// leaving the recursive directory navigation out.
string[] files = this.GetFiles(path);
if (null != files)
{
foreach (string file in files)
{
this.Process(file);
}
}
else
{
// again, might want to do something when you can't access the path?
}
}
I prefer using c# framework functions, but the function i need will be included in .net framework 5.0, so i have to write it.
// search file in every subdirectory ignoring access errors
static List<string> list_files(string path)
{
List<string> files = new List<string>();
// add the files in the current directory
try
{
string[] entries = Directory.GetFiles(path);
foreach (string entry in entries)
files.Add(System.IO.Path.Combine(path,entry));
}
catch
{
// an exception in directory.getfiles is not recoverable: the directory is not accessible
}
// follow the subdirectories
try
{
string[] entries = Directory.GetDirectories(path);
foreach (string entry in entries)
{
string current_path = System.IO.Path.Combine(path, entry);
List<string> files_in_subdir = list_files(current_path);
foreach (string current_file in files_in_subdir)
files.Add(current_file);
}
}
catch
{
// an exception in directory.getdirectories is not recoverable: the directory is not accessible
}
return files;
}

Handle (skip) UnauthorizedAccessException in EnumerateFiles() LINQ C#

I am trying to write a high performance file system searcher that can search unindexed drives (both local and network) very fast filtering on extensions and keywords. I am trying to achieve this using C#'s DirectoryInfo.EnumerateDirectories(), DirectoryInfo.EnumerateFiles() and LINQ queries. From my testing, this is (by far) the best performing code I could find:
FileInfo[] dirFiles = dirInfo.EnumerateDirectories()
.AsParallel()
.SelectMany(di => di.EnumerateFiles("*.*", SearchOption.AllDirectories)
.Where(fi => EndsWithExtension(fi.Extension)) )
.ToArray();
However, the UnauthorizedAccessException is not handled and when thrown, crashes the entire query.
I've tried various ways as outlined on SO related to this issue, but I found that they are significantly slower in search performance. This second best method I found working is over 20 times slower for example:
try {
foreach (string fileName in EnumerateFiles(dirInfo, "*.*", SearchOption.AllDirectories)) {
if (ContainsKeyword(fileName)) {
Results.Add(fileName.FullName);
}
}
} catch (Exception e) { continue; }
I would like to skip over the directory when it throws an exception. I've being trying to achieve this with something similar to this, but I can't get it to work (my knowledge of LINQ and Enumerables is too limited...):
FileInfo[] dirFiles = dirInfo.EnumerateDirectories()
.AsParallel()
.SelectMany(di => di.EnumerateFiles("*.*", SearchOption.AllDirectories)
.SkipExceptions()
.Where(fi => EndsWithExtension(fi.Extension)) )
.ToArray();
public static class Extensions {
public static IEnumerable<T> SkipExceptions<T>(this IEnumerable<T> values) {
using (var enumerator = values.GetEnumerator()) {
bool next = true;
while (next) {
try {
if (enumerator.Current != null)
Console.WriteLine(enumerator.Current.ToString());
next = enumerator.MoveNext();
} catch {
continue;
}
if (next) yield return enumerator.Current;
}
}
}
}
Is it possible to handle (UnauthorizedAccess) exceptions, while still remaining as high performant as the "raw" LINQ query?
Thanks in advance for your help!
Answer EDITED:
A workaround is to call it recursively instead of using SearchOption.AllDirectories. This is actually more inefficient in your case because you don't need to load every file in the filesystem into an array. Start with the following helper methods:
List<string> GetDirectoriesRecursive (string parent)
{
var directories = new List<string>();
GetDirectoriesRecursive (directories, parent);
return directories;
}
void GetDirectoriesRecursive (List<string> directories, string parent)
{
directories.Add (parent);
foreach (string child in GetAuthorizedDirectories (parent))
GetDirectoriesRecursive (directories, child);
}
string[] GetAuthorizedDirectories (string dir)
{
try { return Directory.GetDirectories (dir); }
catch (UnauthorizedAccessException) { return new string[0]; }
}
string[] GetAuthorizedFiles (string dir)
{
try { return Directory.GetFiles (dir); }
catch (UnauthorizedAccessException) { return new string[0]; }
}
Then, to get the big files:
var bigFiles =
from dir in GetDirectoriesRecursive ( #"c:\" )
from file in GetAuthorizedFiles (dir)
where new FileInfo (file).Length > 100000000
select file;
Or, to get just their directories:
var foldersWithBigFiles =
from dir in GetDirectoriesRecursive ( #"c:\" )
where GetAuthorizedFiles (dir).Any (f => new FileInfo (f).Length > 100000000 )
select dir;
ANOTHER APPROACH:
string[] directories = Directory.EnumerateDirectories(#"\\testnetwork\abc$","*.*", SearchOption.AllDirectories).Catch(typeof(UnauthorizedAccessException)).ToArray();
ADDED missing part:
static class ExceptionExtensions
{
public static IEnumerable<TIn> Catch<TIn>(
this IEnumerable<TIn> source,
Type exceptionType)
{
using (var e = source.GetEnumerator())
while (true)
{
var ok = false;
try
{
ok = e.MoveNext();
}
catch(Exception ex)
{
if (ex.GetType() != exceptionType)
throw;
continue;
}
if (!ok)
yield break;
yield return e.Current;
}
}
}

listbox don't exist C#

I have this code
public static List<string> GetAllFilesFromFolder(string root, bool searchSubfolders)
{
Queue<string> folders = new Queue<string>();
List<string> files = new List<string>();
folders.Enqueue(root);
while (folders.Count != 0)
{
string currentFolder = folders.Dequeue();
try
{
string[] filesInCurrent = System.IO.Directory.GetFiles(currentFolder, "*.*", System.IO.SearchOption.TopDirectoryOnly);
files.AddRange(filesInCurrent);
}
catch
{
// Do Nothing
}
try
{
if (searchSubfolders)
{
string[] foldersInCurrent = System.IO.Directory.GetDirectories(currentFolder, "*.*", System.IO.SearchOption.TopDirectoryOnly);
foreach (string _current in foldersInCurrent)
{
folders.Enqueue(_current);
}
}
}
catch
{
// Do Nothing
}
}
return files;
}
The code list all files to List<string> but I need to see it in listbox.items, I tried
foreach(string foo in files)
listbox1.items.add(foo)
The problem is it doesn't recognize the listbox1 and keep saying that it doesn't exist. How can I do it? It is a Windows Forms application.

Error preventing any data from being written to array? [duplicate]

I am trying to display a list of all files found in the selected directory (and optionally any subdirectories). The problem I am having is that when the GetFiles() method comes across a folder that it cannot access, it throws an exception and the process stops.
How do I ignore this exception (and ignore the protected folder/file) and continue adding accessible files to the list?
try
{
if (cbSubFolders.Checked == false)
{
string[] files = Directory.GetFiles(folderBrowserDialog1.SelectedPath);
foreach (string fileName in files)
ProcessFile(fileName);
}
else
{
string[] files = Directory.GetFiles(folderBrowserDialog1.SelectedPath, "*.*", SearchOption.AllDirectories);
foreach (string fileName in files)
ProcessFile(fileName);
}
lblNumberOfFilesDisplay.Enabled = true;
}
catch (UnauthorizedAccessException) { }
finally {}
You will have to do the recursion manually; don't use AllDirectories - look one folder at a time, then try getting the files from sub-dirs. Untested, but something like below (note uses a delegate rather than building an array):
using System;
using System.IO;
static class Program
{
static void Main()
{
string path = ""; // TODO
ApplyAllFiles(path, ProcessFile);
}
static void ProcessFile(string path) {/* ... */}
static void ApplyAllFiles(string folder, Action<string> fileAction)
{
foreach (string file in Directory.GetFiles(folder))
{
fileAction(file);
}
foreach (string subDir in Directory.GetDirectories(folder))
{
try
{
ApplyAllFiles(subDir, fileAction);
}
catch
{
// swallow, log, whatever
}
}
}
}
Since .NET Standard 2.1 (.NET Core 3+, .NET 5+), you can now just do:
var filePaths = Directory.EnumerateFiles(#"C:\my\files", "*.xml", new EnumerationOptions
{
IgnoreInaccessible = true,
RecurseSubdirectories = true
});
According to the MSDN docs about IgnoreInaccessible:
Gets or sets a value that indicates whether to skip files or directories when access is denied (for example, UnauthorizedAccessException or SecurityException). The default is true.
Default value is actually true, but I've kept it here just to show the property.
The same overload is available for DirectoryInfo as well.
This simple function works well and meets the questions requirements.
private List<string> GetFiles(string path, string pattern)
{
var files = new List<string>();
var directories = new string[] { };
try
{
files.AddRange(Directory.GetFiles(path, pattern, SearchOption.TopDirectoryOnly));
directories = Directory.GetDirectories(path);
}
catch (UnauthorizedAccessException) { }
foreach (var directory in directories)
try
{
files.AddRange(GetFiles(directory, pattern));
}
catch (UnauthorizedAccessException) { }
return files;
}
A simple way to do this is by using a List for files and a Queue for directories.
It conserves memory.
If you use a recursive program to do the same task, that could throw OutOfMemory exception.
The output: files added in the List, are organised according to the top to bottom (breadth first) directory tree.
public static List<string> GetAllFilesFromFolder(string root, bool searchSubfolders) {
Queue<string> folders = new Queue<string>();
List<string> files = new List<string>();
folders.Enqueue(root);
while (folders.Count != 0) {
string currentFolder = folders.Dequeue();
try {
string[] filesInCurrent = System.IO.Directory.GetFiles(currentFolder, "*.*", System.IO.SearchOption.TopDirectoryOnly);
files.AddRange(filesInCurrent);
}
catch {
// Do Nothing
}
try {
if (searchSubfolders) {
string[] foldersInCurrent = System.IO.Directory.GetDirectories(currentFolder, "*.*", System.IO.SearchOption.TopDirectoryOnly);
foreach (string _current in foldersInCurrent) {
folders.Enqueue(_current);
}
}
}
catch {
// Do Nothing
}
}
return files;
}
Steps:
Enqueue the root in the queue
In a loop, Dequeue it, Add the files in that directory to the list, and Add the subfolders to the queue.
Repeat untill the queue is empty.
see https://stackoverflow.com/a/10728792/89584 for a solution that handles the UnauthorisedAccessException problem.
All the solutions above will miss files and/or directories if any calls to GetFiles() or GetDirectories() are on folders with a mix of permissions.
Here's a full-featured, .NET 2.0-compatible implementation.
You can even alter the yielded List of files to skip over directories in the FileSystemInfo version!
(Beware null values!)
public static IEnumerable<KeyValuePair<string, string[]>> GetFileSystemInfosRecursive(string dir, bool depth_first)
{
foreach (var item in GetFileSystemObjectsRecursive(new DirectoryInfo(dir), depth_first))
{
string[] result;
var children = item.Value;
if (children != null)
{
result = new string[children.Count];
for (int i = 0; i < result.Length; i++)
{ result[i] = children[i].Name; }
}
else { result = null; }
string fullname;
try { fullname = item.Key.FullName; }
catch (IOException) { fullname = null; }
catch (UnauthorizedAccessException) { fullname = null; }
yield return new KeyValuePair<string, string[]>(fullname, result);
}
}
public static IEnumerable<KeyValuePair<DirectoryInfo, List<FileSystemInfo>>> GetFileSystemInfosRecursive(DirectoryInfo dir, bool depth_first)
{
var stack = depth_first ? new Stack<DirectoryInfo>() : null;
var queue = depth_first ? null : new Queue<DirectoryInfo>();
if (depth_first) { stack.Push(dir); }
else { queue.Enqueue(dir); }
for (var list = new List<FileSystemInfo>(); (depth_first ? stack.Count : queue.Count) > 0; list.Clear())
{
dir = depth_first ? stack.Pop() : queue.Dequeue();
FileSystemInfo[] children;
try { children = dir.GetFileSystemInfos(); }
catch (UnauthorizedAccessException) { children = null; }
catch (IOException) { children = null; }
if (children != null) { list.AddRange(children); }
yield return new KeyValuePair<DirectoryInfo, List<FileSystemInfo>>(dir, children != null ? list : null);
if (depth_first) { list.Reverse(); }
foreach (var child in list)
{
var asdir = child as DirectoryInfo;
if (asdir != null)
{
if (depth_first) { stack.Push(asdir); }
else { queue.Enqueue(asdir); }
}
}
}
}
This should answer the question. I've ignored the issue of going through subdirectories, I'm assuming you have that figured out.
Of course, you don't need to have a seperate method for this, but you might find it a useful place to also verify the path is valid, and deal with the other exceptions that you could encounter when calling GetFiles().
Hope this helps.
private string[] GetFiles(string path)
{
string[] files = null;
try
{
files = Directory.GetFiles(path);
}
catch (UnauthorizedAccessException)
{
// might be nice to log this, or something ...
}
return files;
}
private void Processor(string path, bool recursive)
{
// leaving the recursive directory navigation out.
string[] files = this.GetFiles(path);
if (null != files)
{
foreach (string file in files)
{
this.Process(file);
}
}
else
{
// again, might want to do something when you can't access the path?
}
}
I prefer using c# framework functions, but the function i need will be included in .net framework 5.0, so i have to write it.
// search file in every subdirectory ignoring access errors
static List<string> list_files(string path)
{
List<string> files = new List<string>();
// add the files in the current directory
try
{
string[] entries = Directory.GetFiles(path);
foreach (string entry in entries)
files.Add(System.IO.Path.Combine(path,entry));
}
catch
{
// an exception in directory.getfiles is not recoverable: the directory is not accessible
}
// follow the subdirectories
try
{
string[] entries = Directory.GetDirectories(path);
foreach (string entry in entries)
{
string current_path = System.IO.Path.Combine(path, entry);
List<string> files_in_subdir = list_files(current_path);
foreach (string current_file in files_in_subdir)
files.Add(current_file);
}
}
catch
{
// an exception in directory.getdirectories is not recoverable: the directory is not accessible
}
return files;
}

How to avoid using foreach loop to get the filelist for different reason

Here what am trying to do:
I have a remote server (e.g:svr01,svr02,svr03). Using GetFileList to read the directory get all the files and match with the file name I have then copy to my local drive.
If any files matched then am adding them to an XML file also.
I was trying to do like below
class Program
{
static void Main(string[] args)
{
var getfiles = new fileshare.Program();
string realname = "*main*";
string Location = "SVR01";
bool anymatch = false;
foreach (var file in getfiles.GetFileList(realname,Location))
{anymatch=true;}
if (anymatch == true)
{ baseMeta(); }
foreach (var file in getfiles.GetFileList(realname,Location))
{getfiles.copytolocal(file.FullName); }
}
private FileInfo[] GetFileList(string pattern,string Location)
{
try
{
switch (Location)
{
case "SVR01":
{
var di = new DirectoryInfo(#"\\SVR01\Dev");
return di.GetFiles(pattern);
}
case "SVR02":
{
var di = new DirectoryInfo(#"\\SVR02\Dev");
return di.GetFiles(pattern);
}
case "SVR03":
{
var di = new DirectoryInfo(#"\\SVR03\Prod");
return di.GetFiles(pattern);
}
default: throw new ArgumentOutOfRangeException();
}
}
catch(Exception ex)
{ Console.Write(ex.ToString());
return null;
}
}
private void copytolocal(string filename)
{
string nameonly = Path.GetFileName(filename);
File.Copy(filename,Path.Combine(#"c:\",nameonly),true);
}
private void baseMeta()
{
XmlWriter xmlWrite = XmlWriter.Create(#"c:\basexml");
xmlWrite.WriteStartElement("job");
xmlWrite.WriteElementString("Name", "test");
xmlWrite.WriteElementString("time", DateTime);
xmlWrite.Close();
}
}
but this piece of code worries me because am doing the same process two times, any one please guide me how to avoid this.
foreach (var file in getfiles.GetFileList(realname,Location))
{
anymatch=true;}
if (anymatch == true)
{
baseMeta();
}
foreach (var file in getfiles.GetFileList(realname,Location))
{
getfiles.copytolocal(file.FullName);
}
}
Even am trying to find out if it match anyfile then i quit the first foreach loop generate the basemeta() then goes to next foreach loop to do the rest of the process.
Using LINQ you should be able to easily change your posted code into:
var getfiles = new fileshare.Program();
string realname = "*main*";
string Location = "SVR01";
var fileList = getFiles.GetFileList(realname, Location);
var anymatch = fileList.Any();
if (anymatch) // Or possibly `if (fileList.Any())` if anymatch isn't
// really used anywhere else
baseMeta();
foreach (var file in getfiles.GetFileList(realname,Location))
getfiles.copytolocal(file.FullName);
You'll get the greatest benefit by replacing your GetFileList method with:
private IEnumerable<FileInfo> GetFileList(string pattern,string Location)
{
string directory = string.Empty;
switch (Location)
{
case "SVR01":
directory = #"\\SVR01\Dev";
break;
case "SVR02":
directory = #"\\SVR02\Dev";
break;
case "SVR03":
directory = #"\\SVR03\Prod");
break;
default:
throw new ArgumentOutOfRangeException();
}
DirectoryInfo di = null;
try
{
di = new DirectoryInfo(directory);
}
catch(Exception ex)
{
Console.WriteLine(ex.Message);
yield break;
}
foreach(var fi in di.EnumerateFiles(pattern))
yield return fi;
}
Use this
var files = getfiles.GetFileList(realname, Location);
if (files.Length > 0)
{
baseMeta();
foreach(var file in files)
{
getfiles.copytolocal(file.FullName);
}
}
Try this:
Create method to check the file existence and do all in single loop.
your statement is not much clear that when you will copy or not.. use
your condition on which you want to copy or create xml entry..
What is your AnyMatch?? If you want to check that Is there any file then use
var fileList = getfiles.GetFileList(realname,Location);
if( fileList.Count() > 0)
{
baseMeta();
}
foreach (var file in fileList)
{
// copy the file if match does not exist..
getfiles.copytolocal(file.FullName);
}
But Foreach loop through collection if it have any item. so you need not to care about the count of the files..
If you want to do entry on every copy as per your code then why you need to check anyMatch etc. It will create entry on every file copy.
foreach (var file in getfiles.GetFileList(realname,Location))
{
baseMeta();
// copy the file
getfiles.copytolocal(file.FullName);
}

Categories

Resources