Access Path Denied when iterating through C: drive - c#

The problem is that when I'm inside folder for example in my desktop or it could be any other folder, my program is working smoothly, but once I try to iterate every file in every directory and subdirectories with "C:" drive as a path it throws me an error "Access path denied".
If i use a try/catch before iterating through folder it only shows me like 7 files and stop.
My code looks like that :
try
{
foreach (string file in Directory.EnumerateFiles("C:\\", "*.*", SearchOption.AllDirectories))
{
textBox1.AppendText(file + "\r\n");
cpt++;
textBox2.AppendText(cpt.ToString() + "\r\n");
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
return cpt;
and the outputed values for textBox1 and textBox2 are :
C:\appverifUI.dll 1
C:\bootTel.dat 2
C:\DumpStack.log.tmp 3
C:\hiberfil.sys 4
C:\pagefile.sys 5
C:\swapfile.sys 6
C:\vfcompat.dll 7
I need it to work on .NET framework

Normal applications can't access every folder in a machine. When they do, they get an Access Denied exception. You can avoid this by using the EnumerateFiles overload that accepts an EnumerationOptions parameter with IgnoreInaccessible set to true :
var options=new EnumerationOptions {
IgnoreInaccessible = true,
RecurseSubdirectories = true
};
foreach (string file in Directory.EnumerateFiles("C:\\", "*.*",options))
{
...
}

Related

C# File Permissions

I am currently writing a program in C# that will copy all user profile files to an external device (in this case, my home server).
When my code iterates through my files and folders, it throws an UnauthorizedAccessException.
I have Googled this and searched StackOverflow, but I am unable to find a clear solution that doesn't involve terminating my process. The idea is that it should copy the files and folders that have read permissions.
I had this when I first started, but I easily fixed it by limiting what directories I would backup (though I would prefer a full backup).
Here is my code:
FileInfo f = new FileInfo(_configuration.Destination);
if (!f.Directory.Exists)
{
f.Directory.Create();
}
string[] backupDirectories = new string[]
{
"Desktop",
"Documents",
"Downloads",
"Favorites",
"Links",
"Pictures",
"Saved Games",
"Searches",
"Videos",
".git",
".android",
".IdealC15",
".nuget",
".oracle_jre_usage",
".vs",
"Contacts"
};
foreach (string dirPath in backupDirectories)
{
DirectoryInfo dirInfo = new DirectoryInfo(_path + "\\" + dirPath);
if (dirInfo.Exists)
{
foreach (string dirP in Directory.GetDirectories(dirInfo.FullName, "*", SearchOption.AllDirectories))
{
DirectoryInfo dirI = new DirectoryInfo(dirP);
if (dirI.Exists)
{
string dir = dirP.Replace(_path, _configuration.Destination);
try
{
Directory.CreateDirectory(dir);
textBox2.Invoke((MethodInvoker) delegate
{
textBox2.AppendText("Create Directory: " + dir + Environment.NewLine);
});
} catch (Exception e)
{
textBox2.Invoke((MethodInvoker) delegate
{
textBox2.AppendText("Could NOT Create Directory: " + dir + Environment.NewLine);
});
continue;
}
foreach (FileInfo theFile in dirInfo.GetFiles("*", SearchOption.AllDirectories))
{
string newPath = theFile.FullName;
string file = newPath.Replace(_path, _configuration.Destination);
try
{
File.Copy(newPath, file, true);
textBox2.Invoke((MethodInvoker) delegate
{
textBox2.AppendText("Create File: " + file + Environment.NewLine);
});
} catch (Exception ex)
{
textBox2.Invoke((MethodInvoker) delegate
{
textBox2.AppendText("Could NOT Create File: " + file + Environment.NewLine);
});
}
}
}
}
}
}
I apologise if the code is messy, but I will describe sort of what it is doing. The first bit checks if the backup folder exists on the external drive.
The second part says what folders I need to backup (if you're able to fix this and make it backup all directories with permissions, please help me in doing so).
The first loop starts the iteration for each of the backupDirectories. The second loop starts the iteration for each of the directories in the backup directory. The third loop starts the iteration for each of the folders in the backup directory.
The exception is thrown at Directory.GetDirectories(dirInfo.FullName, "*", SearchOption.AllDirectories), and it is trying to access C:\Users\MyName\Documents\My Music. Attempting to access it in explorer does give me a permissions error, though it isn't listed in explorer when I try going to "Documents" (I am in Windows 10 Pro).
As I recommended, since the Operating System authority is higher than the application, it is likely that you cannot do more than what the Operating System would allow you to do (that is to access or not to access certain folder).
Thus, folders' accessibility is best solved in the Operating System level.
But you could still do two things in the program level to minimize the damage when you search for the items.
To use Directory.AccessControl to know the access level of a directory before you do any query on it. This is not so easy, and there are elaborated answers about this here and also here.
To minimize the damage made by unauthorized access issues by using SearchOption.TopDirectoryOnly instead of SearchOption.AllDirectories, combined with recursive search for all the accessible directories. This is how you can code it
public static List<string> GetAllAccessibleDirectories(string path, string searchPattern) {
List<string> dirPathList = new List<string>();
try {
List<string> childDirPathList = Directory.GetDirectories(path, searchPattern, SearchOption.TopDirectoryOnly).ToList(); //use TopDirectoryOnly
if (childDirPathList == null || childDirPathList.Count <= 0) //this directory has no child
return null;
foreach (string childDirPath in childDirPathList) { //foreach child directory, do recursive search
dirPathList.Add(childDirPath); //add the path
List<string> grandChildDirPath = GetAllAccessibleDirectories(childDirPath, searchPattern);
if (grandChildDirPath != null && grandChildDirPath.Count > 0) //this child directory has children and nothing has gone wrong
dirPathList.AddRange(grandChildDirPath.ToArray()); //add the grandchildren to the list
}
return dirPathList; //return the whole list found at this level
} catch {
return null; //something has gone wrong, return null
}
}
The function above minimize the damage caused by the unauthorized access only to the sub-directories which have the issue. All other accessible directories can be returned.

How to prevent Directory.GetFiles to "check" recycle bin and other "unsafe" places?

So guys, I have a function in my application which to search for certain file in certain directory using GetFiles method
System.IO.Directory.GetFiles(string path, string searchPattern, System.IO.SearchOption)
It works fine, until when I choose drive directory (D:\ or C:\ and such) to be searched, because it's also accessing the Recycle Bin, and then restricted
Access to the path 'D:\$RECYCLE.BIN\S-1-5-21-106145493-3722843178-2978326776-1010' is denied.
It's also need to be able to search subfolders (SearchOption.AllDirectories) too.
How to SKIP such place to be searched? Because there may be any other folder which access also denied.
I capitalize SKIP because if I use try catch and an exception caught, then the entire search will also fail.
Thanks. Please clarify anything you need.
EDITed for more clarity.
When recursively scanning a directory tree, say using a recursive method which takes the directory to start with as a parameter, you can get the attributes of the directory. Then check whether it's a system directory AND NOT a root directory like "C:\" - in that case you want to skip that directory, as it may be, for instance, the recycle bin.
Here's some code that does this, and also catches some common exceptions which occurred when I fiddled with directory scanning.
void scan_dir(string path)
{
// Exclude some directories according to their attributes
string[] files = null;
string skipReason = null;
var dirInfo = new DirectoryInfo( path );
var isroot = dirInfo.Root.FullName.Equals( dirInfo.FullName );
if ( // as root dirs (e.g. "C:\") apparently have the system + hidden flags set, we must check whether it's a root dir, if it is, we do NOT skip it even though those attributes are present
(dirInfo.Attributes.HasFlag( FileAttributes.System ) && !isroot) // We must not access such folders/files, or this crashes with UnauthorizedAccessException on folders like $RECYCLE.BIN
)
{ skipReason = "system file/folder, no access";
}
if ( null == skipReason )
{ try
{ files = Directory.GetFiles( path );
}
catch (UnauthorizedAccessException ex)
{ skipReason = ex.Message;
}
catch (PathTooLongException ex)
{ skipReason = ex.Message;
}
}
if (null != skipReason)
{ // perhaps do some error logging, stating skipReason
return; // we skip this directory
}
foreach (var f in files)
{ var fileAttribs = new FileInfo( f ).Attributes;
// do stuff with file if the attributes are to your liking
}
try
{ var dirs = Directory.GetDirectories( path );
foreach (var d in dirs)
{ scan_dir( d ); // recursive call
}
}
catch (PathTooLongException ex)
{ Trace.WriteLine(ex.Message);
}
}

Can't delete folders in User's folder recursively. 'System.UnauthorizedAccessException' occurred in mscorlib.dll

I'm trying to delete every folder that contains the user's "user name" and it's contents located in C:\Users\User like so:
foreach (var subdir in directory.GetDirectories().Where(subdir => subdir.Name.ToLower().Contains(Environment.UserName))) {
try {
Directory.Delete(subdir.FullName, true);
} catch (Exception exception) {
Console.Write("Deleting " + subdir.FullName + " caused exception: \n" + exception);
}
}
When I try to run the Windows Form binary, I get a 'System.UnauthorizedAccessException' occurred in mscorlib.dll error when it hits the first couple of files. Here's the thing, I'm running it as an admin, I can delete those files in explorer without an issue (or even a UAC prompt), and there is not a process locking/using those files.
What's going on?
swap directory.delete for this call
//Directory.Delete alternative
public void DeleteDirectory(string targetDir)
{
File.SetAttributes(targetDir, FileAttributes.Normal);
string[] files = Directory.GetFiles(targetDir);
string[] dirs = Directory.GetDirectories(targetDir);
foreach (string file in files)
{
File.SetAttributes(file, FileAttributes.Normal);
File.Delete(file);
}
foreach (string dir in dirs)
{
DeleteDirectory(dir);
}
Directory.Delete(targetDir, false);
}
Actually there's another reason for this to happen: within the directory you may have a hidden irritating file named "Thumb.db", which contains thumbnail information of all your files. Sometimes this file won't get closed unless you terminate your explorer.exe via Task Manager or shut down your computer, and result in a "un-deletable" folder.
To get rid of this annoying piece, follow the instructions here :)

C# code to copy files, can this snippet be improved?

While copying around 50 GB of data via local LAN share, due to connectivity issue copy failed at around 10 GB copied.
I have renamed copied 10GB of data directory to localRepository and then written a C# program to copy files from the remote server to destination, only if it is not found in local repository. If found move file from local repository to destination folder.
Although the code worked fine and accomplishes the task very well. I wonder, have I written the most efficient code? Can you find any improvements?
string destinationFolder = #"C:\DataFolder";
string remoteRepository = #"\\RemoteComputer\DataFolder";
string localRepository = #"\\LocalComputer\LocalRepository";
protected void Page_Load(object sender, EventArgs e)
{
foreach (string remoteSrcFile in Directory.EnumerateFiles(remoteRepository, "*.*", SearchOption.AllDirectories))
{
bool foundInLocalRepo = false; ;
foreach (var localSrcFile in Directory.EnumerateFiles(localRepository, "*.*", SearchOption.AllDirectories))
{
if (Path.GetFileName(remoteSrcFile).Equals(Path.GetFileName(localSrcFile)))
{
FileInfo localFile = new FileInfo(localSrcFile);
FileInfo remoteFile = new FileInfo(remoteSrcFile);
//copy this file from local repository
if (localFile.Length == remoteFile.Length)
{
try
{
File.Move(localSrcFile, PrepareDestinationPath(remoteSrcFile));
Debug.WriteLine(remoteSrcFile + " moved from local repo");
}
catch (Exception ex)
{
Debug.WriteLine(remoteSrcFile + " did not move");
}
foundInLocalRepo = true;
break;
}
}
}
if (!foundInLocalRepo)
{
//copy this file from remote repository
try
{
File.Copy(remoteSrcFile, PrepareDestinationPath(remoteSrcFile), false);
Debug.WriteLine(remoteSrcFile + " copied from remote repo");
}
catch (Exception ex)
{
Debug.WriteLine(remoteSrcFile + " did not copy");
}
}
}
}
private string PrepareDestinationPath(string remoteSrcFile)
{
string relativePath = remoteSrcFile.Split(new string[] { "DataFolder" }, StringSplitOptions.None)[1];
string copyPath = Path.GetFullPath(destinationFolder + relativePath);
Directory.CreateDirectory(Path.GetDirectoryName(copyPath));
return copyPath;
}
EDIT:
Based on answer given by Thomas I am attempting to zip the file.
Traditionally as an end user we use to zip a file and then copy. As a programmer can we zip and copy the file parallel? I mean the portion which has been zipped send it over the wire?
You are doing far too much work with the nested loop.
You should remove the inner "foreach" and replace it with some code that:
(1) Constructs the name of the file that you are looking for and
(2) Uses File.Exists() to see if exists, then
(3) Continues with the same block of code that you currently have following the "if (Path.GetFileName(remoteSrcFile)..." condition.
Something like this:
foreach (string remoteSrcFile in Directory.EnumerateFiles(remoteRepository, "*.*", SearchOption.AllDirectories))
{
string localSrcFile = Path.Combine(localRepository, Path.GetFileName(remoteSrcFile));
if (File.Exists(localSrcFile))
{
...
}
}
I would suggest zipping the files before moving. Try take a look at the very simple http://dotnetzip.codeplex.com/
Try zipping 1000 files a time, in that way, you don't have to run the for-loop that many times and establish new connections etc each time.

DirectoryNotFoundException when calling Directory.GetDirectories on Environment.SpecialFolder.Favorites due to Domain Folder Redirection

I have some C# code that tries to get the Favorites for the currently logged in user. The code is part of a Taskbar Toolbar that gets loaded into the Windows Explorer process. I have a user who is using Windows Vista with UAC enabled on a domain that either has Roaming Profiles or Folder Redirection setup and enabled. When calling Directory.GetDirectories on the Favorites path, it throws "System.IO.DirectoryNotFoundException: Could not find a part of the path 'C:\Users\\Favorites\". Other users on other domains that do not have Roaming Profiles or Folder Redirection setup do not have this issue.
The user also reported that copying the path from the failed logs into the run prompt fails to load the path, but if they navigate to the path directly using explorer and then copy and paste that path into the run prompt, it works. He sent me both paths and they are exactly identical which doesn't make any sense at all.
My theory is that this is caused by the Folder Redirection where that path is actually pointing to a share on the server but the redirection is failing when trying to access the subdirectories (of the directoryInfo returned from Directory.GetDirectories). The initial directory works but all subdirectories of the initial directory fail to redirect correctly.
Has anyone come across a situation like this and/or know a workaround to gain proper access to redirected folders?
private void GetFavorites()
{
try
{
System.IO.DirectoryInfo dirInfo = new System.IO.DirectoryInfo(Environment.GetFolderPath(Environment.SpecialFolder.Favorites));
AddFavorites(dirInfo);
}
catch
{
}
}
private void AddFavorites(DirectoryInfo dirInfo)
{
foreach (System.IO.FileInfo fileInfo in dirInfo.GetFiles("*.url"))
{
//string alias = fileInfo.Name.Replace(".url", "");
if (!ItemsBookmarks.ContainsKey(fileInfo.Name))
ItemsBookmarks.Add(fileInfo.Name, fileInfo.Name);
}
foreach (System.IO.FileInfo fileInfo in dirInfo.GetFiles("*.lnk"))
{
if (!ItemsBookmarks.ContainsKey(fileInfo.Name))
ItemsBookmarks.Add(fileInfo.Name, fileInfo.Name);
}
foreach (System.IO.DirectoryInfo objDir in dirInfo.GetDirectories())
{
AddFavorites(objDir);
}
}
Thanks,
John
I believe the problem you are experiencing is related to Reparse Points.
See: http://msdn.microsoft.com/en-us/library/bb513869.aspx
See: What is the best way to check for reparse point in .net (c#)?
The problem can be avoided by using the following syntax:
private void AddFavorites(string dirPath)
{
try
{
foreach (string fileName in Directory.GetFiles(dirPath, "*.*", SearchOption.TopDirectoryOnly))
{
//string alias = fileInfo.Name.Replace(".url", "");
if (!ItemsBookmarks.ContainsKey(fileInfo.Name))
{
ItemsBookmarks.Add(fileName);
}
}
foreach (string subDirName in Directory.GetDirectories(dirPath, "*.*", SearchOption.TopDirectoryOnly))
{
AddFavorites(objDir);
}
}
catch
{
//error getting files or subdirs... permissions issue?
//throw
}
}

Categories

Resources