Any way to check if write permissions are available on a given path that could be either a local folder (c:\temp) or a UNC (\server\share)? I can't use try/catch because I might have write permissions but not delete so I wouldn't be able to delete created file...
Yes you can use the FileIOPermission class and the FileIOPermissionAccess enum.
FileIOPermissionAccess.Write:
Access to write to or delete a file or directory. Write access includes deleting and overwriting files or directories.
FileIOPermission f = new FileIOPermission(FileIOPermissionAccess.Write, myPath);
try
{
f.Demand();
//permission to write/delete/overwrite
}
catch (SecurityException s)
{
//there is no permission to write/delete/overwrite
}
You use a permissions demand, thus:
FileIOPermission f2 = new FileIOPermission(FileIOPermissionAccess.Read, "C:\\test_r");
f2.AddPathList(FileIOPermissionAccess.Write | FileIOPermissionAccess.Read, "C:\\example\\out.txt");
try
{
f2.Demand();
// do something useful with the file here
}
catch (SecurityException s)
{
Console.WriteLine(s.Message);
// deal with the lack of permissions here.
}
specifying the permissions you want and the file system object(s) desired. If you don't have the demanded permission, a Security exception is thrown. More details at
http://support.microsoft.com/kb/315529
http://msdn.microsoft.com/en-us/library/system.security.permissions.fileiopermission.aspx
For a variety of reasons — race conditions being one of them — it's more complicated than it might seem to examine NTFS file system permissions.
Apparently, we figured out a while back that this is a no-op for UNC paths. See this question, Testing a UNC Path's "Accessability", for detials.
A little google-fu suggests that this CodeProject class might be of use, though: http://www.codeproject.com/Articles/14402/Testing-File-Access-Rights-in-NET-2-0
Related
I need compact all the folders of my E:\ But when i try with this code, i get this access exception.
I'm new on programming and are trying learn how a i can do this work.
I'm using the dotnetzip for compact the directories inside E:.
Some parts of the code are just copied, i know that... but i'm trying learn how to do work.
I already tried some others solution for solve the problem answered here.
Like add a manifest to the project who requeireAdministrator permissions to run and insert a Access Control Rule to modify the security of E:\ Something that i notice when do this is my user have permissions removed from the subfiles and directories of E:\ But without this access rule the same exception keep existing.
try
{
ZipFile zip = new ZipFile();
zip.AddDirectory(#"E:\");
zip.Save(#"C:\Users\vitorbento\Desktop\backup.zip");
Console.WriteLine("Compactação concluída");
Console.WriteLine("Done.");
}
catch (UnauthorizedAccessException)
{
FileAttributes attr = (new FileInfo(DirectPath)).Attributes;
Console.Write("UnAuthorizedAccessException: Unable to access file. ");
if ((attr & FileAttributes.ReadOnly) > 0)
Console.Write("The file is read-only.");
}
This is similar with this question but with one more requirement:
Since the deletion of files can fail for whatever reasons. So I want the operation to be "transacted" which mean the whole operation either success in total, or fail and do not change anything at all.
In general this will be very very hard and I can't see any possibility to recover when the physical hard drive suddenly broken. So a weakened clause would be: if it success, we finished. Otherwise if fail, restore everything to the original state when possible.
Some kind of errors I could think of would be:
Access violation. You simply don't allowed to delete some files or folders. This is the case that I wanted to handle the most.
File/folder was used by somebody else and so it is "locked". In Linux this is not a problem but in Windows it is. This is also to be handled.
If it is a network folder there could be network issues. The recover can be hard or impossible. I would not expect this kind of error to be properly handled.
Hardware failure. I don't think any recover can happen here.
Scenario
You have a software that can export its internal data. The result is in a folder and with sub-folder names timestamped.
Now if the user specified a folder that is not empty (probably a previous output folder), the software will create new sub-folders on top of it, which is a mass. So you want to ensure the folder is empty before performing the export.
You can easily detect the folder emptiness and alert the user if not. But if the user say "go ahead and do it" you should do something then. Now, what if you were deleted some of the files and failed on others?
Going ahead in this case is just creating worse mass. At the same time the user would not expect a damaged folder without getting anything working. So it is better to either give them a fully working output or does not change the previous output at all.
As per comments, I'll give you the pseudocode for the process you can follow writing the code:
Clear contents of cache folder if any files exist (they shouldn't)
Copy contents of destination folder to cache folder
Try
While files exist, iterate
Delete file
End While
Catch
While files exist in cache, iterate
If file does not exist in destination folder
Move file from cache to destination
else
Delete file from cache
end If
End While
End Try
By following the guidelines given in the comments, I came up with this solution.
The following code will attempt to move everything to a temporary folder inside the given folder. If success, it returns True. If failed, the catch block will then try to move everything back and return a False. In either case, the finally block will remove the temporary folder recursively.
public static bool EmptyFolderTransactionaly(string folder)
{
var directoryInfo = new DirectoryInfo(folder);
var tmpDir = Directory.CreateDirectory(Path.Combine(folder, Path.GetFileName(Path.GetTempFileName())));
try
{
foreach (var e in directoryInfo.EnumerateFiles())
{
e.MoveTo(Path.Combine(tmpDir.FullName, e.Name));
}
foreach (var e in directoryInfo.EnumerateDirectories().Where(e => e.Name!=tmpDir.Name))
{
e.MoveTo(Path.Combine(tmpDir.FullName, e.Name));
}
return true;
}
catch
{
foreach (var e in tmpDir.EnumerateDirectories())
{
e.MoveTo(Path.Combine(directoryInfo.FullName, e.Name));
}
foreach (var e in tmpDir.EnumerateFiles())
{
e.MoveTo(Path.Combine(directoryInfo.FullName, e.Name));
}
return false;
}
finally
{
tmpDir.Delete(true);
}
}
Let me know if you see any risks in the code.
Till now all cases I met was like:
try to store file
if file is stored that's fine
if file can not be stored tell about it (no path, no permission, etc)
Currently I am standing before problem how to check if file can be stored without storing it. In general I don't even know what to ask Google, because all returned results I got are about permissions, and general in Linux, and I need it under C#/Windows.
How such checking can be done?
Since you don't want to attempt writing to the folder, you could consider this approach.
You can put a try catch around this:
System.Security.AccessControl.DirectorySecurity ds = Directory.GetAccessControl(folderPath);
i.e.
public bool CanWriteToPath(string folderPath)
{
try
{
var ds = Directory.GetAccessControl(folderPath);
return true;
} catch (UnauthorizedAccessException)
{
return false;
}
}
It will only succeed if you have permissions.
Sorry that it still has to use a try/catch.
How about use try catch block? If any exception occurs - you can catch it and determine than can not save file. If no exception occurs, file successfully saved.
There is a Directory.GetAccessControl method and File.GetAccessControl method in File.IO namespace.
If you want to know the status before actually writing the file, check if the specified directory has write permissions for the user. If all is well, write the file. You could also use a try catch block as suggested in other answers.
I'm a bit confused here. I wrote the following script to add files of a certain extension type to a List and it DOES work, just not for the root of C: Here's the code first...
// Create an empty list
List<string> scanFiles = new List<string>();
// Split possible extention list into array
string[] scanExtensions = #"exe,com".Split(',');
try
{
foreach (string extension in scanExtensions)
{
// Add collection for this filetype to the list of files
scanFiles.AddRange(Directory.GetFiles("C:\\", "*." + extension, SearchOption.AllDirectories));
}
}
catch (Exception ex)
{
Console.WriteLine("ERROR: " + ex.Message);
}
// Display results
foreach(string sf in scanFiles)
{
Console.WriteLine(sf);
}
So if I run the above code, I get an error - but not the error I expect. It displays the following...
ERROR: Access to the path 'C:\Documents and Settings\' is denied.
I'd understand this if I hadn't specified 'C:\' as the directory path! If I change this to any valid directory (such as C:\Program Files), the code works fine. Can anyone explain this?
Thanks,
Simon
SearchOption.AllDirectories means your code will drill down into (forbidden) territory.
Better be prepared to handle this kind of error. For a solution without catching exceptions you'll need DirectoryInfo.GetFiles() to get FileInfo objects instead of strings and verify your access rights ahead of time.
But you will still need to handle exceptions (File/Dir not found) because of concurrency so you might as well forget about the FileInfos.
Well, the cause of the error message called "Access denied" is ... that you don't have access to that folder!
Try clicking on it in Windows Explorer. You will notice that, in fact, you can't access it. What a surprise ;-) The message told you exactly that.
SearchOption.AllDirectories means that GetFiles will recursively enumerate all files. If it hits an error somewhere it will throw an exception. There is no way to change that.
You cannot make Directory.GetFiles ignore access denied errors. So you have to code your own file-system enumeration code. It will probably be a recursive function with some error-handling code in it.
You're specifying SearchOption.AllDirectories which according to the documentation means
AllDirectories Includes the current directory and all the subdirectories in a search operation. This option includes reparse points like mounted drives and symbolic links in the search.
In other words, your search is recursive and walks down into Documents and Settings where you have no read permission.
I have an asp.net mvc app with a route that allows users to request files that are stored outside of the web application directory.
I'll simplify the scenario by just telling you that it's going to ultimately confine them to a safe directory to which they have full access.
For example:
If the user (whose ID is 100) requests:
http://mysite.com/Read/Image/Cool.png
then my app is going to append "Cool.png" to "C:\ImageRepository\Users\100\" and write those bytes to the response. The worker process has access to this path, but the anonymous user does not. I already have this working.
But will some malicious user be able to request something like:
http://mysite.com/Read/Image/..\101\Cool.png
and have it resolve to
"C:\ImageRepository\Customers\101\Cool.png"
(some other user's image?!)
Or something like that? Is there a way to make sure the path is clean, such that the user is constrained to their own directory?
How about
var fileName = System.IO.Path.GetFileName(userFileName);
var targetPath = System.IO.Path.Combine(userDirectory, fileName);
That should ensure you get a simple filename only.
Perhaps you should verify that the path starts with the user's directory path?
e.g. "C:\ImageRepository\Customers\100\"
You should also normalize the paths to uppercase letters when comparing them.
The safest way, if it is an option (you are using windows auth), is to make it a non-issue by using Active Directory rights on the folders so it doesn't matter if the user attempts to access a directory that is not valid.
Absent that, store the files so that the path is abstracted from the user. That is, use whatever name the user provides as a lookup in a table that has the REAL path to the file.
Cannolocalization protection is tricky business and it is dangerous to try and outthink a potential attacker.
Using the Request.MapPath overload is one way to check this:
try
{
string mappedPath = Request.MapPath( inputPath.Text, Request.ApplicationPath, false);
}
catch (HttpException)
{
// do exception handling
}
Also you could explode the string and delimit it by slashes, and check the username match also.
To also be able to include a subdirectory in the path you can use:
string SafeCombine(string basePath, string path)
{
string testPath = Path.GetFullPath(Path.Combine(basePath, path));
if (testPath.startsWith(basePath))
return testPath;
throw new InvalidOperationException();
}