This is similar with this question but with one more requirement:
Since the deletion of files can fail for whatever reasons. So I want the operation to be "transacted" which mean the whole operation either success in total, or fail and do not change anything at all.
In general this will be very very hard and I can't see any possibility to recover when the physical hard drive suddenly broken. So a weakened clause would be: if it success, we finished. Otherwise if fail, restore everything to the original state when possible.
Some kind of errors I could think of would be:
Access violation. You simply don't allowed to delete some files or folders. This is the case that I wanted to handle the most.
File/folder was used by somebody else and so it is "locked". In Linux this is not a problem but in Windows it is. This is also to be handled.
If it is a network folder there could be network issues. The recover can be hard or impossible. I would not expect this kind of error to be properly handled.
Hardware failure. I don't think any recover can happen here.
Scenario
You have a software that can export its internal data. The result is in a folder and with sub-folder names timestamped.
Now if the user specified a folder that is not empty (probably a previous output folder), the software will create new sub-folders on top of it, which is a mass. So you want to ensure the folder is empty before performing the export.
You can easily detect the folder emptiness and alert the user if not. But if the user say "go ahead and do it" you should do something then. Now, what if you were deleted some of the files and failed on others?
Going ahead in this case is just creating worse mass. At the same time the user would not expect a damaged folder without getting anything working. So it is better to either give them a fully working output or does not change the previous output at all.
As per comments, I'll give you the pseudocode for the process you can follow writing the code:
Clear contents of cache folder if any files exist (they shouldn't)
Copy contents of destination folder to cache folder
Try
While files exist, iterate
Delete file
End While
Catch
While files exist in cache, iterate
If file does not exist in destination folder
Move file from cache to destination
else
Delete file from cache
end If
End While
End Try
By following the guidelines given in the comments, I came up with this solution.
The following code will attempt to move everything to a temporary folder inside the given folder. If success, it returns True. If failed, the catch block will then try to move everything back and return a False. In either case, the finally block will remove the temporary folder recursively.
public static bool EmptyFolderTransactionaly(string folder)
{
var directoryInfo = new DirectoryInfo(folder);
var tmpDir = Directory.CreateDirectory(Path.Combine(folder, Path.GetFileName(Path.GetTempFileName())));
try
{
foreach (var e in directoryInfo.EnumerateFiles())
{
e.MoveTo(Path.Combine(tmpDir.FullName, e.Name));
}
foreach (var e in directoryInfo.EnumerateDirectories().Where(e => e.Name!=tmpDir.Name))
{
e.MoveTo(Path.Combine(tmpDir.FullName, e.Name));
}
return true;
}
catch
{
foreach (var e in tmpDir.EnumerateDirectories())
{
e.MoveTo(Path.Combine(directoryInfo.FullName, e.Name));
}
foreach (var e in tmpDir.EnumerateFiles())
{
e.MoveTo(Path.Combine(directoryInfo.FullName, e.Name));
}
return false;
}
finally
{
tmpDir.Delete(true);
}
}
Let me know if you see any risks in the code.
Related
I believed this to be really simple but somehow i am making a mistake.I am trying to copy one folder to another location
Directory.Move(SourcePath, Destinationpath )
This expression is failing. The exception thrown is "cannot create file that already exists"
Well, you say you want to "copy one folder to another".
Directory.Move(), doesn't copy: as its name implies, it moves a directory. Take at look at the documentation on how to copy files:
.Net 4.0 (see here for an asynchronous approach)
.Net 4.5 (see here for an asynchronous approach)
The Move call is failing because a directory or file in the move operation exists at the specified location (as the exception noted). In order to fix this you need to ensure that no file or directory exists at the destination. The easiest way is to first delete that path
Do not run this function unless you are OK with unconditionally deleting data at DestinationPath.
static void MyMove(string sourcePath, string destPath) {
try {
Directory.Delete(destPath, recursive: true);
} catch {
// Don't care if this fails. If the file didn't exist, great, if the
// file can't be deleted will still get an error in Move. Just try
// Move at this point
}
Directory.Move(sourcePath, destPath);
}
I have a function that checks every file in a directory and writes a list to the console. The problem is, I don't want it to include files that are currently being copied to the directory, I only want it to show the files that are complete. How do I do that? Here is my code:
foreach (string file in Directory.EnumerateFiles("C:\folder"))
{
Console.WriteLine(file);
}
There's really no way to tell "being copied" vs "locked for writing by something". Relevant: How to check for file lock? and Can I simply 'read' a file that is in use?
If you want to simply display a list of files that are not open for writing, you can do that by attempting to open them:
foreach (string file in Directory.EnumerateFiles("C:\folder"))
{
try {
using (var file = file.Open(file, FileMode.Open, FileAccess.Read, FileShare.ReadWrite) {
Console.WriteLine(file);
}
} catch {
// file is in use
continue;
}
}
However -- lots of caveats.
Immediately after displaying the filename (end of the using block) the file could be opened by something else
The process writing the file may have used FileShare.Read which means the call will succeed, despite it being written to.
I'm not sure what exactly you're up to here, but it sounds like two processes sharing a queue directory: one writing, one reading/processing. The biggest challenge is that writing a file takes time, and so your "reading" process ends up picking it up and trying to read it before the whole file is there, which will fail in some way depending on the sharing mode, how your apps are written, etc.
A common pattern to deal with this situation is to use an atomic file operation like Move:
Do the (slow) write/copy operation to a temporary directory that's on the same file system (very important) as the queue directory
Once complete, do a Move from the temporary directory to the queue directory.
Since move is atomic, the file will either not be there, or it will be 100% there -- there is no opportunity for the "reading" process to ever see the file while it's partially there.
Note that if you do the move across file systems, it will act the same as a copy.
There's no "current files being copied" list stored anywhere in Windows/.NET/whatever. Probably the most you could do is attempt to open each file for append and see if you get an exception. Depending on the size and location of your directory, and on the security setup, that may not be an option.
There isn't a clean way to do this, but this... works...
foreach (var file in new DirectoryInfo(#"C:\Folder").GetFiles())
{
try
{
file.OpenRead();
}
catch
{
continue;
}
Console.WriteLine(file.Name);
}
In my app, I have built a system where users can create picture galleries. Photos held in folders in the format of category_name/gallery_name/{pictures} on disk. Each uploaded photo is stored under relevant directory structure as given above.
When trying to delete a category though, as well as deleting from the database, I want to delete relevant folders from the system too. When I first received the error message "Directory is not empty" I searched and found this solution:
public static void DeleteDirectory(string target_dir)
{
string[] files = Directory.GetFiles(target_dir);
string[] dirs = Directory.GetDirectories(target_dir);
foreach (string file in files)
{
File.SetAttributes(file, FileAttributes.Normal);
File.Delete(file);
}
foreach (string dir in dirs)
{
DeleteDirectory(dir);
}
Directory.Delete(target_dir, false);
}
With this solution, photos in the "gallery_name" folder gets deleted just fine, then the gallery_name folder itself gets deleted fine.. so we are now left with an empty category_name folder. Then the last bit of code in the above subroutine (Directory.Delete(target_dir, false);) gets called to delete the category_name folder. The error raises again..
Does anyone knows a solution to this?
Directory.Delete(target_dir, true); did not work, that is why I tried an alternative.
I have full control over the parent folder and the category_name and gallery_name folders are also created programmatically without any problem.
As I mentioned, the sub directories (gallery_name folders) and their contents (photos) are deleted with this code just fine. It is the category_name folder which causes the error, even though after this code, it is just an empty folder.
The exception message I get is:
System.IO.IOException was unhandled by user code
HResult=-2147024751
Message=The directory is not empty.
Source=mscorlib
StackTrace:
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.Directory.DeleteHelper(String fullPath, String userPath, Boolean recursive, Boolean throwOnTopLevelDirectoryNotFound)
at System.IO.Directory.Delete(String fullPath, String userPath, Boolean recursive, Boolean checkHost)
at System.IO.Directory.Delete(String path)
at MyApp.PhotoGallery.Categories.deleteCategory(Int32 cID, String categoryname) in d:\Documents\My Dropbox\web projects\MyAppFolder\App_Code\BLL\PhotoGallery.vb:line 291
at _admhades_PhotoGallery.deleteCategory(Int32 categoryID, String categoryname) in d:\Documents\My Dropbox\web projects\HavadisPre\_admhades\PhotoGallery.aspx.vb:line 71
You may just use Directory.Delete(target_dir, true); to remove directory and all files recursively. You don't have to write custom function.
This works for me, even though i have File Explorer open:
public static void DeleteFilesAndFoldersRecursively(string target_dir)
{
foreach (string file in Directory.GetFiles(target_dir))
{
File.Delete(file);
}
foreach (string subDir in Directory.GetDirectories(target_dir))
{
DeleteFilesAndFoldersRecursively(subDir);
}
Thread.Sleep(1); // This makes the difference between whether it works or not. Sleep(0) is not enough.
Directory.Delete(target_dir);
}
This issue was driving me crazy. I was seeing the EXACT behavior of the poster when using Directory.Delete(dirPath, recursive: true); to delete a directory and it's contents. And just like the poster, even though the call threw an exception stating "Directory is not empty." the call had actually deleted all the contents of the directory recursively BUT failed to delete the root directory of the path provided. Craziness.
In my case I found that the issue was brought on by whether left nav tree in window's explorer showed the directory open which I was trying to delete. If so, then some sort of queued delete or caching of the delete of the directory's contents seems to be occurring causing the issue. This behavior can seem squirrely and unpredictable because viewing the directory in windows explorer that is to be deleted does not cause this issue as long as the directory is not open in the Windows left nav tree.
A few pictures are probably necessary to make this clear. Notice in the path below that the window is showing 1346. 1346 is a child directory of directory 1001. In this case a call to delete 1346 recursively will succeed because it's not an issue that we are looking at 1346 in Window's explorer per se.
But in the picture below, notice that in the path below we are looking at directory 1018. BUT in the left nave we have directory 1349 opened up (see arrow). THIS IS WHAT CAUSES THE ISSUE, at least for me. If in this situation we call Directory.Delete(dirPath, recursive: true); for the dirPath of directory 1349 it will throw a "Directory is not empty." exception. But if we check the directory after the exception occurrs we will find that it has deleted all the contents of the directory and it is in fact now empty.
So this seems very much like an edge case scenario but it's one we developers can run into because when we are testing code we are wanting to watch to see if the folder gets deleted. And it's challenging to understand what's triggering it because it's about the left nav bar of windows explorer not the main contents area of the window.
Anyway, as much as I dislike the code below, it does solve the problem for me in all cases:
//delete the directory and it's contents if the directory exists
if (Directory.Exists(dirPath)) {
try {
Directory.Delete(dirPath, recursive: true); //throws if directory doesn't exist.
} catch {
//HACK because the recursive delete will throw with an "Directory is not empty."
//exception after it deletes all the contents of the diretory if the directory
//is open in the left nav of Windows's explorer tree. This appears to be a caching
//or queuing latency issue. Waiting 2 secs for the recursive delete of the directory's
//contents to take effect solved the issue for me. Hate it I do, but it was the only
//way I found to work around the issue.
Thread.Sleep(2000); //wait 2 seconds
Directory.Delete(dirPath, recursive: true);
}
}
I hope this helps others. It took quite a bit of time to track down and explain because it's really odd behavior.
This isn't as complete as I'd like, but these are things that have helped me in the past when I faced similar issues.
The file is in use by something else. This means you've created a folder/file and have not yet released it. Use .Close() (where applicable).
A lock related issue.
You have used the Directory.Delete(rootFolder, true) (where true means delete recursively) when there are no folders within the root folder specified.
It is open by another program. Now, I have NO idea where to begin on this other than installing Process Monitor which can help but that only really works in some situations.
Things like Anti Virus, or even (on Windows) things like Defender have caused this issue in the past.
When you call Directory.Delete and a file is open in such way,
Directory.Delete succeeds in deleting all files but when
Directory.Delete calls RemoveDirectory a "directory is not empty"
exception is thrown because there is a file marked for deletion but
not actually deleted.
The obvious things includes make sure you have permission to delete the folder (not just you, but the account the program runs under).
The file you are trying to delete is readonly. Change the file attributes of all files in the folder first from read-only.
The path is shared by other Windows components so becareful where you are creating/deleting folders.
Source
Source
In my case, I have created my directory with my program running as administrator. When I tried to delete the same directory without administrator rights, I got the "Directory is not empty" error.
Solution: Either run the application as administrator or delete it manually.
I'm a bit confused here. I wrote the following script to add files of a certain extension type to a List and it DOES work, just not for the root of C: Here's the code first...
// Create an empty list
List<string> scanFiles = new List<string>();
// Split possible extention list into array
string[] scanExtensions = #"exe,com".Split(',');
try
{
foreach (string extension in scanExtensions)
{
// Add collection for this filetype to the list of files
scanFiles.AddRange(Directory.GetFiles("C:\\", "*." + extension, SearchOption.AllDirectories));
}
}
catch (Exception ex)
{
Console.WriteLine("ERROR: " + ex.Message);
}
// Display results
foreach(string sf in scanFiles)
{
Console.WriteLine(sf);
}
So if I run the above code, I get an error - but not the error I expect. It displays the following...
ERROR: Access to the path 'C:\Documents and Settings\' is denied.
I'd understand this if I hadn't specified 'C:\' as the directory path! If I change this to any valid directory (such as C:\Program Files), the code works fine. Can anyone explain this?
Thanks,
Simon
SearchOption.AllDirectories means your code will drill down into (forbidden) territory.
Better be prepared to handle this kind of error. For a solution without catching exceptions you'll need DirectoryInfo.GetFiles() to get FileInfo objects instead of strings and verify your access rights ahead of time.
But you will still need to handle exceptions (File/Dir not found) because of concurrency so you might as well forget about the FileInfos.
Well, the cause of the error message called "Access denied" is ... that you don't have access to that folder!
Try clicking on it in Windows Explorer. You will notice that, in fact, you can't access it. What a surprise ;-) The message told you exactly that.
SearchOption.AllDirectories means that GetFiles will recursively enumerate all files. If it hits an error somewhere it will throw an exception. There is no way to change that.
You cannot make Directory.GetFiles ignore access denied errors. So you have to code your own file-system enumeration code. It will probably be a recursive function with some error-handling code in it.
You're specifying SearchOption.AllDirectories which according to the documentation means
AllDirectories Includes the current directory and all the subdirectories in a search operation. This option includes reparse points like mounted drives and symbolic links in the search.
In other words, your search is recursive and walks down into Documents and Settings where you have no read permission.
My question is about handling temporary files in a small .NET program. This program/utility just does the following with two URLs:
Download each URL to a string (WebClient.DownloadString).
Save each string to a temporary file.
Call WinMerge (Process.Start) to diff the two files (in read-only mode for both files).
I currently have the Simplest Thing That Could Possibly Work:
save URL1 to windows/temp/leftFileToDiff.txt
save URL2 to windows/temp/rightFileToDiff.txt.
This works great - as WinMerge only needs to run in Read Only mode the files can be overwritten by running my program multiple times and nothing bad happens.
However, I would now like to change the temporary file names to something meaningful (related to the URL) so that I can see which is which in the WinMerge view. I also want to clean these files up when they are no longer needed. What are my options for this?
My next simplest idea is to have a specified folder where these are stored and just to zap this every time my program exits. Is there a better/more elegant/standard way?
Thanks.
Create a Guid-based folder under the user's temp area and use that?
string path = Path.Combine(Path.GetTempPath(),
Guid.NewGuid().ToString("n"));
Directory.CreateDirectory(path);
try
{
// work inside path
}
finally
{
try { Directory.Delete(path, true); }
catch (Exception ex) {Trace.WriteLine(ex);}
}