C# - Blocking a folder from being changed while processing - c#

I have a single-threaded program that processes folders and files in a source directory. Is there a way to block a folder, with files in it, within my source directory from being modified by other processes while my program is working on it? I'm thinking something along the lines of placing some kind of exclusive lock on the folder itself, so only my program's process can use it.
NOTE:
I do not want to block the root source directory itself, just whatever folder(s), in the top level of that directory, I might be processing at any particular moment. I still want to be able to allow outside processes to add folders to the source directory, while I'm processing other folders.
UPDATE:
#Yuri - Yes this is a Windows program, a Windows Service application to be exact.
Part of what makes this both challenging and necessary is that I need to recreate the structure of whatever folder(s) I'm processing, in the source directory, in a separate destination directory. So I can't have a any other process modifying the folder(s) and File(s) while my program is working with them.

Ways to lock a folder:
1.Mark it readonly. (Included because people always try this)
Con - This does not actually work. On folders, the readonly flag is used for other purposes.
2.Mark all of the contents readonly.
Con - Won't prevent certain types of actions (e.g., creating new items).
3.Use ACL
Con - Won't prevent administrators from messing with the folder. Requires certain types of permissions.
4.Use ACL with a specially created user and enable folder Encryption.
Con - This is really of horrifying. If anything goes wrong, your data might be lost.
5.Rename/Move the folder
Con - Can be bypassed by user stupidity.
6.(Edit) gjvdkamp's answer: Lock Individual files
Con - As with #2, this still allows creating new files within the folders. That said, this is The Right Way™ to do it.

edit don't us this hack, doesn't work:
http://itknowledgeexchange.techtarget.com/itanswers/folder-lock/
Else i would loop over all files in foder, create a list<FileStream> and then call Lock on each. This would give you finer grained control over wich files to lock.
GJ
Edit: Try something along these lines:
var locks = new List<FileStream>();
var di = new DirectoryInfo(#"C:\Test");
foreach (var file in di.GetFiles()) {
var fs = new FileStream(file.FullName, FileMode.Open);
fs.Lock(0, 0);
locks.Add(fs);
}

Why don't you take a new copy of the folder, work on that one then replace the old one if neccessary?

I guess this might help you http://www.codeproject.com/KB/files/Unique_Folder_Protection.aspx
What the article basically does is change the extension of the folder to one of the following
".{2559a1f2-21d7-11d4-bdaf-00c04f60b9f0}"
".{21EC2020-3AEA-1069-A2DD-08002B30309D}"
".{2559a1f4-21d7-11d4-bdaf-00c04f60b9f0}"
".{645FF040-5081-101B-9F08-00AA002F954E}"
".{2559a1f1-21d7-11d4-bdaf-00c04f60b9f0}"
".{7007ACC7-3202-11D1-AAD2-00805FC1270E}"
And stores the password inside the folder
Hope this helps
Alex

Related

Is it considered good/acceptable practice to save a file in the temporary directory?

I am developing a WinForms application using C# 3.5. I have a requirement to save a file on a temporary basis. Let's just say, for arguments sake, that's it's for a short duration of time while the user is viewing a particular tab on the app. After the user navigates away from the tab I am free to delete this file. Each time the user navigates to the tab(which is typically only done once), the file will be created(using a GUID name).
To get to my question - is it considered good practice to save a file to the temp directory? I'll be using the following logic:
Path.GetTempFileName();
My intention would be to create the file and leave it without deleting it. I'm going to assume here that the Windows OS cleans up the temp directory at some interval based on % of available space remaining.
Note: I had considered using the IsolatedStorage option to create the file and manually delete the file when I was finished using it i.e. when the user navigates away from the tab. However, it's not going so well as I have a requirement to get the Absolute or Relative path to the file and this does not appear to be an straight-forward/safe chore when interacting with IsolatedStorage. My opinion is that it's just not designed to allow
this.
I write temp files quite frequently. In my humble opionion the key is to clean up after one self by deleting unneeded temp files.
In my opinion, it's a better practice to actually delete the temporary files when you don't need them. Consider the following remarks from Path.GetTempFileName() Method:
The GetTempFileName method will raise an IOException if it is used to
create more than 65535 files without deleting previous temporary
files.
The GetTempFileName method will raise an IOException if no
unique temporary file name is available. To resolve this error, delete
all unneeded temporary files.
Also, you should beaware about the following hotfix for Windows 7 and Windows Server 2008 R2.
Creating temp files in the temp directory is fine. It is considered good practice to clean up any temporary file when you are done using it.
Remember that temp files shouldn't persist any data you need on a long term basis (defined as across user sessions). Exaples of data needed "long term" are user settings or a saved data file.
Go ahead and save there, but clean up when you're done (closing the program). Keeping them until the end also allows re-use.

How to debug an issue deleting a directory using File.Delete and Directory.Delete

This one is proving very difficult to debug. Lets start with my situation:
I have an ASP.Net MVC3 web app developed with C# using .Net 4. Part of the system allows a user to upload a zip file. This is done fine, and the zip file is saved. There is also a Windows service which will periodically look for new zip files, extract them, do some work and then re-zip them back up. (I use System.IO.Compression for the zipping stuff). This part all works fine, after the processing I end up with a structure something like.
Object1Folder
\_ Input.zip
\_ ExtractedFolder
\_ Output.zip
There is also a feature that allows the user to delete the item, and the requirement is to delete the object folder, in this case "Object1Folder". Because I need to delete all sub folders and files I have the following recursive function to do the work...
public static void DeleteDirectory(string directoryPath)
{
string[] files = Directory.GetFiles(directoryPath);
string[] directories = Directory.GetDirectories(directoryPath);
foreach (string file in files)
{
File.SetAttributes(file, FileAttributes.Normal);
File.Delete(file);
}
foreach (string directory in directories)
{
DeleteDirectory(directory);
}
Directory.Delete(directoryPath, true);
}
Which is initially called something like...
DeleteDirectory("C:\\Objects\\Object1Folder");
But it doesn't work! Firstly, there is no error thrown, the code appears to successfully execute, which is annoying. But the result is that only the "Input.zip" file is removed. The "ExtractedFolder" and "Output.zip" file remains.
As far as I can tell, the code is sound. I can't see that it is a case that it doesn't attempt to delete the remaining file and folder. Unfortuantely, we don't have VS installed on the target server so I cannot step through the code and check that it attempts to delete them. Please point out if you can see this being a potential issue though?
My best guesses so far is that it is a permissions issue of some sort. What is interesting (perhaps) is that when I go to manual clean-up the problem (i.e. windows explorer delete of the "Object1Folder" it says "Denied" and asks me to confirm with the admin rights button thing that it does.
I know it's hard for you all to work the problem out, but I am looking for things that I should check to try and solve this problem. What kind of things should I ensure have correct permissions? What permissions do they need? Is there a good way for me to debug this issue? If something else has a hold of those files (maybe from the extraction process of the Windows Service), how can I check if this is the problem?
Some information about that server that might help: It's running "Windows Server 2008 R2 Datacenter" with service pack 1 and is 64-bit. The web app is assigned a user that is a member of both "Users" and "IIS_IUSRS".
Let me know if any of you need some extra information.
Have a look at the eventlog on the server; you might find an exception\error message there.
You could consider using Directory.Delete(path, true) so you delete the folder and all its content in one call (unless I do not understand your code correctly).
Have a look at files being in use. If a file is in use, the OS can't delete it. So make sure you are releasing all files correctly.
Finally, you can't force the files to be not in use so you might end up writing a clean script that runs every night to delete unwanted files and folders.
You really should use a temp destination folder for your unpacked files.
When they have been used you can try deleting them and if that fails just leave them.
The permission to write to a non-temp folder should not really be given to a presentation app residing in the iis.

Copying .svn File?

I'm trying to automate some stuff in SVN, and part of that is copying a .svn file (the one that contains where a directory should update their files from) to a local drive.
I'm using the following code:
File.Copy(#"X:\SVN\.svn", #"C:\SVN\.svn, true");
And I get the following error:
Access to the path 'X:\SVN\.svn' is denied
Am I just not allowed to move these types of files around? I know by default they are hidden, so maybe that's what's going on. Or is there a way in C#, I can just create a new .svn file so I don't have to bother with permissions and the like?
Thanks!
Last time I checked, .svn was a directory. You probably can't copy a directory with File.Copy, or the directory is in use (sharing lock).
Have a look at ProcessExplorer's find function (in Sysinternal Suite) to find out which process uses it (perhaps TortoiseSVN?)
Perhaps schedule your replication using Rich Copy which is a RoboCopy clone from Microsoft that will copy permissions, incremental update etc.
Typically this is because you have tortoise running, and the tortoise cache has the file locked. If you shut it down in task manager you can do whatever you'd like with hit.
That said, what is the purpose of copying it? Are you cloning the project?
If you are on Windows and using TortoiseSVN, then the .svn is actually a directory so you can't use File.Copy on it.

How can I delete a file that is in use by another process?

When I try to delete a file occurs the following exception:
The process cannot access the file ''
because it is being used by another
process.
My code looks like:
string[] files = Directory.GetFiles(#"C:\SEDocumentConverter\SOURCE");
foreach (string file in files)
{
File.Delete(file);
}
How can I solve this problem?
There is no way to delete a file that's currently being used by another process. You have to close whatever program has that file open first, before you can delete it.
If you don't already know which program that is, you can figure it out using Handle or Process Explorer.
You can P/Invoke the Windows MoveFileEx function, and use the MOVEFILE_DELAY_UNTIL_REBOOT flag, with a NULL destination name. This will delete the file when you reboot.
If the file is being used you're out of luck in trying to delete it. I can't tell you based on your code what process might be using the file(s), but try looking here or here or here, or at any of the other questions that show up as related to this one for guidance regarding this issue, and by all means follow the guidance from #Cody Gray about using Process Explorer.
slightly off topic: But it seems from your code that you are trying to delete all files of your folder.
Well instead of deleting them one by one we have another method Directory.Delete(path, True) which will delete the directory as contained in the string named path. Then you may recreate the directory if you want. But your problem may persist here too.
Another way is to find all open handles to the file and close them forcibly.
Works nice for you, bad for any apps which were using the file.
Could try that in UI with SysInternals ProcessExplorer.
Just rename this file. This will do the thing for whoever tries to write to that location.
Notes:
1) Of course the file is not deleted physically yet. Nice to do the MoveFileEx trick mentioned here around to complete the job.
2) If you want to delete a locked file to write smth new in its place (e.g. during build), just rename the file to a GUID name. If you need the folder to be clean, either use an ignored extension / hidden attribute, or rename the file to a path under %TEMP% (if on the same drive).
3) Not all locked files can be renamed, but it works for me for like 90% practical applications. You can move a file without affecting an open read/write/execute handle, it will continue working with the moved file just good (if moved within the same NTFS volume of course).
4) That's what Windows Installer would basically do before it asks you to please reboot somewhen soon: move the file away from your eyes, schedule to be removed upon reboot. Usually the newly-installed app can be used right away.
Practical Use:
My favorite is with MSBuild. Overriding the <Copy/> task with this stuff makes all the build go linux-way. You don't care if a prev version is still running somewhere, can still build&run. The old app keeps using the old version of the files. The new app loads the newly-written version.
Might be moving to %TEMP% if on the same drive (not my case though). I'd just rename them to an extension which is ignored with the current source control client.

C# FileSystemWatcher lock folder

I'm trying to monitor a folder using C# and FileSystemWatcher. everything works well, except the fact that i can delete the folder i'm actually watching
I used to do this in C using ReadDirectoryChangesW, by creating a handle to the folder, and locking it, which prevented delete or rename from the user to that folder (i'm talking about the actual monitored folder, not it's contents)
Is there any way to lock that folder so people don't delete it while it's being watched?
(note that I don't want to change permissions to the folder because it might be on a FAT32 partition/usb drive/etc , which doesn't support permissions)
Not sure if that's an option, but you could create a (temporary) file in said folder and keep it open for the duration of the 'watch'. You'll need to clean it up again afterwards off course. (You might even give it the hidden attribute so it doesn't show up to 'normal' users).
Not the nicest solution and the file will remain littering around when your program crashes before removing the file...

Categories

Resources