Removing file locks - c#

I need to recover form an error case where a file gets left in a locked state. How can I in c# tell this file to reset it's locks? I need to add to this the file is opened by a 3rd party dll and I don't actually have access to the file handle.

Locking a file is the responsibility of the Operating System (on behalf of the program that opens it). If a file is left in a locked state, its really up to the OS to unlock. This typically happens automatically when the process that opened the file exits.
There is, however, a really cool utility that I came across that will help. It's called Unlocker.

You could perhaps start a command line process like net or psfile with something along the lines of:
System.Diagnostics.Process.Start("psfile c:\myfile.txt -c");
You can get psfile here.
You could also use
net file ID /close
but that would require you to know the file ID, which would take a bit more work.
Untested but this should give you a starting point.

I would really consider finding another 3rd party dll. Any system handling Streams should properly respond to error conditions and not leave things like file locks in place.
Is it possible that the library does provide error condition clean up, you've just over looked it? Try something like the following,
try {
thirdPartyObj = new ThirdPartObj();
// Some possible error causing object actions
catch(Exception ex) {
thirdPartyObj = null; // The object should close its resources
}

You have to close the file using .Close(). You need to make sure you still have a way of accessing the file object.
You usually can avoid this error by writing a try{} ... finally {} construct after the code that does your file I/O. In the finally {} block, you'd call the Close method of your file object, preventing this condition. You can also use a using {} block when you create your files, and this will also take care of this problem.

Related

FileSystemWatcher and write completion

I am implementing an event handler that must open and process the content of a file created by a third part application over which I have no control. I am warned by a note in "C# 4.0 in a nutshell" (page 495) about the risk to open a file before it is fully populated; so I am wondering how to manage this occurrence. To keep at minimum the load on the event handler, I am considering to have the handler simply insert in a queue the file names and then to have a different thread to manage the processing, but, anyways, how may I make sure that the write is completed and the file read is safe? The file size could be arbitrary.
Some idea? Thanks
A reliable way to achieve what you want might be to use FileSystemWatcher + NTFS USN journal.
Maybe more complicated than you expected, but FileSystemWatcher alone won't tell you for sure that the newly created file has been closed
-first, the FileSystemWatcher, to know when a file is created. From there you have the complete file path, and are 1 or 2 pinvokes away from getting the file unique ID (which can help you to track it during its whole lifetime).
-then, read the USN journal, which tracks everything that occurs on your drive. Filter on entries corresponding to your new file's ID, and read the journal until reaching the entry with the 'Close' event.
From there, unless your file is manipulated in special ways (opened and closed multiple times by the application that generates it), you can assume it is safe to read it and do whatever you wanted to do with it.
A really great C# implementation of an USN journal parser is StCroixSkipper's work, available here:
http://mftscanner.codeplex.com/
If you are interested I can give you more help about USN journal, as I use it in my project.
Our workaround is to watch for a specific extension. When a file is uploaded, the extension is ".tmp". When its done uploading, it's renamed to have the proper extension.
Another alternative is to have the server try to move the file in a try/catch block. If the fie isn't done being uploaded, the attempt to move the file will throw an exception, so we wait and try again.
Realistically, you can't know. If the other applications "write" operation is to open the file denying write access to everyone else then when it's done, close the file. When you get a notification then you could simply open the file requesting write access and if that fails, you know the operation isn't complete. But, if the "write" operation is to open the file, write, close the file, open the file again, and write again, etc., then you're pretty much out of luck.
The best solution I've seen is to set a timer after the last notification. When the timer elapses, try to open the file for write--if you can, assume the "operation" is done and do what you need to do. If the open fails, assume the operation is still in progress and wait some more.
Of course, nothing is foolproof. Despite the above, another operation could start while you're doing what you want with the file and cause interaction problems.

File Handling Issue

I am developing a tool in c#, at one instance I start writing into a xml file continuously using my tool,when i suddenly restart my machine the particular xml file gets
corrupted, what is the reason an how to avoid it?
xmldocument x= new xmldocument();
x.open();
// change a value of the node every time
x.save();
x=null
this is my code
Use the "safe replace pattern". For example, to replace foo.txt
Write to foo.new
Move foo.txt to foo.old
Move foo.new to foo.txt
Delete foo.old
At any point, you have at least one complete, valid file.
(That helps if you want to write a new file periodically; for appending, I'd go with the answers suggesting that XML isn't the best way forward for you.)
Don't use XML.
XML has a syntax which doesn't lend itself well to writing continuously to the same file, as you always need a final end tag which you can't write unless the file is complete (which it never is with log files, for example).
That means you will always get an invalid XML file when you cancel the writing prematurely (by killing the process or restarting the computer, etc.).
We had a similar situation a while ago and settled on YAML as a nice format which allows for simply appending to the file.
Check that your file is properly closed before the application shuts down.
Also, as someone has pointed out, an XML file must be properly ended with closing tags.
Additional details would also be useful, such as the code that you use to open, write and close the file.
The reason for your file getting corrupted is that due to a crash, you never closed it.
I remember solving an issue like that once, with a file overlapping flag. But that was in C++ using method CreateFile.

Atomic modification of files across multiple networks

I have an application that is modifying 5 identical xml files, each located on a different network share. I am aware that this is needlessly redundant, but "it must be so."
Every time this application runs, exactly one element (no more, no less) will be added/removed/modified.
Initially, the application opens each xml file, adds/removes/modifies the element to the appropriate node and saves the file, or throws an error if it cannot (Unable to access the network share, timeout, etc...)
How do I make this atomic?
My initial assumption was to:
foreach (var path in NetworkPaths)
if (!File.Exists(path)
isAtomic = false;
if (isAtomic)
{
//Do things
}
But I can see that only going so far. Is there another way to do this, or a direction I can be pointed to?
Unfortunately, for it to be truly "atomic" isn't really possible. My best advice would be to wrap up your own form of transaction for this, so you can at least undo the changes.
I'd do something like check for each file - if one doesn't exist, throw.
Backup each file, save the state needed to undo, or save a copy in memory if they're not huge. If you can't, throw.
Make your edits, then save the files. If you get a failure here, try to restore from each of the backups. You'll need to do some error handling here so you don't throw until all of the backups were restored. After restoring, throw your exception.
At least this way, you'll be more likely to not make a change to just a single file. Hopefully, if you can modify one file, you'll be able to restore it from your backup/undo your modification.
I suggest the following solution.
Try opening all files with a write lock.
If one or more fail, abort.
Modify and flush all files.
If one or more fail, roll the already modified ones back and flush them again.
Close all files.
If the rollback fails ... well ... try again, and try again, and try again ... and give up in an inconsitent state.
If you have control over all processes writing this files, you could implement a simple locking mechanism using a lock file. You could even perform write ahead logging and record the planned change in the lock file. If your process crashes, the next one attempting to modify the files would detect the incomplete operation and could continue it before doing it's one modification.
I would introduce versioning of the files. You can do this easily by appending a suffix to the filename. e.g a counter variable. The process for the reader is as follows:
prepare the next version of the file
write it to a temp file with a different name.
Get the highest version number
increment this version by one
rename the temp file to the new file
delete old files (you can keep e.g. 2 of them)
as Reader you do
- find the file with the highest version
- read it

Waiting to get exclusive access of file before moving it in C#

I have a requirement to move certain files after they has been processed. Another process access the file and I am not sure when it releases them. Is there any way I can find out when the handle to the file has been released so I can move them at that time.
I am using Microsoft C# and .Net framework 3.5.
Cheers,
Hamid
If you have control of both the producer of the file and the consumer, the old trick to use is create the file under a different name, and rename it once complete.
For example, say the producer is creating files always called file_.txt, and your consumer is scanning for all files beginning file_, then the producer can do this:
1. Create the file called tmpfile_.txt
2. When the file is written, the producer simply renames the file to file_.txt
The rename operation is atomic, so once your consumer sees its available, it is safe to open it.
Of course, this answer depends on if you are writing both programs.
HTH
Dermot.
Just contniually try to open the file for exclusive writing? (e.g. pass FileShare.None to the FileStream constructor). Once you have opened it, you know no one else is using it. However, this might not be the best way to do what you're doing.
If you're after two way communication, see if the other program can be talked to via a pipe.
If you have control of both of the sources, use a named mutex (which works across processes) to control access to the files rather than locking the file at the filesystem level. This way, you don't have to catch the exception raised by attempting to lock a locked file and loop on that, which is rather inelegant.

Waiting for a file to be created in C#

Is there a built in method for waiting for a file to be created in c#? How about waiting for a file to be completely written?
I've baked my own by repeatedly attempting File.OpenRead() on a file until it succeeds (and failing on a timeout), but spinning on a file doesn't seem like the right thing to do. I'm guessing there's a baked-in method in .NET to do this, but I can't find it.
What about using the FileSystemWatcher component ?
This class 'watches' a given directory or file, and can raise events when something (you can define what) has happened.
When creating a file with File.Create you can just call the Close Function.
Like this:
File.Create(savePath).Close();
FileSystemWatcher can notify you when a file is created, deleted, updated, attributes changed etc. It will solve your first issue of waitign for it to be created.
As for waiting for it to be written, when a file is created, you can spin off and start tracking it's size and wait for it stop being updated, then add in a settle time period, You can also try and get an exclusive lock but be careful of locking the file if the other process is also trying to lock it...you could cause unexpected thigns to occur.
FileSysWatcher cannot monitor network paths. In such instances, you manually have to "crawl" the files in a directory -- which can result in the above users error.
Is there an alternative, so that we can be sure we don't open a file before it has been fully written to disk and closed?

Categories

Resources