FileStream.Close() is not closing the file handle instantly - c#

I am reading source file in chunks and pass it to WCf service to write down on some remote SMB. I am keeping open the FileStream until all data is written.
Opening and closing file handle multiple time decrease performance so I am following this approach.
Once all data is written, I call CloseHandle(). Then I may need to perform some other operation on same file by calling DoSomeOperation(). As I have closed the file handle in CloseHandle() function, but I get the error "file is in use with some other process" in DoSomeOperation(). If I call DoSomeOperation() after some delay then issue is not there.
Please help us to close the file handle instantly as I call FileStream.Close().
This code snippet is part of a big program, so I can't mention all the code here.
//In WCF service
FileStream fs = null;
public void AppendBytes(string fileName, byte[] data, long position)
{
try
{
if (fs==null)//In first call, open the file handle
fs = System.IO.File.Open(fileName, System.IO.FileMode.Append, System.IO.FileAccess.Write, System.IO.FileShare.None);
fs.Write(data, 0, data.Length);
}
catch (Exception ex)
{
//Close handle in case of error
if (fs != null)
fs.Close();
}
}
public void CloseHandle()
{
//Close handle explicitly
if (fs != null)
fs.Close();
}
public void DoSomeOperation(string fileName)
{
using (FileStream fsO = System.IO.File.Open(fileName, System.IO.FileMode.Append, System.IO.FileAccess.Write, System.IO.FileShare.None))
{
//Do something with file here, this is atomic operation so I am opening FileStream with 'using' to dispose at operation is over
}
}
//In client
public void CallerFunction()
{
//Read Data from sourceFile in chunk and copy to target file using WCF.AppendBytes on another machine
WCF.AppendBytes(filename, data, pos);
WCF.CloseHandle();
WCF.DoSomeOperation(filename); //I get error here that file is in use with some other process. if I put a thread.sleep(1000) just before this statement then all works fine.
}
I have written a small test code to reproduce the same scenario on console application: Just Call TestHandleClose() from Main(), it will give error after some cycles.
static void TestHandleClose()
{
int i = 0;
try
{
if (File.Exists(#"d:\destination\file2.exe"))
File.Delete(#"d:\destination\file2.exe");
byte[] data = null;
int blocksize = 10 * 1024 * 1024;
for( i=0;i<100;i++)
{
using (FileStream fr = File.Open(#"d:\destination\File1.zip", FileMode.Open, FileAccess.Read, FileShare.None))
{
data = new byte[blocksize];
fr.Read(data, 0, blocksize); //We are reading the file single time but appending same data to target file multiple time.
using (FileStream f = File.Open(#"d:\destination\file2.exe", FileMode.Append, FileAccess.Write, FileShare.None))
{
f.Write(data, 0, data.Length); //We are writing same data multiple times.
f.Flush();
f.Close();
}
}
}
if (File.Exists(#"d:\destination\file2.exe"))
File.Delete(#"d:\destination\file2.exe");
}
catch (Exception ex)
{
throw;
}
}

The final sample code helped, I'm pretty sure I have your problem.
The thing is, even the latest sample code fails to reproduce. However, it shows another thing that you probably missed - the file you're writing is .exe. Why is this a problem? Well, there's a couple of reasons, but one of those is that when you list the directory where the .exe file is using explorer, explorer goes ahead and tries to read it (to get the icon). In this short time, the file cannot be opened with FileShare.None (and in fact, FileShare.Read probably will not help either, since it's quite likely whoever opened it didn't specify FileShare.ReadWrite).
So yet again, FileStream is closed just fine, and works well (get rid of the Flush and Close calls, though - they're wasting performance, and useless). The problem is that another process tries to read the file in the meantime. It might be some file manager like in my case (Explorer, Total Commander, ...), it might be some FileSystemWatcher you have somewhere, it might be a virus scanner (though most virus scanners nowadays use shadow copies), it might be something that automatically creates thumbnails for images, etc.. But the code you posted yourself simply doesn't cause the problem - it's someone else grabbing your file.
There's basically two options you have - either keep the file opened the whole time you need it, or treat the IOException as temporary and retry a few times in a given interval. That's what you should be doing anyway, instead of relying on the path being happy - most readers only allow concurrent reads, not writes.

In my case it was the FileShare mode that was missing. I used to read/write from multiples sources in the same file in a short period of time . Sometimes it passed good and sometimes it blocked (another process was holding the file) even though the filestream was released and i was using a lock and a using(). Everything solved when i added the FileShare mode ReadWrite.
using (FileStream fileStream = new FileStream(file, mode, access, FileShare.ReadWrite))
{...

Related

Should I keep a file handler open between append writes?

I am working in a project involving data acquisition. One very important requisite is described like this:
At the beginning of the recording, a file must be created, and its headers must be written;
As soon as the data-acquisition starts, the application should keep saving collected data to file periodically (typically once per second);
Writing consist on appending data blocks to the file, atomically if possible;
Should any error occur (program error, power failure), the file must be kept valid untill the last write before the error.
So I plan to use some Thread to watch for data received and write this data do file, but I don't know which practice is best (code below is not real, just to get the feeling):
First option: Single open
using (var fs = new FileStream(filename, FileMode.CreateNew, FileAccess.Write))
fs.Write(headers, 0, headers.Length);
using (var fs = new FileStream(filename, FileMode.Append, FileAccess.Write))
{
while (acquisitionRunning)
{
Thread.Sleep(100);
if (getNewData(out _someData;))
{
fs.Write(_someData, 0, _someData.Length);
}
}
}
Second option: multiple open:
using (var fs = new FileStream(filename, FileMode.CreateNew, FileAccess.Write))
fs.Write(headers, 0, headers.Length);
while (acquisitionRunning)
{
Thread.Sleep(100);
if (getNewData(out _someData;))
{
using (var fs = new FileStream(filename, FileMode.Append, FileAccess.Write))
{
fs.Write(_someData, 0, _someData.Length);
}
}
}
The application is supposed to work in a client machine, and file access by other processes should not be a concern. What I am most concerned about is:
Do multiple open/close impact performance (mind that typical write interval is once per second);
Which one is best to keep file integrity safe in the event of failure (including explicitly power failure)?
Is any of this forms considered a particularly good or bad practice, or either can be used depending on specifics of the problem at hand?
A good way to preserve file content in the event of a power outage/etc, is to flush the filestream after each write. This will make sure the contents you just wrote to the stream get immediately written to disk.
As you've mentioned, other processes won't be accessing the file, so keeping it open wouldn't complicate things, and also it would be faster. But, keep in mind that if the app crashes the lock will remain on the file and you might probably need to handle this accordingly to your scenario.

StreamWriter even with AutoFlush true leaves lines half written

I have a StreamWriter with an AutoFlush = true property. However, I still see the file only partially written when I randomly open it. I'm writing a file that needs to be fully written (JSON) or not during any given time.
var sw = new StreamWriter("C:\file.txt", true /* append */, Encoding.ASCII) { AutoFlush = true };
sw.WriteLine("....");
// long running (think like a logging application) -- 1000s of seconds
sw.Close();
In between the sw.WriteLine() call and sw.Close() I want to open the file, and always have it be in the "correct data format", i.e. my line should be complete.
Current Idea:
Increase the internal buffer of FileStream (and/or StreamWriter) to let's say 128KB. Then every 128KB-1, call .Flush() on the FileStream object. This leads me to my next question, when I do call Flush(), should I right before calling it get the Stream.Position and do a File.Lock(Position, 128KB-1)? Or does Flush take care of that?
Basically I don't want the reader to be able to read the contents in between Flush(), because it'll maybe partially broken.
using (StreamWriter sw = new StreamWriter("FILEPATH"))
{
sw.WriteLine("contents");
// if you open the file now, you may see partially written lines
// since the sw is still working on it.
}
// access the file now, since the stream writer has been properly closed and disposed.
If you need a "log-like" file which is never half-written, the way to go is not keeping it open.
Every time, you want to write your file, you should instantiate a new FileWriter, which will flush the file contents upon releasing the file like this:
private void LogLikeWrite(string filePath, string contents)
{
using (StreamWriter streamWriter = new StreamWriter(filePath, true)) // the true will make you append to the file instead of overwriting its contents
{
streamWriter.Write(contents);
}
}
This way your write operations will be flushed immediately.
If you are sharing the file between processes, your going to have a race condition unless you produce a locking mechanism of some kind. See https://stackoverflow.com/a/29127380/892327. This does require that you are able to modify both processes.
An alternative is to have process A wait for a file at a specified location. Process B writes to a intermediate file and once B has flushed, the file is copied to the location process A is expecting a file to be so that it can consume the file.

How do I delete a file that was opened by xmlreader in c#?

I've researched several questions but none of the answers I found has helped. The goal of this function is to modify an xml file. I read the original file and write the old stuff and the new stuff to the new file. All of this works perfectly. The problem arises when I'm done and need to delete the old file and move the new one.
The error being received is that the jnv_config.xml is in use by another process (the reader file.)
Removing the Close and/or Dispose does not solve the problem.
using (XmlReader reader = XmlReader.Create("jnv_config.xml"))
using (XmlWriter writer = XmlWriter.Create("jnv_temp.xml"))
{
writer.WriteStartDocument();
while (reader.Read())
{
// Read the file, write to the other file - this part works perfectly.
// No filestreams nor anything else is created in here.
}
writer.WriteEndElement();
writer.WriteEndDocument();
reader.Close();
writer.Close();
reader.Dispose();
writer.Dispose();
}
// Delete the old file and copy the new one
File.Delete("jnv_config.xml");
//File.Move("jnv_temp.xml", "jnv_config.xml");
I'm using VS2012 (NET 4.5), C#, Standard Windows Forms project.
Are you sure that it's this XmlReader that still has the file open? Have you tried using Process Explorer to confirm that there are no open file handles for the config file before this code executes?
Check if the file is ready before you delete it. If your working with large files perhaps call code via a loop for a couple of seconds.
private void IsFileOpen(FileInfo file)
{
FileStream stream = null;
try {
stream = file.Open(FileMode.Open, FileAccess.ReadWrite, FileShare.None);
}
catch (Exception ex) {
if (ex is IOException && IsFileLocked(ex)) {
// do something here, either close the file if you have a handle or as a last resort terminate the process - which could cause corruption and lose data
}
}
}
private static bool IsFileLocked(Exception exception)
{
int errorCode = Marshal.GetHRForException(exception) & ((1 << 16) - 1);
return errorCode == 32 || errorCode == 33;
}
It has been my experience that many NTFS file-handling functions (especially DELETEs) are slightly asynchronous. Trying adding a Sleep or Wait for at least 0.2 sec before the RENAME.
As that did not work, I would instead suggest putting the Sleep/Wait before and then slowly increasing it until it works. If you get to some unreasonably large time span (like say 10 seconds) and it still does not work, then I think that you can fairly conclude that the problem is that you XmlReader is not being released as long as you stay in this code.
In which case you may need to do something to insure that it is getting Disposed completely, like forcing GC to run.

FileStream to save file then immediately unlock in .NET?

I have this code that saves a pdf file.
FileStream fs = new FileStream(SaveLocation, FileMode.Create);
fs.Write(result.DocumentBytes, 0, result.DocumentBytes.Length);
fs.Flush();
fs.Close();
It works fine. However sometimes it does not release the lock right away and that causes file locking exceptions with functions run after this one run.
Is there a ideal way to release the file lock right after the fs.Close()
Here's the ideal:
using (var fs = new FileStream(SaveLocation, FileMode.Create))
{
fs.Write(result.DocumentBytes, 0, result.DocumentBytes.Length);
}
which is roughly equivalent to:
FileStream fs = null;
try
{
fs = new FileStream(SaveLocation, FileMode.Create);
fs.Write(result.DocumentBytes, 0, result.DocumentBytes.Length);
}
finally
{
if (fs != null)
{
((IDisposable)fs).Dispose();
}
}
the using being more readable.
UPDATE:
#aron, now that I'm thinking
File.WriteAllBytes(SaveLocation, result.DocumentBytes);
looks even prettier to the eye than the ideal :-)
We have seen this same issue in production with a using() statement wrapping it.
One of the top culprits here is anti-virus software which can sneak in after the file is closed, grab it to check it doesn't contain a virus before releasing it.
But even with all anti-virus software out of the mix, in very high load systems with files stored on network shares we still saw the problem occasionally. A, cough, short Thread.Sleep(), cough, after the close seemed to cure it. If someone has a better solution I'd love to hear it!
I can't imagine why the lock would be maintained after the file is closed. But you should consider wrapping this in a using statment to ensure that the file is closed even if an exception is raised
using (FileStream fs = new FileStream(SaveLocation, FileMode.Create))
{
fs.Write(result.DocumentBytes, 0, result.DocumentBytes.Length);
}
If the functions that run after this one are part of the same application, then a better approach might be to open the file for read/write at the beginning of the entire process, and then pass the file to each function without closing it until the end of the process. Then it will be unnecessary for the application to block waiting for the IO operation to complete.
This worked for me when using .Flush() I had to add a close inside the using statement.
using (var imageFile = new FileStream(filePath, FileMode.Create, FileAccess.ReadWrite,FileShare.ReadWrite))
{
imageFile.Write(bytes, 0, bytes.Length);
imageFile.Flush();
imageFile.Close();
}
Just had the same issue when I closed a FileStream and opened the file immediately in another class. The using statement was not a solution since the FileStream had been created at another place and stored in a list. Clearing the list was not enough.
It looks like the stream needs to be freed by the garbage collector before the file can be reused. If the time between closing and opening is too short, you can use
GC.Collect();
right after you closed the stream. This worked for me.
I guess the solutions of Ian Mercer to put the thread to sleep might have the same effect, giving the GC time to free the resources.

how to best wait for a filelock to release

I have an application where i sometimes need to read from file being written to and as a result being locked. As I have understood from other questions i should catch the IOException and retry until i can read.
But my question is how do i know for certain that the file is locked and that it is not another IOExcetpion that occurs.
When you open a file for reading in .NET it will at some point try to create a file handle using the CreateFile API function which sets the error code which can be used to see why it failed:
const int ERROR_SHARING_VIOLATION = 32;
try
{
using (var stream = new FileStream("test.dat", FileMode.Open, FileAccess.Read, FileShare.Read))
{
}
}
catch (IOException ex)
{
if (Marshal.GetLastWin32Error() == ERROR_SHARING_VIOLATION)
{
Console.WriteLine("The process cannot access the file because it is being used by another process.");
}
}
There's a useful discussion on google groups which you really should read. One of the options is close to darin's; however, to guarantee you get the right win32 error, you really should call the win32 OpenFile() API yourself (otherwise, you really don't know which error you are retrieving).
Another is to parse the error message: that will fail if your application is run on another language version.
A third option is to hack inside the exception class with reflection to fish out the actual HRESULT.
None of the alternatives are really that attractive: the IOException hierarchy would benefit from a few more subclasses IMHO.
To read data you can do:
using (FileStream fs = new
FileStream(fileName, FileMode.Open,
FileAccess.Read, FileShare.ReadWrite |
FileShare.Delete)) { .... }
and to save into file:
using (FileStream fs = new
FileStream(fileName, FileMode.Append,
FileAccess.Write, FileShare.Read |
FileShare.Delete)) { ... }
Flags at the end of constructors describes what other process can do with the file. It is fine, of course, if you control both write and read...
You may open it (as described by bezieur) then try to lock sections (or whole file) :
http://www.java2s.com/Code/VB/File-Directory/Lockandunlockafile.htm
Do you mean you are both reading and writing to the file? Or that an external application is writing to it.
If you are doing the reading and writing then I assume you're doing it on different threads in which case take a look at the ReaderWriteLock class which will do this the management for you, and allow you to provide timeouts.
http://msdn.microsoft.com/en-us/library/system.threading.readerwriterlock.aspx
Otherwise all you need to do is open the file in a read only mode. Then you shouldn't have any problems:
fileStream = new FileStream(fileName, FileMode.Open, FileAccess.Read));
You can compare against the type IOException to check and see if it is not something else
Such as
if (ex is FileNotFoundException)
You might want to look up the help on System.IO. A lot of the exceptions in that class inherit from IOException. Outside of checking to see if it is another type of exception you may have to look in the message of the description or you might look into making a Win32 API call into shell32.dll. There may be a function in there to check if a file is locked.
Also, if you absolutely need to wait you can use loop, but if you want to do other actions while waiting use an asynchronous thread.

Categories

Resources