If I have this:
StreamWriter cout = new StreamWriter("test.txt");
cout.WriteLine("XXX");
// here IOException...
StreamReader cin = new StreamReader("test.txt");
string text = cin.ReadLine();
the clr throws an IOException because I haven't close the cout.
In fact If I do this:
StreamWriter cout = new StreamWriter("test.txt");
cout.WriteLine("XXX");
cout.Close();
StreamReader cin = new StreamReader("test.txt");
string text = cin.ReadLine();
I have no exception.
But If I do this and then exit from the application:
StreamReader cin = new StreamReader("test.txt");
string text = cin.ReadLine();
without closing cin the file can from the OS opened and written.
However reading the source code of StreamReader.cs I didn't' find a destructor method (i.e. ~StreamReader(...)). So who does free that file if the garbage collector doesn't invoke Dispose and there is no finalization method?
Internally, the StreamReader uses a FileStream:
Stream stream = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read, DefaultFileStreamBufferSize, FileOptions.SequentialScan, Path.GetFileName(path), false, false, checkHost);
The FileStream class, being the class that ultimately accesses the file and therefore needs to guarantee cleanup, does have a finalizer, which closes the actual underlying stream. The Dispose method on the StreamReader just calls the Close on the underlying FileStream.
StreamReader and StreamWriter uses a FileStream to access the file. FileStream uses SafeFileHandle to store the underlying OS file handle. As the SafeFileHandle class controls an unmanaged resource it correctly has a finalizer (what you call a destructor) that closes the file handle.
But If I do this and then exit from the application: [...] without closing cin the file can from the OS opened and written
When a process terminates all resources used by that process is released to the operating system. It doesn't matter if your application forgot to close a file handle (even though SafeFileHandle will not "forget"). You will always observe the described behavior no matter how badly written your application is.
I just want to point out that the best way to work with StreamReader and StreamWriter and similar classes is using:
using (StreamWriter cout = new StreamWriter("test.txt")) {
cout.WriteLine("XXX");
}
using (StreamReader cin = new StreamReader("test.txt")) {
string text = cin.ReadLine();
}
This deterministically closes the files when the using block ends even if an exception is thrown while processing the file.
The operating system has a list which process (application) has what file opened. If your process terminates without explicitly closing the file, the operating system still knows that it is not any longer accessing the file and so can allow other requests to access the file.
Operation sytem frees handles and memory owned by application, if the were not freed by application if it closes. Anyway, I'm sure Stream has finalizer.
Related
This question already has answers here:
Cannot write to file after reading
(5 answers)
Closed 3 years ago.
This question stems from this other thread: How to lock a file with C#?
Basically let's say you want to lock a JSON file, read it, then write to it afterward, and finally unlock it. You can lock the file using the answers from the other question.
However I'm having trouble where this is allowing me to read the file, but not write to it afterward without first unlocking the file. That is, the recommended method, which seems to be fairly well-respected, is locking the same thread out of its own resource.
Example:
using (var fs = new FileStream(GetJsonPath(), FileMode.Open, FileAccess.ReadWrite,
FileShare.None))
{
SomeDtoType dto;
using (StreamReader reader = new StreamReader(fs))
{
dto = ((SomeDtoType)(new JsonSerializer()).Deserialize(reader,
typeof(SomeDtoType)));
}
// Make changes to the DTO.....
using (StreamWriter writer = new StreamWriter(fs))
{
new JsonSerializer().Serialize(writer, dto);
}
}
The using line that creates the StreamWriter throws the following exception:
Stream was not writable.
Now one thing that comes to mind is the value of FileShare.None. The problem here is that that particular enum is evidently setting lock permissions for more than just external processes.
How can you lock external threads/processes out of changing/deleting the file, yet allow your own to make these two subsequent read/write accesses?
EDIT:
Evidently moving everything into the using block for the StreamReader, then setting fs.Position to 0 between the read and the write kind of fixes the issue. The fs.Position part is fine, but having to move the write logic into the using block for the StreamReader, just so they can both use the same FileStream lock, seems a tad odd...
StreamReader closes the stream if you don't use ctor overload and instruct it don't do that:
using (StreamReader reader = new StreamReader(fs, Encoding.UTF8, true, 4096, leaveOpen:true))
There is no finalizer in StreamReader, you can move it out using block and keep as undisposed, however I'd recommend explicitly control the lifetime and behavior.
Another issue is that you'll append to the file. If you want to override a content you need to reset it before you write to it:
fs.SetLength(0);
I was searching in StackOverflow about try-finally and using blocks and what are the best practices on using them.
I read in a comment here that if your application is terminated abruptly by killing the process, a finally block will not get executed.
I was wondering, does the same apply to the using blocks? Will for example a stream get closed if a Environment.exit() call occurs inside the using block?:
//....
using (FileStream fsSource1 = new FileStream(pathSource,
FileMode.Open, FileAccess.Read))
{
//Use the stream here
Environment.exit();
}
As a second thought on it, and knowing that CLR Garbage Collector will probably take care of the stream objects if not closed properly in the program calls, is it considered necessary to close a stream in code, if the program gets terminated for sure after the completion of the stream usage?
For example, is there any practical difference between:
//....
using (FileStream fsSource1 = new FileStream(pathSource,
FileMode.Open, FileAccess.Read))
{
//Use the stream here
}
Environment.exit();
And:
//....
FileStream fsSource1 = new FileStream(pathSource, FileMode.Open, FileAccess.Read);
//Use the stream here
Environment.exit();
Or even the example mentioned before?
It shouldn't make a difference in the specific case of FileStream, modulo a tricky corner case when you used its BeginWrite() method. Its finalizer attempts to complete writing any unwritten data still present in its internal buffer. This is however not generally true, it will make a difference if you use StreamWriter for example.
What you are leaving up to the .NET Framework to decide is whether you truly meant to jerk the floor mat and seize writing the file. Or whether it should make a last-gasp attempt to flush any unwritten data. The outcome in the case of StreamWriter will tend to be an unpleasant one, non-zero odds that something is going to fall over when it tries to read a half-written file.
Always be explicit, if you want to make sure this does not happen then it is up to you to ensure that you properly called the Close() or Dispose() method. Or delete the file.
I have a StreamWriter with an AutoFlush = true property. However, I still see the file only partially written when I randomly open it. I'm writing a file that needs to be fully written (JSON) or not during any given time.
var sw = new StreamWriter("C:\file.txt", true /* append */, Encoding.ASCII) { AutoFlush = true };
sw.WriteLine("....");
// long running (think like a logging application) -- 1000s of seconds
sw.Close();
In between the sw.WriteLine() call and sw.Close() I want to open the file, and always have it be in the "correct data format", i.e. my line should be complete.
Current Idea:
Increase the internal buffer of FileStream (and/or StreamWriter) to let's say 128KB. Then every 128KB-1, call .Flush() on the FileStream object. This leads me to my next question, when I do call Flush(), should I right before calling it get the Stream.Position and do a File.Lock(Position, 128KB-1)? Or does Flush take care of that?
Basically I don't want the reader to be able to read the contents in between Flush(), because it'll maybe partially broken.
using (StreamWriter sw = new StreamWriter("FILEPATH"))
{
sw.WriteLine("contents");
// if you open the file now, you may see partially written lines
// since the sw is still working on it.
}
// access the file now, since the stream writer has been properly closed and disposed.
If you need a "log-like" file which is never half-written, the way to go is not keeping it open.
Every time, you want to write your file, you should instantiate a new FileWriter, which will flush the file contents upon releasing the file like this:
private void LogLikeWrite(string filePath, string contents)
{
using (StreamWriter streamWriter = new StreamWriter(filePath, true)) // the true will make you append to the file instead of overwriting its contents
{
streamWriter.Write(contents);
}
}
This way your write operations will be flushed immediately.
If you are sharing the file between processes, your going to have a race condition unless you produce a locking mechanism of some kind. See https://stackoverflow.com/a/29127380/892327. This does require that you are able to modify both processes.
An alternative is to have process A wait for a file at a specified location. Process B writes to a intermediate file and once B has flushed, the file is copied to the location process A is expecting a file to be so that it can consume the file.
I have two programs that work together. To coordinate their operations I use a small settings file. This setting file contains two words separated by a ';'. So in the one program I repeatedly read the words in the file using a while loop. By repeatedly I mean once every second. The loop only terminates when the program terminates; when the user turns off the pc.
But with each iteration of the loop the program size in memory increases until the program throws an OutOfMemory exception. I have tried two different methods of reading the files but both causes the program to 'grow' in memory.
FileStream FS = new FileStream("br.stat", FileMode.Open);
StreamReader SR = new StreamReader(FS);
string s = SR.ReadToEnd();
FS.Dispose();
SR.Dispose();
and
string S = File.ReadAllText("br.stat");
Is there a way to read a file repeatedly with out this happening?
Thanks.
the problem is the design, more so than the implementation. You only need to read from the file once when the app starts and again when the file changes. You can use FileSystemWatcher to detect changes to the file and reload the settings.
This uses drastically less resources than reading the file indefinitely.
also, you'll want to take advantage of the using keyword to ensure you properly dispose of the file stream and reader. In fact I would simplify and just use File.ReadAllText(filename).
I think you're doing it wrong. Realistically there is no need to read the file every iteration unless its changed.
Instead it would be better to use a FileSystemWatcher http://msdn.microsoft.com/en-us/library/system.io.filesystemwatcher.changed(v=vs.85).aspx to read the values and change your control values.
You should also use the following to circumvent your memory leak issues
using (var FS = new FileStream("br.stat", FileMode.Open))
{
using (var SR = new StreamReader(FS))
{
var s = SR.ReadToEnd();
}
}
I would recomend you look at using an using statement
File and Font are examples of managed types that access unmanaged
resources (in this case file handles and device contexts). There are
many other kinds of unmanaged resources and class library types that
encapsulate them. All such types must implement the IDisposable
interface.
As a rule, when you use an IDisposable object, you should declare and
instantiate it in a using statement. The using statement calls the
Dispose method on the object in the correct way, and (when you use it
as shown earlier) it also causes the object itself to go out of scope
as soon as Dispose is called. Within the using block, the object is
read-only and cannot be modified or reassigned.
so something like
using (FileStream FS = new FileStream("br.stat", FileMode.Open))
using (StreamReader SR = new StreamReader(FS))
{
string s = SR.ReadToEnd();
}
I have this code that saves a pdf file.
FileStream fs = new FileStream(SaveLocation, FileMode.Create);
fs.Write(result.DocumentBytes, 0, result.DocumentBytes.Length);
fs.Flush();
fs.Close();
It works fine. However sometimes it does not release the lock right away and that causes file locking exceptions with functions run after this one run.
Is there a ideal way to release the file lock right after the fs.Close()
Here's the ideal:
using (var fs = new FileStream(SaveLocation, FileMode.Create))
{
fs.Write(result.DocumentBytes, 0, result.DocumentBytes.Length);
}
which is roughly equivalent to:
FileStream fs = null;
try
{
fs = new FileStream(SaveLocation, FileMode.Create);
fs.Write(result.DocumentBytes, 0, result.DocumentBytes.Length);
}
finally
{
if (fs != null)
{
((IDisposable)fs).Dispose();
}
}
the using being more readable.
UPDATE:
#aron, now that I'm thinking
File.WriteAllBytes(SaveLocation, result.DocumentBytes);
looks even prettier to the eye than the ideal :-)
We have seen this same issue in production with a using() statement wrapping it.
One of the top culprits here is anti-virus software which can sneak in after the file is closed, grab it to check it doesn't contain a virus before releasing it.
But even with all anti-virus software out of the mix, in very high load systems with files stored on network shares we still saw the problem occasionally. A, cough, short Thread.Sleep(), cough, after the close seemed to cure it. If someone has a better solution I'd love to hear it!
I can't imagine why the lock would be maintained after the file is closed. But you should consider wrapping this in a using statment to ensure that the file is closed even if an exception is raised
using (FileStream fs = new FileStream(SaveLocation, FileMode.Create))
{
fs.Write(result.DocumentBytes, 0, result.DocumentBytes.Length);
}
If the functions that run after this one are part of the same application, then a better approach might be to open the file for read/write at the beginning of the entire process, and then pass the file to each function without closing it until the end of the process. Then it will be unnecessary for the application to block waiting for the IO operation to complete.
This worked for me when using .Flush() I had to add a close inside the using statement.
using (var imageFile = new FileStream(filePath, FileMode.Create, FileAccess.ReadWrite,FileShare.ReadWrite))
{
imageFile.Write(bytes, 0, bytes.Length);
imageFile.Flush();
imageFile.Close();
}
Just had the same issue when I closed a FileStream and opened the file immediately in another class. The using statement was not a solution since the FileStream had been created at another place and stored in a list. Clearing the list was not enough.
It looks like the stream needs to be freed by the garbage collector before the file can be reused. If the time between closing and opening is too short, you can use
GC.Collect();
right after you closed the stream. This worked for me.
I guess the solutions of Ian Mercer to put the thread to sleep might have the same effect, giving the GC time to free the resources.