I want to create a log file to track some operations in my Application.In my scenario within one session I wanna to log at least 50 time per min.currently im using StremWriter to create log file.
public static StreamWriter InitializeStream(string path)
{
/*ensuring whether thread safe or not*/
lock (mylock)
{
if (null == _stream)
{
var fileStream = new FileStream(path, FileMode.OpenOrCreate, FileAccess.ReadWrite);
_stream = new StreamWriter(fileStream);
return _stream;
}
return _stream;
}
}
//logging operation
StreamHandler.Log(path)
Still im bit confuse to selecting AppendAllText or StreamWriter.
Instead of StreamWriter way can i get advantage using AppendAllText Directly?(Performance)
Do not use AppendAllText() inside the loop because it uses StremWriter internally where stream object will be initialized and disposed on each iteration.
Related
why in fs2 object throw error ?? i already have written a FileShare.ReadWrite in fs object
FileStream fs = new FileStream("hello.txt",FileMode.OpenOrCreate,FileAccess.ReadWrite,FileShare.ReadWrite);
mama();
Console.ReadKey();
}
static void mama()
{
FileStream fs2 = new FileStream("hello.txt", FileMode.OpenOrCreate, FileAccess.ReadWrite, FileShare.None);
fs2.Read(new byte[3], 0, 3);
}
can any one tell me why this error ?
error = The process cannot access the file 'C:\Users\iP\documents\visual studio 2015\Projects\ConsoleApplication32\ConsoleApplication32\bin\Debug\hello.txt' because it is being used by another process.
You're getting that error because you're passing FileShare.None to the second call. If you change that to FileShare.ReadWrite to match the first call, you won't have that problem.
The reason for this is because the FileStream constructor calls CreateFileW underneath, and if you take a look at the documentation for that function, it states:
You cannot request a sharing mode that conflicts with the access mode
that is specified in an existing request that has an open handle.
CreateFile would fail and the GetLastError function would return
ERROR_SHARING_VIOLATION.
You already have an open handle from the first request using FileAccess.ReadWrite as the access mode, which conflicts with FileShare.None in the second call.
Because your code never closes the file and has an open handle to it
If you can, always use the using statement, it will flush and close the file
using(var fs = new FileStream(...))
{
// do stuff here
} // this is where the file gets flushed and closed
If 2 methods are working on the same file, pass the FileStream in
static void mama(FileStream fs )
{
fs .Read(new byte[3], 0, 3);
}
I need advice whether or not using a lock (ReaderWriterLockSlim).
A user interacts on screen, and data can be saved into a file :
XmlSerializer xmlserializer = new XmlSerializer(typeof(MyFile));
FileStream fs = new FileStream(fileName, FileMode.Create,FileAccess.ReadWrite);
xmlserializer.Serialize(fs, this);
fs.Close();
In parallel, I have a timer (thus same thread, System.Windows.Forms.Timer), which checks this same file size and sends it to a server if modified.
I'll use File.ReadAllBytes as this is a rather small file.
Should I use a lock since writing filestream takes some time ?
I wonder if the timer can cause problem (I don't have a clear understanding if it preempts).
Thanks for any advice.
In WinForms an event never interrupts a running method running in the same thread (i.e. in the UI thread). Any timer_Tick (from System.Windows.Forms.Timer) will be delayed until the serializing code is finished.
(I assume that you are not using async calls.)
You can read the file size directly from the FileStream before closing it.
var xmlserializer = new XmlSerializer(typeof(MyFile));
using (var fs = new FileStream(fileName, FileMode.Create, FileAccess.ReadWrite)) {
xmlserializer.Serialize(fs, this);
Console.WriteLine(fs.Length); // <=========
} // The using-statement automatically closes fs
If you need to know whether the file changed in another routine, why don't you just use a flag?
public static bool FileHasChanged { get; set; }
...
var xmlserializer = new XmlSerializer(typeof(MyFile));
using (var fs = new FileStream(fileName, FileMode.Create, FileAccess.ReadWrite)) {
xmlserializer.Serialize(fs, this);
}
FileHasChanged = true;
In the other routine (timer_Tick I think):
if (MyFile.FileHasChanged) {
//TODO: Send file to server.
MyFile.FileHasChanged = false;
}
Since everything is running in the same thread, no locking is required.
Another question is whether you really need a file or whether you could just write to a MemoryStream and then use this memory stream to send the data to the server. If you still need the file, you could write to it using the same memory stream and serialize only once. The memory stream would replace the Boolean flag for the communication between the two routines. After sent to the server, the memory stream would be set to null after calling Dispose() (instead of MyFile.FileHasChanged = false;).
That would be more in the sense of Eric Lippert's comments.
I have a class library that gets called from a windows service, the class library can be called many times at the same time.
I have an issue where i have to read file contents in my class, so i get the error that the file is being used by another process if the class is getting called by many processes.
This is how i read the file content:
File.ReadAllBytes("path");
What is the best solution in this case ?
Thank you
The following code demonstrates to access a file by setting its share permissions. The first using block creates and writes file, second and third using blocks access and read the file.
var fileName = "test.txt";
using (var fsWrite = new FileStream(fileName, FileMode.OpenOrCreate, FileAccess.Write, FileShare.ReadWrite))
{
var content = Encoding.UTF8.GetBytes("test");
fsWrite.Write(content, 0, content.Length);
fsWrite.Flush();
using (var fsRead_1 = new FileStream(fileName, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
var bufRead_1 = new byte[fsRead_1.Length];
fsRead_1.Read(bufRead_1, 0, bufRead_1.Length);
Console.WriteLine("fsRead_1:" + Encoding.UTF8.GetString(bufRead_1));
using (var fsRead_2 = new FileStream(fileName, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
var bufRead_2 = new byte[fsRead_2.Length];
fsRead_2.Read(bufRead_2, 0, bufRead_2.Length);
Console.WriteLine("fsRead_2:" + Encoding.UTF8.GetString(bufRead_2));
}
}
}
You need to synchronize the access to the whole file using standard thread synchronization approaches.
The simplest one is Monitor using lock statement:
public class A
{
private static readonly object _sync = new object();
public void DoStuff()
{
// All threads trying to enter this critical section will
// wait until the first to enter exits it
lock(_sync)
{
byte[] buffer = File.ReadAllBytes(#"C:\file.jpg");
}
}
}
Note
Firstly, I was understanding OP was accessing the file from different processes, but when I double-checked the statement:
I have a class library that gets called from a windows service, the
class library can be called many times at the same time.
...I realized OP is calling a method which reads all bytes from some file within the same Windows service instance.
Use Mutex for syncing among different processes. File.ReadAllBytes uses a FileAccess.Read and FileShare.Read when reads the file, so normally you don't need to use any locks here. So you get this exception because the file is being written somewhere (or at least is locked for writing).
Solution 1 - if you are the one who writes this file
private static Mutex mutex;
public void WriteFile(string path)
{
Mutex mutex = GetOrCreateMutex();
try
{
mutex.WaitOne();
// TODO: ... write file
}
finally
{
mutex.ReleaseMutex();
}
}
public byte[] ReadFile(string path)
{
// Note: If you just read the file, this lock is completely unnecessary
// because ReadAllFile uses Read access. This just protects the file
// being read and written at the same time
Mutex mutex = GetOrCreateMutex();
try
{
mutex.WaitOne();
return File.ReadAllBytes(path);
}
finally
{
mutex.ReleaseMutex();
}
}
private static Mutex GetOrCreateMutex()
{
try
{
mutex = Mutex.OpenExisting("MyMutex");
}
catch (WaitHandleCannotBeOpenedException)
{
mutex = new Mutex(false, "MyMutex");
}
}
Remark: A ReadWriteLock would be better here because you can read a file safely parallelly when it is not being written; however, there is no built-in inter-process read-write lock in .NET. Here is an example how you can implement one with Mutex and Semaphor types.
Solution 2 - if you just read the file
You must simply being prepared that the file can be locked when it is being written by a 3rd process:
public byte[] TryReadFile(string path, int maxTry)
{
Exception e;
for (int i = 0; i < maxTry; i++)
{
try
{
return File.ReadAllBytes(path);
}
catch (IOException io)
{
e = io;
Thread.Sleep(100);
}
}
throw e; // or just return null
}
I am writing to a file through a function. On the first call I am able to write to the file but on the second call I get an exception:
The process cannot access a file because it is being used by some other process.
Basically the function I am calling starts a System.Diagnostics.Process process whose output I have to write to a file with the same name and location each time the function gets called. But whenever the function gets called for the second time I get the exception.
I have tried
Byte[] info = new UTF8Encoding(true).GetBytes(contents);
if (!File.Exists(fileName))
{
// Create the file.
using (FileStream fs = File.Create(fileName))
{
fs.Write(info, 0, info.Length);
fs.Close();
}
}
using (FileStream file_write = File.Open(fileName,FileMode.Open, FileAccess.ReadWrite, FileShare.ReadWrite))
{
file_write.Write(info,0,info.Length);
file_write.Close();
}
and
File.writealltext(filename,contents)
and
using (StreamWriter file_write = new StreamWriter(File_path))
{
file_write.WriteLine(File_data);
file_write.Close();
}
Nothing worked.
Can anyone suggest any other way of doing this I am really stuck.
You need to use lock so that way the processes will wait for each other to finish writing.
static object obj = new object();
static void WriteOnFile(byte[] data)
{
lock(obj)
{
FileStream file_write = new FileStream(ileName,FileMode.Open, FileAccess.ReadWrite);
file_write.Write(data,0,data.Length);
file_write.Close();
}
}
Can I close a file stream without calling Flush (in C#)? I understood that Close and Dispose calls the Flush method first.
MSDN is not 100% clear, but Jon Skeet is saying "Flush", so do it before close/dispose. It won't hurt, right?
From FileStream.Close Method:
Any data previously written to the buffer is copied to the file before
the file stream is closed, so it is not necessary to call Flush before
invoking Close. Following a call to Close, any operations on the file
stream might raise exceptions. After Close has been called once, it
does nothing if called again.
Dispose is not as clear:
This method disposes the stream, by writing any changes to the backing
store and closing the stream to release resources.
Remark: the commentators might be right, it's not 100% clear from the Flush:
Override Flush on streams that implement a buffer. Use this method to
move any information from an underlying buffer to its destination,
clear the buffer, or both. Depending upon the state of the object, you
might have to modify the current position within the stream (for
example, if the underlying stream supports seeking). For additional
information see CanSeek.
When using the StreamWriter or BinaryWriter class, do not flush the
base Stream object. Instead, use the class's Flush or Close method,
which makes sure that the data is flushed to the underlying stream
first and then written to the file.
TESTS:
var textBytes = Encoding.ASCII.GetBytes("Test123");
using (var fileTest = System.IO.File.Open(#"c:\temp\fileNoCloseNoFlush.txt", FileMode.CreateNew))
{
fileTest.Write(textBytes,0,textBytes.Length);
}
using (var fileTest = System.IO.File.Open(#"c:\temp\fileCloseNoFlush.txt", FileMode.CreateNew))
{
fileTest.Write(textBytes, 0, textBytes.Length);
fileTest.Close();
}
using (var fileTest = System.IO.File.Open(#"c:\temp\fileFlushNoClose.txt", FileMode.CreateNew))
{
fileTest.Write(textBytes, 0, textBytes.Length);
fileTest.Flush();
}
using (var fileTest = System.IO.File.Open(#"c:\temp\fileCloseAndFlush.txt", FileMode.CreateNew))
{
fileTest.Write(textBytes, 0, textBytes.Length);
fileTest.Flush();
fileTest.Close();
}
What can I say ... all files got the text - maybe this is just too little data?
Test2
var rnd = new Random();
var size = 1024*1024*10;
var randomBytes = new byte[size];
rnd.NextBytes(randomBytes);
using (var fileTest = System.IO.File.Open(#"c:\temp\fileNoCloseNoFlush.bin", FileMode.CreateNew))
{
fileTest.Write(randomBytes, 0, randomBytes.Length);
}
using (var fileTest = System.IO.File.Open(#"c:\temp\fileCloseNoFlush.bin", FileMode.CreateNew))
{
fileTest.Write(randomBytes, 0, randomBytes.Length);
fileTest.Close();
}
using (var fileTest = System.IO.File.Open(#"c:\temp\fileFlushNoClose.bin", FileMode.CreateNew))
{
fileTest.Write(randomBytes, 0, randomBytes.Length);
fileTest.Flush();
}
using (var fileTest = System.IO.File.Open(#"c:\temp\fileCloseAndFlush.bin", FileMode.CreateNew))
{
fileTest.Write(randomBytes, 0, randomBytes.Length);
fileTest.Flush();
fileTest.Close();
}
And again - every file got its bytes ... to me it looks like it's doing what I read from MSDN: it doesn't matter if you call Flush or Close before dispose ... any thoughts on that?
You don't have to call Flush() on Close()/Dispose(), FileStream will do it for you as you can see from its source code:
http://referencesource.microsoft.com/#mscorlib/system/io/filestream.cs,e23a38af5d11ddd3
[System.Security.SecuritySafeCritical] // auto-generated
protected override void Dispose(bool disposing)
{
// Nothing will be done differently based on whether we are
// disposing vs. finalizing. This is taking advantage of the
// weak ordering between normal finalizable objects & critical
// finalizable objects, which I included in the SafeHandle
// design for FileStream, which would often "just work" when
// finalized.
try {
if (_handle != null && !_handle.IsClosed) {
// Flush data to disk iff we were writing. After
// thinking about this, we also don't need to flush
// our read position, regardless of whether the handle
// was exposed to the user. They probably would NOT
// want us to do this.
if (_writePos > 0) {
FlushWrite(!disposing); // <- Note this
}
}
}
finally {
if (_handle != null && !_handle.IsClosed)
_handle.Dispose();
_canRead = false;
_canWrite = false;
_canSeek = false;
// Don't set the buffer to null, to avoid a NullReferenceException
// when users have a race condition in their code (ie, they call
// Close when calling another method on Stream like Read).
//_buffer = null;
base.Dispose(disposing);
}
}
I've been tracking a newly introduced bug that seems to indicate .NET 4 does not reliably flush changes to disk when the stream is disposed (unlike .NET 2.0 and 3.5, which always did so reliably).
The .NET 4 FileStream class has been heavily modified in .NET 4, and while the Flush*() methods have been rewritten, similar attention seems to have been forgotten for .Dispose().
This is resulting in incomplete files.
Since you've stated that you understood that close & dispose called the flush method if it was not called explicitly by user code, I believe that (by close without flush) you actually want to have a possibility to discard changes made to a FileStream, if necessary.
If that is correct, using a FileStream alone won't help. You will need to load this file into a MemoryStream (or an array, depending on how you modify its contents), and then decide whether you want to save changes or not after you're done.
A problem with this is file size, obviously. FileStream uses limited size write buffers to speed up operations, but once they are depleted, changes need to be flushed. Due to .NET memory limits, you can only expect to load smaller files in memory, if you need to hold them entirely.
An easier alternative would be to make a disk copy of your file, and work on it using a plain FileStream. When finished, if you need to discard changes, simply delete the temporary file, otherwise replace the original with a modified copy.
Wrap the FileStream in a BufferedStream and close the filestream before the buffered stream.
var fs = new FileStream(...);
var bs = new BufferedStream(fs, buffersize);
bs.Write(datatosend, 0, length);
fs.Close();
try {
bs.Close();
}
catch (IOException) {
}
Using Flush() is worthy inside big Loops.
when you have to read and write a big File inside one Loop. In other case the buffer or the computer is big enough, and doesn´t matter to close() without making one Flush() before.
Example: YOU HAVE TO READ A BIG FILE (in one format) AND WRITE IT IN .txt
StreamWriter sw = .... // using StreamWriter
// you read the File ...
// and now you want to write each line for this big File using WriteLine ();
for ( .....) // this is a big Loop because the File is big and has many Lines
{
sw.WriteLine ( *whatever i read* ); //we write here somrewhere ex. one .txt anywhere
sw.Flush(); // each time the sw.flush() is called, the sw.WriteLine is executed
}
sw.Close();
Here it is very important to use Flush(); beacause otherwise each writeLine is save in the buffer and does not write it until the buffer is frull or until the program reaches sw.close();
I hope this helps a little to understand the function of Flush
I think it is safe to use simple using statement, which closes the stream after the call to GetBytes();
public static byte[] GetBytes(string fileName)
{
byte[] buffer = new byte[4096];
using (FileStream fs = new FileStream(fileName))
using (MemoryStream ms = new MemoryStream())
{
fs.BlockCopy(ms, buffer, 4096); // extension method for the Stream class
fs.Close();
return ms.ToByteArray();
}
}