In Linux, a lot of IPC is done by appending to a file in 1 process and reading the new content from another process.
I want to do the above in Windows/.NET (Too messy to use normal IPC such as pipes). I'm appending to a file from a Python process, and I want to read the changes and ONLY the changes each time FileSystemWatcher reports an event. I do not want to read the entire file content into memory each time I'm looking for changes (the file will be huge)
Each append operation appends a row of data that starts with a unique incrementing counter (timestamp+key) and ends with a newline.
using (FileStream fs = new FileStream
(fileName, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
using (StreamReader sr = new StreamReader(fs))
{
while (someCondition)
{
while (!sr.EndOfStream)
ProcessLinr(sr.ReadLine());
while (sr.EndOfStream)
Thread.Sleep(100);
ProcessLinr(sr.ReadLine());
}
}
}
this will help you read only appended lines
You can store the offset of the last read operation and seek the file to that offset when you get a changed file notification. An example follows:
Main method:
public static void Main(string[] args)
{
File.WriteAllLines("test.txt", new string[] { });
new Thread(() => ReadFromFile()).Start();
WriteToFile();
}
Read from file method:
private static void ReadFromFile()
{
long offset = 0;
FileSystemWatcher fsw = new FileSystemWatcher
{
Path = Environment.CurrentDirectory,
Filter = "test.txt"
};
FileStream file = File.Open(
"test.txt",
FileMode.Open,
FileAccess.Read,
FileShare.Write);
StreamReader reader = new StreamReader(file);
while (true)
{
fsw.WaitForChanged(WatcherChangeTypes.Changed);
file.Seek(offset, SeekOrigin.Begin);
if (!reader.EndOfStream)
{
do
{
Console.WriteLine(reader.ReadLine());
} while (!reader.EndOfStream);
offset = file.Position;
}
}
}
Write to file method:
private static void WriteToFile()
{
for (int i = 0; i < 100; i++)
{
FileStream writeFile = File.Open(
"test.txt",
FileMode.Append,
FileAccess.Write,
FileShare.Read);
using (FileStream file = writeFile)
{
using (StreamWriter sw = new StreamWriter(file))
{
sw.WriteLine(i);
Thread.Sleep(100);
}
}
}
}
Related
I want to read a binary file line by line (I'm writing of course continously, but I know that after 457 bytes new data start and I know exactly the byte structure and where which information is written to) and change a special entry of the line. I get an System.IO.IOException when I try to access the same file with both BinaryReader and BinaryWriter. I use locking to prevent that the file is accessed from somewhere else.
My code is:
using (FileStream fs2 = new FileStream(testfile, FileMode.Open, FileAccess.Read))
{
using (BinaryReader r = new BinaryReader(fs2))
{
using (BinaryWriter bw = new BinaryWriter(new FileStream(testfile, FileMode.Open, FileAccess.Write), utf8))
{
for (int i = 0; i < 11000; i+=457)
{
int myint = r.ReadInt64();
bw.Seek(i, SeekOrigin.Current);
bw.Write(myint*2);
}
}
}
}
How can I do this?
Do not create the second FileStream because the file is locked for the read operation by the first FileStream object.
If you are sure about file structure, the exception only can come out from 2nd FileStream instantiation. See link below for more information:
Read and Write to File at the same time
It is working for me using the following code:
if (File.Exists(testfile))
{
FileInfo fi = new FileInfo(testfile);
using (FileStream fs2 = new FileStream(testfile, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
using (BinaryReader r = new BinaryReader(fs2))
{
r.BaseStream.Seek(0, SeekOrigin.Begin);
using (BinaryWriter bw = new BinaryWriter(new FileStream(testfile, FileMode.Open, FileAccess.Write, FileShare.ReadWrite)))
{
for (int i = 0; i <= (fi.Length-177); i += 177)//181
{
}
}
}
}
}
I want to read file continuously like GNU tail with "-f" param. I need it to live-read log file.
What is the right way to do it?
More natural approach of using FileSystemWatcher:
var wh = new AutoResetEvent(false);
var fsw = new FileSystemWatcher(".");
fsw.Filter = "file-to-read";
fsw.EnableRaisingEvents = true;
fsw.Changed += (s,e) => wh.Set();
var fs = new FileStream("file-to-read", FileMode.Open, FileAccess.Read, FileShare.ReadWrite);
using (var sr = new StreamReader(fs))
{
var s = "";
while (true)
{
s = sr.ReadLine();
if (s != null)
Console.WriteLine(s);
else
wh.WaitOne(1000);
}
}
wh.Close();
Here the main reading cycle stops to wait for incoming data and FileSystemWatcher is used just to awake the main reading cycle.
You want to open a FileStream in binary mode. Periodically, seek to the end of the file minus 1024 bytes (or whatever), then read to the end and output. That's how tail -f works.
Answers to your questions:
Binary because it's difficult to randomly access the file if you're reading it as text. You have to do the binary-to-text conversion yourself, but it's not difficult. (See below)
1024 bytes because it's a nice convenient number, and should handle 10 or 15 lines of text. Usually.
Here's an example of opening the file, reading the last 1024 bytes, and converting it to text:
static void ReadTail(string filename)
{
using (FileStream fs = File.Open(filename, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
// Seek 1024 bytes from the end of the file
fs.Seek(-1024, SeekOrigin.End);
// read 1024 bytes
byte[] bytes = new byte[1024];
fs.Read(bytes, 0, 1024);
// Convert bytes to string
string s = Encoding.Default.GetString(bytes);
// or string s = Encoding.UTF8.GetString(bytes);
// and output to console
Console.WriteLine(s);
}
}
Note that you must open with FileShare.ReadWrite, since you're trying to read a file that's currently open for writing by another process.
Also note that I used Encoding.Default, which in US/English and for most Western European languages will be an 8-bit character encoding. If the file is written in some other encoding (like UTF-8 or other Unicode encoding), It's possible that the bytes won't convert correctly to characters. You'll have to handle that by determining the encoding if you think this will be a problem. Search Stack overflow for info about determining a file's text encoding.
If you want to do this periodically (every 15 seconds, for example), you can set up a timer that calls the ReadTail method as often as you want. You could optimize things a bit by opening the file only once at the start of the program. That's up to you.
To continuously monitor the tail of the file, you just need to remember the length of the file before.
public static void MonitorTailOfFile(string filePath)
{
var initialFileSize = new FileInfo(filePath).Length;
var lastReadLength = initialFileSize - 1024;
if (lastReadLength < 0) lastReadLength = 0;
while (true)
{
try
{
var fileSize = new FileInfo(filePath).Length;
if (fileSize > lastReadLength)
{
using (var fs = new FileStream(filePath, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
fs.Seek(lastReadLength, SeekOrigin.Begin);
var buffer = new byte[1024];
while (true)
{
var bytesRead = fs.Read(buffer, 0, buffer.Length);
lastReadLength += bytesRead;
if (bytesRead == 0)
break;
var text = ASCIIEncoding.ASCII.GetString(buffer, 0, bytesRead);
Console.Write(text);
}
}
}
}
catch { }
Thread.Sleep(1000);
}
}
I had to use ASCIIEncoding, because this code isn't smart enough to cater for variable character lengths of UTF8 on buffer boundaries.
Note: You can change the Thread.Sleep part to be different timings, and you can also link it with a filewatcher and blocking pattern - Monitor.Enter/Wait/Pulse. For me the timer is enough, and at most it only checks the file length every second, if the file hasn't changed.
This is my solution
static IEnumerable<string> TailFrom(string file)
{
using (var reader = File.OpenText(file))
{
while (true)
{
string line = reader.ReadLine();
if (reader.BaseStream.Length < reader.BaseStream.Position)
reader.BaseStream.Seek(0, SeekOrigin.Begin);
if (line != null) yield return line;
else Thread.Sleep(500);
}
}
}
so, in your code you can do
foreach (string line in TailFrom(file))
{
Console.WriteLine($"line read= {line}");
}
You could use the FileSystemWatcher class which can send notifications for different events happening on the file system like file changed.
private void button1_Click(object sender, EventArgs e)
{
if (folderBrowserDialog.ShowDialog() == DialogResult.OK)
{
path = folderBrowserDialog.SelectedPath;
fileSystemWatcher.Path = path;
string[] str = Directory.GetFiles(path);
string line;
fs = new FileStream(str[0], FileMode.Open, FileAccess.Read, FileShare.ReadWrite);
tr = new StreamReader(fs);
while ((line = tr.ReadLine()) != null)
{
listBox.Items.Add(line);
}
}
}
private void fileSystemWatcher_Changed(object sender, FileSystemEventArgs e)
{
string line;
line = tr.ReadLine();
listBox.Items.Add(line);
}
If you are just looking for a tool to do this then check out free version of Bare tail
I writing a simple sample program, that should write a data to file and read in real time when there is some data. I write this code:
using System;
using System.IO;
using System.Reflection;
using System.Threading;
namespace Sample
{
class Program
{
static void Main(string[] args)
{
const string file = "sample.txt";
var thread = new Thread(() =>
{
var r = new Random();
using (var sw = new StreamWriter(new FileStream(file, FileMode.Create, FileAccess.ReadWrite, FileShare.ReadWrite)) {AutoFlush = true})
{
while (true)
{
Thread.Sleep(r.Next(100, 500));
sw.WriteLine(DateTime.Now);
}
}
});
thread.Start();
using (var watcher = new FileSystemWatcher
{
Path = Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location),
Filter = file,
NotifyFilter = NotifyFilters.LastWrite
})
{
var mse = new ManualResetEventSlim(false);
watcher.Changed += (sender, eventArgs) =>
mse.Set();
watcher.EnableRaisingEvents = true;
using (var sr = new StreamReader(new FileStream(file, FileMode.Open, FileAccess.ReadWrite, FileShare.ReadWrite)))
{
while (true)
{
mse.Wait();
ProcessData(sr.ReadToEnd());
mse.Reset();
}
}
}
}
private static void ProcessData(string s)
{
Console.WriteLine(s);
}
}
}
But it seems that watcher works only when file is opened, but doesn't when it is populated with info (even with AutoFlush flag enabled on StreamWriter). Data is physically on the disk but watcher doesn't raise an event File changed.
I just want to avoid infinite loop and process data only when it is written.
If you don't close them I think you should dispose of both the FileStream and the StreamWriter by putting each in a using clause:
string file = "sample.txt";
using (var fs =new FileStream(file, FileMode.Create,
FileAccess.ReadWrite, FileShare.ReadWrite))
using (var sw = new StreamWriter(fs) {AutoFlush = true})
{
..
..
}
Before closing directly or by disposing, the file system will not be informed that the changes are done. Otherwise you would be flooded with changed events..
I simply want to merge all text files in a given directory, similar to the following command prompt command:
cd $directory
copy * result.txt
I've written the following code, which almost accomplishes what I want, but it's doing something strange. When the StreamWriter writes the first file (or when i = 0), it doesn't actually write any content - the file size remains 0 bytes, despite the first file being ~300 KB. However, the other file writes execute successfully.
If I compare the output from the command prompt to the output from the C# code in diff, you can see that a large block of text is missing. Additionally, the command prompt result is 1,044 KB where the C# result is 700 KB.
string[] txtFiles = Directory.GetFiles(filepath);
using (StreamWriter writer = new StreamWriter(filepath + "result.txt"))
{
for (int i = 0; i < txtFiles.Length; i++)
{
using (StreamReader reader = File.OpenText(txtFiles[i]))
{
writer.Write(reader.ReadToEnd());
}
}
}
Am I using the StreamWriter / StreamReader incorrectly?
Minimalistic implementation, reading the bytes and writing them instead of using a stream for reading - please note, that you should handle the IOException correctly to avoid misbehavior:
var newline = Encoding.ASCII.GetBytes(Environment.NewLine);
var files = Directory.GetFiles(filepath);
try
{
using (var writer = File.Open(Path.Combine(filepath, "result.txt"), FileMode.Create))
foreach (var text in files.Select(File.ReadAllBytes))
{
writer.Write(text, 0, text.Length);
writer.Write(newline, 0, newline.Length);
}
}
catch (IOException)
{
// File might be used by different process or you have insufficient permissions
}
Here, hope it helps you. Note: By copying from a stream to another you save some ram and greatly improve performance.
class Program
{
static void Main(string[] args)
{
string filePath = #"C:\Users\FunkyName\Desktop";
string[] txtFiles = Directory.GetFiles(filePath, "*.txt");
using (Stream stream = File.Open(Path.Combine(filePath, "result.txt"), FileMode.OpenOrCreate))
{
for (int i = 0; i < txtFiles.Length; i++)
{
string fileName = txtFiles[i];
try
{
using (Stream fileStream = File.Open(fileName, FileMode.Open, FileAccess.Read))
{
fileStream.CopyTo(stream);
}
}
catch (IOException e)
{
// Handle file open exception
}
}
}
}
}
I wrote your code , it works properly! only change the line :
using (StreamWriter writer = new StreamWriter(filepath + "result.txt"))
to:
using (StreamWriter writer = new StreamWriter(filepath + "/result.txt"))
i guess you can't see the file because it is saved in another folder .
I have 2 applications, one is writing to a file, and the other one reads the file. It's a log file, so the writer will be logging until the program stops, while the reader could be invoked any time to get the content of the file.
I thought that when the writer opens the file with FileShare.Read, the reader would be able to access the file, but it produces an error saying that the file is being used by another process.
Writer Application:
FileStream fs = new FileStream("file.log", FileMode.Open, FileAccess.Write, FileShare.Read);
BinaryWriter writer = new BinaryWriter(fs);
Reader Application:
BinaryReader reader = new BinaryReader(File.OpenRead("file.log"));
How do I prevent this error?
Can you try specifying FileShare.Read while reading the file also? Instead of directly using File.OpenRead use FileStream with this permission.
Also, for logging, you can use log4Net or any other free logging framework which manages logging so efficiently and we do not have to manage writing to files.
o read a locked file you are going to need to provide more flags for the FileStream.
Code such as below.
using (var reader = new FileStream("d:\\test.txt", FileMode.Open, FileAccess.Read,FileShare.ReadWrite))
{
using (var binary = new BinaryReader(reader))
{
//todo: add your code
binary.Close();
}
reader.Close();
}
This would open the file for reading only with the share mode of read write. This can be tested with a small app. (Using streamreader\write instead of binary)
static Thread writer,
reader;
static bool abort = false;
static void Main(string[] args)
{
var fs = File.Create("D:\\test.txt");
fs.Dispose();
writer = new Thread(new ThreadStart(testWriteLoop));
reader = new Thread(new ThreadStart(testReadLoop));
writer.Start();
reader.Start();
Console.ReadKey();
abort = true;
}
static void testWriteLoop()
{
using (FileStream fs = new FileStream("d:\\test.txt", FileMode.Open, FileAccess.Write, FileShare.Read))
{
using (var writer = new StreamWriter(fs))
{
while (!abort)
{
writer.WriteLine(DateTime.Now.ToString());
writer.Flush();
Thread.Sleep(1000);
}
}
}
}
static void testReadLoop()
{
while (!abort)
{
Thread.Sleep(1000);
using (var reader = new FileStream("d:\\test.txt", FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
using (var stream = new StreamReader(reader))
{
Console.WriteLine(stream.ReadToEnd());
stream.Close();
}
reader.Close();
}
}
}
I realize the example above is pretty simple but the fact still remains that the "testWriteLoop" never releases the lock.
Hope this helps