get data from Console to create log entries for log4net - c#

I have some external libraries that I am using that are logging to the console. I want these to log via log4net.
NOTE: I am NOT wanting to log to console from log4net, that should be straight forward.
What I have discovered thus far:
1) Console.setOut method allows using a different file stream.
1.1) Overriding memorystream seemed promising but there isn't a chance for raising an event to notify of changes
2) Writing to a file from Console seems like a work around, where I can read the file to update the UI textbox with new logs
3) FileStreams can autoflush, this means automatic updating of information. This sort of concept is similar to what I am after?
Whats the best way to get the console information put into log4net so that it can publish console log items the same way as log4net is configured? Currently my log4net puts logs into the eventlog, into a databinded wpf textbox, and into a file.

Personally don't know any other solution for this case other then you wrote:
ovewrite output of console pointing it to a file
read the file and add to a logger
To be notified about the change you can try to use FileSystemWatcher http://msdn.microsoft.com/en-us/library/system.io.filesystemwatcher.aspx.
Or if you don't want "real time notification", canopen file only for read and check with the timer if there is any row after last saved reader pointer position.
But I think, the first option is much easier.
Hope this helps.

Related

Serilog: Prevent Empty Log File Creation

EDIT: The below accepted answer works partially. It does defer file creation, but it does not respect a minimum LogEventLevel (since the deferred line is called whenever logging occurs, even if it ends up not writing to the file). It does, however, accomplish my overarching goal of "no blank file creation" (and I will just log errors in my application), so I'm keeping it as the accepted answer.
My final code looks like this:
new LoggerConfiguration().WriteTo.Map("", "", (_, writeTo) => writeTo.File("SampleName.log")).CreateLogger();
I have a simple console application in which I'm using Serilog and a File sink:
new LoggerConfiguration().WriteTo.File("SampleName.log", LogEventLevel.Error).CreateLogger();
Ideally, I would only want it to create a log file on disk when there is something to log (errors in my scenario). However, if the application runs and completes without errors, Serilog will still create an empty file. Is there a way to instruct Serilog to only create a file when it has something to log?
This is not possible as of this writing. The file is created on disk as soon as the FileSink instance is created and there's no configuration option to change that behavior.
You could create a sink that wraps the FileSink and lazily create the FileSink instance whenever the first log event is emitted.
Or, you could delete the empty file at the end, just before your app ends (but after you call Log.CloseAndFlush).

Read a log file created by NLog and write in it simultanously

I have an application that writes logs into a file created by NLog.
And I have another application called Log Viewer. It can open and read the log file mentioned above. But there is some problem here. While log viewer is reading the log file, and first application is writing in it , some of the log line can not be seen in log viewer.
for example if first application write log in every milisecond , log viewer can not track new log lines and miss some of them , you know ? I need an online log viewer that can track any new log lines. I do not want to read all text in file every method call, I need just to read new log line in it
The only way you can know that a log line is new is by knowing which position in the file you last read (eg. "int lastposition=0;".
You need to read from that position until the end of file. The position of "End of file" is the same as the filelength. When the file block has been read; you show what you read in the viewer, and save the last position into the variable lastposition; where you need to start next time, the viewer is reading.
But if the viewer can't open the file... that's an other story.
Having both applications share the same log is likely to be problematic. Probably the easiest solution is to have your viewer copy the original log file, and view it's own dedicated copy. You can occasionally check to see if the actual log file has updates and make new copies accordingly.
Having both access the same file will require locking, and risks causing issues in your application if the file is unavailable to write to (possibly blocking, losing log entries, or generating exceptions).
Best solution is to set up a NLog target for a database. Keep track of the last updated row is easier and safer than tracking the file position.
I wouldn't recommend sharing a file active log file both read and write.
How to set up NLog Database target.

log4net disable question

I am currently coding a very large project, I am near the end of the development stage and have sent out an evaluation version to the consumer so he can check for bugs and what not.
He told me that there was a large log file being created every time the application starts up.
After some investigation I found out that the log file was created using log4net via an external DLL which I use in my application.
I have searched a bit and found you can disable log4net logs through an app.config file and adding several values there, I tried this but with no success, the logs are being created no matter which value I use in the xml config file.
I have no access to the external DLL source code and I have searched for an xml configuration file that the DLL uses - but have not found anything.
I would like to disable log4net completely, the Logs are pretty useless for me since I use my own log engine, which i can configure any way I want it.
Thanks for any help
When you call for log4net you can specify the config file.
check the log4net initialization in your app to see where it gets the instructions.
another way is to configure log4net to use windows event log rather than a file and configure a limit on the log size.
You can specify the config file that log4Net uses; by default it is named log4net.config
If that's not the case, I would suggest firing up Process Monitor to check which file it is using.
[BTW, I would be slightly curious about using a third-party DLL that I didn't have control over.]
Log4net can also be configured programmatically. I wonder if the dll is doing that? If the dll is configuring log4net programmatically maybe you could use the log4net api to turn it off. I don't have a good example of how to do that now because I am on my phone.

rewriting the same file in rapid succession?

I am working on an app that will keep a running index of work in accomplished.
I could write once at the end of a work session, but I don't want to risk losing data if something blows up. Therefore, I rewrite to disk (XML) every time a new entry or a correction is made by the user.
private void WriteIndexFile()
{
XmlDocument IndexDoc
// Build document here
XmlTextWriter tw = new XmlTextWriter(_filePath, Encoding.UTF8);
tw.Formatting = Formatting.Indented;
IndexDoc.Save(tw);
}
It is possible for the writes to be triggered in rapid succession. If this happens, it tries to open the file for writing before the prior write is complete. (While it would not be normal, I suppose it is possible that the file gets opened for use by another program.)
How can I check if the file can be re-written?
Edit for clarification: This is part of an automated lab data collection system. The users will click a button to capture data (saved in separate files), and identify the sub-task the the data package is for. Typically, it will be 3-10 minutes between clicks.
If they make an error, they need to be able to go back and correct it, so it's not an append-only usage.
Finally, the files will be read by other automated tools and manually by humans. (XML/XSLT)
The size will be limited as each work session (worker shift or less) will have a new index file generated.
Further question: As the overwhelming consensus is to not use XML and write in an append-only mode, how would I solve the requirement of going back and correcting earlier entries?
I am considering having a "dirty" flag, and save a few minutes after the flag is set and upon closing the work session. If multiple edits happen in that time, only one write will occur - no more rapid user - also have a retry/cancel dialog if the save fails. Thoughts?
XML is a poor choice in your case because new content has to be inserted before the closing tag. Use Text istead and simply open the file for append and write the new content at the end of the file, see How to: Open and Append to a Log File.
You can also look into a simple logging framework like log4net and use that instead of handling the low level file stuff urself.
If all you want is a simple log of all operations, XML may be the wrong choice here as it is difficult to append to an XML document without rewriting the whole file, which will become slower and slower as the file grows.
I'd suggest instead File.AppendText or even better: keeping the file open for the duration of the aplication's life time and using WriteLine.
(Oh, and as others have pointed out, you need to lock to ensure that only one thread writes to the file at a time. This is still true even with this solution.)
There are also logging frameworks that already solve this problem, such as log4net. Have you considered using an existing logging framework instead of rolling your own?
I have a logger that uses System.Collections.Queue. Basically it waits until something is queued then trys to write it. While writing items, which could be slow, more items could be added to the queue.
This will also help in just grouping messages rather than trying to keep up. It is running on a separate thread.
private AutoResetEvent ResetEvent { get; set; }
LogMessage(string fullMessage)
{
this.logQueue.Enqueue(fullMessage);
// Trigger the Reset Event to send the
this.ResetEvent.Set();
}
private void ProcessQueueMessages()
{
while (this.Running)
{
// This will process all the items in the queue.
while (this.logQueue.Count > 0)
{
// This method will just log the top item on the queue
this.LogQueueItem();
}
// Once the queue is empty will wait for a
// another message to queueed before running again.
// Rather than sleeping and checking if the queue is full,
// saves from doing a System.Threading.Thread.Sleep(1000); stuff
this.ResetEvent.WaitOne();
}
}
I handle write failures but not dequeueing until it wrote to the file with no errors. Then I just keep attempting until it finally can write. This has saved me because somebody removed permissions from one of our apps during it process. Permission was given back with out shutting down our app, and we didn't lose a single log statement.
Consider using a flat text file. I have a process that I wrote that uses an XML log... it was a poor choice. You can't just write out the state as you run without having to constantly rewrite the file to make sure the tags are correct. If it was flat entries written to a file you could have an automatic timeline that could give you details of what happened without trying to figure out if it was the XML writer/tag set that blew up and you don't have to worry about your logs bloating out as much.
I agree with others suggesting you avoid XML. Also, I would suggest you have one component (a "monitor") that is responsible for all access to the file. That component will have the job of handling multiple simultaneous requests and making the disk writes happen one after another.

C#: tail like program for text file

I have a log file that continually logs short lines. I need to develop a service that reacts (or polls, or listens to) to new lines added to that file, a sort of unix' tail program, so that my service is always up to date reguarding the file.
I don't think that opening a read stream and keeping it opened is a good idea. Maybe I should use the FileSystemWatcher class.
Long story short, I need to parse in real time every new line added to this file.
Any idea help or indication is really appreciated.
EDIT
As I've been not very clear. I do not need any program, I am writing a program. For reading (then processing) every new line added to the file. I mean that what I'm looking for is a methodology (or: how to implement this?) for continually tailing a file that keeps on been written.
I have to develop a Windows service that "listens" to this file and does operations on every new line.
So, if in a given moment the file is:
12.31.07 - jdoe [log on] 347
12.32.08 - ssmith [log on] 479
12.32.08 - mpeterson [log off] 532
12.32.09 - apacino [log on] 123
in the very moment that the line
12.32.11 - pchorr [log on] 127
is added to the log file by the logging program (that I have not access to), I need my Windows service to "react" to the line addiction, intercept the new line (12.32.11 - pchorr [log on] 127) and process it. And so on.
Now, I don't know how to do this. I should poll the file every n seconds, store the last read line in memory and process only the newly added lines. The problem with this is that is very slow, plus I'd be reading a very large file every time.
Or maybe I could use FileSystemWatcher, but I haven't found any example of using it for similar purposes.
So, what would you suggest to get the work done? Thanks.
I would recommend using FileSystemWatcher to be notified of changes to the file or files you're concerned about. From there, I would cache information such as the size of the file between events and add some logic to only respond to full lines, etc. You can use the Seek() method of the FileStream class to jump to a particular point in the file and read only from there. Given these features, it shouldn't be too hard to hand-roll this functionality if that's what you need.
Simple solution would be use , sample code provided in http://www.codeproject.com/Articles/7568/Tail-NET article. It is just one function copy/paste into your code.
It is important to note that Microsoft (since vista/svr08) no longer updates file metadata when a file is updated (such as a log file being updated by a service).
For example, the metadata for a file such as modified date, will not be updated until the file is closed by the service/program which is updating the log file.
Therefore FileSystemWatcher will NOT catch log file updates as you might expect.
https://blogs.technet.microsoft.com/asiasupp/2010/12/14/file-date-modified-property-are-not-updating-while-modifying-a-file-without-closing-it/
You haven't really explained whether you need a tail-like program for Windows i.e. http://www.baremetalsoft.com/baretail/ or if you want a windows version of tail (use cygwin) or if you're looking for some sort of log monitoring API....

Categories

Resources