I am logging the data like below :
.WriteTo.File("log.txt", rollOnFileSizeLimit: true,retainedFileCountLimit: 1,
fileSizeLimitBytes: 10)
This creates a file set like this:
log.txt => log_001.txt => log_002.txt
Since I set retainedFileCountLimit = 1 once the log.text was deleted then log_001.txt
Is there any way to maintain only the log.log file instant of keeping int values in an increased format ?
There is a forked sink, Serilog.Sinks.PersistentFile that does this.
But it's a very debatable ask which is why it's not supported by the standard Serilog.Sinks.File for the foreseeable future.
Related
I want to log my whole data in a file and log only the errors in my oracle database. I'm using the below code but it's not working.
Log.Logger = new LoggerConfiguration()
.MinimumLevel.Verbose().WriteTo.File("logs/myLog.txt", rollingInterval: RollingInterval.Minute)
.MinimumLevel.Error().WriteTo.Oracle(cfg =>
cfg.WithSettings(logConnectionString, tableSpaceAndTableName: "MY_TABLE")
.UseBurstBatch(true, 1000, true, 1)
.CreateSink())
.CreateLogger();
In this case, i log only error data both in file and oracle database. How must i configure the above code, to store the whole data in a file and the error data in database?
Almost all sinks have a restrictedToMinimumLevel parameter. There's no Serilog.Sinks.Oracle project in the Serilog Github repo so I assume you use Serilog.Sinks.Oracle. The Oracle() method accepts a restrictedToMinimumLevel parameter as well.
You can change your code to :
Log.Logger = new LoggerConfiguration()
.WriteTo.File("logs/myLog.txt",
rollingInterval: RollingInterval.Minute,
restrictedToMinimumLevel:LogEventLevel.Verbose)
.WriteTo.Oracle(cfg =>
cfg.WithSettings(logConnectionString, tableSpaceAndTableName: "MY_TABLE")
.UseBurstBatch(true, 1000, true, 1)
.CreateSink(),
restrictedToMinimumLevel:LogEventLevel.Error )
.CreateLogger();
Configuring database logging typically takes more code than this, so it's a good idea to extract the code to a different method. You'll probably want to use additional columns to extract common attributes like categories, activity IDs etc so you don't have to parse the JSON payload to find specific events:
ILogEventSink ConfigureOracle(BatchLoggerConfiguration cfg)
{
const string column = "ADDITIONALDATACOLUMN";
var columnOptions = new ColumnOptions
{
AdditionalDataColumns = new List<DataColumn>
{
new DataColumn(column , typeof(string))
}
};
return cfg.WithSettings(logConnectionString, tableSpaceAndTableName: "MY_TABLE")
.UseBurstBatch(true, 1000, true, 1)
.CreateSink();
}
...
Log.Logger = new LoggerConfiguration()
.WriteTo.File("logs/myLog.txt",
rollingInterval: RollingInterval.Minute,
restrictedToMinimumLevel:LogEventLevel.Verbose)
.WriteTo.Oracle(ConfigureOracle,LogEventLevel.Error )
.CreateLogger();
Organizing database logging code
I also log to a database. Putting everything into a single expression may be fashionable but the resulting code can quickly become too hard to read and maintain. Fluent APIs aren't always a good idea.
In this case, one almost always needs to specify extra columns to hold common properties like categories and activity IDs, perhaps even customer IDs. This means specifying extra columns. No matter the database product, querying the raw JSON data is more expensive than querying materialized and indexed columns.
It will take some experimentation until one get a table you can actually use for troubleshooting. The database logging configuration should be extracted into a separate a separate method, if not a separate file. Otherwise Startup.cs or, in .NET 6, Program.cs, will become an unreadable mess.
Recently I focused in Serilog to point out a templated path based on the current Date of every LogEvent.
After figuring how to implement this, I finally resolve the path on the fly by using the Date field into LogEvent by using Serilog.Sinks.Map, such as shown below:
return new LoggerConfiguration().WriteTo
.Map(
// Log key
(LogEvent le) => le.Timestamp.Date,
// Log Action
(DateTime date, LoggerSinkConfiguration lc) =>
{
string path = GetFilesPath(date, logName);
lc.File(path);
}
);
public string GetFilePath(DateTime date, string logName) =>
Path.Combine("./Logs", $"{date:yyyy-MM-dd}", $"{logName}.log");
With this, I achieved my goal: writing logs with in a sub folder based on the Date.
The issue is, since Serilog does not know that the pointing path changed, it does not close or dispose the file stream as expected. So, my application leaves files opened day to day, ad infinitum.
It'd be great if someone has faced this approach, to manually close the stream, or if Serilog API exposes somehow automatically close those streams.
Btw, I am using
Serilog 2.9.0
Serilog.Sinks.File 4.1.0
Serilog.Sinks.Map 1.0.1
Edit 05/06/2020 for those reading this afterwards.
Keying every single log event by the Timestamp is a bad idea. By doing so, we are in fact adding an entry per log event (supposing that no events are being emitted at the same time, for simplicity).
Even if we specify the sinkMapCountLimit to 0, which in theory won't keep any event in our map, if that event is configured to write to file (specially with the RollingFile sink), those sinks won't be disposed nor erased from memory.
So, the chunk of code above is leaking memory (and pretty fast).
The Map.Sink documentation warns about this, indeed.
...but isn't suitable when the set of possible key values is open-ended.
Serilog.Sinks.Map accepts a parameter sinkMapCountLimit to control this:
return new LoggerConfiguration().WriteTo
.Map(
// Log key
(LogEvent le) => le.Timestamp.Date,
// Log Action
(DateTime date, LoggerSinkConfiguration lc) =>
{
string path = GetFilesPath(date, logName);
lc.File(path);
},
sinkMapCountLimit: 5
);
I have written a logging framework that uses Log4Net, Nlog and Serilog interchangeably. Every call to the logger, fires an event before the log entry is written. This optionally pushes entries via SignalR to connected web clients.
Before the serilog addition, I used string.Format to get the formatted text. Now with that great destructuring has come great responsibility. string.Format obviously doesn't like {#0} or {data} in the string.
// log the event before engaging with the logger
LogEventBus.Handle(LogLevels.Info, DateTime.Now, msg, args);
if (DiagnosticLevel < level)
return;
_logger.Info(msg, args);
Is there any way to get the serilog generated output, directly as string?
I started writing a memory sink, but that moves away from my centralised event based logging, and completely breaks away from the other libraries I have implemented.
Any suggestions?
You can convert Serilog's message format to a standard .NET format string ({0} etc) like this:
var parser = new MessageTemplateParser();
var template = parser.Parse(templateMessage);
var format = new StringBuilder();
var index = 0;
foreach (var tok in template.Tokens)
{
if (tok is TextToken)
format.Append(tok);
else
format.Append("{" + index++ + "}");
}
var netStyle = format.ToString();
Once you have a standard format string you can pass this through or use string.Format() with it and args.
It's not going to be super-efficient - hooking deeper into the Serilog pipleine (ILogEventEnricher) should be better. As another commenter suggested, it may be better just to embrace a single logging framework here.
Do your logging in two steps.
Write log message to a TextWriter and read the value from the
TextWriter
(https://github.com/serilog/serilog/wiki/Provided-Sinks#textwriter)
Write that already formatted value into the real logger
Whilst this might work, I worry about your architecture here. It all sounds like yuo are creating huge dependencies on Serilog, whilst you are also using several other logging frmeworks. Choose one logging framework OR use really generic features. C# has introduced string interpolation, not as fancy as Serilogs serialization etc, but works. I'd go back to KISS.
I need to parse an IIS log file. Is there any alternative to LogParser, a simple class to query a log file ?
I only need to know how many request I receive between 2 dates.
Here is an example of iis log file :
#Software: Microsoft Internet Information Services 7.5
#Version: 1.0
#Date: 2014-08-26 12:20:57
#Fields: date time s-sitename s-computername s-ip cs-method cs-uri-stem cs-uri-query s-port cs-username c-ip cs-version cs(User-Agent) cs(Cookie) cs(Referer) cs-host sc-status sc-substatus sc-win32-status sc-bytes cs-bytes time-taken
2014-08-26 12:20:57 W3SVC1 QXXXSXXXX 172.25.161.53 POST /XXXX/XXX/XXXX/XXXXX/1.0/XXXX/XXXXXXXX/xxxxxx.svc - 443 - 999.99.999.999 HTTP/1.1 - - - xxxx.xxxx.xxx.xxx.xxxx.xxxx.xxx.com 200 0 0 4302 5562 1560
You can use Tx (LINQ to Logs and Traces) , you can install it via nuget
and use it like this:
var iisLog = W3CEnumerable.FromFile(pathToLog);
int nbOfLogsForLastHour = iisLog.Where(x => x.dateTime > DateTime.Now.AddHours(-1)).Count();
If the log file is used by another process, you can use W3CEnumerable.FromStream
It's 2017 and the LogParser is still closed source. Moreover, all the instrumentation provided by cloud solutions appears to be making the need for parsing IIS logs a thing of the past. But since I am also dealing with legacy apps, I wrote this simple parser using .NET core.
using System;
using System.IO;
using W3CParser.Extensions;
using W3CParser.Instrumentation;
using W3CParser.Parser;
namespace W3CParser
{
class Program
{
static void Main(string[] args)
{
var reader = new W3CReader(File.OpenText(args.Length > 0 ? args[0] : "Data/foobar.log"));
using (new ConsoleAutoStopWatch())
{
foreach (var #event in reader.Read())
{
Console.WriteLine("{0} ({1}):{2}/{3} {4} (bytes sent)",
#event.Status.ToString().Red().Bold(),
#event.ToLocalTime(),
#event.UriStem.Green(),
#event.UriQuery,
#event.BytesSent);
}
}
}
}
}
Source code: https://github.com/alexnolasco/32120528
You can use IISLogParser , and install it via nuget, it has support for large files (> 1Gb)
List<IISLogEvent> logs = new List<IISLogEvent>();
using (ParserEngine parser = new ParserEngine([filepath]))
{
while (parser.MissingRecords)
{
logs = parser.ParseLog().ToList();
}
}
If you're dealing with large volumes and/or dispersed locations of IIS log files, then SpectX is a handy tool for this because you don't have to ingest the logs and can run queries directly on multiple raw files. Avg processing speed per core - 350MB/sec.
It's not open source but the full-functionality 30-day trial is free.
Tutorials:
Parsing IIS logs.
Analyzing IIS logs - 20 sample queries.
To filter a time period, sort the logs by date and filter the period you need, e.g:
| sort(date_time desc)
| filter(date_time > T('2019-11-01 08:48:20.000 +0200'))
| filter(date_time < T('2019-11-05 11:48:20.000 +0200'));
I use filter feature of CMTrace.exe tool (Refer screenshot):
I am using Nhibernate 3.3 and I have set up the configuration so that it should log SQL, etc. In the past (NH 2.2+) I have set stdout to a StreamWriter like so:
string nhLoggerPath = "...path...";
Logger = new StreamWriter(nhLogPath, false, Encoding.UTF8);
Console.SetOut(Logger);
And everything was working just fine. Now with version 3.3, I get everything in my log file as before, EXCEPT for the sql that Nhibernate is supposed to be logging. What has changed, or what do I need to do to get everything working again?
p.s. I am not using log4net (obviously) and I don't care to either.
Here is how I am setting up the logging....
...
db.ConnectionString = #"myConnectionString";
db.LogSqlInConsole = true;
db.LogFormattedSql = true;
...
how do you setup logging sql? NH 2.x used log4net exclusivly, NH3.x has an internal logger implementation which defaults to log4net if present or nologging when not present. You probably did not configure it to log to console.