Logging multiple threads with nlog - c#

I use nlog logger in my project.
My program generates xml files based on data which i getting from sql server. I'm doing this with PLINQ. But also i have to log tracing info to be able make some investigations on exceptional cases on prod environment.
Result logs looks awful, when it came from multiple threads. For example:
Operation 1 started
Deserializing XXX
Operation 2 started
Deserializing XXX finished with status X
Filling XXX with data from Z
Deserializing YYY....
And its just for degree of parallelism 2.
I'd like to see result like this:
Operation 1 started
Deserializing XXX
Deserializing XXX finished with status X
Filling XXX with data from Z
Operation 1 finished
Operation 2 started
Deserializing YYY....
I see some solutions, but they're not looking good enough:
Save logging data to some buffer and flush it when parallel task ends - I will be ought to pass context to all inner methods (looks terrible!).
Add some kind of prefix to logging message to help getting context for some messages - i have to pass prefix to every inner message (also looks terrible).
Is there some clean solutions for this problem?

In NLog config files, there is the ${threadid} syntax. Use it like this:
<target name="file" xsi:type="File"
layout="${longdate} [${threadid}] ${level:uppercase=true} ${message} ${exception:format=tostring}"
fileName="${basedir}/logs/log.txt"
archiveFileName="${basedir}/logs/log.{#####}.txt"
archiveAboveSize="10485760"
archiveNumbering="Sequence"
concurrentWrites="true"
keepFileOpen="false" />
More info:
https://github.com/NLog/NLog/wiki/ThreadId-Layout-Renderer
I used it in production and generally it works. It is not perfect, but all operations (which I logged) is in sequence and this threadid describe which operation is in which ThreadId.

Related

WIX Custom Action modify file in INSTALLFOLDER after InstallFinalize

I have written a C# custom action for my WIX V3 Installer which is supposed to modify my appsettings.json in the INSTALLFOLDER. The action´s Execute-attribute is set to immediate and Impersonate="no",it is called after InstallFinalize but it encounters a problem within this action which is the missing admin permission.
The action modifies appsettings.json in the INSTALLFOLDER which is Program File (x86).
The custom action reads, deserializes, modifies, and serializes the data normally with no error.
The error happens during writing to appsettings.json in InstallFolder.
Although the error appears the rest of the application is installed and is working fine.
I have tried combining Execute and Custom actions in ALL possible combinations, and while I get privileges to write to InstallFolder if I change the Custom Action to run before the installation is finished I can't find the appsettings.json file because all the files at that point are temporary files (.tmp), and with non-relevant names.
The error that appears:
Error message
Part of my Product.wsx code:
<Property Id="Password" Value="Error password" />
<Property Id="FilePath" Value="C:\Program Files (x86)\Company\Product\" />
<CustomAction Id="SetUserName" Property="Username" Value="[ACCOUNT]"/>
<CustomAction Id="SetPassword" Property="Password" Value="[PASSWORD]"/>
<CustomAction Id="SetFilePath" Property="FilePath" Value="[INSTALLFOLDER]"/>
<Binary Id="GetData" SourceFile="$(var.SetupExtensions.TargetDir)\$(var.SetupExtensions.TargetName).CA.dll" />
<CustomAction Id="ChangeJSON" BinaryKey="GetData" DllEntry="CustomAction1" Execute="immediate" Impersonate="no" Return="check"/>
<InstallExecuteSequence>
<Custom Action="SetUserName" After="CostFinalize" />
<Custom Action="SetPassword" After="CostFinalize" />
<Custom Action="SetFilePath" After="CostFinalize"/>
<Custom Action='ChangeJSON' After='InstallFinalize'></Custom>
</InstallExecuteSequence>
My Custom Action code:
public static ActionResult CustomAction1(Session session)
{
try
{
session.Log( "Begin CustomAction1" );
string user = session["Username"];
string password = session["Password"];
string loc = session["FilePath"];
var json = System.IO.File.ReadAllText( loc +"appsettings.json" );
var root = JsonConvert.DeserializeObject<Root>(json);
root.Default.UserName = user;
root.Default.Password = password;
json = JsonConvert.SerializeObject( root, Formatting.Indented );
System.IO.File.WriteAllText( loc + "appsettings.json", json );
//The MessageBox bellow shows(and is with correct info) when I remove System.IO.File.WriteAllText above ^^
MessageBox.Show("Username: "+ user +"\nPassword: "+password +"\nFilePath: " + loc);
return ActionResult.Success;
}
catch(Exception ex )
{
session.Log( "Error: " + ex.Message );
MessageBox.Show(ex.Message);
return ActionResult.Failure;
}
How do I modify appsettings.json through my Custom Action?
Custom actions that change the system state should run between InstallIntialize and InstallFinalize. This means you should schedule it Before InstallFinalize not after.
It should also run in deferred execution with no impersonation. You will need another custom action scheduled prior to this one that creates custom action data and passes the data to the deferred custom action.
Ideally you should also have rollback and commit actions to support rollbacks and test using the WIXFAILWHENDEFERRED custom action.
Read:
http://www.installsite.org/pages/en/isnews/200108/index.htm
http://blog.iswix.com/2011/10/beam-me-up-using-json-to-serialize.html
Application Launch: I would consider updating the JSON via the application launch sequence if you can. 1) Single source solution, 2) familiar territory for developers, 3) easier and better debugging features and 4) no impersonation-, sequencing- and conditioning-complexities like you get for custom actions: Why is it a good idea to limit the use of custom actions in my WiX / MSI setups?
Deferred CA Sample: You can find a sample of a deferred mode custom action WiX solution here:
https://github.com/glytzhkof/WiXDeferredModeSample
Inline Sample: This older answer shows the key constructs inline in the answer:
Wix Custom Action - session empty and error on deferred action.
CustomActionData: Deferred mode custom actions do not have access to the session objects property values like immediate mode custom actions do. As Painter has written you need to "send" text or settings to deferred mode by writing the data into the execution script. The data is then available in deferred mode by reading the special property CustomActionData. The above sample should have the required constructs to see how this works. See the MSI documentation as well and also Robert Dickau's MSI Properties and Deferred Execution.
Transaction: Deferred mode custom actions can only exist between InstallInitialize and InstallFinalize. Actions between these actions run with elevated rights and can write to per-machine locations (not writeable for normal users). You can schedule yourself right before InstallFinalize for starters to test your current delete mechanism (I would try other approaches).
Rollback CAs: Rollback custom actions are intended to undo what was done with your JSON custom action and then revert everything to the previous state (cleanup after failed install). Writing these is quite involved and takes a lot of testing - many just skip them. It is best to try to find a library or a framework that does the job for you. I am not aware of any except the one linked to below, and I don't know its state. Rollback custom actions must precede the actual CA in the InstallExecuteSequence (when an MSI is rolling back it executes the sequence in reverse).
With all this complexity, custom actions become the leading source of deployment errors: https://robmensching.com/blog/posts/2007/8/17/zataoca-custom-actions-are-generally-an-admission-of-failure/
Links:
How to pass CustomActionData to a CustomAction using WiX?
https://github.com/NerdyDuck/WixJsonExtension (not sure of status of this)

C# EventViewer Logs Parsing

I am in charge of parsing forwarded EventViewer (evt) logs (Windows 7?). To do this I am run a query using Log Parser 2.2 over the logs and pulling out specific EventIDs and writing these to a CSV file. However, I am considering using EventViewerReader to do this instead.
The query I am doing on these evt includes a "strings" column which outputs a bunch of junk usually of the format: SID|Username|Usergroup|... but it isn't consistent. What I want is a consistent way to get the username for these events and filter out the useless data. The problem is that I don't understand the format for the output. I am wondering if these events have a standard format or if this is a custom format from my work? The method I have right now basically looks for known usergroups and checks for potential usernames to the left of them (skipping "LOCALS SERVICE", "NETWORK SERVICE", "-", and some other keywords). My issue with this method is that I don't know all of the usergroups, and I can get false-positives on usernames.
Here are some of the EventID Codes I am looking at:
https://www.ultimatewindowssecurity.com/securitylog/quickref/downloads/quickref.zip
4624 An account was successfully logged on
4625 An account failed to log on
4647 User initiated logoff
4648 A logon was attempted using explicit credentials
4800 The workstation was locked
4801 The workstation was unlocked
4802 The screen saver was invoked
4803 The screen saver was dismissed
I ended up looking closer at the evt files and saw they have a set XML schema. "Strings" isn't really an element, but a the concatonated values of EventData child Data elements. All I did was look at the schema for each EventID and found the depth of the username in each event type.
I think it was strings[5] (TargetUserName) for logon/logoff, strings[0] for a couple others, and strings[1] for a couple more.
<EventData>
<Data Name="SubjectUserSid">S-1-5-18</Data>
<Data Name="SubjectUserName">XXX-PC$</Data>
<Data Name="SubjectDomainName">WORKGROUP</Data>
<Data Name="SubjectLogonId">0x3e7</Data>
<Data Name="TargetUserSid">S-1-5-18</Data>
<Data Name="TargetUserName">SYSTEM</Data>
<Data Name="TargetDomainName">NT AUTHORITY</Data>
<Data Name="TargetLogonId">0x3e7</Data>
...
</EventData>

NLog - Parameters not passed to query when writing to database

I'm pretty new to NLog so please forgive my basic question.
I've inherited a Winforms application written by some contractors, I'm mainly a database developer but I've managed to do some development of the application but I sometimes struggle with tracking down error messages encountered by users, therefore I'm attempting to retrofit logging into the application via NLog, I first encountered the issue in NLog 4.4.5 and I've since updated to 4.4.12 and that hasn't solved the problem. I've verified that NLog is catching the errors correctly as it will output to a text file but when I try to direct it to a database output I can't get it to work.
This is my database table:
My problem is that I can only get errors written to the database only if I don't pass any parameters to the insert statement (which is pretty useless). That is to say that the following in my NLog.config file works:
<target name="database" xsi:type="Database">
<commandText>INSERT INTO [tblException] (DbVersionID, ExceptionDateTime) SELECT MAX(DbVersionID), GETDATE() FROM tblDbVersion</commandText>
<dbProvider>System.Data.SqlServerCe.4.0</dbProvider>
<connectionString>Data Source=${basedir}\Database.sdf</connectionString>
</target>
But this doesnt':
<target name="database" xsi:type="Database">
<commandText>INSERT INTO [tblException] (DbVersionID, ExceptionDateTime, Message) SELECT MAX(DbVersionID), GETDATE(), #message FROM tblDbVersion</commandText>
<parameter name="#message" layout="${message}" />
<dbProvider>System.Data.SqlServerCe.4.0</dbProvider>
<connectionString>Data Source=${basedir}\Database.sdf</connectionString>
</target>
I've enabled internal logging and the following is what I get:
2017-11-28 11:26:45.8063 Trace Executing Text: INSERT INTO [tblException] (DbVersionID, ExceptionDateTime, Message) SELECT MAX(DbVersionID), GETDATE(), #Message FROM tblDbVersion
2017-11-28 11:26:45.8063 Trace Parameter: '#message' = 'Test Error Message' (String)
2017-11-28 11:26:45.8063 Error Error when writing to database. Exception: System.Data.SqlServerCe.SqlCeException (0x80004005): A parameter is not allowed in this location. Ensure that the '#' sign is in a valid location or that parameters are valid at all in this SQL statement.
at System.Data.SqlServerCe.SqlCeCommand.ProcessResults(Int32 hr)
at System.Data.SqlServerCe.SqlCeCommand.CompileQueryPlan()
at System.Data.SqlServerCe.SqlCeCommand.ExecuteCommand(CommandBehavior behavior, String method, ResultSetOptions options)
at System.Data.SqlServerCe.SqlCeCommand.ExecuteNonQuery()
at NLog.Targets.DatabaseTarget.WriteEventToDatabase(LogEventInfo logEvent)
at NLog.Targets.DatabaseTarget.Write(LogEventInfo logEvent)
It appears that the parameter isn't getting replaced in the query before it's being run. I tried adding single quotes in the command text just in case that would help but it just resulted in the literal string '#message' being inserted into the database field.
I can't see anything that I've done differently to the examples, so any help would be appreciated.
Regards,
Alex
As per the comment from #pmcilreavy
Have you tried temporarily removing the MAX and replacing with a literal and changing to INSERT INTO .. (blah) VALUES (blah, #Message); SqlCe has quite a few limitations and quirks compared to Sql Server so might be worth simplifying things where possible.
The issue appears to be that SQLCE doesn't support parameters in the select part of a select statement.

NLog Async target timestamps

I'm curious about timestamping in NLog when using async targets.
I know that according to What's the meaning of the time stamp in nlog when async is on? the timestamp is generated when the log entry is queued, as you would expect.
However, I noticed something in one of my log files, so I decided to whip up a quick test.
static void Main(string[] args)
{
for (int i = 0; i < 10000; i++)
{
_logger.Info("Timestamp: {0}, LogNumber: {1}",DateTime.Now.ToString("HH:mm:ss.fff"), i);
}
_logger.Factory.Flush();
}
My NLog.config looks like:
<?xml version="1.0" encoding="utf-8" ?>
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<targets async="true">
<target xsi:type="File" name="f" fileName="${basedir}/logs/${shortdate}.log"
layout="${longdate} ${uppercase:${level}} ${message}" />
</targets>
<rules>
<logger name="*" minlevel="Trace" writeTo="f" />
</rules>
</nlog>
Now if I look at the output, we see that all entries between 754, and 9962 have the same NLog timestamp, however, DateTime.Now shows the milliseconds progressing:
2015-02-12 08:19:23.3814 INFO Timestamp: 08:19:23.376, LogNumber: 0
...
2015-02-12 08:19:23.3853 INFO Timestamp: 08:19:23.384, LogNumber: 754
...
2015-02-12 08:19:23.4033 INFO Timestamp: 08:19:23.399, LogNumber: 9963
...
I can understand that with the overheads, a DateTime.Now stamp of .384 could be logged an .385, however, it doesn't make sense to me that .399 comes out as .385.
The way that the NLog timestamp progresses, almost looks like the timestamp is generated during a logging cycle, rather than the log call. Which would be contrary to the above article.
So, is this something to do with the time source NLog uses, or rather, when the timestamp is generated?
Behavior you see is defined by time source system of NLog.
NLog 4.x provides 4 different time sources, and you can plug in your own. The default time source of NLog is optimized for performance and does some caching of the values that are being returned. The caching mechanism uses Environment.TickCount intervals to return time stamps for new log entries. Resolution of TickCount property is limited to resolution of system timer which is not a fixed value but typically varies in the 10-16 ms range.
So, the effect you see is defined by default time source of NLog. And, btw, it's not specific to async targets - remove async property from your example and you will see exactly same behavior.
If you need a more precise resolution for time stamps in your log events, you can configure NLog to use another time source:
<nlog>
<time type="AccurateUTC" />
</nlog>
However, this will not guarantee that all your events will have different time stamps, you might get 2-3 events with same time. But it will use straight value of DateTime.UtcNow to fill in the date/time field of log event, without any caching. And without any date time zone conversions too...
You can find more info about time sources in NLog wiki

How to log complex synchronization process?

I'm developing a complicated distributed service that makes iterative synchronization process. It synchronise every 10 seconds business entities in different information systems. One iteration is consist of bunch of 3d party service calls to retrieve current state of business objects (count of customers, goods, certain customer and goods details etc.), queries to local DB and then get differences between them and smooth out, synchronize this differences.
There are different types of iterations. They are fast (only changes in set of objects) and slow iterations (full revieweing of data). Fast are every 10 seconds and slow are once a day.
So, how can I log this processes using NLog? I'm using SQLite for storing data. But I'm stuck in DB design for logs.
So I want to log flow of every iteration:
1. Request for current state of objects to 3d party service
2. Query the local database for current state of objects
3. Get differences list
4. Invoke external service to commit insufficient data
5. Update local database for insufficient data
But there is so many kinds of info to log so I can't just put it into one TEXT field.
At the moment I'm using such structure for logs:
CREATE TABLE [Log] (
[id] INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT,
[ts] TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
[iteration_id] varchar,
[request_response_pair] varchar,
[type] VARCHAR NOT NULL,
[level] TEXT NOT NULL,
[server_id] VARCHAR,
[server_alias] VARCHAR,
[description] TEXT,
[error] Text);
So every service request and response puts to description and request_response_pair is a key to link every response to every request.
Here is my NLog config:
<?xml version="1.0" encoding="utf-8" ?>
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" internalLogFile="D:\nlog.txt" internalLogLevel="Trace">
<targets>
<target name="Database" xsi:type="Database" keepConnection="false"
useTransactions="false"
dbProvider="System.Data.SQLite.SQLiteConnection, System.Data.SQLite, Version=1.0.82.0, Culture=neutral, PublicKeyToken=db937bc2d44ff139"
connectionString="Data Source=${basedir}\SyncLog.db;Version=3;"
commandText="INSERT into Log(iteration_id, request_response_pair, type, level, server_id, server_alias, description, error) values(#Iteration_id, #Request_response_pair, #Type, #Loglevel, #server_id, #server_alias, #Description, #Error)">
<parameter name="#Type" layout="${message}"/>
<parameter name="#Loglevel" layout="${level:uppercase=true}"/>
<parameter name="#Request_response_pair" layout="${event-context:item=request_response_pair}"/>
<parameter name="#Iteration_id" layout="${event-context:item=iteration_id}"/>
<parameter name="#server_id" layout="${event-context:item=server_id}"/>
<parameter name="#server_alias" layout="${event-context:item=server_alias}"/>
<parameter name="#Description" layout="${event-context:item=description}"/>
<parameter name="#Error" layout="${event-context:item=error}"/>
</target>
</targets>
<rules>
<logger name="*" minlevel="Trace" writeTo="Database" />
</rules>
</nlog>
Here is how I log:
namespace NLog
{
public static class LoggerExtensions
{
public static void InfoEx(this Logger l, string message, Dictionary<string, object> contextParams)
{
LogEventInfo eventInfo = new LogEventInfo(LogLevel.Info, "", message);
foreach (KeyValuePair<string, object> kvp in contextParams)
{
eventInfo.Properties.Add(kvp.Key, kvp.Value);
}
l.Log(eventInfo);
}
public static void InfoEx(this Logger l, string message, string server_id, string server_alias, Dictionary<string, object> contextParams = null)
{
Dictionary<string, object> p = new Dictionary<string, object>();
p.Add("server_id", server_id);
p.Add("server_alias", server_alias);
if (contextParams != null)
{
foreach (KeyValuePair<string, object> kvp in contextParams)
{
p.Add(kvp.Key, kvp.Value);
}
}
l.InfoEx(message, p);
}
}
}
I know about logging levels but I need all this verbose logs, so I log it as info. I can't find any tutorial how to log these complicated, structured logs. Only plain dumb log messges.
I am assuming, you are talking about "Logs" for the typical "log things so we have something to look at if we need to inspect our workflow (errors/performance)". I assume that you do NOT mean logs as e.g. in "We need accounting information and the log is part of our domain data and is included in the workflow."
And from what I got from your posting, you are worrying about the backend storage format on the logs, so that you can later process them and use for said diagnostics?
Then I'd recommend that you make the logging code independend of the domain specifics.
Question: How will the logs you create be processed? Do you really need to access them all over the place so you need the database to provide you a structured view? Is it in any kind relevant how fast you can filter your logs? Or will they end up in one big log-analyzer application anyway, that is ran only ever second week when something bad happened?
In my opinion, the biggest reasons you want to avoid any domain specifics in the log are that "logs should work if things go wrong" and "logs should work after things changed".
Logs should work if things go wrong
If you have columns in your log table for domain specific values like "Request_response_pair", and there is no pair, then to write the log itself might fail (e.g. if its an index field). Of course, you can make sure to not have non-null columns and no restrictions in your DB-design, but take a step back and ask: Why do you want the structure in your log database anyway? Logs are meant to work as reliable as possible, so any kind of template you press them on may restrict the use cases or may make you not be able to log crucial information.
Logs should work after things changed
Especially if you need logs for detecting and fixing bugs or improve performance, that means that you will regulary compare logs from "before the change" to logs from "after the change". If you need to change the structure of your log database, because you changed your domain data, this is going to hurt you when you need to compare the logs.
True, if you make a data structure change, you probably still need to update some tools like log analyzer and such, but there is usually a large part of logging/analyzing code that is completely agnostic to the actual structure of the domain.
Many systems (including complex ones) can live with "just log one simple string" and later write tools to take the string apart again, if they need to filter or process the logs.
Other systems write logs in simple string key/value pairs. The log function itself is not domain specific but just accepts a string dictionary and writes it off (or even easier, a params string[] which should have an even number of parameters and you use every second parameter as key - if you aren't scared by that proposition :-D).
Of course, you will probably start writing another tooling layer on top of the base log functions that knows about domain specific data structures and then composes the string dictionary and pass it on. You certainly don't want to copy the decomposing code all place around. But make the base functions available at all places where you might want to log something. Its really helpfull if you indeed run into "strange" situations (exception handler) where some information is missing.

Categories

Resources