I'm using nlog for logging, and c# activity for spans with contexts such as traceparent / traceid.
I am adding an optional custom property to the activity, like this: activity.SetCustomProperty(customProperty.Key,customProperty.Value))
I would like to add the value to a log record.
However, I'm unsure on the correct configuration for NLog.
When configuring NLog layouts, you can set attribute layouts.
One of the layout renderers is the ActivityTraceLayoutRenderer.
Here is how I'm trying to configure the ActivityTraceLayoutRenderer, assuming the CustomProperty configured on the trace is "ReservationId":
"layout": {
"type": "JsonLayout",
"attributes": [
{
"name": "timestamp",
"layout": "${date:format=yyyy-MM-ddTHH\\:mm\\:ss.ffffff}"
},
{
"name": "traceId",
"layout": "${activity:property=TraceId}"
},
{
"name": "reservationId",
"layout": "${activity:{property=CustomProperty,item=ReservationId}}"
}
]
}
However, the resulting logs look... weird:
{ "timestamp": "2022-08-08T14:39:38.000840", "reservationId": "}" }
Related
I want to connect Azure Event hub using NLog.Extensions.AzureEventHub because its part of Nlog configuration, But not sure how do I pass the target correctly I am getting error but cant understand reason because I am passing the string , I check the online Nlog documentation but its only providing the Xml settings. Could you please give example app settings for Nlog Azure Eventhub taget (C# .net core 6.0)
NLog.Extensions.AzureStorage
This is my app settings
"NLog":{
"internalLogLevel":"Info",
"internalLogFile":"c:\\temp\\internal-nlog.txt",
"extensions": [
{ "assembly": "NLog.Extensions.Logging" },
{ "assembly": "NLog.Web.AspNetCore" },
{ "assembly": "NLog.Extensions.AzureEventHub" }
],
"targets": {
"allfile": {
"type": "File",
"fileName": "c:\\temp\\nlog-all-${shortdate}.log",
"layout": "${longdate}|${event-properties:item=EventId_Id}|${uppercase:${level}}|${logger}|${message} ${exception:format=tostring}"
},
"ownFile-web": {
"type": "File",
"fileName": "c:\\temp\\nlog-own-${shortdate}.log",
"layout": "${longdate}|${event-properties:item=EventId_Id}|${uppercase:${level}}|${logger}|${message} ${exception:format=tostring}|url: ${aspnet-request-url}|action: ${aspnet-mvc-action}"
},
"eh": {
"type": "AzureEventHub",
"credentials": {
"name": "eventhub1",
"connectionString": "Endpoint=sb://eventhubnp198007.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=4z/JbjHp7UqxCWPD343434343GZbJpfG5E="
},
"layout": "${longdate}|${event-properties:item=EventId_Id}|${uppercase:${level}}|${logger}|${message} ${exception:format=tostring}|url: ${aspnet-request-url}|action: ${aspnet-mvc-action}"
}
},
"rules":[
{
"logger":"*",
"minLevel":"Trace",
"writeTo":"allfile"
},
{
"logger":"Microsoft.*",
"maxLevel":"Info",
"final":"true"
},
{
"logger":"*",
"minLevel":"Trace",
"writeTo":"ownFile-web"
}
]
}
Error
> 2022-08-23 12:59:10.0781 Error AzureEventHub(Name=eh): Failed to
> create EventHubClient with connectionString= to EventHubName=.
> Exception: System.ArgumentException: Value cannot be an empty string.
> (Parameter 'connectionString') at
> Azure.Core.Argument.AssertNotNullOrEmpty(String value, String name)
> at
> Azure.Messaging.EventHubs.Producer.EventHubProducerClient..ctor(String
> connectionString, String eventHubName, EventHubProducerClientOptions
> clientOptions) at
> NLog.Targets.EventHubTarget.EventHubService.Connect(String
> connectionString, String eventHubName, String serviceUri, String
> tenantIdentity, String resourceIdentity, String clientIdentity) at
> NLog.Targets.EventHubTarget.InitializeTarget()
Not sure why you have added the "credentials"-section, but I think it will work if you just remove it. Since it also doesn't exist in the xml-config:
"eh": {
"type": "AzureEventHub",
"eventHubName": "eventhub1",
"connectionString": "Endpoint=sb://eventhubnp198007.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=4z/JbjHp7UqxCWPD343434343GZbJpfG5E=",
"layout": "${longdate}|${event-properties:item=EventId_Id}|${uppercase:${level}}|${logger}|${message} ${exception:format=tostring}|url: ${aspnet-request-url}|action: ${aspnet-mvc-action}"
}
The Json-configuration tries to map 1-to-1 with the Xml-configuration. See also: https://github.com/NLog/NLog.Extensions.Logging/wiki/NLog-configuration-with-appsettings.json
The only place where Json and Xml really divert is when handling "list of items". Where Json must specify the actual collection-name, and Xml supports node-items.
It is a good idea to use "throwConfigExceptions": true as it will quickly reveal invalid configuration. Especially when using NLog v5.
Btw. if you want the target "eh" to be used, then you must include it in the "writeTo":. Ex:
{
"logger":"*",
"minLevel":"Trace",
"writeTo":"eh, ownFile-web"
}
I have this schema extension:
{
"id": "intnovaction_Docu2EventMetadata",
"description": "Eventos de Docu2",
"targetTypes": [
"event"
],
"status": "Available",
"owner": "d1aaf0fa-549f-4692-8929-22eb90b33099",
"properties": [
{
"name": "ActuacionId",
"type": "String"
},
{
"name": "ExpedienteId",
"type": "String"
}
]
}
I am able to extend event properties using this schema. I can set values for 'ActuacionId' and 'ExpedienteId' on an event and I can get these values through this request: https://graph.microsoft.com/v1.0/me/events?$select=id,intnovaction_Docu2EventMetadata
that returns
{
"#odata.context": "https://graph.microsoft.com/v1.0/$metadata#users('6d418063-df8b-4f47-921b-1072baf4a949')/events(id,intnovaction_Docu2EventMetadata)",
"value": [
{
"#odata.etag": "W/\"FwgXoe8hSUuEcCnxk8/heAAALdjYcQ==\"",
"id": "AAMkAGE1MDUwMDZkLWRmZDctNGMxMi1hN2ZiLTUwNTBlYTc1NmRkYwBGAAAAAABIbknKwqd9SI8d_mLMOg2XBwAXCBeh7yFJS4RwKfGTz_F4AAAAAAENAAAXCBeh7yFJS4RwKfGTz_F4AAAtI8LHAAA=",
"intnovaction_Docu2EventMetadata": {
"ActuacionId": "1",
"ExpedienteId": "2"
}
}
}
the problem comes when I try to filter for those properties:
https://graph.microsoft.com/v1.0/me/events?$select=id,intnovaction_Docu2EventMetadata&$filter=intnovaction_Docu2EventMetadata/ActuacionId eq '1'
Then I am receiving this error response
{
"error": {
"code": "RequestBroker-ParseUri",
"message": "Could not find a property named 'e2_3be22c6901b942889d07616b14e79402_intnovaction_Docu2EventMetadata' on type 'Microsoft.OutlookServices.Event'.",
"innerError": {
"request-id": "4137b6f4-1c8d-4c1e-84fd-02e8ccaab860",
"date": "2017-10-02T19:25:28"
}
}
}
Is it not possible to filter events by schema properties?
It looks like we missed something in our documentation. It's not currently possible to filter on schema extensions defined on Outlook based entity types (events, messages and personal contacts). We could also improve our error messages to make this more clear too. I'll file some items for this.
Hope this helps,
I'd like to produce a log file that contains only log output from a particular EventId (or EventIds). Is such functionality supported?
If you plug in Serilog as the provider, you can continue logging through Microsoft.Extensions.Logging but apply Serilog's filtering to limit what is sent to the log file.
To do that you'd use the following Serilog configuration:
Log.Logger = new LoggerConfiguration()
.Filter.ByIncludingOnly("EventId.Id = 9")
.WriteTo.RollingFile("logs/log-{Date}.txt")
.CreateLogger();
(Where 9 is whatever event id you want to include.)
You can plug in Serilog with https://github.com/serilog/serilog-aspnetcore, and to compile this example you'll also need to install the Serilog.Sinks.RollingFile and Serilog.Filters.Expressions packages.
If you want to config Serilog using the appsettings file you can use something like this:
"Serilog": {
"Using": ["Serilog.Sinks.File", "Serilog.Settings.Configuration", "Serilog.Expressions"],
"MinimumLevel": {
"Default": "Debug"
},
"WriteTo": [{
"Name": "File",
"Args": {
"path": "log.txt",
"rollingInterval": "Day",
"retainedFileCountLimit": "120",
"rollOnFileSizeLimit": "true",
"theme": "Serilog.Sinks.SystemConsole.Themes.AnsiConsoleTheme::Code, Serilog.Sinks.Console"
}
}
],
"Filter": [{
"Name": "ByIncludingOnly",
"Args": {
"expression": "EventId.Id = 101"
}
}
]
}
In the using field you have to put all Serilog package that you are importing, then config a sink for writing the logs and finally define a filter.
I almost spent a whole morning trying to achive that so I hope it helps you
I'm using Autofac JSON files to register two classes for the same interface in my project.
If I do something like this:
JSON Config file 1:
{
"components": [
{
"type": "Services.FirstProvider, Services",
"services": [
{
"type": "Services.IHotelProvider, Services"
}
],
"parameters": {
"username": "<user>",
"password": "<pwd>"
}
}
]
}
JSON Config file 2:
{
"components": [
{
"type": "Services.SecondProvider, Services",
"services": [
{
"type": "Services.IHotelProvider, Services"
}
],
"parameters": {
"key": "<key>",
}
}
]
}
And register:
config.AddJsonFile("First/FirstProviderConfig.json");
config.AddJsonFile("Second/SecondProviderConfig.json");
I can see that only the SecondProviderhas been registered. And switching registration:
config.AddJsonFile("Second/SecondProviderConfig.json");
config.AddJsonFile("First/FirstProviderConfig.json");
Only FirstProvider has been registered.
If I try to register them in the same file:
{
"components": [
{
"type": "Services.FirstProvider, Services",
"services": [
{
"type": "Services.IHotelProvider, Services"
}
],
"parameters": {
"username": "<user>",
"password": "<pwd>"
}
},
{
"type": "Services.SecondProvider, Services",
"services": [
{
"type": "Services.IHotelProvider, Services"
}
],
"parameters": {
"key": "<key>"
}
}
]
}
It works.
I need to have separated files to configure them. What I miss?
The key point here is that you're using Microsoft.Extensions.Configuration as the basis for configuration files now, which means configuration is somewhat governed by the way Microsoft.Extensions.Configuration behaves.
When you have configuration, the way Microsoft.Extensions.Configuration wants to handle it is to override settings as you layer one configuration provider on top of another.
In the simple case, say you have two configurations:
{
"my-key": "a"
}
and
{
"my-key": "b"
}
It doesn't create an array of all possible values; it'll layer the second over the first based on the key (my-key) matching and override to have the value b.
When you parse JSON configuration it flattens everything out into key/value pairs. Does the same with XML. It does this because configuration supports environment variables and INI files and all sorts of other backing stores.
In the case of the above very simple files, you get
my-key == b
Nice and flat. Looking at something more complex:
{
"top": {
"simple-item": "simple-value",
"array-item": ["first", "second"]
}
}
It flattens out like:
top:simple-item == simple-value
top:array-item:0 == first
top:array-item:1 == second
Notice how the array (an "ordinal collection") gets flattened? Each item gets auto-assigned a fake "key" that has a 0-based index.
Now think about how two config files will layer. If I have the above more complex configuration and then put this...
{
"top": {
"array-item": ["third"]
}
}
That one flattens out to
top:array-item:0 == third
See where I'm going here? You layer that override config over the first one and you get:
top:simple-item == simple-value
top:array-item:0 == third
top:array-item:1 == second
The arrays don't combine, the key/value settings override.
You see them in a JSON representation, but it's all just key/value pairs.
You have two choices to try and fix this.
Option 1: Fudge the Array (Not Recommended)
Since your first configuration is (simplified):
{
"components": [
{
"type": "Services.FirstProvider, Services",
"services": [ ...]
}
]
}
You can potentially "fudge it" a little by putting a dummy empty element in the second "override" config:
{
"components": [
{
},
{
"type": "Services.SecondProvider, Services",
"services": [ ...]
}
]
}
Last I checked, the override thing was additive-only, so empty values don't erase previously set values. By shifting the array in the second configuration by 1, it'll change the flattened version of the key/value representation and the two arrays should "merge" the way you want.
But that's pretty ugly and I wouldn't do that. I just wanted to show you one way to make it work so you'd understand why what you're doing isn't working.
Option 2: Two Separate Configuration Modules (Recommended)
Instead of trying to combine the two JSON files, just create two separate IConfiguration objects by individually loading the JSON files. Register them separately in two different ConfigurationModule registrations. It shouldn't blow up if either of the configurations is empty.
var first = new ConfigurationBuilder();
first.AddJsonFile("autofac.json", optional: true);
var firstModule = new ConfigurationModule(first.Build());
var second = new ConfigurationBuilder();
second.AddJsonFile("autofac-overrides.json", optional: true);
var secondModule = new ConfigurationModule(second.Build());
var builder = new ContainerBuilder();
builder.RegisterModule(firstModule);
builder.RegisterModule(secondModule);
If the config is empty or missing it just won't register anything. If it's there, it will. In the case where you want to override things or add to your set of handlers for nice IEnumerable<T> resolution, this should work.
I have a azure data factory project.
I read the docs https://learn.microsoft.com/en-us/azure/data-factory/data-factory-use-custom-activities to add custom activities to one of my pipelines.
in the documentation said that you have to zip the dlls of your class library that represent the custom activities, and storage this zip in a azure blob.
and the definition of the pipeline are:
{
"name": "LoadFromOnerxSalesInvoicesRaw",
"properties": {
"description": "Test Deserialize Sales Invoices Raw",
"activities": [
{
"type": "DotNetActivity",
"typeProperties": {
"assemblyName": "BICodeActivities.dll",
"entryPoint": "BICodeActivities.Activities.OneRx.DeserializeSalesInvoiceToLines",
"packageLinkedService": "biCABlobLS",
"packageFile": "bi-activities-container/BICodeActivities.zip",
"extendedProperties": {
"SliceStart": "$$Text.Format('{0:yyyyMMddHH-mm}', Time.AddMinutes(SliceStart, 0))"
}
},
"inputs": [
{
"name": "o-staging-onerx-salesInvoices"
}
],
"outputs": [
{
"name": "o-staging-onerx-salesInvoicesLines"
}
],
"policy": {
"timeout": "00:30:00",
"concurrency": 2,
"retry": 3
},
"scheduler": {
"frequency": "Day",
"interval": 1
},
"name": "DeserializeSalesInvoiceToLines",
"linkedServiceName": "biBatchLS"
},
{
"type": "DotNetActivity",
"typeProperties": {
"assemblyName": "BICodeActivities.dll",
"entryPoint": "BICodeActivities.Activities.OneRx.DeserializeSalesInvoiceToDiscounts",
"packageLinkedService": "biCABlobLS",
"packageFile": "bi-activities-container/BICodeActivities.zip",
"extendedProperties": {
"SliceStart": "$$Text.Format('{0:yyyyMMddHH-mm}', Time.AddMinutes(SliceStart, 0))"
}
},
"inputs": [
{
"name": "o-staging-onerx-salesInvoices"
}
],
"outputs": [
{
"name": "o-staging-onerx-salesInvoicesDiscounts"
}
],
"policy": {
"timeout": "00:30:00",
"concurrency": 2,
"retry": 3
},
"scheduler": {
"frequency": "Day",
"interval": 1
},
"name": "DeserializeSalesInvoiceToDiscounts",
"linkedServiceName": "biBatchLS"
}
],
"start": "2017-04-26T09:20:00Z",
"end": "2018-04-26T22:30:00Z"
}
}
When set up this pipeline in my visual studio project and build i get an error "BICodeActivities.zip is not found in the solution".
I have to zip the dlls and add manually to the solution?, or i need to do something else?
I'm assuming you have the custom acitivites as class library in the same solution as your data factory project.
If so, you simply need to reference the class library project in the data factory project. Right click > Add > Reference. Select the library project.
Once done, when you build the solution Visual Studio will handle the zipping of the DLL's for you and also add the ZIP folder as a dependency which will show in the publishing wizard to be deployed to the blob storage linked service.
For further support check out this blog post.
https://www.purplefrogsystems.com/paul/2016/11/creating-azure-data-factory-custom-activities/
Hope this helps.