AWS Logging not working - c#

I created a very simple Console app which is supposed to log the messages to the AWS Logs but although the app runs I can't find any log on AWS.
I think publishing the app code does not make sense: I presume it's ok and it does not throw any exception.
I think the problem is located in the AWS settings. This is what I did in AWS:
created some role , not sure why but did it almost close to what aws poor and messy documentation says. So the role is created, not exactly as it was supposed in the "documentation" but it contains the required permissions for the logs. Why I created it? - I don't have a clue - my app does not use it!
Created the Log Group - ok, this parameter is what I put into the config of my app
Not sure I need t create the log stream, but ok, I created it, but when I click on it it says "No events found." and "It appears you have not installed a CloudWatch Logs agent .."
Why do I need some agent? what is it? how to install? - absolutely not clear and pointing to the poor aws "documentation" is useless.
I guess these are the major things done in the AWS but..still no result - nothing works, I cant see the logs.
Searched for the answer in google, youtube, etc - no result.
Found some code which is similar to mine but it's no enought - it seems there are some settings required to be done on AWS.
What's wrong?

You have two options:
Write log files to disk and use CloudWatch Agent to submit these logs to CloudWatch: https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/QuickStartWindows2016.html
With this option you don't need to configure anything related to AWS in the program, but you have to install and configure the Agent.
Use AWS.Logger NuGet package and configure it to send the logs to CloudWatch, in this case you don't need to use the Agent: https://github.com/aws/aws-logging-dotnet/tree/master/samples/AspNetCore
With this option you must create AWS API user with CloudWatch Log writing permission and put this user credentials into AWS.Logger configuration. Show the configuring code you used if you need an advice on this.

I had a similar problem, which turned out to be more config-related.
Firstly, make sure that you have AWS Toolkit for Visual Studio installed and set up with the appropriate user. I use an IAM User with the correct policy permissions to read and write Cloudwatch logs.
Here's a copy of my basic console test that works correctly:
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Configuration;
using System;
using System.Diagnostics;
using System.Threading;
using System.Threading.Tasks;
namespace Runner
{
class Program
{
public static async Task Main(string[] args)
{
var services = ConfigureServices(new ServiceCollection())
.BuildServiceProvider();
await services.GetService<App>().RunAsync();
}
private static IServiceCollection ConfigureServices(IServiceCollection services)
{
var configuration = ConfigurationFactory.GetConfiguration();
services
.AddSingleton(configuration)
.AddLogging(builder =>
{
var config = configuration.GetSection("Logging");
builder
.AddConfiguration(configuration.GetSection("Logging"))
.AddConsole()
.AddDebug()
.AddAWSProvider(configuration.GetAWSLoggingConfigSection().Config);
})
// add app
services.AddTransient<App>();
return services;
}
}
public class App
{
private ILogger<App> Logger;
public App(ILogger<App> logger)
{
Logger = logger;
}
public async Task RunAsync()
{
try
{
Logger.LogTrace("LogTrace", "{\"Test\":1}");
Logger.LogInformation("LogInformation", "{\"Test\":2}");
Logger.LogWarning("LogWarning", "{\"Test\":3}");
Logger.LogDebug("LogDebug", "{\"Test\":4}");
Logger.LogError("LogError", "{\"Test\":5}");
Logger.LogCritical("LogCritical", "{\"Test\":6}");
Thread.Sleep(3000);
Debugger.Break();
}
catch (Exception ex)
{
throw;
}
}
}
}
And my appsettings.json file is:
{
"Logging": {
"Region": "eu-west-1",
"LogGroup": "/dev/runner",
"IncludeLogLevel": true,
"IncludeCategory": true,
"IncludeNewline": true,
"IncludeException": true,
"IncludeEventId": false,
"IncludeScopes": false,
"LogLevel": {
"Default": "Debug",
"System": "Information",
"Microsoft": "Information"
},
"Console": {
"LogLevel": {
"Default": "Error",
"System": "Information",
"Microsoft": "Information"
}
},
"Debug": {
"LogLevel": {
"Default": "Trace",
"System": "Information",
"Microsoft": "Information"
}
}
}
}
The Thread.Sleep is to allow the console logger to catch up with itself - if you just break you often don't see anything.
Similarly, if you quit the program executing at the breakpoint the AWS logger won't flush its buffers to Cloudwatch (it will just create the logstream and leave it empty), so let the program run to completion to populate the logstream itself.

Related

Dependency Logs into constructor behave different than logs injected into functions

I created the following function:
using System;
using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
namespace FunctionApp1
{
public class DINotWorking
{
ILogger _log;
public DINotWorking(ILogger<DINotWorking> log)
{
_log = log;
_log.LogInformation("I can in constructor");
}
[FunctionName("HttpTrg1")]
public IActionResult Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest req,ILogger logger1)
{
_log.LogInformation("Contructor injected Log works !");
logger1.LogInformation("Function injected log Work!");
string name = req.Query["name"];
string responseMessage = $"Hello, '{name}'. This HTTP triggered function executed successfully.";
return new OkObjectResult(responseMessage);
}
}
}
host.json
{
"version": "2.0",
"logging": {
"fileLoggingMode": "always",
"applicationInsights": {
"samplingSettings": {
"isEnabled": true,
"excludedTypes": "Request"
}
},
"logLevel": {
"FunctionApp1.DINotWorking": "Information",
"FunctionApp1.DINotWorking.User": "Information"
}
}
}
The function has two loggers available in it:
_log is a dependency injected class member with the type ILogger<DINotWorking>.
logger1 is dependency injected function parameter with the type ILogger
I am going to reference them as #1 and #2 going forward.
They behave differently in different environments:
local development - Azure Function tools
In local development environment in azure function tools both #1 and #2 behave identical:
Deployed - Azure Function portal - Application Insight
In Azure Function portal, they both log to Application Insight. Means both #1 and #2 behave identical:
Deployed - Azure Function portal - Filesystem Logs
In Azure Function portal, #1 log entries do not show in the Filesystem logs. Why?
Question: Why #1 log entries do not show in Filesystem logs?
Azure function logging is a black box to me.
Obviously, there is a difference between constructor injected ILogger and Function injected ILogger.
Does this have anything to do with Log Categories explained here
How can I see the difference in my code?
How can I make them behave the same in all environment in the host.json?
Update #1
Per #Hari Krishna suggestion, I've changed the host.json file to the following.
{
"version": "2.0",
"logging": {
"applicationInsights": {
"samplingSettings": {
"isEnabled": true,
"excludedTypes": "Request"
}
},
"fileLoggingMode": "always",
"logLevel": {
"FunctionApp1.HttpTrg1": "Information",
"FunctionApp1.HttpTrg1.User": "Information"
}
}
}
The issue is still persisting (outstanding).
How can I diagnose this issue? Is there anyway the hosting tells me why it is ignoring constructor injected log entries from file system logs?
AFAIK,
In Azure Function portal, #1 log entries do not show in the Filesystem logs. Why?
By default, FileSystem Logs shows the Function execution logs like executing, executed, and error.
We can configure the host.json file to get all the logs (App Insights logs + File System Logs) in File System Logs because everything should be configured whatever the logs you want to push to Application Insights:
{
"version": "2.0",
"logging": {
"applicationInsights": {
"samplingSettings": {
"isEnabled": true,
"excludedTypes": "Request"
}
},
"fileLoggingMode": "always",
"logLevel": {
"Function.HttpTrigger1": "Information",
"default": "None"
}
}
}
I have configured the fileLoggingMode in host.json to log both (application & function). Then the output is:
File System Logs:
App Insights Logs:
For more information on fileLoggingMode attributes, refer to this MS Doc and GitHub Open Informative Ticket.
Updated Answer:
Please Check that I have uploaded my code on my GitHub Repository by writing the similar code as yours and shown how to configure the Host.json as well as Function Code.
Results:
Locally I can see both the logs:
After deploying to Azure Function App, App Insights Logs were:
Also, the File System Logs were:

StackExchange.Exceptional not logging exceptions from Blazor

I have a Blazor Server-Side app running with .NET 5.0 and I'm trying to switch from ElmahCore to Exceptional.
But I can't get it to log Blazor exceptions.
When I throw an exception in a MVC controller it gets logged, but if I throw one in e.g. OnAfterRenderAsync nothing gets logged.
What do I need to configure to get Blazor exceptions logged with Exceptional?
Also in ElmahCore I could use ElmahExtensions.RiseError(exception); to log an exception I catched/handled in code but still wanted to show up in the error-log. Is there something similar for Exceptional?
I configured Exceptional with the default configuration from HERE.
I downloaded the package first:
Install-Package StackExchange.Exceptional.AspNetCore -Version 2.2.17
Next, add the Exceptional tag configuration to the appsettings.json file, which contains the connection string used to store errors in the database:
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft": "Warning",
"Microsoft.Hosting.Lifetime": "Information"
}
},
"AllowedHosts": "*",
"Exceptional": {
"Store": {
"ApplicationName": "Test",
"Type": "SQL",
"ConnectionString": "Your Db"
}
}
}
Registering AddExceptional service in ConfigureServices Method:
services.AddExceptional(Configuration.GetSection("Exceptional"), settings =>
{
settings.UseExceptionalPageOnThrow = HostingEnvironment.IsDevelopment();
});
After adding ConfigureServices next, we are going to add “UseExceptional” middleware in Configure method.
Adding app.UseExceptional(); Middleware in Configure method to handle errors:
public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
{
app.UseExceptional();
.....
}
Then I wrote a throw exception throw new NotImplementedException();
Then you can see the record:
According to the article you provided, it should be done.Which side did you have a problem with?

ASP.NET Core 3 logging to output window

I have an ASP.NET Core 3 WEB API project and I want to trace using stopwatch the time execution of some method sending the result to the output window of Visual Studio 2019 by using Debub.WriteLine(...)
The issue is that this output window is bombed with hundreds of lines of information per seconds by .NET Core framework itself. They are not errors, warning.
I've tried to set to Warning or None the log level in the appsettings.json but nothing has changed.
"Logging": {
"LogLevel": {
"Default": "None",
"Microsoft": "None",
"Microsoft.Hosting.Lifetime": "None"
}
},
I've tried to disable by going to "TOOLS --> Options --> Projects and Solutions --> Web Projects" and checking "Disable local Application Insights for Asp.Net Core web projects." and nothing has changed.
Is there a way to disable all that trash sent to the output window?
*** UPDATE ***
I've tried also to add the following:
public static IHostBuilder CreateHostBuilder(string[] args) =>
Host.CreateDefaultBuilder(args).ConfigureLogging(config =>
{
config.ClearProviders();
})
.ConfigureWebHostDefaults(webBuilder =>
{
webBuilder.UseStartup<Startup>();
});
and nothing has changed.
****** UPDATE 2 ******
Tried to modify the appsettings.development.json by setting everything to None and then by setting what suggested by Fei Han, but nothing changes again:
Is there a way to disable all that trash sent to the output window?
You can try to add following code in appsettings.Development.json file to set LogLevel
of Microsoft to None for Debug provider.
{
"Logging": {
//...
//code for other provider
//...
"Debug": { //Debug provider
"IncludeScopes": true,
"LogLevel": {
"Microsoft": "None"
}
}
}
}

Azure Function stops working when I use dependencies, works locally

I'm trying to setup an Azure Function, Linux based in consumption mode, Queue triggered.
It works perfectly locally in debug (said every programmer ever), but when deploying nothing happens. I can't find any logs.
I started over, uploaded an empty function - it worked, but as soon as I add my own libraries, it stopped working.
I then tried to remove my libraries and re-upload the 'hello world' function but it still doesn't work.
This is the host.json:
{
"logging": {
"fileLoggingMode": "always",
"logLevel": {
"default": "Information",
"Host.Results": "Information",
"Function": "Information",
"Host.Aggregator": "Information"
},
"applicationInsights": {
"samplingExcludedTypes": "Request",
"samplingSettings": {
"isEnabled": true
}
},
"console": {
"isEnabled": "true"
}
},
"Values": {
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=****;AccountKey=*****;BlobEndpoint=https://***.blob.core.windows.net/;TableEndpoint=https://*****.table.core.windows.net/;QueueEndpoint=https://****.queue.core.windows.net/;FileEndpoint=https://****.file.core.windows.net/",
"FUNCTIONS_WORKER_RUNTIME": "dotnet"
},
"version": "2.0"
}
The function code (without my libraries), works on first upload only.
[FunctionName("EmailQueueWorker")]
//public static async Task Run(
public static async Task Run(
[QueueTrigger(queueName: "email", Connection = "AzureWebJobsStorage")] string queueItem,
ILogger log
)
{
log.LogWarning("Start run()");
}
What am I doing wrong (or where can I find logs? the Application Insights is empty)? Thanks
I ran into the same problem a week or two ago; I'd bet good money the problem is your connection to the queue. For comparison, this is my full and complete host.json for my (working) queue trigger function:
{
"version": "2.0",
"logging": {
"applicationInsights": {
"samplingSettings": {
"isEnabled": true
}
},
"fileLoggingMode": "always",
"logLevel": {
"default": "Information",
"Host.Results": "Error",
"Function": "Trace",
"Host.Aggregator": "Trace"
}
}
}
Connection String
Our logLevel section is a bit different, and you'll note there aren't any connection strings there. I'm still farely new to Azure but from what I've learned, that isn't where they go.
In Visual Studio 2019, right click on the Project, then Publish. Under Actions, click Manage Azure App Service settings. There, you can add any needed connection string settings. If you need to specify a storage account, the setting name should be the name of the storage account plus "_STORAGE". For example, if your storage account was named MyVault then the name of the setting would be MyVault_STORAGE.
In VS Code, it's a bit different to get to. You have to look under Azure, Functions, and then be sure you select your Azure subscription (not the local copy!) and drill down into the function, Application Settings, where you can add/edit.
In the Azure portal, you can manage app settings this way.
Logs
In Azure portal, start by going to Function App. Click on your primary function name. Now, under the new menu for that function, under Functions sub-menu, click on Functions. Now you'll see a list of all the different functions that comprise your queue trigger function. Among them should be EmailQueueWorker - click on it. Now, you should see the execution count, and you can click on Monitor in the left hand menu, then Logs in the middle area. You can Start/Stop/Clear as needed.
For whatever reason, I find that I see the actual log data a lot faster when I use either Visual Studio 2019 or VS Code to stream it rather than the web console. There seems to be a bit of a delay at times with the web console.

Microsoft Service Fabric and logging

I am investigating the best way to implement logging for my Service Fabric Stateless API and have been somewhat overwhelmed by the varying different solutions for what appears to be a relatively simple requirement.
I have implemented logging using the WebHostBuilder().ConfigureLogging and have successfully logged my trace messages to the Debug window and via Serilog.Extensions.Logging.File I have also managed to dump this log to a file, this all being controlled via a #if DEBUG directive and this I was happy with.
Then I needed to configure what would happen when deployed to a cluster within Azure and this is when I became overwhelmed!!!
I thought that I could register ServiceEventSource type logger in the same manner as I did with AddDebug however it was not this simple.
So I have managed to get my logs to appear within the diagnostic window using the ServiceEventSource.Current.Message but these logs are not integrated within the ASP.NET logging framework :/
My continued investigation has led me to understand that Service Fabric logging should be directed towards Application Insights albeit many, many articles having varying degrees of detail and applicability to the latest framework.
My current thinking is that I need to remove the ASP.NET logging and implement something such as EventFlow to allow my trace messages to be generated and subsequently piped through to Application Insights for interrogation at a later date, is my thinking correct??
Or am I currently going off at a tangent?
UPDATE 15/05/2019
After deploying this to Azure Service Fabric the log files were not populated, this appears to be an incompatibility between the Serilog.Sinks.AzureBlobStorage NUGET package and the .NET Core version 2.2.0 that my project was targeting.
I have posted a ticket on the GitHub page and await a response, in the short term you can download the source code and migrate the project to a Microsoft.NETCore.App 2.2.0 project and directly reference this and everything works perfectly.
ORIGINAL ANSWER
I seem to do this quite a lot, answering my own question but here goes again. It's taken me a day or two to get to the bottom of this so I thought I would share my findings and solution with the community in-case it might help somebody else in the future and/or somebody might have something to add or even contradict me which I'd welcome any input.
My development environment is as follows: -
Microsoft Visual Studio 15.9.11
Windows 10 Professional
SDK: Microsoft.NETCore.App 2.2.0
I created a new Service Fabric Stateless Service the purpose of this service is to provide RESTful endpoints to a Angular 7 front end web application.
My requirement was to provide logging information in both my development environment via the Debug window and to also provide similar logging information whilst my apps are being hosted within a Service Fabric Cluster on Azure.
NUGET Package Installations
Microsoft.Extensions.Logging (2.2.0)
Serilog.AspNetCore (2.1.1)
Serilog.Enrichers.Environment (2.1.3)
Serilog.Settings.Configuration (3.0.1)
Serilog.Sinks.Debug (1.0.1)
Serilog.Sinks.AzureBlobStorage (1.3.0)
Controlling Development & Production Environments
I control the development & production environments using the DEBUG pre-processor directive to include either the appsettings.json or appsettings.Development.json file.
My appSettings.Development.json file is like this: -
{
"AppSettings": {
// My app settings not applicable to this
},
"Serilog": {
"Using": [ "Serilog.Sinks.Debug" ],
"MinimumLevel": {
"Default": "Verbose",
"Override": {
"Microsoft": "Warning",
"System": "Warning"
}
},
"WriteTo": [
{
"Name": "Debug",
"Args": {
"outputTemplate": "[{Timestamp:HH:mm:ss} {MachineName} {Level:u3}] {Message:lj}{NewLine}{Exception}"
}
}
],
"Enrich": ["WithMachineName"]
}
}
My appSettings.json file is like this: -
{
"AppSettings": {
// My app settings not applicable to this
},
"Serilog": {
"Using": [ "Serilog.Sinks.AzureBlobStorage" ],
"MinimumLevel": {
"Default": "Information",
"Override": {
"Microsoft": "Warning",
"System": "Warning"
}
},
"WriteTo": [
{
"Name": "AzureBlobStorage",
"Args": {
"outputTemplate": "[{Timestamp:HH:mm:ss} {MachineName} {Level:u3}] {Message:lj}{NewLine}{Exception}",
"connectionString": "[Connection String]",
"storageContainerName": "app",
"storageFileName": "{yyyy}-{MM}-{dd}.log"
}
}
],
"Enrich": [ "WithMachineName" ]
}
}
As you can see from the above settings files I output to the Debug window when in development and I have chosen to output to Azure Blob Storage when deployed to a Service Fabric Cluster in Azure.
To implement the Serilog logging simple review my Stateless Service class implementation below, which shows how to toggle the two different appSettings.json files dependent upon the environment and also how the Serilog logger is inserted into the dependency injection system via the use of the UseSerilog extension method.
using System.Collections.Generic;
using System.Fabric;
using System.IO;
using Microsoft.AspNetCore.Hosting;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.ServiceFabric.Services.Communication.AspNetCore;
using Microsoft.ServiceFabric.Services.Communication.Runtime;
using Microsoft.ServiceFabric.Services.Runtime;
using Serilog;
namespace Caboodal.Manatee.ServiceFabric.Api.Identity
{
internal sealed class Identity : StatelessService
{
public Identity(StatelessServiceContext context)
: base(context)
{
}
private string AppSettingsFilename
{
get
{
#if DEBUG
return "appsettings.Development.json";
#else
return "appsettings.json";
#endif
}
}
protected override IEnumerable<ServiceInstanceListener> CreateServiceInstanceListeners()
{
var appSettings = GetAppSettings();
Log.Logger = new LoggerConfiguration()
.ReadFrom.Configuration(appSettings)
.CreateLogger();
return new[]
{
new ServiceInstanceListener(
serviceContext =>
new KestrelCommunicationListener(
serviceContext,
"ServiceEndpoint",
(url, listener) =>
{
ServiceEventSource.Current.ServiceMessage(serviceContext, $"Starting Kestrel on {url}");
return new WebHostBuilder()
.UseKestrel()
.ConfigureAppConfiguration(
(builderContext, config) =>
{
config.AddJsonFile(AppSettingsFilename, false, true);
})
.ConfigureServices(
services => services
.AddSingleton(serviceContext))
.UseContentRoot(Directory.GetCurrentDirectory())
.UseSerilog()
.UseStartup<Startup>()
.UseServiceFabricIntegration(listener, ServiceFabricIntegrationOptions.None)
.UseUrls(url)
.Build();
}))
};
}
private IConfigurationRoot GetAppSettings()
{
return new ConfigurationBuilder()
.SetBasePath(Directory.GetCurrentDirectory())
.AddJsonFile(AppSettingsFilename)
.Build();
}
}
}
Using the Logger within a controller
Because the ILogger instance is configured as a Dependency Injected instance it can simply be accessed within your Controller classes like any other dependency e.g.
[Authorize]
[ApiController]
[Route("[controller]")]
public class UserController : ApiController
{
private readonly IUserService _userService;
private readonly ILogger<UserController> _logger;
public UserController(IUserService userService, ILogger<UserController> logger)
{
_userService = userService;
_logger = logger;
}
[AllowAnonymous]
[HttpPost("authenticate")]
public IActionResult Authenticate([FromBody] DtoAuthenticateRequest request)
{
// Adding log entries
_logger.Log(LogLevel.Debug, "Here is a log entry");
// Some code in here
return Ok(response);
}
}
I got very sidetracked with the ServiceEventSource.cs class but with the usage of Serilog I have now ignored this aspect of the project template.
If you wish to output your logs to other data consumers or simply into different formats then just review the Serilog website here for a complete list of the Sinks available, with Application Insights being one of the many.

Categories

Resources