I have a simple Azure function with HttpTrigger like so:
[FunctionName("Heartbeat")]
public async Task<IActionResult> Heartbeat(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "heartbeat")] HttpRequest req,
ILogger log)
{
log.Log(LogLevel.Information, "Received heartbeat request");
var status = await _healthCheck.CheckHealthAsync();
return new OkObjectResult(Enum.GetName(typeof(HealthStatus), status.Status));
}
local.settings.json:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=false",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"APPINSIGHTS_INSTRUMENTATIONKEY": "*****"
},
"ConnectionStrings": {
"ServiceDb": "Server=.;Initial Catalog=Acquire;Integrated Security=True;"
},
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft": "Warning",
"Microsoft.Hosting.Lifetime": "Information"
}
},
When i run the function locally I'm able to hit a breakpoint inside Heartbeat() and get 200OK. However, when it's deployed to Azure, It does not work and I get 404 Not Found.
Local:
Azure:
What is the problem? why can't I execute an HttpTrigger?
Also, is there a way to view local.settings.json configururation in Azure? I though these settings would somehow populate from local.settings.json in Azure (like connection strings)-yet they're blank. Is there a way to show these settings in Azure?
is there a way to view local.settings.json configururation in Azure?
There is no local.settings.json in Azure, hence the name "local".
You should add settings with the same names to your App Settings.
There should be an option for azure to give you the function url. Should look something like this
I found the solution. My issue was that Runtime version was set to: ~1 instead of ~3. Another change I had to make for IOptions to bind is edit project, and modify these options:
<Content Include="local.settings.json">
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
<CopyToPublishDirectory>PreserveNewest</CopyToPublishDirectory>
</Content>
To make logging work:
https://nuggets.hammond-turner.org.uk/2019/10/tuesday-quickie-ilogger-in-azure.html
I added this and now I see all logs fine:
{
"version": "2.0",
"logging": {
"logLevel": {
"ServiceLogger": "Information"
}
}
}
but all the answers from other users got me on the right path.
Related
We work in .NET 6 and want to log all incoming requests. We plan to allow this primarily in DEV and TEST environments but potentially and temporarily also in PROD.
The new HttpLogger sounds like a perfect match. But it will output to the standard ILogger which in our case is Elastic. From GDPR reasons we want to avoid that where potentially sensitive data will be exposed in an uncontrolled way.
Does anyone have another solution:
Redirect messages from HttpLogger to other storage
Create custom middleware to be able to sniff requests and output to custom storage
I found this interesting post on reading the body twice but was not able to make it work: https://codetalk.in/posts/2022/01/04/read-request-body-multiple-times-in-asp-dot-net-core
You can do that just by configuration, for example in production app settings, disable HTTP logging for Elastic and enable it for file logger, something like this (really depends on your loggers and configurations):
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.AspNetCore.HttpLogging.HttpLoggingMiddleware": "Information"
},
"Elastic": {
"IncludeScopes": true,
"LogLevel": {
"Default": "Information",
"Microsoft.AspNetCore.HttpLogging.HttpLoggingMiddleware": "None"
}
},
"File":
{
"IncludeScopes": true,
"LogLevel": {
"Default": "None",
"Microsoft.AspNetCore.HttpLogging.HttpLoggingMiddleware": "Information"
}
}
}
}
To be more restrictive, configure HttpLoggingOptions for production to not log some fields for example.
I created the following function:
using System;
using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
namespace FunctionApp1
{
public class DINotWorking
{
ILogger _log;
public DINotWorking(ILogger<DINotWorking> log)
{
_log = log;
_log.LogInformation("I can in constructor");
}
[FunctionName("HttpTrg1")]
public IActionResult Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest req,ILogger logger1)
{
_log.LogInformation("Contructor injected Log works !");
logger1.LogInformation("Function injected log Work!");
string name = req.Query["name"];
string responseMessage = $"Hello, '{name}'. This HTTP triggered function executed successfully.";
return new OkObjectResult(responseMessage);
}
}
}
host.json
{
"version": "2.0",
"logging": {
"fileLoggingMode": "always",
"applicationInsights": {
"samplingSettings": {
"isEnabled": true,
"excludedTypes": "Request"
}
},
"logLevel": {
"FunctionApp1.DINotWorking": "Information",
"FunctionApp1.DINotWorking.User": "Information"
}
}
}
The function has two loggers available in it:
_log is a dependency injected class member with the type ILogger<DINotWorking>.
logger1 is dependency injected function parameter with the type ILogger
I am going to reference them as #1 and #2 going forward.
They behave differently in different environments:
local development - Azure Function tools
In local development environment in azure function tools both #1 and #2 behave identical:
Deployed - Azure Function portal - Application Insight
In Azure Function portal, they both log to Application Insight. Means both #1 and #2 behave identical:
Deployed - Azure Function portal - Filesystem Logs
In Azure Function portal, #1 log entries do not show in the Filesystem logs. Why?
Question: Why #1 log entries do not show in Filesystem logs?
Azure function logging is a black box to me.
Obviously, there is a difference between constructor injected ILogger and Function injected ILogger.
Does this have anything to do with Log Categories explained here
How can I see the difference in my code?
How can I make them behave the same in all environment in the host.json?
Update #1
Per #Hari Krishna suggestion, I've changed the host.json file to the following.
{
"version": "2.0",
"logging": {
"applicationInsights": {
"samplingSettings": {
"isEnabled": true,
"excludedTypes": "Request"
}
},
"fileLoggingMode": "always",
"logLevel": {
"FunctionApp1.HttpTrg1": "Information",
"FunctionApp1.HttpTrg1.User": "Information"
}
}
}
The issue is still persisting (outstanding).
How can I diagnose this issue? Is there anyway the hosting tells me why it is ignoring constructor injected log entries from file system logs?
AFAIK,
In Azure Function portal, #1 log entries do not show in the Filesystem logs. Why?
By default, FileSystem Logs shows the Function execution logs like executing, executed, and error.
We can configure the host.json file to get all the logs (App Insights logs + File System Logs) in File System Logs because everything should be configured whatever the logs you want to push to Application Insights:
{
"version": "2.0",
"logging": {
"applicationInsights": {
"samplingSettings": {
"isEnabled": true,
"excludedTypes": "Request"
}
},
"fileLoggingMode": "always",
"logLevel": {
"Function.HttpTrigger1": "Information",
"default": "None"
}
}
}
I have configured the fileLoggingMode in host.json to log both (application & function). Then the output is:
File System Logs:
App Insights Logs:
For more information on fileLoggingMode attributes, refer to this MS Doc and GitHub Open Informative Ticket.
Updated Answer:
Please Check that I have uploaded my code on my GitHub Repository by writing the similar code as yours and shown how to configure the Host.json as well as Function Code.
Results:
Locally I can see both the logs:
After deploying to Azure Function App, App Insights Logs were:
Also, the File System Logs were:
In WebForms we would have a web.config file that we could do ConfigurationManager.AppSettings["SomeKey"]; to retrieve the value from the key-value pair.
I've just started a project in .NET 5.0 and there doesn't seem a simple way to do something that seems to trivial?
I've looked online and have been unsuccessful in following tutorials on how to access these keys in appsettings.json from a .cshtml file using # notation.
appsettings.json:
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft": "Warning",
"Microsoft.Hosting.Lifetime": "Information"
}
},
"AllowedHosts": "*",
"MyKey": "wwwwwwwwwww"
}
Index.cshtml:
<h1>
Title - #ConfigurationManager.AppSettings["MyKey"];
</h1>
The above illustrates what I am trying to achieve, is there a simple way to do this rather than creating classes etc as I don't seem to be able to follow their examples.
To access configuration settings in a view in a .NET project, you should be able to use # annotation. This is done by injecting the configuration into the page:
#page
#model Test5Model
#using Microsoft.Extensions.Configuration
#inject IConfiguration Configuration
Configuration value for 'MyKey': #Configuration["MyKey"]
Take this appsettings.json for example:
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft": "Warning",
"Microsoft.Hosting.Lifetime": "Information"
}
},
"AllowedHosts": "*",
"Test": "Hello World",
"ConnectionStrings": {
"SomeContext": "SomeConnectionStringProperties"
},
"SectionOne": {
"SectionOneTestVal": "SomeTestValue"
}
}
In order to access the Test key, it would simply be #Configuration["Test"] while to access the SectionOneTestVal "key" in the SectionOne section, you would do something like Configuration.GetSection("SectionOne")["SectionOneTestVal"]:
Thus adding this to a view:
<p>#Configuration["Test"]</p>
<p>#Configuration.GetSection("SectionOne")["SectionOneTestVal"]</p>
...would yield:
For more information and examples, also check out dependency injection into views.
I'm trying to setup an Azure Function, Linux based in consumption mode, Queue triggered.
It works perfectly locally in debug (said every programmer ever), but when deploying nothing happens. I can't find any logs.
I started over, uploaded an empty function - it worked, but as soon as I add my own libraries, it stopped working.
I then tried to remove my libraries and re-upload the 'hello world' function but it still doesn't work.
This is the host.json:
{
"logging": {
"fileLoggingMode": "always",
"logLevel": {
"default": "Information",
"Host.Results": "Information",
"Function": "Information",
"Host.Aggregator": "Information"
},
"applicationInsights": {
"samplingExcludedTypes": "Request",
"samplingSettings": {
"isEnabled": true
}
},
"console": {
"isEnabled": "true"
}
},
"Values": {
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=****;AccountKey=*****;BlobEndpoint=https://***.blob.core.windows.net/;TableEndpoint=https://*****.table.core.windows.net/;QueueEndpoint=https://****.queue.core.windows.net/;FileEndpoint=https://****.file.core.windows.net/",
"FUNCTIONS_WORKER_RUNTIME": "dotnet"
},
"version": "2.0"
}
The function code (without my libraries), works on first upload only.
[FunctionName("EmailQueueWorker")]
//public static async Task Run(
public static async Task Run(
[QueueTrigger(queueName: "email", Connection = "AzureWebJobsStorage")] string queueItem,
ILogger log
)
{
log.LogWarning("Start run()");
}
What am I doing wrong (or where can I find logs? the Application Insights is empty)? Thanks
I ran into the same problem a week or two ago; I'd bet good money the problem is your connection to the queue. For comparison, this is my full and complete host.json for my (working) queue trigger function:
{
"version": "2.0",
"logging": {
"applicationInsights": {
"samplingSettings": {
"isEnabled": true
}
},
"fileLoggingMode": "always",
"logLevel": {
"default": "Information",
"Host.Results": "Error",
"Function": "Trace",
"Host.Aggregator": "Trace"
}
}
}
Connection String
Our logLevel section is a bit different, and you'll note there aren't any connection strings there. I'm still farely new to Azure but from what I've learned, that isn't where they go.
In Visual Studio 2019, right click on the Project, then Publish. Under Actions, click Manage Azure App Service settings. There, you can add any needed connection string settings. If you need to specify a storage account, the setting name should be the name of the storage account plus "_STORAGE". For example, if your storage account was named MyVault then the name of the setting would be MyVault_STORAGE.
In VS Code, it's a bit different to get to. You have to look under Azure, Functions, and then be sure you select your Azure subscription (not the local copy!) and drill down into the function, Application Settings, where you can add/edit.
In the Azure portal, you can manage app settings this way.
Logs
In Azure portal, start by going to Function App. Click on your primary function name. Now, under the new menu for that function, under Functions sub-menu, click on Functions. Now you'll see a list of all the different functions that comprise your queue trigger function. Among them should be EmailQueueWorker - click on it. Now, you should see the execution count, and you can click on Monitor in the left hand menu, then Logs in the middle area. You can Start/Stop/Clear as needed.
For whatever reason, I find that I see the actual log data a lot faster when I use either Visual Studio 2019 or VS Code to stream it rather than the web console. There seems to be a bit of a delay at times with the web console.
I created a very simple Console app which is supposed to log the messages to the AWS Logs but although the app runs I can't find any log on AWS.
I think publishing the app code does not make sense: I presume it's ok and it does not throw any exception.
I think the problem is located in the AWS settings. This is what I did in AWS:
created some role , not sure why but did it almost close to what aws poor and messy documentation says. So the role is created, not exactly as it was supposed in the "documentation" but it contains the required permissions for the logs. Why I created it? - I don't have a clue - my app does not use it!
Created the Log Group - ok, this parameter is what I put into the config of my app
Not sure I need t create the log stream, but ok, I created it, but when I click on it it says "No events found." and "It appears you have not installed a CloudWatch Logs agent .."
Why do I need some agent? what is it? how to install? - absolutely not clear and pointing to the poor aws "documentation" is useless.
I guess these are the major things done in the AWS but..still no result - nothing works, I cant see the logs.
Searched for the answer in google, youtube, etc - no result.
Found some code which is similar to mine but it's no enought - it seems there are some settings required to be done on AWS.
What's wrong?
You have two options:
Write log files to disk and use CloudWatch Agent to submit these logs to CloudWatch: https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/QuickStartWindows2016.html
With this option you don't need to configure anything related to AWS in the program, but you have to install and configure the Agent.
Use AWS.Logger NuGet package and configure it to send the logs to CloudWatch, in this case you don't need to use the Agent: https://github.com/aws/aws-logging-dotnet/tree/master/samples/AspNetCore
With this option you must create AWS API user with CloudWatch Log writing permission and put this user credentials into AWS.Logger configuration. Show the configuring code you used if you need an advice on this.
I had a similar problem, which turned out to be more config-related.
Firstly, make sure that you have AWS Toolkit for Visual Studio installed and set up with the appropriate user. I use an IAM User with the correct policy permissions to read and write Cloudwatch logs.
Here's a copy of my basic console test that works correctly:
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Configuration;
using System;
using System.Diagnostics;
using System.Threading;
using System.Threading.Tasks;
namespace Runner
{
class Program
{
public static async Task Main(string[] args)
{
var services = ConfigureServices(new ServiceCollection())
.BuildServiceProvider();
await services.GetService<App>().RunAsync();
}
private static IServiceCollection ConfigureServices(IServiceCollection services)
{
var configuration = ConfigurationFactory.GetConfiguration();
services
.AddSingleton(configuration)
.AddLogging(builder =>
{
var config = configuration.GetSection("Logging");
builder
.AddConfiguration(configuration.GetSection("Logging"))
.AddConsole()
.AddDebug()
.AddAWSProvider(configuration.GetAWSLoggingConfigSection().Config);
})
// add app
services.AddTransient<App>();
return services;
}
}
public class App
{
private ILogger<App> Logger;
public App(ILogger<App> logger)
{
Logger = logger;
}
public async Task RunAsync()
{
try
{
Logger.LogTrace("LogTrace", "{\"Test\":1}");
Logger.LogInformation("LogInformation", "{\"Test\":2}");
Logger.LogWarning("LogWarning", "{\"Test\":3}");
Logger.LogDebug("LogDebug", "{\"Test\":4}");
Logger.LogError("LogError", "{\"Test\":5}");
Logger.LogCritical("LogCritical", "{\"Test\":6}");
Thread.Sleep(3000);
Debugger.Break();
}
catch (Exception ex)
{
throw;
}
}
}
}
And my appsettings.json file is:
{
"Logging": {
"Region": "eu-west-1",
"LogGroup": "/dev/runner",
"IncludeLogLevel": true,
"IncludeCategory": true,
"IncludeNewline": true,
"IncludeException": true,
"IncludeEventId": false,
"IncludeScopes": false,
"LogLevel": {
"Default": "Debug",
"System": "Information",
"Microsoft": "Information"
},
"Console": {
"LogLevel": {
"Default": "Error",
"System": "Information",
"Microsoft": "Information"
}
},
"Debug": {
"LogLevel": {
"Default": "Trace",
"System": "Information",
"Microsoft": "Information"
}
}
}
}
The Thread.Sleep is to allow the console logger to catch up with itself - if you just break you often don't see anything.
Similarly, if you quit the program executing at the breakpoint the AWS logger won't flush its buffers to Cloudwatch (it will just create the logstream and leave it empty), so let the program run to completion to populate the logstream itself.