I am using SEQ, file and JSON as a Serilog sinks
Log.Logger = new LoggerConfiguration()
.Enrich.With(new ThreadIdEnricher())
//.Enrich.FromLogContext()
.WriteTo.RollingFile(#"C:\QRT\Logs\QRT-LOG.txt", LogEventLevel.Information)
.WriteTo.Seq("http://localhost:5341")
.WriteTo.Console(restrictedToMinimumLevel: LogEventLevel.Information)
.WriteTo.File(new CompactJsonFormatter(), "C:/QRT/Logs/log.clef")
.CreateLogger();
SEQ is for me because it looks like it would be really useful.
JSON I may do away with... I was attempting to write a file that I could import into Access. The point is that I need my non-developer friend to be able to see the logs and Access is a tool I believe he can use to easily filter on items such as Customer ID etc. I have not been able to find much documentation on the Serilog sinks other than their names. Can someone either suggest a mechanism to sink to something that can be imported to Access or another sink that a user-friendly tool can ready?
I am currently using NLog and GamutLogViewer which is awesome because it can color entries based on regular expressions!
Any suggestions would be most welcome. The idea is my friend is not looking at the logs to debug. He will be looking at the "Information" contained in the logs.
This is using C# on a console app in Windows.
Thanks
-Ed
Serilog has a sink called Serilog.Sinks.NLog which adapts Serilog to write events through your existing NLog infrastructure, which means you can effectively use Serilog throughout your app, but output log files in the NLog format, which would be readable by the GamutLogViewer (or YALV! as an alternative).
Another approach I can think of is to use the sink Serilog.Sinks.MSSqlServer where you write your logs to a SQL Server table (could even be a SQL Server Express instance on the user's machine, if you don't want/have a shared SQL Server) and then use Microsoft Access to query these logs via linked tables in Access.
Ultimately, you could develop your own sink that writes directly to a .csv file or even directly to an Access .accdb file, for example. Developing Sinks for Serilog is super easy and there are tons of examples you can use as a base for your custom sink.
Related
I have a ASP.NET Core Web API written in C# with docker-compose, elasticsearch, and serilog and Kibana. I plan on removing the Kibana from the docker-compose.yml file. Once the Serilog generates the log files and after configuring a sink to Elasticsearch so it can write the logs where elasticsearch can read it. How do I go about reading those logs that is now in elasticsearch without having to go to Kibana to view the logs and read them?
Is there any recommendations on a documentation and/or a package for this or is this something that needs to be programmed from scratch?
Suggestion attempt:
I went to Download Kafka then I went to powershell as an admin and did a wget (url). After it downloaded, I ran tar -xzf kafka_2.13-2.8.0.tgz &
cd kafka_2.13-2.8.0. I then followed what you advised to Activate Zookeeper broker and Kafka and then creating the topic. However when for each step you said to do, nothing happened. I would try to activate zookeeper it would tell me how do I want to open the file, so I would just hit ESC and then ran the other commands but same thing would come up. Should this be doing that?
You can use one of the two official clients for elasticsearch using .NET
There is a low level and high level client, you can read more about the difference and how to use each one in the official documentation.
Make use of log4net as log provide and it's Kafka_Appender .This appender will produce your operation logs in every level to topic and it consumers then Logstash will ingest this logs to your elastic index as output.
There are many privileges in this roadmap, You have supper powerful stream processor like Apache Kafka and its queue based messaging help you to always trace every logs that it produce an other one is Logstash which you can even add more stream processor and filter like grok and have multiplr outputs and even storing you logs as Csv or file system.
First activate Zookeeper and Kafka broker and create a consumer with some topic name in bin directory of downloaded Kafka file:
Activating Zookeeper broker
./zookeeper-server-start.sh ../config/zookeeper.properties
Activating Kafka broker
./kafka-server-start.sh ../config/server.properties
Create Topic
./kafka-topics.sh --create --topic test-topic -zookeeper localhost:2181 --replication-factor 1 --partitions 4
Active consumer of the created topic
./kafka-console-producer.sh --broker-list localhost:9092 --topic test-topic
Then add the log appender for created topic for consuming logs(This one is up to you) and after that create a Logstash pipeline such as below configuration
input {
kafka{
group_id => "35834"
topics => ["yourtopicname"]
bootstrap_servers => "localhost:9092"
codec => json
}
}
filter {
}
output {
file {
path => "C:\somedirectory"
}
elasticsearch {
hosts => ["localhost:9200"]
document_type => "_doc"
index => "yourindexname"
}
stdout { codec => rubydebug
}
}
And run it with popular command in the bin directory of logstash
./logstash -f yourconfigurationfile.conf
Please note that to create an index before activating Logstash more over you do not need to design mapping for your output index as soon as first document inserted elastic will creates a mapping for all relevant fields in your index.
Currently the application, Logger writes to the console by using the AddConsole() method.
How can it be set to write to a file?
for example to the directory "c:\workspace\TestProject\Log.txt"
logger = new LoggerFactory()
.AddConsole()
.CreateLogger("Msg");
I guess, you are using Microsoft.Framework.Logging
But out-of-the-box implementations are provided for basic console logging and a few other targets, you’ll need a logging back-end like Serilog or NLog to gain the kind of functionality you're requesting.
I would recommend you to use NLog (just personal preference)
Install-Package NLog
then add to your code
loggerFactory.AddNLog(new global::NLog.LogFactory());
https://github.com/aspnet/Logging/tree/dev/samples/SampleApp
http://nlog-project.org/
Can any one let me know how to implement logging with Enterprise Library 6.0 in C#. I want to do logging in Database, if it is available otherwise log the exceptions , information, messages into LOG file.
Can anyone tell me How to implement logging into Db, otherwise log in file dynamically.
I will have both logging DB and file config changes in App.config/Web.config.
So please help me on this how to implement logging dynamically based on runtime value:
If Db is available and accessible, then log, otherwise if DB is not accessible, then log to Log-file or event-viewer.
The new version 6 makes comprehensive use of the factory pattern, hence you need to set the logger up differently in version 6:
Try the following:
IConfigurationSource configsrc = ConfigurationSourceFactory.Create();
LogWriterFactory logWriterFactory = new LogWriterFactory(configsrc);
Logger.SetLogWriter(logWriterFactory.Create());
Logger.Write("logtest", "General");
Your description of your database logging requirements isn't quite clear, but I think these Code examples and links should be what you are looking for.
I am developing c# application, which is running as a windows service.
What ever transactions we are doing in the application i am writing it into log file.
A log directory is added in app.config file as below.
<add key ="LogDir" value="log" />
<add key ="LogLevel" value="2" />
And in the c# code the above one is accessing as below.
int logLevel = Convert.ToInt32(ConfigurationManager.AppSettings["logLevel"]);
if (logLevel > 0)
{
logger = new Logger();
logger.TraceLevel = logLevel - 1;
logger.logDir = ConfigurationManager.AppSettings["logDir"];
logger.logFileBaseName = "touchserver";
}
And then when any process is happening i am writing the data to the log as below.
TouchServer.Log(Logger.MessageType.Trace, 1, "Item successfully deleted");
And when i run my application in debug mode (i mean as console application) the log file will be created in the application's debug folder and the data will write into the log file.
But my problem is that when i install my application as service the log file is not getting created in the debug folder, and i am unable to see the actions performed , in case if anything went wrong.
Please help me to find a solution in this.
And i am installing service using Installutil command.
Thanks in advance
sangita
While you could get into why this is not working and fix the solution, overall there is no need to implement a logging component.
There are excellent free libraries available that do this very well. log4net is very popular. It is easy to use, feature rich and efficient. Take a look at it.
But my problem is that when i install my application as service the log file is not getting created in the debug folder, and i am unable to see the actions performed , in case if anything went wrong.
Check out what are the result of the IO operations by using Process Monitor. I suspect you'll find the identity being used to run the service process does not have write permissions where it is trying to write the log file.
But the better option is to use an existing logging library as Hemal suggests.
I'm trying to debug a webpart installed on a client's SharePoint instance. I wanted a quick and easy logging feature, so I thought of writing messages to a text file in the temp directory. SharePoint doesn't seem to like it, so what are my options?
IF you are writing to the temp directory, you will need to give the file (if it exists) or the directory rights for the IIS Application pool that the SharePoint IIS application is running under.
There are few ways of custom logging in sharepoint -
Use SPDiagnosticsService - You may write to the ULS via SPDiagnosticsService class.
Utilize diagnostics.asmx web service -
SharePointDiagnostics SharePointDiagnosticsObject = new SharePointDiagnostics();
SharePointDiagnosticsObject.Credentials = System.Net.CredentialCache.DefaultNetworkCredentials;
string Response = SharePointDiagnosticsObject.SendClientScriptErrorReport(message, file, line, client, stack, team, originalFile);
For more details on usage of diagnostics.asmx refer the following link -
https://vivekkumar11432.wordpress.com/2016/09/23/how-to-do-logging-in-uls-from-csom-in-c/
For more details on logging refer the following link -
http://www.codeproject.com/Articles/620996/Five-suggestions-to-implement-a-better-logging-in
Don't use
Microsoft.Office.Server.Diagnostics.PortalLog.LogString("Message");
According to Microsoft documentation - LogString is reserved for internal use and is not intended to be used directly from your code.
I would guess that this is a permissions issue that SharePoint is blocking you on (and probably not telling you that it is). When you try to write to a text file on the server, you need to have elevated permissions in order to do it. You can accomplish this using SPSecurity.RunWithElevatedPrivileges. Something like the following, if you want just a simple, small-code solution.
SPSecurity.RunWithElevatedPrivileges(delegate() {
using (StreamWriter sw = new StreamWriter(#"C:\log.txt"))
{
//log information here
}
});
Try a logging framework like log4net, or write a small logging framework writing into an external database, you could also use lists to log if you want to stay inside sharepoint