The situation is: we've got a number of working application instances, developed on C#. We want them to log in one place (that could be a file). As far as I know log4net and NLog can send logs via TCP. The problem is - how to listen to these logs and store it?
Is there any working solution to collect these logs?
In log4Net you shoul configure appender, here official documentation see RemotingAppender
For the listen TCP you should use TcpListener like in this resource, there are exist some clients for log4net like this
In NLog you might consider the Database target. NLog has some other targets you might consider, including the LogReceiverService, which sends logging messages to a WCF Service or Web Service, where they can be logged to any of the NLog targets, including to a file.
In log4Net you might consider the AdoNetAppender. Configuration examples here.
I will note that, in the past, I implemented a WCF-based LoggingService (which is ultimately similar to NLog's LogReceiverService), which worked well for me.
It is better to use Application insights feature provided by MS. it can be used any language not just Microsoft languages. Application insights is divided into 2 parts. SDk for instrumenting telemetry(logs data) and visualizing these logs in azure dash boards. SDK is opensource and you need to pay for azure visualization tools. if you don't want to pay then use Application insgists in your code and send these logs to elastic stack which is open source
the best architecture
MS Application Insights -- for instrumenting logs
Apache Kafka -- acts as a pipeline and as temporary stoarge, send your logs to kafka
logstash-- a filter with which you can filter log
elasticsearch -- a no sql db to where the filtered data stored
kibana -- dash boards which pulls data from elastic and give vizualisations.
also you can link spark to kafka output to trigger alerts in form of emails, text messages.
Related
I'm working on a timed recurring process that in some cases will be deployed OnPrem and in other cases deployed in the cloud (Azure). I'm researching a Windows Service and an Azure WebJob. Given that I only need the recurring process to be the timed piece, I'm thinking about having the bulk of the logic in a library with just the entry points being different between the Windows Service for the local deployment or a WebJob when deploying to Azure. Each csproj (service and WebJob) would handle only the timed loop and configuration settings then call into the library for the bulk of the work.
My question is: Is there another design combination available to me that would fulfill these requirements potentially in a better way? I've read about wrapping an existing windows service in a WebJob, but I don't think that would be necessary in this case given I'm starting from scratch.
When it comes to keeping your common code up to date, and knowing which versions are used by which applications, bets solution is to create a class library project with respected design pattern and convert it into a nuget project.
you know you can host your own private NuGet repository, create your own packages, and host them internally within your own network.
Here is a very nice article for"How to create Nuget package out of your class library project". You can utilize it and share it across all your code.
And finally you can just callit from your windows service/WebJob.
Let me know if you need any help related to designing the solution.
Hope it helps.
After using Azure Mobile Services a year ago, I decided to get back to mobile development but Microsoft changed a lot in their offer and I'm actually struggling to set my project up.
My goal is to create a service whith these features:
.NET backend preferred over the Javascript one (I don't like callbacks :))
SSO (Facebook, Twitter, Google+, Windows Live)
SQL Database (I really need relations, and I already have a T-SQL schema)
Push Notifications (just to Windows and Android for now and with unlimited custom channels so that I can have one channel for each user and avoid dealing with notifications' logic)
Monthly scheduled jobs to update database from an external JSON API and to remove old entries
Mobile Client (with a shared Xamarin library to handle all the data-related stuff and WUP + Android support)
Web Client (I don't have a Mac so I can't build and publish the iOS version, so a web app may be needed as a temporary replacement)
What I did was to:
Open Azure Preview Portal link
Click on New => Web + Mobile => Mobile App
Set the Resource Group with all the needed plans
Added a Data Connection to a newly created SQL Database
Added a Notification Hub with settings for GCM and WNS
Added Mobile Authentication with settings for Microsoft Account, Facebook, Twitter, Google
Created the schema for my SQL Database
Before going on, I'm not sure that this was the correct workflow but documentation is pretty confused and the Get Started sections just discuss about code and not how to properly setup the service and have it running, so I just did the same basic things that I would've done with the old Mobile Service, plus dealing with the SQL Database instead of the NOSQL one.
Now it comes the issue: I have no idea on how to move next, and even the Quickstart projects (both server and client) are not helpful (they're the old TodoItem sample working with the Mobile Service).
The first thing that I wanted to do was to create the Scheduled Job because I actually need to fill the database with the external data before moving forward.
The only thing close to what I need is the WebJob, but I can't schedule it yet and it requires me to upload an exe file while I'd like to be able to write my C# code directly to the server (being able to remotely debug it).
An alternative may be to create a Compute Instace and write an endless loop doing what I need, but this will force me to manually deal with the SQL Database inside the Mobile App Service.
Another issue is related to the SQL Database. As I already wrote, the Quickstart seems to work with the NOSQL included in the old Mobile Service, meaning that I don't have a direct connection to my SQL Database, while I'd like to be able something like
App.MobileService.GetTable<MyTable>()
Plus, having 10 tables, I'd also like to have a way to map them automatically (like NetBeans does for JavaEE projects).
So the question is: what's a good (or the best) workflow to get everything working as I need it or, at least, close to how I need it?
(I know that answers may be opionion-based but they still may be useful since Microsoft's documentation is not complete)
If you have an existing database, you could use Entity Framework Code First to Existing Database. That will generate the C# classes for you.
The database that you create when you add a Data Connection to your Mobile App is an Azure SQL Database by default--I'm not sure why you thought it was NoSQL?
Once you have done this, you can query your tables from the Azure Mobile Apps client SDK. For instance, in Xamarin, the quickstart project does queries as follows (see https://github.com/Azure/azure-mobile-services-quickstarts/blob/MobileApp/client/xamarin.android/ZUMOAPPNAME/ToDoActivity.cs#L126).
var list = await toDoTable.Where (item => item.Complete == false).ToListAsync ();
Finally, regarding your question on WebJobs, you can actually schedule it. See https://azure.microsoft.com/en-us/documentation/articles/web-sites-create-web-jobs/#CreateScheduledCRON for more information. Even though you Web Deploy your webjob target, you can still remote debug it. See this blog post for information: http://www.bursteg.com/remote-debugging-azure-webjobs-attach-a-debugger-from-server-explorer/
I have several running .net service processes and now I want to write a monitor application where the user can see all services and what they are currently doing.
The service write their output to a log file/event log.
The monitor application should present the services (this is easy) and then the user can choose on of the service and sees what the service is doing.
My idea was to poll the logfile (or set a FileSystemWatcher on it) and reload the logfile all the times.
Is there a better way / what are the alternatives?
e.g.
Is is possible to connect to the service process and the service process raises events (with the logtext) to the monitor application process (in Win32 I did that long ago, but I donĀ“t know how to do that in .net)?
Or write to shared memory?
Or is is possible to pass messages through the C# ServiceController?
Ideally, youre service can write its messages into any kind of logger interface. Let's call it IServiceLogger which provides a method like Log(string message). Imaging that your service has a list of loggers implementing this interface and writes into each of them.
With this, you can plug any implementation of that interface to retrieve the logs from the service. So this could be an ServiceFileLogger implementation to write into the file system as well as a NetworkLogger which sends those log messages to all connected clients.
Regarding the NetworkLogger, you still have to do some work to handle the connection between the monitoring clients and the server (the process which receives the logs and sends them to the clients) but you could build really smart monitoring tools and deploy them in your company network or even build a website showing live logs.
Tip:
If you want a more standardized solution, I'd highly recommend to take a look at log4net which basically does the same internally but has a huge set of features on top. With log4net you can use (or create) a bunch of different log appenders which receive these messages and can process them further like writing emails, inserting into a database, writing into files, etc. So I think you can find a matching appender or create your own for your monitoring clients.
To achieve that you can use Windows Communication Foundation. You can expose an event in your service, and your monitor can bind to this event.
This method depends on the implementation, but you can learn about WCF here: https://msdn.microsoft.com/en-us/library/ms734712(v=vs.110).aspx
I recently found a .NET tutorial that showed me how to make a simple chat application in Visual studio using the .NET library, signalr.
I have created the application and made a few moderations including some AES encryption. I have then hosted it on Windows Azure as a website.
The chat application works and has been tested, but I don't know how to view the messages sent from one user to the other.
Can someone tell me where I can find these message streams?
Thanks
All messages will go through the SignalR Hub, so you could either use breakpoints/debugging to see what is send (if you are running in your development environment). Otherwise you could use tracing in the Hub to see the messages.
There is a extension to glimpse that you might use (I haven't tested it though): http://www.nuget.org/packages/Glimpse.SignalR
You can get some performance counters using the tools provided by Microsoft.
For what I know there is no way to get all messages send or received from the server. When you need to analyze the communication of specific clients, try to use Wireshark or Microsoft Network Monitor.
You could also write your own message logger that writes them to file.
You can use common methods as Trace.WriteLine to write info to the default listener. Then use a listener to write to file / database.
This source may be useful when learning to analyze messages from a webserver.
Fiddler is the best tool for monitoring web traffic. It's stupid easy to use and has a lot of extended features to help debug server/client applications. Take a look at this page which includes a brief description of using Fiddler with signalr.
Good luck!
I have a project consisting of a windows client (approx. 150 users), a webservice and some windows services. All together working in an intranet and build using C# .NET 3.5. Now I want to log exceptions in a central database and manage them (watch top 10, ticket system, etc.) via a web application.
I thought about using and expanding ELMAH, because it already has an web application for management. Maybe create a webservice for the clients to log their exceptions.
Is that a good idea, because ELMAH is obviously intended for asp.net web sites only.
I am aware of the Exception Management Application Block, but as far as I know it has no management application like ELMAH, plus my last visit at the Enterprise Library was no fun.
What is your opinions, are there other ideas?
Enterprise Library is cumbersome and overkill. Look at open source logging components: NLog link text or Log4Net link text. They both have the capability to log to various "sinks" including a flat file, UDP, database, etc.
I would set something up where your logging component writes to the event log on the server. Then use something like Microsoft Operations Manager (MOM) or another systems management software that can scan the event log and raise alerts via paging, command-center console, etc. At the same time, you could also log to a database for querying, etc.
If you are looking for management of exceptions, reporting, alerting, etc... There are tons of solutions like MS MOM, Tivoli, CA Unicenter, HP OpenView, and even NagIOS that you could use for this.
The client-side is a bit more tricky. Since it is intranet, you could use UDP and run a service on the server that will listen for those UDP packets and store them in the event log and/or a database. Or you could add some methods to your web service to capture logging events.
I don't think that your idea to expand ELMAH is a bad one at all. Having done many similar projects I've always had to roll my own management apps and it is always a pain. Not sure how much you will be able to use from ELMAH but it sounds like it might be a great starting place.