I have a project consisting of a windows client (approx. 150 users), a webservice and some windows services. All together working in an intranet and build using C# .NET 3.5. Now I want to log exceptions in a central database and manage them (watch top 10, ticket system, etc.) via a web application.
I thought about using and expanding ELMAH, because it already has an web application for management. Maybe create a webservice for the clients to log their exceptions.
Is that a good idea, because ELMAH is obviously intended for asp.net web sites only.
I am aware of the Exception Management Application Block, but as far as I know it has no management application like ELMAH, plus my last visit at the Enterprise Library was no fun.
What is your opinions, are there other ideas?
Enterprise Library is cumbersome and overkill. Look at open source logging components: NLog link text or Log4Net link text. They both have the capability to log to various "sinks" including a flat file, UDP, database, etc.
I would set something up where your logging component writes to the event log on the server. Then use something like Microsoft Operations Manager (MOM) or another systems management software that can scan the event log and raise alerts via paging, command-center console, etc. At the same time, you could also log to a database for querying, etc.
If you are looking for management of exceptions, reporting, alerting, etc... There are tons of solutions like MS MOM, Tivoli, CA Unicenter, HP OpenView, and even NagIOS that you could use for this.
The client-side is a bit more tricky. Since it is intranet, you could use UDP and run a service on the server that will listen for those UDP packets and store them in the event log and/or a database. Or you could add some methods to your web service to capture logging events.
I don't think that your idea to expand ELMAH is a bad one at all. Having done many similar projects I've always had to roll my own management apps and it is always a pain. Not sure how much you will be able to use from ELMAH but it sounds like it might be a great starting place.
Related
The situation is: we've got a number of working application instances, developed on C#. We want them to log in one place (that could be a file). As far as I know log4net and NLog can send logs via TCP. The problem is - how to listen to these logs and store it?
Is there any working solution to collect these logs?
In log4Net you shoul configure appender, here official documentation see RemotingAppender
For the listen TCP you should use TcpListener like in this resource, there are exist some clients for log4net like this
In NLog you might consider the Database target. NLog has some other targets you might consider, including the LogReceiverService, which sends logging messages to a WCF Service or Web Service, where they can be logged to any of the NLog targets, including to a file.
In log4Net you might consider the AdoNetAppender. Configuration examples here.
I will note that, in the past, I implemented a WCF-based LoggingService (which is ultimately similar to NLog's LogReceiverService), which worked well for me.
It is better to use Application insights feature provided by MS. it can be used any language not just Microsoft languages. Application insights is divided into 2 parts. SDk for instrumenting telemetry(logs data) and visualizing these logs in azure dash boards. SDK is opensource and you need to pay for azure visualization tools. if you don't want to pay then use Application insgists in your code and send these logs to elastic stack which is open source
the best architecture
MS Application Insights -- for instrumenting logs
Apache Kafka -- acts as a pipeline and as temporary stoarge, send your logs to kafka
logstash-- a filter with which you can filter log
elasticsearch -- a no sql db to where the filtered data stored
kibana -- dash boards which pulls data from elastic and give vizualisations.
also you can link spark to kafka output to trigger alerts in form of emails, text messages.
I got a chat application (webservice) running on a website hosted by a web farm and I don't know how to temporarily store the chat messages. Im using long polling to save resources and I have specified a shared machine key.
Because its running on a web farm the HttpApplicationState won't work and saving each message to my database would cause a lot of overload and overhead, and I doubt that would be a good idea.
So is there any other approach to save the messages in server "memory", note: within a web farm?
The classic solution to this is to use a distributed cache; it's not as popular in .Net world as it might be, but here's an article on MSDN; Microsoft has a product, or you can use the open source Memcached, for which you can also get .Net client libraries and Windows versions.
Please note that while distributed caching is very cool when it works, it does introduce a lot of additional complexity, and exciting new ways for bugs to creep into your app. I'd only go down this route if I really, really needed to.
I found some more help on the topic here. It introduces different caching techniques. Without the use of third-part software.
I have a.NET (C#) WPF application which run on different clients.
I would like to track the usages, metrics, error, etc. of the application (with the clients permission off course) and have this information be sent back for further analysis.
I'm talking something like Google Analytics but for a client application and not web site.
I'm currently looking for very basic stuff like errors and crashes of the application, application start, application exit and because my application is build with navigation (not SDI or MDI) when a screen is navigated to and when navigated away.
Because this is a client application, and some clients are not always connected to the internet I think I'll have to cache the data and send it once connection exists.
Has anyone seen something like this (that cost less then 100$) ?
Do anyone else is interested in such ability?
Thank you very much,
Ido.
I developed something that basically did what you want on a project once. I just used an AOP library (I think I used PostSharp but there are quite a few libraries out there now that are free) that tracked when a form was opened/closed, when errors were thrown, etc. We just stored the info in a text file which was uploaded to an FTP server when the application was started the next time. We mostly used it for error reports (it only sent if the application crashed) but you could do something like that for metrics gathering purposes.
I have a question that is perhaps slightly towards architecture and design rather than syntax problem....
I have a c# winforms app on my desktop which I have built which is similar to a CRM. I make updates to customers records as they submit orders/invoices etc and what I would like to do is to build (or buy) a module that will update a remote database that sits behind a website onto hich registered clients log in. I want the clients to be able to see the status of their invoice/purchase etc as I have updated it on the winforms app.
I can think of a couple of options of the top of my head but would like to know more if you have done something similar
Things I am considering are;
>Replication - I think this is overkill as the updates are not
huge, are one way only, and not
critical they are in real time, and
also I am running SQL express on the
winforms app. This can be changed
but rather not
>create a text/xml file that gets created and uploaded to the web
server to a location that is
monitored every 5 minutes and then
updates the web database. - I am not
hosting the website myself so I do
not have complete control over what
I can install but I suspect I can
install a .NET 'filewatcher'
Anyway, I would appreciate your thought on my 'problem'
thanks
I think your best bet is to create a web service of some kind (I like using ServiceStack.net to create simple REST ones, much cleaner imo than WCF). This will sit on the server and be responsible for the server-side sync piece.
On the client, you could either have the winforms app fire off the call to the web service based on some threshold of activity, or you could have a windows service that you install with the winforms app which does it in a scheduled job or on a timer.
You'll want to be sure that you're doing all your calls over SSL of course, and make sure you're authenticating the clients, but that's the basic architectural approach I'd take.
I'm going to develop a POS system for medium scale company
and the requirement for me is to make all data on time for all of their branches
while in my mind, move the server from local to web would solve this problem
but, i never done any online server for window application
may i know what is the best option for use as secure database ?
such as SQL can handle this well ?
i tried to google but all of the result return is not what i want
may i know what will you do when you facing this problem ?
my knowledge on coding is just VB and CS
also SQL for database
i would like to learn new if there is better option
i hope it is impossible to access by anonymous and it is store secure at back-end only
What you probably want to do is create a series of services exposed on the internet and accessed by your application. All database access would be mediated by these services. For security you would probably want to build them in WCF and expose them through IIS. Then your Windows application would just call these services for most of its processing.
If you design it properly you could also have it work with a local database as well so that it could work in a disconnected manner if, for example, your servers go down.
Typically you don't move the server off of the site premises.
The problem is that they will go completely down in the event your remote server is inaccessible. Things that can cause this are internet service interruption (pretty common), remote server overloaded (common enough), basically anything that can stop the traffic between the store location and your remove server will bring them to their knees. The first time this happens they'll scream. The second time and they'll want your head due to the lost sales.
Instead, leave a sql server at each location. Set up a master sql server somewhere. Then set up a VPN connection between the stores and this central office. Finally, have the store sql boxes do merge replication with the central office. Incidentally, don't use the built in replication, but an off the shelf product which specializes in replicating sql server. The built in one can be difficult to learn.
In the event their internet connection goes dark the individual stores will still be able to function. It will also remain performant as all of the desktop app traffic is purely to the local sql box.
Solving replication errors is much easier than dealing with a flaky ISP.
I would recommend you to check Viravis Platform out.
It is an application platform that also can be used just as an online database for any .NET client with the provided SDK. It has its own generic windows and web clients and some custom web solutions for some specific applications.
You may be using it as a complete solution or as a secure online database backend.