C# code that run constantly - service or separate thread? - c#

I have a .NET 4 web application that has 3 separate projects associated – DAL, BAL, and UI. I am using Entity Framework for database interaction.
I have code that cycles through a bunch of database data, calls methods depending on what it finds, and then updates a database. I want this code to run all the time. At the same time I want users to be able to log in and run reports etc all while the code in the background is constantly running.
What is a good approach to this? Do I create a service for the code that constantly runs, a separate thread, an entirely separate project for the code that runs constantly, or a different approach..?
Also, depending on the answers given, how would I kick start the code that runs constantly? i.e. just through form load method or is there a better approach? I currently kick start the code by clicking a start button; this is fine for testing, but not going to work when in production.

You would be best suited for using Windows Services for always-running tasks.
Running code on a separate thread under IIS is not a reliable mechanism since IIS can terminate threads at will to conserve server resources.

Given your question and clarifications on other answers that:
Your solution runs in a hosted environment where you cannot install a service;
Calling it from a third server (i.e. Azure or such) is not an option for you;
You might be best off starting a thread in your Application_Start event to manage the database work. You'd probably want to ensure that this thread had some periodic idle time, so as not to take up too much of the hosted environment and ensure it's shutdown when your application ends or is restarted.
A service would really be optimal, but if you're in a hosted environment and can't/won't use another server, then that's not possible.

Use a Windows Service. Should also look into using Stored Procs for the database interactions you mentioned. In terms of kicking the Windows Service off, you can set it to automatic startup (when the OS starts) which will mean it will run until terminated.

I would only recommend a Windows Service if it will literally always be running. However, "always" usually means every x seconds / minutes / hours /days.
If x is greater than a few minutes, I would make it a Console Application and run it through the Windows Task Scheduler. This way you don't have to worry about memory leaks and a slew of other issues.
However, if it is only working with the database, I would recommend a stored procedure and a Sql Job.

Related

What to do with long running processes on a ASP.NET MVC site?

This program is used for taking data that has been imported and exported it line by line to normalized tables. It can take hours sometimes.
How it works is
Uses presses button to begin processing
A call is made from jquery to a method on the MVC controller.
That controller calls a DLL which then begins the long processing
Only one person uses this program at a time.
We are worried about it tying up a ASP.NET IIS thread. Would this be more efficient if instead of the web site running the code we could make a scheduled task that runs a EXE every 30 minutes to check and process the code..
EDIT 1: After talking to a coworker the work around we will do for now is simply remove the button from the web site that processes and instead refactor that processing into a scheduled task that runs every 5 minutes ... if there are any comments about this let me know.
The question is really about the differences between the web site running code vs. a completely separate EXE...IIS threads vs. processes... Does it help any ?
If the processing takes hours, it should definitely be in a separate process, not just a separate thread. You complicate thread locking and management, garbage collection and other things by dropping this into a separate process. For example, if your web server needs to be rebooted, your separate process can continue running without being affected. With a little work, you could even spin up this process on a separate server if you want (of course you would need to change the process start mechanism to do this)
When the task can run for hours having it block an ASP.Net thread is definitely the wrong thing to do. A web call should complete in a reasonable amount of time (seconds ideally, minutes at worst). Tasks which take considerably longer than that can be initiated from a web request but definitely shouldn't block the entire execution.
I think there are a couple of possible paths forward
If this is a task that does need to be executed on a semi-regular basis then factor it into an EXE and schedule the task to run at the correct interval
If this task should run on demand then factor it out into an EXE and have the web request kick off the EXE but not wait for its completion
Another possibility is to factor it out into a long running server process. Then use remoting or WCF to communicate between asp.net and the process

how can I set up a task to run in background?

I am have a web application that is written in c# .net 4.0. It connects to an Oracle database to perform CRUD operations.
What I need to is write a background type tasks that is polling the database at predefined intervals to check to see if it is up. If its down a property in memory is set so that no further requests to the database are made. The task continues to run and alters the property once the database is available again.
I wondering what is the best way to structure such in my application, how I would set up a background task to run etc or any advice on the implementation of such? I don't want this to hog resources on the server so it needs to run in background and not be resource intensive.
A Windows Service is the obvious answer for background tasks. The basics are pretty easy to do - just create a new project by selecting Windows Service as the project type and go from there.
Inside the Windows Service create an instance of a class and within it create a timer - when the timer fires run your periodic tasks
If you must do this I would suggest writing a Windows Service, but have it modify a config file setting that your web application checks. I think having both applications looking at the same config file is the most simple solution in this case.

C# automation application

I would like to create an auto email distributor application by C# for my company.
Usually, I would create a console application and set up a schedule to run this program from server task scheduler.
Is there any ways to build this kind of application other than console application? And what is the benefit of it compared to console application.
It can be done with every application type. Asp.net\Silverlight\WCF\WPF\WinForm...
Console Application is the easiest way because
It doesn't have a UI layer(or almost doesn't have...).
No configurations are required.
If you have a robust scheduling application (such as task scheduler), there is little point to rebuilding this inside your application. If the scheduling application is not sufficient for your needs, you'll need to embed that logic in your app.
Either way, a console application is fine. The latter scenario will just require more configuration (in the form of a configuration file probably). Still no need for a UI unless the config is complex.
You may write a Service. Writing a Service in .Net is too easy and straight forward and then you need to use a timer to invoke the operation on schedules.
A service will be executed in the background and can be configured to be executed as soon as the computer turns on.
Because services do not have user interface, you need to feed them with the requests using a database or a messaging system.
Windows schedule of a console application is ok.
If your schedule is continuous (every minute) you might be better off a Windows service see Here.
See this stackoverflow question for the comparison HERE

Windows service to do job every 6 hours

I've got a windows service with only two methods - one private method DoWork(), and an exposed method which calls DoWork method. I want to achieve the following:
Windows service runs DoWork() method every 6 hours
An external program can also invoke the exposed method which calls DoWork() method. If the service is already running that method called from the service, DoWork() will again be invoked after the current method ends.
What's the best approach to this problem? Thanks!
An alternative approach would be to make use of a console application which can be scheduled by Windows task scheduler to run every 6 hours. In that case you don't waste resources to keep the Windows service running the entire time but only consume resources when needed.
For your second question: when you take the console app approach you can have it called by making use of Process.Start for example.
If the purpose of your application is only to run a specific task every six hours, you might be better off creating a command line application and creating a scheduled task that Windows runs automatically. Obviously, you could then manually start this application.
If you're still convinced you need a service (and honestly, from what I've seen so far, it sounds like you don't), you should look into using a Timer, but choose your timer carefully and read this article to get a better understanding of the timers built into .NET (Hint: Pay close attention to System.Timers.Timer).
To prevent reentry if another method tries to call DoWork() while the process is in the middle of performing its operation, look into using either a Mutex or a Semaphore.
there are benefits and drawbacks either way. my inclination with those options is to choose the windows service because it makes your deployment easier. scheduling things with the windows task scheduler is scriptable and can be automated for deployment to a new machine/environment, but it's still a little more nonstandard than just deploying and installing a windows service. you also need to make sure with task scheduler it is running under an account that can make the webservice call and that you aren't going to have problems with passwords expiring and your scheduled tasks suddenly not running. with a windows service, though, you need to have some sort of checking in place to make sure it is always running and that if it restarts that you don't lose hte state that lets it know when it should run next.
another option you could consider is using nservicebus sagas. sagas are really intended for more than just scheduling tasks (they persist state for workflow type processes that last for more than the duration of a single request/message), but they have a nice way of handling periodic or time-based processes (which is a big part of long running workflows). in that a saga can request that it get back a message from a timeout manager at a time it requests. using nservicebus is a bigger architectural question and probably well beyond the scope of what you are asking here, but sagas have become how i think about periodic processes and it comes with the added benefit of being able to manage some persistent state for your process (which may or may not be a concern) and gives you a reason to think about some architectural questions that perhaps you haven't considered before.
you can create a console application for your purpose. You can schedule the application to run every 6 hours. The console will have a default method called on application start. you can call your routine from this method. Hope this helps!!

Programming a long-running time-based process

I was wondering what the best way to write an application would be. Basically, I have a sports simulation project that is multi-threaded and can execute different game simulations concurrently.
I store my matches in a SQLite database that have a DateTime attached to it.
I want to write an application that checks every hour or so to see if any new matches need to be played and spawns those threads off.
I can't rely on the task scheduler to execute this every hour because there are objects that the different instances of that process would share (specifically a tournament object), that I suspect would be overwritten by a newer process when saved back into the DB. So ideally I need to write some sort of long-running process that sleeps between hours.
I've written my object model so that each object is only loaded once from memory, so as long as all simulation threads are spawned from this one application, they shouldn't be overwriting data.
EDIT: More detail on requirements
Basically, multiple matches need to be able to run concurrently. These matches can be of arbitrary length, so it's not necessary that one finishes before the other begins (in fact, in most cases there will be multiple matches executing at the same time).
What I'm envisioning is a program that runs in the background (a service, I guess?) that sleeps for 60 minutes and then checks the database to see if any games should be started. If there are any to be started, it fires off threads to simulate those games and then goes back to sleep. Hence, the simulation threads are running but the "scheduling" thread is sleeping for another 60 minutes.
The reason I can't (I think) use the default OS task-scheduling interface is that these require the task to be executed to be spurned as a new process. I have developed my database object model such that they are cached by each object class on first load (the memory reference) meaning that each object is only loaded from memory once and that reference is used on all saves. Meaning that when each simulation thread is done and saves off its state, the same reference is used (with updated state) to save off the state. If a different executable is launched every time, presumably a different memory reference will be opened by each process and hence one process could save into the DB and overwrite the state written by the other process.
A service looks like the way to go. Is there a way to make a service just sleep for 60 minutes and wake up and execute a function after that? I feel like making this a standard console application would waste memory, but I don't know if there is an efficient way to do that which I'm not aware of.
If you want to make it really reliable, make it a Service.
But I don't see any problems in making it a normal (Console, WinForms, WPF) application.
Maybe you could expand on the requirements a little.
The reason I can't (I think) use the default OS task-scheduling interface is that these require the task to be executed to be spurned as a new process. I have developed my database object model such that they are cached by each object class on first load (the memory reference) meaning that each object is only loaded from memory once and that reference is used on all saves
If you want everything to remain cached forever, then you do need to have an application that simply runs forever. You can make this a windows service, or a normal windows application.
A windows service is just a normal exe that conforms to the service manager API. If you want to make one, visual studio has a wizard which auto-generates some skeleton code for you. Basically instead of having a Main method you have a Service class with a Run method and everything else is the same.
You could, if you wanted to, use the windows task scheduler to schedule your actions. The way you'd do this is to have your long-running windows service in the background that does nothing. Have it open a TCP socket or named pipe or something and just sit there. Then write a small "stub" exe which just connects to this socket or named pipe and tells the background app to wake up.
This is, of course, a lot harder than just doing a sleep in your background application, but it does let you have a lot more control - you can change the sleep time without restarting the background service, run it on-demand, etc.
I would however, consider your design. The fact that you rely on a long-running service is a large point of failure. If your app needs to run for days, and you have a single bug which crashes it, then you have to start again. A much better architecture is to follow the Unix model, where you have small processes which start, do one thing, then finish (in this case, process each game simulation as it's own process so if one dies it doesn't take the master process or the other simulations down).
It seems like the main reason you're trying to have it long-running is to cache your database queries. Do you actually need to do this at all? A lot of the time databases are plenty fast enough (they have their own caches, which are plenty smart). A common mistake I've seen programmers make is to just assume that something like a database is slow, and waste a pile of time optimizing when in actual fact it would be fine

Categories

Resources