I need your advice for scheduling tasks in the MVC3 Webapp.
My task is to create some generic scheduler for different services in the webapp that can be used later in development. For example we have some available tasks that user can schedule whenever he wants.
I didn't want to reinvent the wheel and found the Quartz.Net library that can be used to create the scheduler.
I know that it's not a good idea to host scheduling inside webapp cause webserver can recycle the application pools, etc, so i decided to use it inside a Windows Service and then using Quartz.NET remoting feature to trigger tasks inside my webapp.
But i've found some issues with that. Correct me if I'm wrong, but when i tried to use the Quartz.NET remoting it runs the job inside the Windows Service process, it means that it needs to know about all types inside my webapp, so all the assemblies of the webapp should be referenced by it, and i need to have another config file for database, etc. So in case I write new job class, i can't easily schedule it, i need to stop the service and renew the library for it, so it's not very generic approach.
I can't find any info that Quartz.NET can run jobs only based on it's interface.
So I came up with idea to write my own scheduler that will be hosted in the Windows Service, and will have some IJob interface that will be implemented in the webapp. I will also use .Net remoting using IPC channel.
So the webapp will be like .Net Remoting Server, and when i want to add some new job and then schedule it, i will just have to write new job that implements IJob interface.
I will register it as
IpcChannel channel = new IpcChannel("CurrentIPC");
ChannelServices.RegisterChannel(channel);
RemotingConfiguration.RegisterWellKnownServiceType(
typeof(SimpleJob), "SimpleJob", WellKnownObjectMode.SingleCall);
RemotingConfiguration.RegisterWellKnownServiceType(
typeof(ComplexObject), "ComplexObject", WellKnownObjectMode.SingleCall);
in this case i will have two Job types registered. Then when scheduling the job i will pass the name of the class and on the Windows Service side that will stand for client (executing objects on the webapp side) i will just bind the passed name of the class with IJob like this:
Dictionary<string, IJob> jobs = new Dictionary<string, IJob>();
void AddJob(string name)
{
IJob obj = (IJob)Activator.GetObject(typeof(IJob), string.Format("ipc://CurrentIPC/{0}", name));
jobs.Add(name, obj);
}
So now i don't need to bother about references to my app and other stuff, the scheduler will do it's job without knowing anything, just IJob interface and executing tasks on the webapp side.
If i'm wrong or it's too complex and there are some other simpler methods of doing this, or there are some pitfalls that i'm not aware of, can you help me with that?
Thank you.
P.S.
Also there was an idea to have separate scheduler that will run the web app methods directly by executing a link to specified service in the web app, for example "http://localhost:3030/Request/12" and that's all, but in my web app you should be authorized to execute such request and again we have issues we need to resolve, and we will have additional load to the webserver with such requests in case of thousands of scheduled tasks.
I think you are on the right track, I would create the scheduler using Quartz.NET and host it in a Windows Service because of the app pool recycling issue.
It will trigger tasks/services in your webapp using specific URL:s for each task/service either in your web app or a separate web service instance.
Using this separation the scheduler only needs to know about the urls and the schedule and does not need to reference your app directly. This is also reusable in future projects.
I know that it's not a good idea to host scheduling inside webapp cause webserver can recycle the application pools, etc, so i decided to use it inside a Windows Service...
Why complicate? why not make it simple and use an external service to run webhooks at a period of time?
I use this service and I've been happy, though it's so easy that I could create my own service based on this simple procedure:
http://momentapp.com/
to simplify the call:
private void Run() {
try {
var work = RequestNewMessage(); // get work
ProcessWork(work); // process work
// Log work
}
catch(Exception ex) {
// Log Error
}
finally {
// set job for now plus1 minute
SetRecuringJob(DateTime.UtcNow.AddMinute(1));
}
}
private void SetRecuringJob(DateTime dt) {
PostJob("https://momentapp.com/jobs/new?job[at]={0:s}&job[method]=POST&job[uri]=http://yourapp.com/", dt);
}
A bit late I imagine, but could still be useful for others.
Here's a project I've spent quite a bit of time with that seems to fulfil most the requirements stated - in fact, I use it almost exclusively for callback jobs as described in the question. It's also what we use internally, so I do try to keep it updated.
http://backgroundworker.codeplex.com/
It's pretty similar to Quartz, but more focussed towards managing jobs and their data.
If your issue is you don't want to reference the entire web app from the WindowService, and I agree with that, why don't you simply create a utility dll implementing a generic job able to call your web app via a plain http url and implements the "actions" as a WCF REST service?
This I think will cleanup a bit the architecture and keep insulated ( and reusable ) the scheduler service in your organization.
I might be missing a requirement, but as you seem to be able to install a custom service, it sounds like you have full control over that machine. So you might as well schedule a Windows Task for your recurring job, which should give you a pretty solid basis for recurring tasks.
You can then schedule a VB script to load a url on a timely basis, like here: http://4rapiddev.com/internet/call-or-open-a-web-page-url-by-using-windows-task-scheduler-or-cronjob/
Anything goes of course, from batch file, over powershell to custom executable.
Upsides to this approach:
You can rely on default OS behaviour for what the OS is good at: scheduling :)
There's already a UI for the admin (albeit not a web UI)
You have no dependency but the OS
Possible downsides:
You have to build your own web management around it if you want to expose it to the user (e.g. use one of these as a basis: http://www.codeproject.com/Articles/2407/A-New-Task-Scheduler-Class-Library-for-NET or http://taskscheduler.codeplex.com/).
Security might be an issue if end-users are able to manipulate tasks
Less fine-grained control over what happens on failures or timeouts (not sure how big of an issue, as you can also manipulate how the Scheduler deals with those)
Edit: I did seem to have missed the requirement on the authentication, but when you're using Windows, you could allow for Windows authentication within a certain area of your app. And because you're using Scheduled Tasks which are Windows Authenticated, you have this covered. I never heavily used Windows authentication within an ASP.NET app, but you might even get away with only custom configuration and no extra programming.
Related
I am having an asp.mvc application which resides on a server.From this application, I want to start a process which is a bit long-running operation and will be resource intensive operation.
So what I want to do is I want to have some user agent like 3 which will be there on 3 machines and this user agent will use resources of their respective machines only.
Like in Hadoop we have master nodes and cluster in which tasks are run on the individual cluster and there is 1 master node keeping track of all those clusters.
In Azure, we have virtual machines on which tasks are run and if require Azure can automatically scale horizontally by spinning up the new instance in order to speed up the task.
So I want to create infrastructure like this where I can submit my task to 3 user agents from the mvc application and my application will keep track of this agents like which agent is free, which is occupied, which is not working something like this.
I would like to receive progress from each of this user agent and show on my MVC application.
Is there any framework in .net from which I can manage this background running operations(tracking, start, stop etc..) or what should be the approach for this?
Update : I don't want to put loads of server for this long running operations and moreover I want to keep track of this long running process too like what they are doing, where is error etc.
Following are the approach which I am thinking and I don't know which will make more sense:
1) Install Windows Service in the form of agents of 2-3 computer on premises to take advantage of resp resources and open a tcp/ip connection with this agents unless and until the long running process is complete.
2) Use hangfire to run this long running process outside of IIS thread but I guess this will put load on server.
I would like to know possible problems of above approaches and if there are any better approaches than this.
Hangfire is really a great solution for processing background tasks, and we have used used it extensively in our projects.
We have setup our MVC application on separate IIS servers which is also a hangfire client and just enqueues the jobs needs to be executed by hangfire server. Then we have two hangfire server instances, which are windows service application. So effectively there is no load on the MVC app server to process the background jobs, as it is being processed by separate hangfire servers.
One of the extremely helpful feature of hangfire is its out of the box dashboard, that allows you to monitor and control any aspect of background job processing, including statistics, background job history etc.
Configure the hangfire in application as well as in hangfire servers
public void Configuration(IAppBuilder app)
{
GlobalConfiguration.Configuration.UseSqlServerStorage("<connection string or its name>");
app.UseHangfireDashboard();
app.UseHangfireServer();
}
Please note that you use the same connection string across. Use app.UseHangfireServer() only if you want to use the instance as hangfire server, so in your case you would like to omit this line from application server configuration and use only in the hangfire servers.
Also use app.UseHangfireDashboard() in instance which will serve your hangfire dashboard, which would be probably your MVC application.
At that time we have done it using Windows Service, but if had to do it now, I would like to go with Azure worker role or even better now Azure Web Jobs to host my hangfire server, and manage things like auto scaling easily.
Do refer hangfire overview and documentation for more details.
Push messages to MSMQ from your MVC app and have your windows services listen (or loop) on new messages entering the queue.
In your MVC app create a ID per message queued, so make restful API calls from your windows services back to the mvc app as you make progress on the job?
Have a look at Hangfire, this can manage background tasks and works across VMs without conflict. We have replaced windows services using this and it works well.
https://www.hangfire.io
Give a try to http://easynetq.com/
EasyNetQ is a simple to use, opinionated, .NET API for RabbitMQ.
EasyNetQ is a collection of components that provide services on top of the RabbitMQ.Client library. These do things like serialization, error handling, thread marshalling, connection management, etc.
To publish with EasyNetQ
var message = new MyMessage { Text = "Hello Rabbit" };
bus.Publish(message);
To subscribe to a message we need to give EasyNetQ an action to perform whenever a message arrives. We do this by passing subscribe a delegate:
bus.Subscribe<MyMessage>("my_subscription_id", msg => Console.WriteLine(msg.Text));
Now every time that an instance of MyMessage is published, EasyNetQ will call our delegate and print the message’s Text property to the console.
The performance of EasyNetQ is directly related to the performance of the RabbitMQ broker. This can vary with network and server performance. In tests on a developer machine with a local instance of RabbitMQ, sustained over-night performance of around 5000 2K messages per second was achieved. Memory use for all the EasyNetQ endpoints was stable for the overnight run
Once a day, I want my ASP.NET MVC4 website, which may be running on multiple servers, to email a report to me. This seems like a pretty common thing to want to do, but I'm having a tough time coming up with a good way to do it.
Trying to run a timer on a server to do this work is problematic for a couple of reasons. If I have multiple servers then I'd have the timer running on all of them (in case a server goes down); I'd need to coordinate between them, which gets complicated. Also, trying to access the database via Entity Framework from a background thread adds the complication that I must employ a locking strategy to serialize construction/disposal of the DbContext object between the periodic background thread and the "foreground" Controller thread.
Another option would be to run a timer on the servers, but to have the timer thread perform a GET to a magic page that generates and emails the report. This solves the DbContext problem because the database accesses happen in a normal Controller action, serialized with all of the other Controller accesses to the database. But I'm still stuck with the problem of having potentially more than one timer running, so I'd need some smarts in the Controller action to ignore redundant report requests.
Any suggestions? How is this sort of thing normally done?
You should not be doing this task from your web application as Phil Haack nicely explains it in his blog post.
How is this sort of thing normally done?
You could perform this task from a Windows Service or even a console application that is scheduled to run at regular intervals using the Windows Scheduler.
The proper solution is to create a background service that runs independently of your website. However, if that is not an option there is a hack where you can use the cache as explained in Easy Background Tasks in ASP.NET by Jeff Atwood.
A few options:
If you are hosting on Azure as a Website, check out WebJobs which was released recently in preview (http://azure.microsoft.com/en-us/documentation/articles/web-sites-create-web-jobs/)
If you don't want the pain of extracting out your email logic outside of the website, expose that functionality at a url (with a handler, mvc action, etc.) and then run a Windows Scheduled task that hits that url on a schedule.
Write a simple console app that is executed similarly via a Windows Scheduled task.
Or write a simple Windows Service that internally is looping and checking the time and when reached, hits that url, runs that exe, or has it's own code to send you the email.
I would recommend running Quartz.NET as a Windows Service:
Quartz.NET - Enterprise Job Scheduler for .NET Platform
There's boilerplate code for a Windows Service in the download.
I want to Scheduling in Asp.net
I have following options to implement this
To write SQLServer JOB(I dont want to do this.Dont want to go outside of .Net environment)
Second option is I will write windows service and this window service will call asp.net
webservice then this webservice calls asp.net method
(I also dont need to do this because my hosting provider might not be allow me to install
window service)
Third option is I call my scheduling method in Application_Start event in global class
(Drawback is, webserver will kill thread any time )
To call Scheduling Code in Page_Load event of Home Page(Might be nobody visits my website for hours
,Also page execution might be slow due to scheduling code)
I also found some online services that calls your page at given interval,some are given below
http://www.cronservice.co.uk/new/
http://scheduler.codeeffects.com
Anybody give me bettor solution of this and also explain why it is bettor?
Thanks in Advance
The ASP.NET application isn't the right place to implement scheduling. I would suggest creating a service or a scheduled task that runs in short intervals.
You don't have many options in a shared hosting environment. My host (WinHost) allows remote access to their database, so I was able to create an executable that ran on a local server with Task Scheduler.
The performance isn't great since the database is accessed over the internet, but it's still better than attempting to run pseudo scheduled tasks with ASP.NET.
Some hosts also offer a service that will request a url within your site on a scheduled basis. However, this didn't work for me because the task I had to run took longer than the request timeout.
There is no one solution that fits all. SQL jobs and windows jobs (scheduled thru windows task scheduler) are very widely used. In one of my previous work places they had jobs that ran on multiple platforms (mainframe,windows,sql server). Failure in some of these jobs, would cost in thousands by the day. So they employed something called ESP. This software monitored jobs on all platforms and sent a message to the control room in case of a failure.
If you throw some more light on the requirement, we might be able to help you better.
ASP.NET is not the right place to house your Scheduled Tasks. I'm using Quartz.net when I have to create Scheduled Tasks.
Create a page that launches your task and place it at the URL http://www.mydomain.com/runtask.
Create a scheduled task on your home PC that sends a request to http://www.mydomain.com/runtask.
You'll need to keep your home PC on all the time.
Ideally I would go with number 1 as you get full control/history/error reporting etc. You can write an SSIS job in .NET and have SQL server schedule it.
However, I have had a similar problem with shared hosting that is very restrictive. What I did was create a page which runs the process on page load (using validation in the querystring for security). I then used a machine I have which is always on to schedule a Windows Task Scheduler (this is part of Windows as standard) to call a bit of VB script that opens the browser and then shuts it.
I am developing a solution in .Net utilising the VMWare Web Service API to create new isolated virtualised development environments.
The front end will be web based. Requestors input details for the specific environment which will be persisted to SQL. So I need to write an engine of some sort to pull the data from SQL and work on the long running task of creating resource pools, switch port groups and cloning existing VM templates etc. As work progresses, events will be raised to write logs and update info back to SQL. This allows requestors to pull data back into a webpage to see how it's progressing or if it's completed.
The thing I am struggling with is how to engineer the engine which will exec the long running task of creating the environment. I cannot write a windows service (which I would like) as we work in a very secure environment and it's not possible (group policy etc). I could write a web service to execute the tasks, extending the httpRuntime executionTimeout to allow the task to complete. But I'm keen to hear what you guys think may be a better solution to use (based on .Net 3.5). The solution needs to be service oriented as we may be using it on other projects within our org. What about WWF, WCF? I have not used any of the newer technologies available since .Net 2.0 as we've only just been approved to move up from .Net 2.0.
First of all, a Windows Service isn't insecure. One of software development devils is discarding a solution by ignorance or because a lack of investigation, requirement analysis and taking decisions collaborately.
Anyway, if you want to do it in a SOA way, WCF is going to be your best friend.
But maybe you can mix a WCF service hosted by Internet Information Services and Message Queue Server (MSMQ).
A client calls a WCF service operation and enqueues a message to some Message Queue Server queue. Later, a Windows scheduled task executed overtime, checks if your queue has at least an incoming message, and task processes this and others until you dequeue all messages.
Doing this way, IIS WCF host will never need to increase its execution time and it'll serve more requests, while the heavy work of executing this long tasks is performed by a Windows scheduled task, for example a console application or a PowerShell cmdlet (or just a PowerShell script, why not?).
Check this old but useful MSDN article about MSMQ: http://msdn.microsoft.com/en-us/library/ms978430.aspx
I don't understand your comment regarding services being insecure, I think that is false.
However I would look into creating something that uses a messaging bus/daemon.
RabbitMQ: http://www.rabbitmq.com/
MassTransit: http://masstransit-project.com/
NServiceBus: http://nservicebus.com/
Then you can basically use something like WCF, Console App or Service as your daemon.
I'm building a web application that will need to import data from other database servers when it starts.
I would like to have this import done automatically at regular intervals. I would also like to be able to start and stop the import process from my web application.
What would be the best implementation for the import agent - a Windows Service? Something else?
If your web application needs to have this data in memory, you can use the Cache class.
Set it to expire every X hours, as you need and when it expires, re-fetch the data..
You could create a Windows Service that uses Quartz.Net to run the scheduled tasks.
You should not run scheduled task from your web app, since you don't have any guarantee that your web app is running. You're at IIS app pool management's mercy.
You might want to look at Best way to run scheduled tasks.
Of what I heard this looks like a description for Microsoft Sync Framework. I have just few information about it for myself but will be pleased to see you pointed into that direction.
I'm not sure about your question because you are talking about hourly syncing. When talking web applications, there can't be a nice way to do such a task. You have to create a console app or best task would be a Windows Service Process (which are easier then it sounds)?
Sync Framework Intro
http://msdn.microsoft.com/en-us/sync/bb821992
Sync Framework Tutorial
http://alexduggleby.com/2007/12/16/sync-framework-tutorial-part-1-introduction/
Sync Framework Samples
http://archive.msdn.microsoft.com/sync
And, when I'm editing the answer with links
Nice guide to create a Windows Service (and setup)
http://www.codeproject.com/KB/dotnet/simplewindowsservice.aspx
(if first time, try it on a test project before the production project)
This might be an oversimplification, but can you create a class that does all of this work using a Timer, and then in the application_start of the global.asax, create a BackgroundWorker that kicks off this process?
Your web application could then control the BackgroundWorker object, starting/stopping as necessary.