Is Task.Delay safe to use in ASP.NET? - c#

I have action that returns only an JSON string with information.
To prevent users edit JS code which updates information every 2 seconds I also need an server side delay to prevent high CPU load.
How safe is it to use Task.Delay(2000) if there are (for example) 2000-5000 users doing same request at the same time? Information returned is different for each user.

Why do you think adding Task.Delay(2000) will reduce the CPU load? If you have a high CPU load at T, adding Task.Delay(2000) only postpones the high CPU load to T+2, which is totally helpless.
A quick solution is checking the submit frequency on the UI side, like on a web page, disable the submit button and enable it again after a few seconds. But this can be cheated since the front-end scripts can be modified.
A safer solution is checking the submit frequency on the server side, you record the last submit time somewhere (e.g. a static variable, the simplest), and reject invalid requests.

Task.Delay is totally safe to use since it doesn't involve creating or blocking threads or and it doesn't stall the CPU.
On the other hand, it is not going to help you, since it is still possible to do multiple requests from one machine. Delaying the execution without further checks is a useless way to throttle requests.

Beside the other answers that are correct in asp.net if the user use the asp.net session, there is an issue that you must know with that because the asp.net session is lock the entire site until the call returns.
So if you use that Delay, with session, you block all users... Please read about :
Does ASP.NET Web Forms prevent a double click submission?
Web app blocked while processing another web app on sharing same session
What perfmon counters are useful for identifying ASP.NET bottlenecks?
Replacing ASP.Net's session entirely
Trying to make Web Method Asynchronous

If it goes about Task.Delay, then yes, this is fine. Task.Delay results in a task with a timer that will continue with that task when it's done (in the callback of timer). Given the way it works, that doesn't block your thread and doesn't execute on another thread, so it seems to be fine. The number of requests you posted also doesn't sound big.
It is true, however, that my answer is more about using Task.Delay in ASP.NET MVC than in your particular scenario, which you would need to describe in details if you need more specific answer.

Related

Can/Should I Use an Asynchronous Controller Here? (ASP.NET MVC 3)

I have this [HttpPost] action method:
[HttpPost]
public ActionResult AddReview(Review review)
{
repository.Add(review);
repository.Save();
repository.UpdateSystemScoring(review.Id); // call SPROC with new Review ID.
return View("Success", review);
}
So, basically a user clicks a button, i add it to my database (via Entity Framework 4.0), save changes, and then i call a stored procedure with the identity field, which is that second last line of code.
This needs to be done after the review is saved (as the identity field is only created once Save is called, and EF persists the changes), and it is a system-wide calculation.
From the user point of view, he/she doesn't/shouldn't care that this calculation is happening.
This procedure can take anywhere from 0-20 seconds. It does not return anything.
Is this a candidate for an asynchronous controller?
Is there a way i can add the Review, and let another asynchronous controller handle the long-running SPROC call, so the user can be taken to the Success page immediately?
I must admit (partially ashamed of this): this is a rewrite of an existing system, and in the original system (ASP.NET Web Forms), i fired off another thread in order to achieve the above task - which is why i was wondering if the same principal can be applied to ASP.NET MVC 3.
I always try and avoid multi-threading in ASP.NET but user experience is the #1 priority, and i do not want the page timing out.
So - is this possible? Also happy to hear any other ideas. Also - i can't use triggers here, don't really want to go into too much detail why - but i can't.
I would fire a new thread (not from the thread pool) to perform this task and return immediately especially if you don't care about the results. Asynchronous controllers are useful in situations where most of the time is spent waiting for some other system to complete the task and you once this system completes the task your application is signaled to process the result. During the execution of the task no threads are consumed from your application. So in your scenario this task could be performed by SQL Server using the async versions of the BeginRead methods in ADO.NET. You could use this if you need the results back. If you don't firing a new thread would work just fine as before.
I think asynchronous controllers are more for things where the request may take a long time to return a response, but the main thread would spend most of that time waiting for another thread/process. This is mostly useful for ajax calls rather than main page load, when it is acceptable to just show a progress indicator until the response is returned.
I use a separate queueing system for this type of task, which is more robust and easier to work with but does take a bit more work to set up. If you really need to do it within the ASP.net process, a separate request is probably the best option, though there is some potential for the task not to run - for example I'm not sure what happens if the connection drops or the app pool recycles while an async task is running.
Since the scoring system takes so long to run I would recommend using a scheduled task in SQL Server or Windows to update the scores every x amount of minutes. Since the user doesn't know about the request it don't matter to run immediately.
You could add the ID's to a queue and process the queue every 30 minutes.
Otherwise if there is a reason this needs to be run immediately you could do an asyc call or see if you could trim some fat of the stored proc.
I have a very similar system that I wrote. Instead of doing things synchronously we do everything asynchronous using queues.
Action -> causes javascript request to web server
|
Web server puts notification on queue
|
Worker picks up message from queue and does point calculation
|
At some point in future user sees points adjusted
This allows us to be able to handle large amounts of user load and not need to worry about this having an adverse affect on our calculation engine. This also means that we can add more workers to handle larger load when we have large load and can remove workers when we don't have a large load.

How can I send the HTTP response back to the user but still do more things on the server after that?

Sometimes there is a lot that needs to be done when a given Action is called. Many times, there is more that needs to be done than what needs to be done to generate the next HTML for the user. In order to make the user have a faster experience, I want to only do what I need to do to get them their next view and send it off, but still do more things afterwards. How can I do this, multi-threading? Would I then need to worry about making sure different threads don't step on each others feet? Is there any built in functionality for this type of thing in ASP.NET MVC?
As others have mentioned, you can use a spawned thread to do this. I would take care to consider the 'criticality' of several edge cases:
If your background task encounters an error, and fails to do what the user expected to be done, do you have a mechanism of report this failure to the user?
Depending on how 'business critical' the various tasks are, using a robust/resilient message queue to store 'background tasks to be processed' will help protected against a scenario where the user requests some action, and the server responsible crashes, or is taken offline, or IIS service is restarted, etc. and the background thread never completes.
Just food for though on other issues you might need to address.
How can I do this, multi-threading?
Yes!
Would I then need to worry about making sure different threads don't step on each others feet?
This is something you need to take care of anyway, since two different ASP.NET request could arrive at the same time (from different clients) and be handled in two different worker threads simultaneously. So, any code accessing shared data needs to be coded in a thread-safe way anyway, even without your new feature.
Is there any built in functionality for this type of thing in ASP.NET MVC?
The standard .net multi-threading techniques should work just fine here (manually starting threads, or using the Task features, or using the Async CTP, ...).
It depends on what you want to do, and how reliable you need it to be. If the operaitons pending after the response was sent are OK to be lost, then .Net Async calls, ThreadPool or new Thread are all going to work just fine. If the process crashes the pending work is lost, but you already accepted that this can happen.
If the work requires any reliable guarantee, for instance the work incurs updates in the site database, then you cannot use the .Net process threading, you need to persist the request to do the work and then process this work even after a process restart (app-pool recycle as IIS so friendly calls them).
One way to do this is to use MSMQ. Other way is to use the a database table as a queue. The most reliable way is to use the database activation mechanisms, as described in Asynchronous procedure execution.
You can start a background task, then return from the action. This example is using the task Parallel Library, found in .NET 4.0:
public ActionResult DoSomething()
{
Task t = new Task(()=>DoSomethingAsynchronously());
t.Start();
return View();
}
I would use MSMQ for this kind of work. Rather than spawning threads in an ASP.NET application, I'd use an Asynchronous out of process way to do this. It's very simple and very clean.
In fact I've been using MSMQ in ASP.NET applications for a very long time and have never had any issues with this approach. Further, having a different process (that is an executable in a different app domain) do the long running work is an ideal way to handle it since your web application is no being used to do this work. So IIS, the threadpool and your web application can continue to do what they need to, while other processes handle long running tasks.
Maybe you should give it a try: Using an Asynchronous Controller in ASP.NET MVC

Background process in asp.net

Thanks in Advance for reading and answer this question.
I got button in asp 2.0 that will process something BIG. It will take sometime to finish (more than 30,000 comparison) and I want to know if the browser says that it lost the comunication with the server, the server will finish the process?
You probably want to modify your architecture so that the HTTP response is not dependent on the processing finishing within the timeout period. It sounds as if you are not going to tell the user anything based on the results of the calculation anyway based on the question. There are different methods you could use, but most involve writing a message to a queue, and then having a separate process, like a Windows Service monitor that queue and do the long running work separately.
You should not execute this button live on the site but instead spawn a thread server side.
You could use AJAX to tell the services to start the comparison and listen for the answer later on.

Is it possible to return an ASP page before continuing slow server side work (eg logging)

Is it possible to return the page response to the user, before you've finished all your server side work?
Ie, I've got a cheap hosting account with No database, but I'd like to log a certain event, by calling a webservice on my other, more expensive hosting account (ie, a very slow logging operation)
I don't really want the user to have to wait for this slow logging operation to complete before their page is rendered.
Would I need to spin up a new thread, or make an asynchronous call? Or is it possible to return the page, and then continue working happily in the same thread/code?
Using ASP.Net (webforms) C# .Net 2.0 etc.
You would probably need a second thread. An easy option would be to use the ThreadPool, but in a more sophisticated setup a producer/consumer queue would work well.
At the simplest level:
ThreadPool.QueueUserWorkItem(delegate {
DoLogging(state details);
});
You sure can - try Response.Flush.
That being said - creating an asynchronous call may be the best way to do what you want to do. Response.Flush simply flushed the output buffer to the client, an asynchronous call would allow you to fire off a logging call and not have it impact the client's load time.
Keep in mind that an asynchronous call made during the page's life cycle in ASP.NET may not return in time for you to do anything with the response.

What's the best way to handle long running process in an ASP.Net application?

In my web application there is a process that queries data from all over the web, filters it, and saves it to the database. As you can imagine this process takes some time. My current solution is to increase the page timeout and give an AJAX progress bar to the user while it loads. This is a problem for two reasons - 1) it still takes to long and the user must wait 2) it sometimes still times out.
I've dabbled in threading the process and have read I should async post it to a web service ("Fire and forget").
Some references I've read:
- MSDN
- Fire and Forget
So my question is - what is the best method?
UPDATE: After the user inputs their data I would like to redirect them to the results page that incrementally updates as the process is running in the background.
To avoid excessive architecture astronomy, I often use a hidden iframe to call the long running process and stream back progress information. Coupled with something like jsProgressBarHandler, you can pretty easily create great out-of-band progress indication for longer tasks where a generic progress animation doesn't cut it.
In your specific situation, you may want to use one LongRunningProcess.aspx call per task, to avoid those page timeouts.
For example, call LongRunningProcess.aspx?taskID=1 to kick it off and then at the end of that task, emit a
document.location = "LongRunningProcess.aspx?taskID=2".
Ad nauseum.
We had a similar issue and solved it by starting the work via an asychronous web service call (which meant that the user did not have to wait for the work to finish). The web service then started a SQL Job which performed the work and periodically updated a table with the status of the work. We provided a UI which allowed the user to query the table.
I ran into this exact problem at my last job. The best way I found was to fire off an asychronous process, and notify the user when it's done (email or something else). Making them wait that long is going to be problematic because of timeouts and wasted productivity for them. Having them wait for a progress bar can give them false sense of security that they can cancel the process when they close the browser which may not be the case depending on how you set up the system.
How are you querying the remote data?
How often does it change?
Are the results something that could be cached for a period of time?
How long a period of time are we actually talking about here?
The 'best method' is likely to depend in some way on the answers to these questions...
You can create another thread and store a reference to the thread in the session or application state, depending on wether the thread can run only once per website, or once per user session.
You can then redirect the user to a page where he can monitor the threads progress. You can set the page to refresh automatically, or display a refresh button to the user.
Upon completion of the thread, you can send an email to the user.
My solution to this, has been an out of band service that does these and caches them in db.
When the person asks for something the first time, they get a bit of a wait, and then it shows up but if they refresh, its immediate, and then, because its int he db, its now part of the hourly update for the next 24 hours from the last request.
Add the job, with its relevant parameters, to a job queue table. Then, write a windows service that will pick up these jobs and process them, save the results to an appropriate location, and email the requester with a link to the results. It is also a nice touch to give some sort of a UI so the user can check the status of their job(s).
This way is much better than launching a seperate thread or increasing the timeout, especially if your application is larger and needs to scale, as you can simply add multiple servers to process jobs if necessary.

Categories

Resources