I searched threads here and couldn't really find what I wanted. I know asp.net web forms is an old technology, but I need to work on it for now. Let's say I have a method which does some heavy processing. For example, there is a function which creates 300 PDF Invoices, zip it and downloads it to user computer.
Sample Code:
for(int i = 1; i <= 300;i++)
{
PrintPDF(i);
}
Now let's say PrintPDF takes about 30 seconds to print one record, so it will take around 150 minutes to print 300 PDFs. Now from a user point of view, I may choose to quit in between if I don't like. If user closes the browser then
Does the request to print PDF get aborted instantly after user closes the session?
If it doesn't, what can we do to ensure that the request is immediately aborted as soon as user closes the browser.
Http is stateless. That means you can never relay on fact that you'll get notification when user is closing the browser. However you can always implement Dead man's switch. I.E. make a javascript that will send pings to your server every ten seconds or so & treat user that haven't sent "ping" for more than twenty seconds as logged of. As for heavy processing on server side - that's really unfortunate way to go; for instance ASP.NET have maximum time it can spend serving request - check executionTimeout of httpRuntime web.config element (by default 110s). You can increase this value of course - but the application pool can be recycled anyway and also if there will be lot of requests on "heavy processing" you can run out of available processing threads. If the site is accessible over internet that is also great place for DDos attack.
Better way is to create queue (in db/cloud) and windows service that will process this queue asynchronously. Still you can implement this "force kill request mechanism" by storing "close" flag in queue item that will service check periodically & will stop processing if it is set.
Other workaround is to use websockets (SignalR).
Related
I have a website where I need to take a bit of data from the user, make an ajax call to a .net webservice, and then the webservice does some work for about 5-10 minutes.
I naturally dont want the user to have to sit there that whole time, so I have made it an asynchronous ajax call to the webservice, and after the call has been sent, I redirect the user to a "you are done!" page.
What I want to happen is for the webservice to keep running to finish--and not abort--after it receives the information from the user.
From my testing, this is more or less what happens, but now I'm finding that this might be limited by time? I.e. if the webservice runs past a certain amount of time, it will abort if the user isnt still connected.
I might be off here in this assessment, but this is what I THINK is going on from my testing.
So my question is whether with .net web services, if this is indeed what happens? Does it get aborted after some time if the user isnt still on the other end? Is there any way to disable this abort?
Thanks in advance!
when you invoke a web service, it will always finish its work, even if user leaves the page that invoked it.
Of course webservices have their own configuration and one of them sets timeout.
If you're creating a WCF service (SOAP Service) you can set it in its contract (changing binding properties), if you're creating a service with WebApi or MVC (REST/Http Service) then you can either add to its config file or programmatically set in its controller as it follows.
HttpContext.Server.ScriptTimeout = 3600; //Number of seconds
That can be a reason causing webservice to interrupt its work but it is not related to what happens on client side.
Have a nice day,
Alberto
Whilst I agree that the answer here is technically correct, I just
wanted to post a more robust alternative approach that avoids some of
the pitfalls possible with your current approach such as
Web Server being bounced during the long-running processing of request
Web Server App pool being recycled during processing
Web server running out of threads due to too many long-running requests and not being able to process any more requests
I would recommend you take a thoroughly ansynchronous approach and use
Message Queues (MSMQ for example) with a trigger on the queue that
will execute the work.
The process would be:
Your page makes Ajax call to the Webservice
Webservice writes a message into the Queue and returns right away. The message contains details of what work needs to be carried out.
User continues on your site as usual, or goes home, etc.
A trigger on the Queue is watching for messages and when a message
arrives in the queue, it activates a process which:
Reads the message
Performs the necessary work
Updates any back-end storage, etc, with the results of the work
This is much more robust because it totaly decouples the Web service from any long-running work and means that if the user makes a request and the web server goes down a moment later (for whatever reason) then the work will still be queued up when the server comes back online, etc.
You can read more about it here (MSMQ is the MS Message Queue tech; there are many others!)
Just my 2c
I am playing with the the Windows Azure emulator running an MVC website with a single controller method that calls Thread.Sleep(5000) before it returns.
On the client I run a loop that sends a POST request to the controller every 1000 ms, receives a reply from the server with the RoleEnvironment.CurrentRoleInstance.Id, and prints it on the screen.
I have 4 instances of my MVC worker role running.
I understand that the connection: keep-alive HTTP header can keep the browser from making a request to a different instance, because an existing connection is open.
But still, even when loading up my site in multiple browser windows, it keeps hanging while waiting for the Thread.Sleep(), and then (most times) continues to get replies from the same instance.
Why doesn't Azure's load balancer send subsequent requests to a non-busy worker role instance? Do I need to manually mark it as busy?
You mentioned using the emulator, which doesn't handle load balancing the same way as Azure's real load balancers. See this post for details about the differences. I don't know what exactly is going on in your case, but... I'd suggest you trying this out in Azure to see if you get the behavior you're expecting.
Is it possible to request a .NET server for an XHR request, while it is busy with another task?
Basically, I am working on an e-commerce application, that generates invoices of purchases once at a time, that may be weekly, monthly etc. So, while in process to generate the invoices a lot of calculations, and database reads and writes are done on the server side, but the user can only wait for the complete process to finish at once without knowing the progress made by the server. As per the project's requirement, the application can be generating thousands of invoices at a time, so I guess that'll take a lot of time.
So, I was thinking that, is it possible for me to write a code in ASP.NET, C# & jQuery, for requesting an XHR from the server while it is busy with generating invoices, so as to know the progress made by the server.
The process may be like:
User selects the criteria for Invoice generation on the screen and clicks Generate Invoice button.
The server gets the requests and makes a initial read operation on the database so as to return the number of records or invoices to be generated, and simultaneously starts generating invoices.
The output of that Read(), is sent to the Client System, and on client's side, a modal pop starts to show a progress bar telling the number of records to be processed as well as those completed processes.
As a server cannot send a response by itself without the client initiating with a request to do so, I guess the client may be sending XHR's every 10-20 seconds so as to know the progress made by the server on the Invoice Generation process.
But, here comes the actual problem, the server may not respond to the same application domain, and tell about the progress made, before completing the earlier requested process of Invoice Generation. Or else, it may break the earlier process.
Can it be done using multiple threads? Or may be some other methodology of .NET.
My application is in ASP.NET with C# and answers with Code examples or references will appreciated.
I have a windows form application that I've recently been handed to upgrade. It makes two Web Services calls (using .net Web References functionality). One is SSL, the other is not.
The first webservice requested after you open the client takes about 12 seconds, any other requests take about .5 sec. -Regardless of which webservice you request first, and any future request is fast regardless of which until you close the client.
After you open the client again the first hit takes a 12 seconds again.
I've having a hard time searching for this because of the huge amount of forum posts regarding the Server first load that occurs with IIS metadata. I'm familiar with that issue and it is not what is occurring here.
Also, the database calls that the application performs have no such delay. I'm not leaning towards a network issue because of that.
Any thoughts?
Thanks.
A delay that long is probably I/O related, either disk (generating XML serializers) or network (DNS resolution, certificates, strong name validation, etc.). Check the resource monitor: is the CPU, disk, or network loaded? If not, it's probably a network call stuck on a timeout.
Try capturing data with Process Monitor, which will include all disk and network traffic.
If the problem looks to be network-related, then Wireshark or Fiddler might give a clearer picture.
I have been struggling with a situation on a site where users' clicks generate AJAX requests, e.g. with some visual effect when the response comes. It sometimes happens that the user doesn't wait until the request is done (although he sees a loading indicator) and starts clicking other elements like crazy. Because in principle I cannot and don't want to disable that possibility, I have tried (more or less successfully) to ensure that anytime a new request is fired, the previous one is aborted (client-side) and its handlers are no longer called - whichever request comes last wins.
Although somewhat successfull on client-side, I'm now wondering whether there is some way to simulate a similar behavior at the server. As I cannot really kill the previous request, only disconnect from it, it will still run to the end and consume valuable resources. If the user clicks on 20 elements, he will only see the result of the very last request, but there will still be 20 requests on the server wasting the CPU doing useless work.
Is there some way to implement a last-win strategy for multiple requests of the same resource in the ASP.NET / IIS ? I guess the requests are anyway internally queued, and what I would need is for the IIS when it tries to dequeue the next one, simply have a look whether there are some others and only serve the very last one from the same session.
I know that in ASP on IIS you could test for isClientConnected and abort if client had disconnected.
I believe something similar would exist in most plattforms.
But I do not know how it works with ajax?