How can I delay an HTTP response for x seconds? - c#

I won't go into the boring details of why I need this, it's part of an internal analytics package, but my goal is to create an ASP.NET page that returns a redirect after 2 seconds.
The problem I'm seeing is that using Thread.Sleep(2000); is going to hold up one of my ASP.NET ThreadPool threads. As I understand it, this is pretty wasteful as thread creation isn't cheap and I need this server to handle as many possible simultaneous connections as possible.
So, what's the best way to have HTTP GETs to my page return after at least 2 seconds (over 2 seconds is no problem, it just can't be under).
protected void Page_Load(object sender, EventArgs e)
{
Thread.Sleep(2000);
Response.Redirect(RedirectUri);
}
EDIT
I should clarify, the requested page is actually requested as an image, so returning HTML isn't possible. It'll be used like so:
<img src="http://hostname/record.aspx"/>
The redirect to an actual image should take 2 seconds.

You can do this on the markup itself, you can put something like:
<head>
<meta http-equiv="refresh" content="3; URL=otherpage.aspx">
</head>

You could implement IHttpAsyncHandler. See MSDN.

Do it in JS:
setTimeout(function() {
$.ajax({url: './script.aspx'});
},2000);

There is no way simple way to delay program execution without holding up a thread. You could in theory set up a a delay at the other server where the Re-Direct is occurring, but if you are just trying to cause a delay or timeout prior to the Redirect, you'll have to pay the penalty of a waiting thread.

Related

Multiple WebClient calls from worker threads get blocked in loopback requests back to self

I have a .NET 4.5.2 ASP.NET webapp in which a chunk of code makes async webclient calls back into web pages inside the same webapp. (Yes, my webapp makes async calls back into itself.) It does this in order to screen scrape, grab the html, and hand it to a PDF generator.
I had this all working...except that it was very slow because there are about 15 labor-intensive reports that take roughly 3 seconds each, or 45 seconds in total. Since that is so slow I attempted to generate all these concurrently in parallel, and that's when things hit the fan.
What is happening is that my aspx reports (that get hit by webclient) never make it past the class constructor until timeout. Page_Load doesn't get hit until timeout, or any other page events. The report generation (and webclient calls) are triggered when the user clicks Save in the webapp, and a bunch of stuff happens, including this async page generation activity. The webapp requires windows authentication which I'm handling fine.
So when the multithreaded stuff kicks off, a bunch of webclient requests are made, and they all get stuck in the reports' class contructor for a few minutes, and then time out. During/after timeout, session data is cleared, and when that happens, the reports cannot get their data.
Here is the multithreaded code:
Parallel.ForEach(folders, ( folderPath ) =>
{
...
string html = getReportHTML(fullReportURL, aspNetSessionID);
// hand html to the PDF generator here...
...
});
private string getReportHTML( string url, string aspNetSessionID ) {
using( WebClient webClient = new WebClient() ) {
webClient.UseDefaultCredentials = true;
webClient.Headers.Add(HttpRequestHeader.Cookie, "ASP.NET_SessionId=" + aspNetSessionID);
string fullReportURL = url;
byte[] reportBytes = webClient.DownloadData(fullReportURL);
if( reportBytes != null && reportBytes.Length > 0 ) {
string html = Encoding.ASCII.GetString(reportBytes);
return html;
}
}
return string.Empty;
}
Important points:
Notice I have to include the ASP.NET session cookie, or the web call doesn't work.
webClient.UseDefaultCredentials = true is required for the winauth.
The fragile session state and architecture is not changeable in the short term - it's an old and massive webapp and I am stuck with it. The reports are complex and rely heavily on session state (and prior to session state many db lookups and calcs are occurring.
Even though I'm calling reports from my webapp to my same webapp, I must use an absolute url - relative URL throws errors.
When I extract the code samples above into a separate .net console app, it works well, and doesn't get stuck in the constructor. Because of this, the issue must lie (at least in part) in the fact that my web app is making async calls back to itself. I don't know how to avoid doing this. I even flirted with Server.Execute() which really blows up inside worker threads.
The reports cannot be generated in a windows service or some other process - it must be linked to the webapp's save event.
There's a lot going on here, but I think the most fundamental question/problem is that these concurrent webclient calls hit the ASPX pages and get stuck in the constructor, going no further into page events. And after about 2 minutes, all those threads flood down into the page events, where failures occur because the main webapp's session state is no longer active.
Chicken or egg: I don't know whether the threads unblock and eventually hit page events because the session state was cleared, or the other way around. Or maybe there is no connection.
Any ideas?

Problems getting newly created thread to send outputs to asp:panel in ASP.NET C#

I'm creating a file processor for use in an intranet.
I described it in another question - ERR_EMPTY_RESPONSE when processing a large number of files in ASP.Net using C#
Now, as suggested on above question's answer, I'm trying to use threads to execute the file processing task.
But there is a problem. I need the newly created thread to write feedbacks to a component in page (asp:panel, or div, or whatever). Those feedbacks would be results from several database operations.
The application reads those txts, interprets each line of it, and insert data in database. Each line inserted in database must return a feedback, like "registry 'regname' inserted successfully", or "i got problems inserting registry 'regname' in file 'filename', skipping to next registry".
I did test with something very simple:
protected void DoImport()
{
try
{
MainBody.Style.Add(HtmlTextWriterStyle.Cursor, "wait");
int x = 0;
while (x < 10000)
{
ReturnMessage(String.Format("Number {0}<hr />", x), ref pnlConfirms);
x++;
}
}
catch (Exception ex)
{
ReturnMessage(String.Format("<font style='color:red;'><b>FATAL ERROR DURING DATA IMPORT</b></font><br /><br /><font style='color:black;'><b>Message:</b></font><font style='color:orange;'> {0}</font><br />{1}", ex.Message, ex.StackTrace), ref pnlErrors);
}
finally
{
MainBody.Style.Add(HtmlTextWriterStyle.Cursor, "default");
}
}
This function is called from Page_Load, and fills an asp:panel called "pnlConfirms" with a row of numbers, but all at once, on load.
I changed it to:
protected void DoImport()
{
try
{
MainBody.Style.Add(HtmlTextWriterStyle.Cursor, "wait");
ThreadPool.QueueUserWorkItem(new WaitCallback(DoWork));
}
catch (Exception ex)
{
ReturnMessage(String.Format("<font style='color:red;'><b>FATAL ERROR DURING DATA IMPORT</b></font><br /><br /><font style='color:black;'><b>Message:</b></font><font style='color:orange;'> {0}</font><br />{1}", ex.Message, ex.StackTrace), ref pnlErrors);
}
finally
{
MainBody.Style.Add(HtmlTextWriterStyle.Cursor, "default");
}
}
private void DoWork(Object stateInfo)
{
int x = 0;
while (x < 10000)
{
ReturnMessage(String.Format("Number {0}<hr />", x), ref pnlConfirms);
x++;
}
}
And both uses this function:
public void ReturnMessage(string message, ref Panel panel, bool reset = false)
{
if (reset)
{
panel.Controls.Clear();
}
Label msg = new Label();
msg.Attributes.Add("width", "100%");
msg.Text = message;
panel.Controls.Add(msg);
}
I need ThreadPool.QueueUserWorkItem(new WaitCallback(DoWork)); to fill those asp:panels with feedbacks - like insertion errors and warnings.
My code already has those feedbacks under try...catch statements, but they're not getting output to any asp:panel from threadpool (it works when invoked directly from DoImport() function, like in the first example I posted).
I'm doing something very wrong, but I can't find out what (and I'm researching this for almost 2 weeks). Please, help!
In ASP.NET, when a browser requests a page, that page is rendered and sent to the browser as soon as its processing finishes, so the browser will show the page as it's finally rendered.
According to your code you're trying to render a page, show a wait cursor, and expect it's shown on the browser and then, the cursor is changed by a default cursor. As I explained, independently from using or not additional threads, the page won't be sent to the browser until it's completely rendered. So you'l never see the wait cursor on the client side.
The easiest wait to get what you're trying to do is to use web services (traditional .asmx or WCF) and AJAX (jquery os ASP.NET AJAX).
1) create a web service that does the processing
2) create a page which is sent to the browser, and, using javascript (jQuery or ASP.NET AJAX) make a call to the web service, and show something to let the user know that the request is being processed. (a wait cursor, or even better an animated gif)
3) when the process finishes, your javascript will get the responde from the web service, and you can update the page to let the user know the process has finished.
if you don't have experience on javascript, you can make most of this task using:
ScriptManager which can be used to create a javascript web service proxy for your client side (other interesting article) and is required for the rest of the controls
some javascript (or jquery) which can be use to update the "process running/ process finished hints" on the client side. I.e. when the call to the web service ends, you can use javascript to update the page using DOM, or load a new page or the same page with an special parameter to show the result of the process
In this way you can do what you want:
1) show a page in a state that shows the process is running
2) show the same, or other page, in a state that shows the end of the process
The trick is comunicating the browser with the server, and this can only be done using some of the available ajax techniques.
Another typical technique is using jQuery.ajax, like explained in encosia.com
According to the OP message, the process of all the files would be so slow that it would tiemout the web service call. If this is the case, you can use this solution:
1) Create a web service that process one (or a batch) of the pending files, and returns at least the number of pending files when it finishes the processing of the current file (or batch).
2) from the client side (javascript), call the web service. When it finishes, update the page showing the number of pending files, and, if this number is greater than zero, call the web service again.
3) when the call to the web service returns 0 pending files, you can update the page to show the work is finished, and don't call it any more.
If you process all the files at once, there will be no feedback on the client side, and there will also be a timeout. Besides, IIS can decide to stop the working thread which is making the work. IIS does this for several reasons.
A more reliable solution, but harder to implement, is:
1) implement a Windows Service, that does the file processing
2) implement a web service that returns the number of pending files (you can communicate the Windows Service and Web App indirectly using the file system, a database table or something like that)
3) use a timer (ajax timer, or javascript setInterval) from your web page to poll the server every N seconds using the web service, until the number of pending files is 0.
An even harder way to do this is hosting a WCF service in your Windows Service, instead of the indirect communication between your web app and windows service. This case is much more complicated because you need to use threads to do the work, and attend the calls to the wcf service. If you can use indirect communitacion it's much easier to implemente. The dtabse table is a simple and effective solution: your working process updates a row a table whenever it process a file, and the web service reads the progress state from this table.
There are many different soultions for a not so simple problem.
You are starting new thread (or more precise running your code on one of free threads in thread pool)and not waiting for results in main thread. Something like Thread.Join (if you would use manual thread creation) or other synchronization mechanism as events need to be used if you want to go this route.
The question you've linked to suggests using asynchronous pages which you are not doing. You would start processing request, kick off the task and release the thread, when the task is finished you complete request.
Side note: consider simply doing all conversion on main thread that handles request. Unless you expect slow I/O to complete the task moving CPU work from one thread to another may not produce significant gains. Please measure performance of your current solution and confirm that it does not meet performance goals you have set up for your application. (this does not apply if you doing it for fun/educational purposes).

Measure page load time?

I saw some other method of measure with using trace, but I just wonder this method measure correctly...
I overrided each execution of the following:
PreInit
Init
InitComplete
PreLoad
Load
LoadComplete
PreRender
PreRenderComplete
SaveStateComplete
Unload
Then I store the time when the handler execute with using DateTime.Now.Tick...
At the end of Unload I will print out each of their execution time....
So the time above should be the time server spent to generate the page?
I am asking is because I notice some page took like 879ms in total above, but until my browser actually see the page is take few more seconds.
Those few more seconds should be the time that takes to download the page from server?
Thanks in advance.
in global.asax
namespace aaaaa
{
public class Global : System.Web.HttpApplication
{
private Stopwatch sw = null;
protected void Application_BeginRequest(object sender, EventArgs e)
{
sw = Stopwatch.StartNew();
}
protected void Application_EndRequest(object sender, EventArgs e)
{
if (sw != null)
{
sw.Stop();
Response.Write("took " + sw.Elapsed.TotalSeconds.ToString("0.#######") + " seconds to generate this page");
}
}
}
}
Yes, there's time for the code to run and the time for the browser to get the response from the server and output it to the screen. You can measure the front-end work using a variety of measuring sites:
Pingdom Tools
WebWait
Web page Test
To determine the timing for processing of the code, I would use the StopWatch class instead of using DateTime. DateTime is more precise to the decimal point, but less accurate. StopWatch is designed exactly for that and would be better to use to calculate the timing. Avoid calling it a lot though, as that itself will add overhead to the page processing. I would create a new StopWatch() then call Start at the very beginning, then call stop at the very end. Then spit out the elapsed time after.
StopWatch class
Precise Run Time Measurements with Stopwatch
If you are just looking for overall time, why not just look at the time-taken value in your IIS logs?
The extra time in the browser could be a lot of things. Fetching, Images, CSS, javascript files, javascript running in the page, and the client rendering of the HTML itself. If you want to get a better feel for what is actually happening and when fire up Fiddler and then reload your page and look at what happened in Fiddler.

What is the best approach to handle session timeouts in asp.net

There are various ways to handle session timeouts, like "meta refreshes" javascript on load functions etc.
I would like something neat like: 5 minutes before timeout, warn the user...
I am also contemplating keeping the session open for as long as the browser is open(still need to figure out how to do it though... probably some iframe with refreshing).
How do you handle session timeouts, and what direction do you think i should go in?
The best approach to handle sessions timeouts.
I say that there is 2 basic cases.
One is when the users enter little or no data, and just read reports, or do small thinks with his mouse. In this case there is not easy way to inform him that the session is going to expire. If you going to check the time left for the session calling the code behind, then automatically you update the session. Then if you have a timer to count down the session, then maybe the user have open a new tab of your web and the session is going to expired but not the time you have note with javascript and the user receive wrong message.
So for me, when the user enter little or no data, just let the session expired, if he lose one click, it will do it again later.
Second is when the user need to enter many data, that some time can take time, a long text for example, to write it and fix it. In this case I use the below technique and I am not let the session go out.
How to keep the session open as long as the browser.
Here is a very nice and simple technique, I use an image that I make an reload of it before the session is timeout using JavaScript.
<img id="keepAliveIMG" width="1" height="1" src="/img/ui/spacer.gif?" />
<script language="javascript" type="text/javascript">
var myImg = document.getElementById("keepAliveIMG");
if (myImg){
window.setInterval(function(){
myImg.src = myImg.src.replace(/\?.*$/, '?' + Math.random());
}, 6000);
}
</script>
In a third case, you can do this. We care if the session is expired only on post back. When the user have enter some data and on the post back the application is redirect him on the login page and the post lost.
In this third case you can capture the post data and saved them until the user re-login. You capture the post data on global.asax on the
protected void Application_AuthenticateRequest(Object sender, EventArgs e)
This is the function that called before the redirect to the login page, and there you see if you have post data and the use required to login, you save that post data, ether to a new redirect page, ether to the server (maybe on session, maybe on your temporary database).
Now after the user is login again, you redirect him again to the last page with the saved post data, and the user is continue as it is.
The only trick here is to make a middle page, that render the form with the last posted data and an automatically redirect javascript call.
The only thing I can think of is to generate some script on the page that creates a client timer, so that when the page is received and rendered, it can show an alert X-minutes later (that is 5mins before expire).
If you'd rather have the session just keep itself alive, you can do this with a generic handler (ASHX) that you periodically call via AJAX. This will help refresh the session and it should stay alive for as long as the AJAX calls continue.
Example "keepalive.ASHX":
<%# WebHandler Language="C#" Class="keepalive" %>
using System;
public class keepalive : System.Web.IHttpHandler
{
public void ProcessRequest (System.Web.HttpContext context)
{
context.Response.ContentType = "text/json";
var thisUser = System.Web.Security.Membership.GetUser();
if (thisUser != null)
context.Response.Write("[{\"User\": \"" + thisUser.UserName + "\"}]");
}
public bool IsReusable
{
get { return false; }
}
}
And here's the script on the page to call it (with jQuery for simplicity):
<script type='text/javascript'>
function keepAliveInterval()
{
$.ajax(
{
url: "keepalive.ashx",
context: document.body,
error: function () {
alert("AJAX keepalive.ashx error :(");
}
});
}
$(document).ready(function () {
window.setInterval('keepAliveInterval()', 60000);
});
</script>
Use some jquery that keys off of your session timeout variable in the web.config. You can use this Jquery delay trick that when a specific time occurs (x number of minutes after load of the page), it pops up a div stating session timeout in x minutes. Nice, clean and pretty simple.
Regarding session timeout, Codesleuth's ajax call would be perfect.

Cannot redirect after HTTP headers have been sent

When I try to redirect to another page through Response.Redirect(URL) am getting the following error:- System.Web.HttpException: Cannot redirect after HTTP headers have been sent.
I wrote one Response.Write("Sometext"); and Response.Flush() before calling redirect Method.
In this case how do we use Response.Redirect(URL)?
I'm executing a Stored procedure through Asynch call. The SP will take almost 3 min to execute. By that time I'll get load balancer timeout error from Server because this application is running in Cloud computer. For avoiding load balancer timeout I'm writing some text to browser (response.write() and Flush() ) .
You need to ensure that you do not write/flush anything before trying to send a HTTP header.
After sending headers there is no proper way to do a redirect as the only things you can do are outputting JavaScript to do the redirect (bad) or sending a 'meta refresh/location' tag which will most likely not be at the correct position (inside HEAD) and thus result in invalid html.
I had the same error and same approach. You might want to try using a javascript instead of directly calling Response.Redirect.
Response.Write("<script type='text/javascript'>");
Response.Write("window.location = '" + url + "'</script>");
Response.Flush();
Worked fine with me however I still need to check it on different browsers.
if (!Response.IsRequestBeingRedirected)
Response.Redirect("~/RMSPlusErrorPage.aspx?ErrorID=" + 100, false);
You can't use Response.Redirect as you've gone past headers and written out "Sometext". You have to check (redirect condition) before you start writing out data to the client or make a META redirect.
If you want one of those pages that shows text and redirects after 5s META is your option.
You won't get this error, if you redirect before the rendering of your page begins (for example when you redirect from the Load or PreRender events of the page).
I see now in your comments, that you would like to redirect after a long-running stored procedure completes. You might have to use a different approach in this case.
You could put for example an AJAX UpdatePanel with a Timer on your page, and the Timer could check in every few seconds whether the stored procedure has completed, and then do the redirection.
This approach also has the advantage, that you can put some "in progress" message on the page, while the procedure is running, so your user would know, that things are still happening.
Try to do the following:
catch (System.Threading.ThreadAbortException)
{
// To Handle HTTP Exception "Cannot redirect after HTTP headers have been sent".
}
catch (Exception e)
{//Here you can put your context.response.redirect("page.aspx");}
In my case, cause of the problem is that loading data in the scroll gridview is taking a long time. And before gridview data is not loaded completely, but I press the redirect button. I get this error.
You can lessen your data get
or
before loading completion prevent to press redirect button

Categories

Resources