ASP.net Framework and blocked threads - c#

Can anyone explain the following behaviour of an ASP.net webform on IIS. When I have a page and start it 4 times, the pages are served synchroniously. In other words if this page takes about 10 seconds to build, I see the first page appear at around 10sec, 2nd at 20sec, 3rd at 30sec, 4th at 40sec. When asking the threadnumber I usually see 2 threadnumbers, which makes me think IIS/ASP.net is only having two threads in IIS per application pool?
Small example:
protected void Page_Load(object sender, EventArgs e)
{
System.Threading.Thread.Sleep(10000);
this.Label1.Text = System.Threading.Thread.CurrentThread.ManagedThreadId.ToString();
}
When I look for suggestions on the internet I find that I should program asynchroniously and for example use await Task.Delay to not block the thread, but I still have the same result when I use this Page_Load:
protected async void Page_Load(object sender, EventArgs e)
{
await System.Threading.Tasks.Task.Delay(10000);
this.Label1.Text = System.Threading.Thread.CurrentThread.ManagedThreadId.ToString();
}
Can anyone explain me this behaviour? First of all if there are two threads I would expect two pages being served at 10sec and the 3rd and 4th at 20sec. But even when I run the code with async it doesn't really make much/any difference. How can I circumvent this blocking of threads with an easy solution?
Best regards,
Rémy Samulski
EDIT: I was calling the website from the same browser and the same URL. This caused the issue. Thx to Richard for helping me figuring out my mistake.

Related

Speech Synthesizer "SpeakAsyncCancelAll" runs in user interface thread

I have a Windows forms application that I am trying to add accessibility to and have run into an issue with the speech synthesizer where it appears that the SpeechAsyncCancelAll runs in the user interface thread. Performance is totally dependent on the power of the PC.
This can be reproduced with a very simple application in Windows forms.
Create a form and add a numeric up down control. Then use this code:
using System.Windows.Forms;
using System.Speech;
using System.Speech.Synthesis;
namespace WindowsFormsApp8
{
public partial class Form1 : Form
{
SpeechSynthesizer _speech = new SpeechSynthesizer();
public Form1()
{
InitializeComponent();
}
private void numericUpDown1_ValueChanged(object sender, EventArgs e)
{
_speech.SpeakAsyncCancelAll();
_speech.SpeakAsync(numericUpDown1.Value.ToString());
}
}
}
On my development machine which is very powerful it runs without a problem and very fast when you hold down the up arrow. Each value is cancelled so you do not hear anything as the control increments and when you stop pressing the up arrow it announces the last value properly.
However, the minute this is run on a lesser PC, even a core i9 hexacore machine, the repeat on the increment slows to a crawl.
It looks to me that this is running on the user interface thread.
Any suggestions?
Thanks
Don't get yourself tricked by the "Async" in the name of the SpeakAsyncCancelAll() method name. As one can see in the source code of the SpeechSynthesizer and VoiceSynthesis classes, there is quite some synchronous code involved in order to communicate with a background thread that does the actual voice synthesis. This code is actually quite heavy in that it uses multiple lock statements.
A best practice solution for this situation (multiple successive user interactions could create a series of code reactions but in the end we only want the last one) is to not directly start the reaction, but start a timer and only perform the reaction if there was no other user interaction in the meantime.
public partial class Form1 : Form
{
private SpeechSynthesizer _speech = new SpeechSynthesizer();
public Form1()
{
InitializeComponent();
timer1.Interval = 500;
}
private void numericUpDown1_ValueChanged(object sender, EventArgs e)
{
// Reset timer
timer1.Stop();
timer1.Start();
}
private void timer1_Tick(object sender, EventArgs e)
{
timer1.Stop();
_speech.SpeakAsyncCancelAll();
_speech.SpeakAsync(numericUpDown1.Value.ToString());
}
}
You should allow the user to configure the timer interval to chose a good compromise based on their system performance and their individual usage patterns. People who need audio assistance often consider for good reasons a too long delay between user activity and an audio response as wasting their time. So it is important that users can configure such a delay to best fit their individual needs.
Let's assume you have taken Neil's excellent comment into consideration, and checked the repeat rate of the NumericUpDown control on the other PCs "without" calling the speech engine. Good.
Your code looks right. The SpeakAsyncCancelAll and SpeakAsync do not block and are "expected" to be running on a background thread. When I attempted to reproduce the problem (not a shocker) your code works fine on my PC using the test condition you describe. That being the case, maybe you could try a couple of variations on the slim chance that something makes a difference and yields some kind of clue by ruling out some unlikely issues.
Variation 1
Capture the "text to say" and post the work using BeginInvoke. This ensures that nothing could possibly be interfering with the ValueChanged or MouseDown messages from pumping in the message queue.
private void numericUpDown1_ValueChanged(object sender, EventArgs e)
{
// Make 100% sure that the up-down ctrl is decoupled from speak call.
var say = $"{numericUpDown1.Value}";
// Ruling out an unlikely problem
BeginInvoke((MethodInvoker)delegate
{
_speech.SpeakAsyncCancelAll();
_speech.SpeakAsync(say);
});
}
Variation 2
Since you have a suspicion that something is running on the UI thread that shouldn't be, go ahead and give explicit instructions to post it on a background Task. At least we can rule that out.
private void numericUpDown2_ValueChanged(object sender, EventArgs e)
{
// Make 100% sure that the up-down ctrl is decoupled from speak call.
var say = $"{numericUpDown2.Value}";
// Ruling out an unlikely problem
Task.Run(() =>
{
_speech.SpeakAsyncCancelAll();
_speech.SpeakAsync(say);
});
}
Variation 3 - Inspired by NineBerry's answer (added to test code project repo)
/// <summary>
/// Watchdog timer inspired by NineBerry.
/// https://stackoverflow.com/a/74975629/5438626
/// Please accept THAT answer if this solves your issue.
/// </summary>
int _changeCount = 0;
private void numericUpDown3_ValueChanged(object sender, EventArgs e)
{
var captureCount = ++_changeCount;
var say = $"{numericUpDown3.Value}";
Task
.Delay(TimeSpan.FromMilliseconds(250))
.GetAwaiter()
.OnCompleted(() =>
{
if(captureCount.Equals(_changeCount))
{
Debug.WriteLine(say);
_speech.SpeakAsyncCancelAll();
_speech.SpeakAsync(say);
}
});
}
Well the above answers do not solve the issue. However, all the tested computers were dell computers. By default when the OS is installed, Dell installs a sound utility called MaxWaves which allows different audio enhancements. Although all options are off in this utility, it appears that it buffers the sound and when an Async.CancelAll() call comes, it blocks until the sound duration is complete. Therefore everything appears to slow to a crawl.
Uninstalling this utility as well as disabling it as a service corrects the problem.
Everything now works correctly. Thank you for your answers.

ASP.net global.asax Timers randomly stop working

Given the code:
protected void Application_Start(object sender, EventArgs e)
{
var testTimer = new Timer(
LogTimer,
null,
new TimeSpan(0, 0, 0, 0),
new TimeSpan(0, 0, 0, 1)
);
}
public static void LogTimer(object sender)
{
"Hello".Log();
}
At seemingly random occasions the timer stops firing, and wont start again unless I restart the website.
It doesn't throw any exceptions, but looking in the Windows error log there are some entries:
The Open Procedure for service "Lsa" in DLL "C:\Windows\System32\Secur32.dll" failed. Performance data for this service will not be available. The first four bytes (DWORD) of the Data section contains the error code.
Unable to open the Server service performance object. The first four bytes (DWORD) of the Data section contains the status code.
The site is active (the start mode of the app pool is AlwaysRunning.
I understand that using timers in this way is not a recommended approach for critical things for exactly this reason, but I am failing to come up with an explanation as to why it's silently and apparently randomly just giving up.
From your code, I expect the garbage collector to collect your timer since there is no handle for that. have you tried something like
static Timer testTimer ;
protected void Application_Start(object sender, EventArgs e)
{
testTimer = new Timer(...);
}
ASP.NET isn't suited to running timers due to the way AppDomains get unloaded, the threading model and many other factors.
I suggest you read this blog post from Scott Hanselman that discusses various ways to successfully run timer-based code in ASP.NET web applications.

Custom WebPart Timing out after 5 minutes exactly

I have a SharePoint 2010 webpart that is timing out after 5 minutes when on the QA servers. The QA environment is 1 web server, 2 app servers, and a SQL server. I've tried EVERYTHING to fix this. Changed the IIS timeouts, worker failures, disabled ping checks and nothing works. It's the simplest code in the world!!
Could someone please try this on one of your environments? Preferably one with multiple servers making up the farm. This code works perfectly on our dev servers which is just one box. All I am doing is calling the SPLongOperation, sleeping for 6 minutes, and ending the long operation.
Anyone have a clue what I'm missing?
Code:
protected void btnTest_Click(object sender, EventArgs e)
{
string currentUrl = SPContext.Current.Web.Url;
using (SPLongOperation operation = new SPLongOperation(this.Page))
{
operation.LeadingHTML = "Updating Reports on Selected Project Sites";
operation.TrailingHTML = "Gathering the worker threads, this could take a second. Please stand by...";
operation.Begin();
Thread.Sleep(360000);
operation.End(currentUrl, SPRedirectFlags.Default, this.Context, "");
}
SetStatus("Slept for a long time!", false);
}
Interestingly, if I run the below operation where I don't navigate to the long operation page, it makes me re-authenticate but then redirects me to the "Internet Explorer cannot display the webpage" screen. Very odd.
protected void btnTest2_Click(object sender, EventArgs e)
{
SetStatus("Before Sleep", false);
Thread.Sleep(360000);
SetStatus("Sleep Successful!", false);
}

How can I delay an HTTP response for x seconds?

I won't go into the boring details of why I need this, it's part of an internal analytics package, but my goal is to create an ASP.NET page that returns a redirect after 2 seconds.
The problem I'm seeing is that using Thread.Sleep(2000); is going to hold up one of my ASP.NET ThreadPool threads. As I understand it, this is pretty wasteful as thread creation isn't cheap and I need this server to handle as many possible simultaneous connections as possible.
So, what's the best way to have HTTP GETs to my page return after at least 2 seconds (over 2 seconds is no problem, it just can't be under).
protected void Page_Load(object sender, EventArgs e)
{
Thread.Sleep(2000);
Response.Redirect(RedirectUri);
}
EDIT
I should clarify, the requested page is actually requested as an image, so returning HTML isn't possible. It'll be used like so:
<img src="http://hostname/record.aspx"/>
The redirect to an actual image should take 2 seconds.
You can do this on the markup itself, you can put something like:
<head>
<meta http-equiv="refresh" content="3; URL=otherpage.aspx">
</head>
You could implement IHttpAsyncHandler. See MSDN.
Do it in JS:
setTimeout(function() {
$.ajax({url: './script.aspx'});
},2000);
There is no way simple way to delay program execution without holding up a thread. You could in theory set up a a delay at the other server where the Re-Direct is occurring, but if you are just trying to cause a delay or timeout prior to the Redirect, you'll have to pay the penalty of a waiting thread.

Measure page load time?

I saw some other method of measure with using trace, but I just wonder this method measure correctly...
I overrided each execution of the following:
PreInit
Init
InitComplete
PreLoad
Load
LoadComplete
PreRender
PreRenderComplete
SaveStateComplete
Unload
Then I store the time when the handler execute with using DateTime.Now.Tick...
At the end of Unload I will print out each of their execution time....
So the time above should be the time server spent to generate the page?
I am asking is because I notice some page took like 879ms in total above, but until my browser actually see the page is take few more seconds.
Those few more seconds should be the time that takes to download the page from server?
Thanks in advance.
in global.asax
namespace aaaaa
{
public class Global : System.Web.HttpApplication
{
private Stopwatch sw = null;
protected void Application_BeginRequest(object sender, EventArgs e)
{
sw = Stopwatch.StartNew();
}
protected void Application_EndRequest(object sender, EventArgs e)
{
if (sw != null)
{
sw.Stop();
Response.Write("took " + sw.Elapsed.TotalSeconds.ToString("0.#######") + " seconds to generate this page");
}
}
}
}
Yes, there's time for the code to run and the time for the browser to get the response from the server and output it to the screen. You can measure the front-end work using a variety of measuring sites:
Pingdom Tools
WebWait
Web page Test
To determine the timing for processing of the code, I would use the StopWatch class instead of using DateTime. DateTime is more precise to the decimal point, but less accurate. StopWatch is designed exactly for that and would be better to use to calculate the timing. Avoid calling it a lot though, as that itself will add overhead to the page processing. I would create a new StopWatch() then call Start at the very beginning, then call stop at the very end. Then spit out the elapsed time after.
StopWatch class
Precise Run Time Measurements with Stopwatch
If you are just looking for overall time, why not just look at the time-taken value in your IIS logs?
The extra time in the browser could be a lot of things. Fetching, Images, CSS, javascript files, javascript running in the page, and the client rendering of the HTML itself. If you want to get a better feel for what is actually happening and when fire up Fiddler and then reload your page and look at what happened in Fiddler.

Categories

Resources