Background
I have an application that simply does the following tasks:
Take input from user
Run a SQL Stored Procedure passing the user input as a parameter
Do something
However, some tables used in the stored procedure get locked due to another scheduled jobs which I don't have control over.
What I have
So, I basically have the code that looks like this so that users are redirected to the "server is busy" page:
Dictionary<int, string> result = new Dictionary<int, string>();
try
{
result = new myRepository().GetUserInfo(userInput);
}
catch(Exception ex)
{
// in the error page, tell users to try again in a few minutes because the server is busy
Response.Redirect("ServerIsBusy.aspx");
}
What I want to do
With my current code, it takes while to display the "server is busy" message, so I'm trying to find if there is a way to detect if the stored procedure (GetUserInfo(userInput)) runs slow as quickly as possible so that I can redirect users to the "server is busy" page.
What I've tried
I tried setting httpRunTime executionTimeout to a few seconds, but as explained in many sites, this setting doesn't work well when the number is small, and it took about 20 seconds when I set it to 5 seconds.
Is there any configurations or a trick that I could use to solve this?
If you can use Ajax, i would make an asynchrone (with jQuery) request to your web server and show a buzy icon/label/message on your website.
Like this:
function doSomeThing() {
$.ajax({
type: "GET",
url: url,
async: true
});
}
When you work with databases, be sure you set the correct indices, often you get a lot of speed improvement. But be sure you set the right indices, to much and wrong can be bad too.
Related
I have a .NET 4.5.2 ASP.NET webapp in which a chunk of code makes async webclient calls back into web pages inside the same webapp. (Yes, my webapp makes async calls back into itself.) It does this in order to screen scrape, grab the html, and hand it to a PDF generator.
I had this all working...except that it was very slow because there are about 15 labor-intensive reports that take roughly 3 seconds each, or 45 seconds in total. Since that is so slow I attempted to generate all these concurrently in parallel, and that's when things hit the fan.
What is happening is that my aspx reports (that get hit by webclient) never make it past the class constructor until timeout. Page_Load doesn't get hit until timeout, or any other page events. The report generation (and webclient calls) are triggered when the user clicks Save in the webapp, and a bunch of stuff happens, including this async page generation activity. The webapp requires windows authentication which I'm handling fine.
So when the multithreaded stuff kicks off, a bunch of webclient requests are made, and they all get stuck in the reports' class contructor for a few minutes, and then time out. During/after timeout, session data is cleared, and when that happens, the reports cannot get their data.
Here is the multithreaded code:
Parallel.ForEach(folders, ( folderPath ) =>
{
...
string html = getReportHTML(fullReportURL, aspNetSessionID);
// hand html to the PDF generator here...
...
});
private string getReportHTML( string url, string aspNetSessionID ) {
using( WebClient webClient = new WebClient() ) {
webClient.UseDefaultCredentials = true;
webClient.Headers.Add(HttpRequestHeader.Cookie, "ASP.NET_SessionId=" + aspNetSessionID);
string fullReportURL = url;
byte[] reportBytes = webClient.DownloadData(fullReportURL);
if( reportBytes != null && reportBytes.Length > 0 ) {
string html = Encoding.ASCII.GetString(reportBytes);
return html;
}
}
return string.Empty;
}
Important points:
Notice I have to include the ASP.NET session cookie, or the web call doesn't work.
webClient.UseDefaultCredentials = true is required for the winauth.
The fragile session state and architecture is not changeable in the short term - it's an old and massive webapp and I am stuck with it. The reports are complex and rely heavily on session state (and prior to session state many db lookups and calcs are occurring.
Even though I'm calling reports from my webapp to my same webapp, I must use an absolute url - relative URL throws errors.
When I extract the code samples above into a separate .net console app, it works well, and doesn't get stuck in the constructor. Because of this, the issue must lie (at least in part) in the fact that my web app is making async calls back to itself. I don't know how to avoid doing this. I even flirted with Server.Execute() which really blows up inside worker threads.
The reports cannot be generated in a windows service or some other process - it must be linked to the webapp's save event.
There's a lot going on here, but I think the most fundamental question/problem is that these concurrent webclient calls hit the ASPX pages and get stuck in the constructor, going no further into page events. And after about 2 minutes, all those threads flood down into the page events, where failures occur because the main webapp's session state is no longer active.
Chicken or egg: I don't know whether the threads unblock and eventually hit page events because the session state was cleared, or the other way around. Or maybe there is no connection.
Any ideas?
So I am trying to query an API that's accessible via HTTP ( no authorization ). To speed things up, I tried to use a Parallel.ForEach loop but it seems like the longer it runs, the more errors pop up.
It fails to retrieve more and more requests. I know the API provider isn't limiting me because I can request the very same blocked URLs in my Internet browser. Also, these are different failed URLs each time, so it doesn't seem to be the case of malformed requests.
The error doesn't seem to occur while I use single threaded foreach loop.
My malfunctioning loop is below:
Parallel.ForEach(this.urlArray, singleUrl => {
this.apiResponseBlob = new System.Net.WebClient ().DownloadString(singleUrl );
this.responsesDictionary.Add(singleUrl, apiResponseBlob);
}
Normal foreach loop works fine but is very slow:
foreach (string singleUrl in this.urlArray) {
this.apiResponseBlob = new System.Net.WebClient ().DownloadString(singleUrl);
this.responsesDictionary.Add(singleUrl, apiResponseBlob);
}
Also: I've had a solution in PHP - I spawned several "fetchers" simultaneously and it never hung up. It seems strange to me that PHP would handle multithreaded retrieval better than C# so I must obviously miss something.
How do I query the API fastest way? Without these strange failures?
Hi did you try to speed up your code with a sync downloads like in this question (see marked answer):
DownloadStringAsync wait for request completion
your could loop through your uris and get a callback for each successfull download.
EDIT : i have seen that you use
this.apiResponseBlob = DL
when you use multithreading every thread tries to write in that variable. This could be a reason vor your bug. Try using an instance of that object type or use
lock{}
so that only one thread can write this variable at time.
http://msdn.microsoft.com/de-de/library/c5kehkcz.aspx
like
Parallel.ForEach(this.urlArray, singleUrl => {
var apiResponseBlob = new System.Net.WebClient ().DownloadString(singleUrl );
lock(singleUrl.ToString()){
this.responsesDictionary.Add(singleUrl, apiResponseBlob);
}
}
I have website used by hundred of viewers every day. One of the pages has a timer that ticks every 10 seconds, when it does so it gets the latest data from the database and updates the screen.
The problem i have is that a high number of users and a high number of database connections takes its toll on the server.
Is there a better way of doing this? Updating server side and all users would benefit from the latest data but only the server is carrying out the calls every 10 seconds and not by every user?
SignalR is the way you'll want to go. Right now, your application is probably loading a page. Then you got some jQuery that probably sets a timer for 10 seconds. Then your timer is kicking off and you're probably doing an ajax call to get refreshed data, then putting that refreshed data into a <div> or something.
So essentially, every 10 seconds, your back end is calling your SQL server, doing some kind of SELECT statement, then the data from the SQL Server is being transmitted to your application server, where you are taking that data, transforming into displayable data.
SignalR, on the other hand works differently. It uses push technology. Push technology works like this. Lets say you have 5 people visiting your page right now. One person (person A) is doing something that saves something to the database. No one else is seeing this data though yet. But SignalR will send a signal out to everyone else (or just the people in which this database save affects) that says "Hey! There is newer data available. You should update now". The other people connected do an ajax call and get the refreshed data. And viola! The other 4 people now have updated data on their screen!
I hope I explained this clearly enough for you to understand! Scott Hanselman wrote a good introduction to SignalR.
I would use the setInterval() to fire an ajax function every 10 seconds.
window.setInterval("javascript function", milliseconds);
the javascript function would be similar to
function GetLatest(){
$.ajax({
type: 'POST',
url: '../Services/UpdateService.asmx/GetLatest',
data: '',
contentType: "application/json; charset=utf-8",
success: function (data) {
//use the data.d to get the info passed back from webservice. then
add your logic to update your html
},
error: function () {
//catch any bad data
}
});
}
your backend method should look like this. Object is whatever your objcet is
[WebMethod]
[ScriptMethod(ResponseFormat = ResponseFormat.Json)]
public Object GetLatest()
{
Object obj= new Object ();
obj.FirstName = "Dave";
obj.LastName = "Ward";
return obj;
}
Have you considered having your server side code cache the data for that 10 seconds. Cache it and set it's expiration for 10 seconds. That way, everyone gets the same information for 10 seconds and then the first after that 10 retrieves a new data set and then caches it. That way, only the first person to refresh causes a DB query, the rest get data up to 10 seconds old. Something like this:
Cache.Insert(key, data, Nothing, DateTime.Now.AddSeconds(10), TimeSpan.Zero)
I guess this is assuming that the users are all getting the same data at each poll. If they are all getting unique datasets, this won't cut it for you.
I'm newbie with SignalR and want to learn so much. i already read beginner documents. But in this case i've stucked. what i want to do is when a user got new message i want to fire a script, like alert or showing div like "you have new mail" for notify the recieved user. And my question is how can i do that ? is there anyone know how to achieve this ? or good "step-by-step" document? i really want to work with SignalR.
ps: i'm using Visual Studio 2012 and MsSQL server
edit: i forgot to write, notification must be fired when message created to DB
Thank you
In your Scripts use the following, naturally this is not all the code, but enough based off tutorials to get you going. Your userId will be generated server side, and somehow your script can get it off an element of the page, or whatever method you want. It runs when the connection is started and then every 10 seconds. Pinging our server side method of CheckMessage() .
This js would need refactoring but should give you the general idea.
...
var messageHub = $.connection.messageHub;
var userId = 4;
$.connection.hub.start().done(function () {
StartCheck();
}
//Runs every 10 seconds..
function StartCheck()
{
setInterval(messageHub.server.checkMessage(userId,$.connection.hub.id), 10000);
}
This method takes in a userId, assuming your db is set up that way, and grabs them all from your database; naturally the method used is probably not appropriate for your system, however change it as you need to. It also checks if the user has any messages, and if so sends down another message to our SignalR scripts.
public void CheckMessage(int userId,int connectionId)
{
var user = userRepo.RetrieveAllUsers.FirstOrDefault(u=>u.id == userId);
if(user.HasMessages)
{
Clients.Group(connectionId).DisplayMailPopUp();
}
}
Finally this message, upon being called would run your code to do the 'You have Mail alert' - be it a popup, a div being faded in or whatever.
...
messageHub.client.displayMailPopUp = function () {
alert("You have Mail!");
};
...
Hopefully this helps - I recommend the following links for reading up and building your first SignalR app:
http://www.asp.net/signalr/overview/signalr-20/getting-started-with-signalr-20/tutorial-getting-started-with-signalr-20-and-mvc-5
And a smaller sample: http://code.msdn.microsoft.com/SignalR-Getting-Started-b9d18aa9
I'm creating a file processor for use in an intranet.
I described it in another question - ERR_EMPTY_RESPONSE when processing a large number of files in ASP.Net using C#
Now, as suggested on above question's answer, I'm trying to use threads to execute the file processing task.
But there is a problem. I need the newly created thread to write feedbacks to a component in page (asp:panel, or div, or whatever). Those feedbacks would be results from several database operations.
The application reads those txts, interprets each line of it, and insert data in database. Each line inserted in database must return a feedback, like "registry 'regname' inserted successfully", or "i got problems inserting registry 'regname' in file 'filename', skipping to next registry".
I did test with something very simple:
protected void DoImport()
{
try
{
MainBody.Style.Add(HtmlTextWriterStyle.Cursor, "wait");
int x = 0;
while (x < 10000)
{
ReturnMessage(String.Format("Number {0}<hr />", x), ref pnlConfirms);
x++;
}
}
catch (Exception ex)
{
ReturnMessage(String.Format("<font style='color:red;'><b>FATAL ERROR DURING DATA IMPORT</b></font><br /><br /><font style='color:black;'><b>Message:</b></font><font style='color:orange;'> {0}</font><br />{1}", ex.Message, ex.StackTrace), ref pnlErrors);
}
finally
{
MainBody.Style.Add(HtmlTextWriterStyle.Cursor, "default");
}
}
This function is called from Page_Load, and fills an asp:panel called "pnlConfirms" with a row of numbers, but all at once, on load.
I changed it to:
protected void DoImport()
{
try
{
MainBody.Style.Add(HtmlTextWriterStyle.Cursor, "wait");
ThreadPool.QueueUserWorkItem(new WaitCallback(DoWork));
}
catch (Exception ex)
{
ReturnMessage(String.Format("<font style='color:red;'><b>FATAL ERROR DURING DATA IMPORT</b></font><br /><br /><font style='color:black;'><b>Message:</b></font><font style='color:orange;'> {0}</font><br />{1}", ex.Message, ex.StackTrace), ref pnlErrors);
}
finally
{
MainBody.Style.Add(HtmlTextWriterStyle.Cursor, "default");
}
}
private void DoWork(Object stateInfo)
{
int x = 0;
while (x < 10000)
{
ReturnMessage(String.Format("Number {0}<hr />", x), ref pnlConfirms);
x++;
}
}
And both uses this function:
public void ReturnMessage(string message, ref Panel panel, bool reset = false)
{
if (reset)
{
panel.Controls.Clear();
}
Label msg = new Label();
msg.Attributes.Add("width", "100%");
msg.Text = message;
panel.Controls.Add(msg);
}
I need ThreadPool.QueueUserWorkItem(new WaitCallback(DoWork)); to fill those asp:panels with feedbacks - like insertion errors and warnings.
My code already has those feedbacks under try...catch statements, but they're not getting output to any asp:panel from threadpool (it works when invoked directly from DoImport() function, like in the first example I posted).
I'm doing something very wrong, but I can't find out what (and I'm researching this for almost 2 weeks). Please, help!
In ASP.NET, when a browser requests a page, that page is rendered and sent to the browser as soon as its processing finishes, so the browser will show the page as it's finally rendered.
According to your code you're trying to render a page, show a wait cursor, and expect it's shown on the browser and then, the cursor is changed by a default cursor. As I explained, independently from using or not additional threads, the page won't be sent to the browser until it's completely rendered. So you'l never see the wait cursor on the client side.
The easiest wait to get what you're trying to do is to use web services (traditional .asmx or WCF) and AJAX (jquery os ASP.NET AJAX).
1) create a web service that does the processing
2) create a page which is sent to the browser, and, using javascript (jQuery or ASP.NET AJAX) make a call to the web service, and show something to let the user know that the request is being processed. (a wait cursor, or even better an animated gif)
3) when the process finishes, your javascript will get the responde from the web service, and you can update the page to let the user know the process has finished.
if you don't have experience on javascript, you can make most of this task using:
ScriptManager which can be used to create a javascript web service proxy for your client side (other interesting article) and is required for the rest of the controls
some javascript (or jquery) which can be use to update the "process running/ process finished hints" on the client side. I.e. when the call to the web service ends, you can use javascript to update the page using DOM, or load a new page or the same page with an special parameter to show the result of the process
In this way you can do what you want:
1) show a page in a state that shows the process is running
2) show the same, or other page, in a state that shows the end of the process
The trick is comunicating the browser with the server, and this can only be done using some of the available ajax techniques.
Another typical technique is using jQuery.ajax, like explained in encosia.com
According to the OP message, the process of all the files would be so slow that it would tiemout the web service call. If this is the case, you can use this solution:
1) Create a web service that process one (or a batch) of the pending files, and returns at least the number of pending files when it finishes the processing of the current file (or batch).
2) from the client side (javascript), call the web service. When it finishes, update the page showing the number of pending files, and, if this number is greater than zero, call the web service again.
3) when the call to the web service returns 0 pending files, you can update the page to show the work is finished, and don't call it any more.
If you process all the files at once, there will be no feedback on the client side, and there will also be a timeout. Besides, IIS can decide to stop the working thread which is making the work. IIS does this for several reasons.
A more reliable solution, but harder to implement, is:
1) implement a Windows Service, that does the file processing
2) implement a web service that returns the number of pending files (you can communicate the Windows Service and Web App indirectly using the file system, a database table or something like that)
3) use a timer (ajax timer, or javascript setInterval) from your web page to poll the server every N seconds using the web service, until the number of pending files is 0.
An even harder way to do this is hosting a WCF service in your Windows Service, instead of the indirect communication between your web app and windows service. This case is much more complicated because you need to use threads to do the work, and attend the calls to the wcf service. If you can use indirect communitacion it's much easier to implemente. The dtabse table is a simple and effective solution: your working process updates a row a table whenever it process a file, and the web service reads the progress state from this table.
There are many different soultions for a not so simple problem.
You are starting new thread (or more precise running your code on one of free threads in thread pool)and not waiting for results in main thread. Something like Thread.Join (if you would use manual thread creation) or other synchronization mechanism as events need to be used if you want to go this route.
The question you've linked to suggests using asynchronous pages which you are not doing. You would start processing request, kick off the task and release the thread, when the task is finished you complete request.
Side note: consider simply doing all conversion on main thread that handles request. Unless you expect slow I/O to complete the task moving CPU work from one thread to another may not produce significant gains. Please measure performance of your current solution and confirm that it does not meet performance goals you have set up for your application. (this does not apply if you doing it for fun/educational purposes).