Handling long page execution times in ASP.NET - c#

I'm working on a web application import program. Currently an admin user can upload a formatted csv file that my page will parse. I'm experiencing an execution duration issue as each line pertains to a file that has to be saved to Scribd, S3, as well as some internal processing.
What would you guys recommend for improving execution time? Since this is an admin only page, I doubt it would get run more than once a week, so my hope is to get it out the door asap.
I've looked some at the Async="true" flag, but I wasn't sure if that was the direction I wanted to go, or if I should look more that a windows server.

Two options come to mind:
Threads: In your code setup a collection of threads, join them and then have each one process a single file. Once all the threads complete you'll be able to return the page. This will increase your turn around time, but could still leave something to be desired on page returns
Queue: Have the user submit the csv file and provide a GUID/Hash/Whatever ID where the admin could then go to the "status" page, input their ID and check the details of their job. This solution will provide a quick feedback to the user and allow them to keep track of the results without having to wait around.

A quick and dirty option might be to set Page.Server.ScriptTimeout to a really high value on that page. (I think it maxes at Int.MaxValue).
Probably advisable to block the submit button after its been clicked, and inform the user that they may want to go make a coffee.

I'd suggest using AJAX to have an internal post back occur that would handle the asynchronous processing. You can periodically poll the state, and prevent your master page from having the "processing" wheel constantly churning on the page for the lengthy process.

I have a web page that takes a long time to process a mailing list so I kick it off in it's own thread. When the process is done, a report can be seen from another link on the result page. I have a runable MailSender class. The ASPX script has a bit in it that looks like this:
// prep the MailSender
MailSender ms = new MailSender(people, Subject, FileName....);
if (SendAsync) {
ThreadStart ts = new ThreadStart(ms.run);
Thread WorkerThread = new Thread(ts);
WorkerThread.Start();
} else {
ms.run();
}
If you want to speed your code up, try to break it into parallelizable pieces if you can and write a class for each piece. You could then kick off a new thread for each bit and monitor the status somewhere so the user can be informed when to come back to the results. You said that each line of your input would generate it's own output file. Sounds like a great candidate for multi-threading. Won't speed things up much if you don't have multi-cores availabe on the server though.
One problem with this whole scheme is that server restarts or application pool recycling will kill your long running process. This can be a problem if you threads are going to run for an hour or two.

As external factors are involved in the processing time, you need to consider if performance improvements would affect "actual" performance, if most of the time is in processing it and sending it to the thirdparty (ie Scribd,S3), then making improvements on your end might not have a huge affect and might increase the complexity for a simple task.
What I would do is have the aspx page only doing what aspx does best; ie handling the user interface part only (ie the upload), so once the upload is complete as far as the user is concerned their part is done. You could implement a progress indicator using AJAX to make it nicer but as its an admin section I wouldnt bother with the niceties,
Then have simple console application sheduled to fire at specific intervals, or a windows service watching a directory (depending on how timecritical the updates are), once the app runs as it is in the back ground and does not require user interaction, time is not a critical factor (ie you dont have a user waiting for context to be returned)..
it will appear to the user that things are very snappy (ie the time it takes to upload the file) and you are keeping needless complexity out of your solution.

I think the simplest solution to what you want is to use asynchronous pages in ASP.NET. Is there any particular reason why you don't want to go that route?
I can think of an alternative, which is to have some background process (like a process triggered by a scheduled task in Windows, or a Windows service) that will look at a queue of waiting jobs (say, from a database table) and process those jobs. This way you will have to upload that CSV somewhere and insert a db record so that the background process will see that CSV and use it when it comes around. But to me it seems like more work, so I'd rather use asynchronous pages :)
Here's a nice tutorial on ASP.NET asynchronous pages

Related

how to handle system wide key events in asp.net application

i am working on an asp.net web application, where tasks are assigned to users, we set standard time to every task, in that standard time period the user has to finish the task, there are two buttons on the page, proceed and save, when a user clicks on proceed button, the time is saved in database as starttime, and when the user clicks on save button, the time is saved in database as endtime. this way we are capturing the time period within which the user is completing the task.
the standard time is set on an average time study basis, not every time the task takes the same amount of time.
often users can complete the task in very less time than the standard time, in this case the users are proceeding the task and even after completing the task, instead of saving it, they lock the system and go for tea breaks and after coming from break, they save the task.
i want to save some information on the web page when they lock the pc even when the browser is minimized.
i tried implementing applet using jintellitype library but its not capturing the key combinations that are used by windows os.
i also tried using Silverlight but there is no such support as in winforms application in Silverlight, i have to create a com component or something that interacts with system32 or some native api. it doesn't seem easy for me, i would like to know if there is such library for Silverlight.
it should be browser independent, i haven't tried ActiveX, but i think it can be done using ActiveX, but i don't want to use ActiveX as it runs only on IE.
i want to know all the possible solutions to achieve this.
thanks in advance.
Why don´t you set a kind of timer-check to know if the last time is too far from the correspondent (and previewed) time to perform the job? If a task may expend, for instance, from 1 to 5 minutes, have 21 minutes is too far.
Why din´t you create a timer to TIMEOUT user? If users know they will be timed-out after some time, probably, they won´t leave to coffre-break during the test (a kind of penalty must be aggregated on this, like start from the initial point if timeout).
Why don´t you automatically save the record after the job finish, instead obly the user to press a button?
Until I know, you can perform SUSPEND mode, but not detect them if started from other apps.

How to wait for the post back in Watin?

In my C# code, I am using Watin to navigate the web, to log in to a page, I need to click the log in button, but right after I want to log out, so I have the click log out button right after, but the log out part doesn't work. I even tried closing the browser (using the close method) after logging in, but it didn't work. It feels like as soon as the page gets changed (i.e. after logging in) no more commands from the c# will work.
Does anyone know whats wrong?
As mentioned in another answer Thread.Sleep(milliseconds) is a way to wait for a time period for something to load. Very, very easy to implement, but it is far from optimal due to varying load times, and if you make it long enough so that it will always wait long enough you'll end up with a lot of wasted time. On one test this is not a big deal, but for instance if you have to wait 5 seconds and you have 1000 tests.... etc etc etc.
The route I've gone is:
Put in Thread.Sleep()s to determine if it is a "wait" issue.
If the the code with the Sleep() is going to be used more than once figure out what is causing the need for the sleep().
Refactor out the Sleep() using various Wait...() methods. WaitTilExists, WaitForAttributeEqualsWhatever, WaitForAsyncToFinish <- Not real methods, but WatiN has a bunch built in
The big cause of waits for me now is JQuery asynchronous calls in ASP.NET and I made a static helper class that works well for me to wait for async calls to finish. These tend to be very specific to what framework(s) the sites you're testing are written in.
The watin click command wait until the browser is loaded so practically it wait for the postback.
In case if you using ClickNoWait() command it will not wait.
So if your code looks like this it should work:
browser.GoTo("www.your-site.com");
// fill user/pass
browser.Button(Find.ByClass("login-class")).Click();
browser.Button(Find.ByClass("logout-class")).Click();
In case it's still not working you can add this after login click browser.WaitForComplete();
In Watin you will encounter many situations where the code is non blocking (you'll execute a line of code and will immediately keep going) so for those cases you'll need to find a different way to know that the next page (action, etc.) is already there. For example, on a login page you could check if that pages has a TextBox called UserName:
<code>
TextField uName = browser.TextField(Find.ByName("userName"));
if(uName.Exists)
{
// Then do the login code....
}
</code>
In the same way you should control that the page after the login is there before you keep going executing your code. So for example, if you are logging in into a page that you know that will contain the text: "Your Account Details" you might do something like this:
<code>
browser.GoTo("http://www.yourdomain.com/login.aspx");
//do your login code
browser.WaitUntilContainsText("Your Account Details", 240); // the second parameter indicates the seconds it will wait before it times out.
// your code to deal with the page after the login.
</code>
Using Thread.Sleep is a recipe for confusion and that's a problem for sure, you will NEVER get the timing right with a web page (even if you think it will take 10 seconds it might never come back and at that point the server will be terminating the opened connection).
Hope it helps.
Use Thread.sleep in your scripts to sync with logout and login...
or
instead of logout you directly close application and use ie instance to relogin to application

How to dynamically run process in asp.net

I am designing a website and it uses Windows Forms (in Visual Studio 10) in which for example i have five-six URLs. Now i am displaying them on home page of my website xyz.com
What i want is, i want to calculate total no. of tweets for all links and display links based on no. of times they are being tweeted/retweeted.
for a url we can calculate no. of tweet using twitter api http://urls.api.twitter.com/1/urls/count.json?url=YourURL
I know all the stuff like receiving JSON values in a string and parsing json to retrieve tweet counts and then compare and display links based on the priority etc.
What i have been using till now it is initiating all the process using a Click_Button.
But i want to know how can i automate this all for each 10 minutes. Its like a end user can see urls priority with just refreshing the page.
One way to do this is to run a scheduled task ever 10 mins which interacts with the DB. The web application also interacts with the DB and thus the two systems are distinct.
Side note: it is strongly recommended to use only console applications as scheduled tasks. If you make a windows form application will will have some issues.
As Kieren Johnstone has pointed out in another answer the best way to do this would be to write a windows service.
I still recommend the solution as described above as a first step since it is easy to debug and test.
Additionally, give some serious consideration to logging and error reporting -- with background tasks you can never know to much about what the heck it was doing when it broke.
If timing itself is not important (it doesn't have to be 10 minutes precisely), I would suggest binding to any event that fires when users use your application. No point in calculating anything if noone is using it :-)
So you could use a login, or page load, or whatever happens at an interval roughly like the interval you wish to achieve.
You can always store a DateTime variable somewhere that you can check to see when the calculation was last made. Something like:
public void MyEventHasFired()
{
DateTime dateLastProcessed = ... //Database? Session data? Anything goes.
if(dateLastProcessed < DateTime.Now.AddMinutes(-10))
{
//calculate
...
dateLastProcessed = DateTime.Now;
}
}
The best solution is definitely a Windows Service. It can be started, stopped and managed well, it's easy to log, maintain..
Scheduled Tasks are very prone to problems. At least in a Windows Service you can configure it to start automatically, re-start if there's a problem, you can control the timing yourself in the code, and catch/handle exceptions as you wish.
The best scheduler i know is Quartz.net
It'is not simple to use but it works great.
You can find an example with asp.net there http://blogs.planetcloud.co.uk/mygreatdiscovery/post/ASPNET-Scheduled-Tasks-with-QuartzNET.aspx
Anyway i agree with Kieren Johnstone: you should use a windows service

Background File Transfer problems

I have this problem when using the Background File Transfer in WP7. It works perfectly when my application is running, but as soon as I click the Windows button, it stops(and resumes when I activate the application again.) Isn´t the purpose of Background File Transfer to run in the background, even when your application is deactivated? Does it have to be in a separate class(some sort of background agent class, separate from the main project?) Really frustrating, when I am doing all that the tutorial here says: http://msdn.microsoft.com/en-us/library/hh202959(v=vs.92).aspx.
Are there some "special" things I need to do to make sure it runs in the background, or some methods, maybe the ones I created by myself(to get the url etc.), that can not be accessed while deactivated? Can I not add to the queue while deactivated, maybe?
Thanks a lot for your time:)
EDIT: A little debugging tells me that the file in the queue is actually downloading. It gets finished, but it doesn´t fetch the next one until I reactivate the app. Can I not use my own methods, variables etc when doing this? Maybe I have an internal queue for, say, 20 items. How can I then populate the download queue(max 5) when this gets to zero?
EDIT2: In the sample from Microsoft, they say that you can add to the queue at a later time:
// Check to see if the maximum number of requests per app has been exceeded.
if (BackgroundTransferService.Requests.Count() >= 5)
{
// Note: Instead of showing a message to the user, you could store the
// requested file URI in isolated storage and add it to the queue later.
MessageBox.Show("The maximum number of background file transfer requests for this application has been exceeded. ");
return;
}
But it does not say if we can do this while in background or not. Since it is about background file transferring, they should have mentioned it, otherwise we should assume it can be done in the background, which seems not to be the case. But we can´t know that. Anyone who can confirm this 100%?
I have looked into this as well and it isn't possible (based on my research) to populate the queue after the max. 5 queued downloads have finished. I thought about using a background agent but BackgroundTransferRequest.Add is unavailable from background agents meaning the only way to queue more downloads is when your app is running (see Unsupported APIs for Background Agents for Windows Phone).
The only thing I can think of is using a background agent to send a toast notification letting the user know that the downloads have finished and that they should start the app to queue the next five downloads. This is less than ideal.

Opinion on User Experience - C# Winforms

I’ve got a process which will take a little under 5 seconds to complete. The user will most likely notice the program flicker for a few seconds after pushing the “go” button.
My question is:
Is this something that would normally be dumped onto a background worker, or is there another .NET method for handling small tasks, or is this something that shouldn’t be a concern?
FYI:
The process opens a user specified excel file, processes an unknown number of lines (max 1.5 million due to excel I believe), and queries a database (very quick query). So at the worst case scenario the user uploads a 1.5 million row excel file and is running on a very slow internet connection.
If you don't want the user to be able to do anything while the file is being uploaded, then you don't need to put it on a different thread.
If you want the user to be able to go on to other tasks while the file is uploading, put it on a different thread.
As a general rule of thumb, if I have a situation where I absolutely don't want the user to do anything while a long-running process is going, I disable the controls on the form until the task is complete, and usually use a status indicator to show that progress is happening.
My personal guideline for whether or not to allow user interaction is if the results of a process could be altered by a user action in mid-stream.
For example, one program that we have parses a bunch of queries on a highly normalized database (normalized to the point where reporting is sloooow) into "reportable" tables, and I don't want the user altering data in one of the source tables while the query is running, because it will give goofy results.
If there is no harm in allowing user interaction while the process is occuring, then put it in another thread.
Edit
Actually, on reading #UrbanEsc and #archer's comments, I agree with them. Still put it on a different thread and freeze the controls (and include a progress indicator where possible).
I would push this to a background worker. Doing so will keep the UI responsive. If the process ever does lag for more than a few seconds, users start getting nervous ...especially when the lagging process causes the UI to be 'frozen'.
From a user experience point of view it might be best to hand the job over to a different thread or an asynchronous worker and tell the user that his request is being processed in the background. Once the worker finishes, a success/failure message can be handled and shown to the user as required.
The cheapest way to handle the problem is to turn the cursor into an hourglass during the processing. That tells user please wait, I'm busy.
According to the budget (time and/or effort) you're willing to throw in it, using a backgroundworker and some reporting GUI is certainly a plus. But it's up to you according to your app.
For example, I'm currently modifying an in-house app that has 3 users. In that case, the hourglass is OK: All 3 of them will quickly learn they just have to wait. Don't get me wrong: this app is damn important. Without it, the small company that uses it would just die. But if I ask them for 2 hours of extra budget for a nice and tested little GUI, background thread, blah vs an hourglass, what do you think they'll say?
On the other hand, if it's an important operation in your flagship product, of course be nice to your users! Don't hesitate: background thread. Especially if the operation may actually take much longer than those 5 seconds.
Conclusion: Be pragmatic!
I would put it into a background worker or fire of a task if you are in .NET 4.0, for example:
void OnButtonClick(...)
{
new TaskFactory().StartNew(() => { /* your excel and query code */ });
}
I'll vote for the background worker process, since a frozen UI is like a frozen application, and most of users will think your application isn't doing anything at all.
UI thread for a progress bar or some animation, info text noticing what's going on + background worker thread = win
I think every process not related with the UI itself should be started as a separate thred or, in this case, as a bg worker. This will help to maintain the app healthy and easy to improve/fix in the future.
Also, as a user or tester, I really hate flicking and freezing windows...
Regards.
A general rule of thumb is any operation that takes a second or longer to complete requires some form of feedback to the user. This can be a progress bar, message, etc. Anything longer then that then the user becomes frustrated (not sure if they did something wrong, hate waiting, etc).
For operations like this that can take longer based on the environment (number of apps, available memory, data size, hard drive speed, etc) they should ALWAYS be put on a background thread and pipe messages back to the UI. I love the BackGroundWorker for this.

Categories

Resources