asp.net page not responding after 30 minutes [closed] - c#

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
I have a asp.net web page in which the button click event process runs for 35 minutes and in the front end I am using ajax and showing a progress bar image. If the process (button click event) completes in less than 30 minutes then page reloads successfully, else "in progress" image keeps showing even after the process is completed, until the time of AsyncPostBackTimeout (which is set to 60 minutes) and shows server time out issue after 60 minutes.
Please let me know if there is something I am doing wrong.

Without seeing your code, I can't tell you what's going wrong. However, I can recommend a couple of options:
Break the task out in to multiple steps (instead of one long chained task). it may be a little more work for the user, but at least they're not left hanging on a page for a half hour+ (ouch!).
Use a profiler to see what's actually taking so long and see if you can't optimize the code to cut down on the process. For example, if it's a database call it may make sense to make a stored procedure instead of multiple select/updates (with data doing back and forth)--keep the processing on the machine until the final result is needed.
For long tasks, it may make sense to break the process out in to a service or separate entity (and just have the service report back progress). For example MSMQ is a great way to have a dedicated service running and pass tasks off to it when needed. Just keep in mind, this now creates another layer which is one more place to maintain.

If a process takes 30 minutes, it could tomorrow take 60 minutes or more just because your servers will be busy doing other things. The approach is then fundamentally wrong.
My advice would be to put such long tasks to another layer, a system service. The service runs, picks tasks from a queue, executes one by one. The front layer just peeks every few seconds/minutes to see if the operation is complete. Or even better, users do not wait, they do other things and eventually somehow they are informed that the long-running task is complete.

Related

making c# accurtate as posible (maybe with hacs)? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
From what I know it is rather known that c# can not be accurate when timing is critical. I certainly can understand that but was hoping there were known game hacks to help my issue.
tech:
I'm using an API for USB that sends data over a control transfer. In the API I get an event when an interrupt transfer occurs (one every 8 ms). I then simply fire off my control transfer at that exact time. What I have noticed, however not often, is that it takes more then 8ms to fire. Most of the time it does so in a timely matter (< 1ms after the interrupt event). The issue is that control transfers can not happen at the same time of an interrupt transfer so the control transfer must be done with in 5ms of the interrupt transfer so that it is complete and the interrupt transfer can take place.
So usb stuff aside my issue is getting an event to fire < 5ms after another event. I'm hoping there is a solution for this as gaming would also suffer form this sort of thing. For example some games can be put in a high priority mode. I wonder if that can be done in code? I may also try a profiler to back up my suspicions, it may be something I can turn off.
For those that want to journey down the technical road, the api is https://github.com/signal11/hidapi
If maybe someone has a trick or idea that may work, here are some of the considerations in my case.
1) usb interrupt polls happen ever 8 ms and are only a few hundred us long
2) control transfer should happen once every 8-32 ms (fast the better)
3) this control transfer can take up to 5 ms to complete
4) Skipping oscillations is ok for the controller transfer
5) this is usb 1.1
This is not even a C# problem, you are in a multi tasking non-realtime OS, so you don't know when your program is going to be active, the OS can give priority to other tasks.
Said that, you can raise the priority of the program thread, but I doubt it will solve anything:
System.Threading.Thread.CurrentThread.Priority = ThreadPriority.Highest;
When such restrictive timmings must be met then you must work at kernel level, per example as a driver.

Async rewrite of sync code performs 20 times slower in loops [closed]

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 5 years ago.
Improve this question
I'm trying not to panic here, so please, bear with me! :S
I've spent a considerable amount of time rewriting a big chunk of code from sync (i.e. thread-blocking) to async (i.e. C# 6+). The code in question runs inside an ASP.NET application and spans everything from low-level ADO.NET DB access to higher-level unit-of-work pattern, and finally a custom async HTTP handler for public API access - the full server-side stack, so to speak. The primary purpose of the rewrite wasn't optimization, but untangling, general clean-up, and bringing the code up to something that resembles a modern and deliberate design. Naturally, optimization gain was implicitly assumed.
Everything in general is great and I'm very satisfied with the overall quality of the new code, as well as the improved scalability it's shown so far in the last couple of weeks of real-world tests. The CPU and memory loads on the server have fallen drastically!
So what's the problem, you might ask?
Well, I've recently been tasked with optimizing a simple data import that is still utilizing the old sync code. Naturally, it didn't take me long before I tried changing it to the new async code-base to see what would happen.
Given everything, the import code is quite simple. It's basically a loop that reads items from a list that's been previously read into memory, adds each of them individually to a unit of work, and saves it to a SQL database by means of an INSERT statement. After the loop is done, the unit of work is committed, which makes the changes permanent (i.e. the DB transaction is committed as well).
The problem is that the new code takes about 20 times as long as the old one, when the expectation was quite the opposite! I've checked and double-checked and there is no obvious overhead in the new code that would warrant such sluggishness.
To be specific: the old code is able to import 1100 items/sec steadily, while the new one manages 40 items/sec AT BEST (on average, it's even less, because the rate is falling slightly over time)! If I run the same test over a VPN, so that the network cost outweighs everything else, the throughputs is somewhere along 25 items/sec for sync and 20 for async.
I've read about multiple cases here on SO which report a 10-20% slowdown when switching from sync to async in similar situations and I was prepared to deal with that for tight loops such as mine. But a 20-fold penalty in a non-networked scenario?! That's completely unacceptable!
What is my best course of action here? How do I tackle this unexpected problem?
UPDATE
I've run the import under a profiler, as suggested.
I'm not sure what to make of the results, though. It would seem that the process spends more than 80% of its time just... waiting. See for yourselves:
The 14% spent inside the custom HTTP handler corresponds to the IDataReader.Read which is a consequence of a tiny remainder of the old sync API. This is still subject to optimization and is likely to be reduced in the near future. Regardless, it's dwarfed by the WaitAny cost, which definitely isn't there in the all-sync version!
What's curious is that the report isn't showing any direct calls from my code to WaitAny, which makes me think this is probably part of the async/await infrastructure. Am I wrong in this conclusion? I kind of hope I am!
What worries me is that I might be reading this all wrong. I know that async costs are much harder to reason about than single-threaded costs. In the end, the WaitAny might be nothing more than the equivalent of the "Sytem Idle Process" on Windows - an artificial representation of the CPU infrastructure that reflects a free percentage of the CPU resource.
Can anyone shed some light here for me, please?

Performance issue with WPF [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Closed 8 years ago.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
This question appears to be off-topic because it lacks sufficient information to diagnose the problem. Describe your problem in more detail or include a minimal example in the question itself.
Improve this question
I am facing a strange issue that throws following exception.
The CLR has been unable to transition from COM context 0x22f3090 to COM context 0x22f32e0 for 60 seconds. The thread that owns the destination context/apartment is most likely either doing a non pumping wait or processing a very long running operation without pumping Windows messages. This situation generally has a negative performance impact and may even lead to the application becoming non responsive or memory usage accumulating continually over time. To avoid this problem, all single threaded apartment (STA) threads should use pumping wait primitives (such as CoWaitForMultipleHandles) and routinely pump messages during long running operations.
So, I am keen to know its possible reasons in wpf. As of now, I am performing an operation which is causing this So, I put stop watch and check the time of my code but my code is not taking time to execute rather it is taken by run time framework. I know probably I am doing something wrong with my code. So, I am keen to know the possible reason for this kind of bug.. currently Invoking that operation is taking more than 5 minutes and even operation is very simple
After doing lots of efforts, I came to know that I was making a silly mistake by putting a data grid inside a scroll bar, which disables the by default visualization(UI and Data), and It was trying to load all the objects in the grid regardless of their need,
So, good note:
Never ever place scroll bar outside a **grid.
<ScrollViewer>
<DataGrid>
....
</DataGrid>
</ScrollViewer>**
Never do this.

estimate execution time before running a function

I have a function that runs for about 0.7 seconds on my not-so-new development machine. (it runs for about 3 seconds on another machine I tested)
I want to show the user some pre-message about half a second before the above function is done.
I don't want to show the message too long before the function is done as it will be annoying to just look at it and wait. On the other hand, I would rather not wait until the function is done because the whole thing starts from a user action and I don't want to waste time - it's better if I can show that message while the other function is doing its job.
I've already added a loop with a short Thread.sleep() to let the pre-message hang if the function was "too fast" but I'm afraid that usually won't be the case... And so, I want to see if I could roughly estimate the execution time based on current machine specifications and even by the current CPU usage and do that before running the function. Also, since we are talking about seconds and milliseconds, if getting this information will take more than a few milliseconds then it's not worth it. In this case, I might calculate it only once when the application is loaded.
Does anybody have an idea how to do that?
Estimating time is pointless, and most likely impossible to be acurate. (Though there might be some scenarios where it could be done).
Look at Windows file copying in the past "10 seconds left.... 2 minutes left.... 5 seconds left" It kept changing its estimate based on whatever metrics it used. Better to just show a spnning image, or message, to let the user know something is going on.
If you are processing a list of items, then it will be much easier for you, as the message could be:
Processing item 4 of 100.
At least then the user can know, roughly, what the code is doing. If you have nothing like this to inform the user, then I would cut your losses and show a simple "Processing...." message, or some icon, whatever takes your fancy for your solution.

Advices on time-consuming procedure in C#

I've developed a program using Delphi that, among some features, does a lot of database reading of float values and many calculations on these values. At the end of these calculations, it shows a screen with some results. These calculations take some time to finish: today, something like 5 to 10 minutes before finally showing up the results screen.
Now, my customers are requesting a .Net version of this program, as almost all of my other programs have already gone to .Net. But I'm afraid that this time-consuming calculation procedure wouldn't fit the web scenario and the program would take the user to some kind of timeout error.
So, I'd like some tips or advices on how to do this kind of procedure. Initially I thought about calling a local executable (that could be even my initial Delphi program, in a console way) and after some time show the result screen in a web page. But, again, I'm afraid this wouldn't be the best approach.
An external process is a reasonable way to go about it. You can fire off a thread inside the ASP.NET process (i.e. just with new Thread()) which could also work, but there are issues around process recycling and pooling that might make this a little harder. simply firing off an external process and then maybe using some Ajax polling to check on it's status on the browser seems like a good solution to me.
FWIW, another pattern that some existing online services use (for instance, ones that do file conversion that may take a few minutes) is having the person put in an email address and just send the results via email once it's done - that way if they accidentally kill their browser or it takes a little longer than expected or whatever, it's no big deal.
Another approach I've taken in the past is basically what Dean suggested - kick it off and have a status page that auto-refreshes, and once it's complete, the status includes a link to results.
How about:
Create a Web Service that does the fetching/calculation.
Set the timeout so it wont expire.
YourService.HeavyDutyCalculator svc = new YourService.HeavyDutyCalculator();
svc.Timeout = 10 * 1 * 1000; //Constitutes 10 mins, 10 mins x 1 second x 1000 ms
Service.CalculateResult result = svc.Calculate();
Note that you can put -1 if you want it to run infinitely.
MSDN:
Setting the Timeout property to Timeout.Infinite indicates that the request does not time out. Even though an XML Web service client can set the Timeout property to not time out, the Web server can still cause the request to time out on the server side.
Call that web method inside you web page
Place a waiting/inProgress image
Register for web method OnComplete event; and show results upon complete.
You can also update the timeout in your web.config:
<httpRuntime useFullyQualifiedRedirectUrl="true|false"
maxRequestLength="size in kbytes"
executionTimeout="seconds"
minFreeThreads="number of threads"
minFreeLocalRequestFreeThreads="number of threads"
appRequestQueueLimit="number of requests"
versionHeader="version string"/>
Regardless of what else you do you need a progress bar or other status indication to the user. Users are used to web pages that load in seconds, they simply won't realise (even if you tell them in advance) that they have to wait a full 10 minutes for their results.

Categories

Resources