Web Page Hangs after Procedure execution completes - c#

I have a web page that accepts an excel file, reads data from it and stores it in a buffer table. Then a procedure is called that reads from this buffer table and each record passes through number of validations. After procedure completes, I delete the buffer table contents from code behind. I have executed this code using about 100 records and it executes in a few seconds. However, when the record size is increased (say about 2000), the procedure takes over 5 mins to execute, but the web page hangs. I have checked the table, the record insertion and buffer table deletion takes about 6-7 mins, but the web page does not return a result even after 30 mins. I have tried to optimize the procedure, but still in case of large number of records the web page hangs.
Please give me some direction as to how to avoid this page hanging situation. Any help would be great. Thanks in advance

I think that the first thing you should do is wrap your inserts into a transaction.
If there are too many records for a single transaction, you could perform a commit every n records (say 500).
As far as the web page returning, you could be reaching a timeout of some sort where IIS or the client abandons the request or if you update the page with data, you could have invalid data which is causing errors in the page.
For this, you should check the windows event log to see if IIS or ASP.Net are reporting any exceptions. You can also run fiddler to see what is happening to the request.
Finally, I would strongly suggest a redesign that does not require the user to wait with a submitted form on the screen until processing is complete.
The standard pattern that we use for this type of functionality is to record incoming requests in the database with a GUID, kick off a background worker to perform the task, and return the GUID to the client.
When the background worker has finished (or encounters an error), it updates the request table in the database with the new status (i.e. success or fail) and the error message if any.
The client can use the GUID to issue ajax requests to the web server on a regular basis (using window.timeout so as not to block the user and allow animations to be displayed) to determine whether or not the process is complete. Once the process is complete, the UI can be updated as needed.
Update
To record incoming requests in the database with a GUID, create a table that contains a GUID column as the primary key, a status column (3 values: in progress, success, failure), and an error message column.
When the request is received, create a new GUID in your code, then write a record to this new table with this GUID and a status of in progress prior to launching the background worker.
You will pass the GUID to the background worker so that it can update the table on completion (it just updates that status to complete or error and records the error message, if any).
You will also pass the GUID back to the client through javascript so that the client can periodic ask the web server to perform a query against the table using the GUID to determine when the request is no longer in progress.

Related

Receive multipart response and treat each part as soon as received

Current situation: an existing SQL Server stored procedure I have no control upon returns 10 large strings in separate resultsets in about 30 seconds (~3 seconds per dataset). The existing ASP.NET Web API controller method that collects these strings only returns a response once all strings are obtained from the stored procedure. When the client receives the response, it takes another 30 seconds to process the strings and display the results, for a total of 1 minute from request initiation to operation completion.
Contemplated improvement: somehow transmit the strings to the client as soon as each is obtained from the SqlDataReader, so the client can work on interpreting each string while receiving the subsequent ones. The total time from request initiation to completion would thus roughly be halved.
I have considered the WebClient events at my disposal, such as DownloadStringCompleted and DownloadProgressChanged, but feel none is viable and generally think I am on the wrong track, hence this question. I have all kinds of ideas, such as saving strings to temporary files on the server and sending each file name to the client through a parallel SignalR channel for the client to request in parallel, etc., but feel I would both lose my time and your opportunity to enlighten me.
I would not resort to inverting the standard client / server relationship using a "server push" approach. All you need is some kind of intermediary dataset. It could be a singleton object (or multiple objects, one per client) on your server, or another table in an actual database (perhaps NoSql).
The point is that the client will not directly access the slow data flow you're dealing with. Instead the client will only access the intermediary dataset. On the first request, you will start off the process of migrating data from the slow dataset to the intermediary database and the client will have to wait until the first batch is ready.
The client will then make additional requests as he processes each result on his end. If more intermediary results are already available he will get them immediately, otherwise he will have to wait like he did on the first request.
But the server is continuously waiting on the slow data set and adding more data to the intermediate data set. You will have to have a way of marking the intermediate data as having already been sent to the client or not. You will probably want to spawn a separate thread for the code that moves data from the slow data source to the intermediate one.

Sql connection timeout for a stored procedure execution from c#

I have 1 store procedure to generate some report, its is very complex so it is taking upto 7-8 mins sometimes to generate output.
When i am trying to access from the Webpage ( C# ) i am getting connection time out error.
I have already set remote connection timeout=0 (unlimited) and in connection string also i have tried to supply connection timeout.
I have suppose 6 Lacs around records of bills and i have performing 6 times around sum based on different groups and different dates, so is there any solution to make it faster.
Or any connection timeout workout?
I think the problem with page request timeout. I suppose you start stored procedure execution right on Page_Load event and after some time IIS close request by timeout.
I'm suggesting to you remove load function from Page_Load event and after page loaded send AJAX request to server or page to start stored procedure execution and check execution result from time to time. When result will be ready you can get it by AJAX and display to user.
Is it really connection timeout that you should be worrying about?
Since you have a long-running command, please ensure that you set CommandTimeout on your SqlCommand to 0.

How to log changes to database every 5 minutes in a high-transaction application with C# and SQL?

Imagine this scenario: you have a WCF web service that gets hit up to a million times a day. Each hit contains an "Account ID" identifier. The WCF service is hosted in a distributed ASP.NET cluster and you don't have Remote Desktop access to the server.
Your goal is to save "number of hits per hour" for each Account ID into a SQL database. The results should look like this:
[Time], [AccountID], [NumberOfHits]
1 PM, Account ID (Bob), 10 hits
2 PM, Account ID (Bob), 10 hits
1 PM, Account ID (Jane), 5 hits
The question is: How can you do this without connecting to a SQL server database on every hit?
Here's one solution I thought of: Store the temporary results in a System.Web.Cache object, listen to its expiration, and on Cache Expiration, write all the accumulated data to the database when Cache expires.
Any thoughts on a better approach?
Deffered update is the key, indeed, and you are on the right path with your local cache approach. As long as you don't have a requirement to display the last-update-count on each visit, the solution is simple: update a local cache of account_id->count and periodically sweep through this cache, replace the count with 0 and add the count to the total in the database. You may loose some visit counts if your ASP.Net process is lost, and your display hit count is not accurate (Node 1 int he ASP farm returns it's lats count, Node 2 returns its own local one, different from Node 1).
If you must have accurate display of counts on each return result (whether this is an page return or a service return, matter little) then it gets hairy quite fast. Centralized cache like Memcache can help to create a solution, but is not trivial.
Here is how I would keep the local cache:
class HitCountCache
{
class Counter
{
public unsigned int count {get;set}
public accountid {get;set}
};
private Dictionary<accountType, Counter> _counts = new Dictionary<...>();
private Object _lock= new Object();
// invoke this on every call
//
void IncrementAccountId (accountId)
{
Counter count;
lock(_lock)
{
if (_counts.TryGetValue (accountId, out count))
{
++count.count;
}
else
{
_counts.Add (accountId,
new Counter {accountId = accountId; count=0});
}
}
}
// Schedule this to be invoked every X minutes
//
void Save (SqlConnection conn)
{
Counter[] counts;
// Snap the counts, under lock
//
lock(_lock)
{
counts = _counts.ToArray();
_counts.Clear();
}
// Lock is released, can do DB work
//
foreach(Counter c in counts)
{
SqlCommand cmd = new SqlCommand(
#"Update table set count+=#count where accountId=#accountId",
conn);
cmd.Parameters.AddWithValue("#count", c.count);
cmd.Parameters.AddWithValue("#accountId", accountId);
cmd.ExecuteNoQuery();
}
}
}
This is a skeleton, it can be improved, and can also be made to return the current total count if needed, at least the total count as known by local node.
One option is to dump the relevant information into your server logs (logging APIs are already optimised to deal with high transaction volumes) and reap them with a separate process.
You asked: "How can you do this without connecting to a SQL server database on every hit?"
Use connection pooling. With connection pooling a several connection to SQL server opened ONCE and then they are reused for subsequent calls. So on each database hit, you do not need to connect to SQL server, because you will already be connected and can reuse existing connection for you database access.
Note, that connection pooling is used by default with SQL ado.net provider, so you might be using already without even knowing it.
An in-memory object as proposed is fastest but risks data loss in the event of an app or server crash. To reduce data loss you can lazy-write the cached data to disk. Then periodically read back from the cache file and write the aggregated information to your SQL server.
Any reason why they aren't using app fabric or the like?
Can you get into the service implementation? If so, the way to hit this is to have the service implementation fire a "fire and forget" style logging call to whatever other service you've setup to log this puppy. Shouldn't hold up the execution, should survive app crashes and the like and won't require digging into the SQL angle.
I honestly wouldn't take the job if I couldn't get into the front end of things, most other approaches are doomed to fail here.
If your goal is performance on the website then like another poster said, just use fire and forget. This could be a WebService that you post the data to or you can create a service running in the background listening on an MSMQ queue. I can give you more examples of this if interested. If you need to keep the website or admin tool in sync with the database you can store the values in a high performance cache like memcache at the same time you update the database.
If you want to run a batch of 100 queries on the DB in one query then make a separate service, again with MSMQ, which polls the queue and waits for > 100 messages in the queue. Once it detects there is 100 messages it opens a transaction with MSTDC and reads all the messages into memory and batches them up to run in one query. MSMQ durable, meaning that if the server shuts off or the service is down when the message is sent, it will still get delivered when the service comes online. Messages will only be removed from the queue once the query has completed. If the query errors out or something happens to the service the messages will still be in the queue for processing, you don't loose anything. MSTDC just helps you keep everything in one transaction so if one part of the process fails everything gets rolled back.
If you can't make a windows service to do this then just make a WebService that you call. You still send the MSMQ message each time a page loads, and say once every 10 times the page loads you fire the web service to process all the messages in the queue. The only problem you might have is getting the MSMQ service installed, however many hosting places and install something like this for you if you request it.

How to prevent browser timeouts in ASP.NET?

The scenario:
user uploads a small file or about 100 kB to the ASP.NET 4.0 application with perhaps 1000 units of work.
The server then gets to work on the units of work, one at a time.
Each unit takes a few seconds to complete due to requesting information from an external service.
The results are only saved to the database if all units were completed successfully, using a transaction.
Once completed, the user may get a list of what was done.
The problem is that the user does not get the confirmation, instead his browser I think gives up because of a timeout.
Future files are expected to be a few 100 times larger, increasing the problem.
I want to prevent this timeout.
Here are some ideas I had:
Optimize the code to run faster. This is done and is no longer the problem.
Run the requests to the external service in parallel.
Increase the server timeouts a little.
Let the user upload the file and then send him an email with the results later when the file has been processed.
Somehow make the page refresh and show some progress information to the user while waiting, e.g., 5% complete - done in 10 minutes.
How could I implement this last step, showing progress information and preventing browser timeout?
Other suggestions are welcome.
Thanks,
You need to decouple processing the file from browser response. This is achieved by
Creating a persistent item (i.e. in database, file, etc) in a queue so that it is fault tolerant
return success result to browser
Create a queue worker to asynchronously process your queue.
You put the data uploaded into a queue in your database. Then, you have an asynchronous process (example Windows Service) to pull items from your queue and process them. You can update your DB with progress of each operation, and when full completed, remove the item from the queue and update your other tables.
For progress, the user can then query the queue table for the status of his upload.
An easy trick is to write a piece of javacsript to repeatedly request something from the current page. If you use a request action of HEAD then the server will only respond with a minimal amount of information each time.
Something like:
<script src="/js/jquery-1.3.2.min.js" type="text/javascript"></script>
<script type="text/javascript">
$(document).ready(function(){
setTimeout("callserver()",6000);
});
function callserver() {
var remoteURL = '/yourpage.aspx';
$.get(remoteURL, function(data) { setTimeout("callserver()",6000); });
}
</script>

Is timer good solution for this?

I have application that use MSSQL database.
Application have module that is using for sending messages between application users.
When one user send message to another i insert message in database, and set message status to 1( after user read message database I update and set message status to 0).
Now,i am using system.timers.timer for checking message status, and if message status 1 user get alert that he has one message inbox.
The problem is that this application can be used by many users, and if timer run ever 5 minutes this gone slow application and database.
Is there any other solution to do this, with out runing timer?
Thanks!
I don't think the solution using a timer which does polling is that bad. And 50 Users is relatively little.
Does each user run a client app, which directly connects to the database? Or is this a ASP.NET app? Or a service which connects to the db and notifies client apps?
If you have client apps connecting directly to the DB, I'd stay with the timer and probably reduce the timeout (the number of queries seems to be extremely low in your case).
Other options
Use SqlDependency/Query notifications MSDN
Only if your message processing logic gets more complex, probably take a look at service broker. Especially if you need queuing behavior. But as it seems, this would be far too complex.
I wouldn't use a trigger.
Maybe you should look into having a "monitor" service, which is the only one looking at changes in the database and then sending a message to the other applications (a delegate) that data has updated, and they themselves should fetch their own data only when they get that message.
If checking always against the message table you can use add a column to your user table named: HasNewMessage, which is updated by a trigger on the message table
To illustrate it:
User 1 gets a new message
Messagetable trigger sets HasNewMessage to 1 for user1
You then check every 5 minutes if user1 HasNewMessage (should be faster due to indexed user
table)
If user1 looks into his mailbox you set HasNewMessages back to 0
Hope this helps

Categories

Resources