When i enter username and password on my site. if the username and pasword are correct, then i have a c# method called on Page_Load for database (which delete the non-required records).
if there is one record or 100, i still have to wait for the page load until that process is completed :(
I am using this string to load all the files, which will be then used to compare files
HttpContext.Current.Request.PhysicalApplicationPath;
how ever if i used a static path i.e : c:/images, then things goes bad :(
so what could be the possible solultion ?
You can start the record removal asynchronously:
Asynchronous Operations (ADO.NET)
Then your Page Load will occur before the removal operation is finished.
EDIT: Since you mention that you are using an Access DB, I guess that you are not losing the time by deleting the records but by some other operation (I suspect closing the DB, see my comment to Amir's answer). The thing you should do now is to benchmark, either by using a tool (see this question) or "manually", using the Stopwatch class. Any way, before you try to optimize, use one of these methods to find out what is really causing the delay.
Use Ajax and make it async as a web service.
Edit1: What I mean is to move the code in the Page_Load into a web service method, then call that web service from javascript after the page loads, sending it the information it needs to properly perform your operation, thus the client side appears more responsive - I make the assumption that the actions taken are not required to properly render your client side code, however if not, you might consider updating the page after the web service returns. This could be done manually, through the built in ajax toolkit or via a library such as jQuery.
This doesn't sound like a async problem to me. Deleting 100 or even 1000 records in a database shouldn't take more than a few milliseconds. If I was to suspect, I would think you have not set up your indexes correctly. So instead of deleting those records using a quick index, it needs to look through every record and see if its a match.
Related
I have an ASP.NET application and a C# application hosting a WebBrowser. The ASP.NET application is run through the C# application.
I am in need of notifying the C# application when e.g a button is clicked. The best approach will probably be through Javascript. This doesn't seem complicated as I can expose functions for the javascript window.external, but the only URL my C# application sees is the /Default.aspx. All the javascript functions (window.external.myfunc(..)) has to come from this page.
Any ideas? I'm turning up blank. I'm also a bit unsure on how to call the javascript functions from code-behind. ScriptManager.RegisterClientScriptBlock seems to be used a lot, but can this be called several times at one page?
Thanks!
By the way, for the C# client I'm using WebBrowser.ObjectForScripting to a custom object which will take these window.external function calls from Javscript. This works.
The part with ObjectForScripting and window.external will work fine.
And in order to call js functions from your ASP.NET code, you should use ajax calls from the js. There are two approaches for that:
1- create a web service and invoke it constantly to check for anything that has happened (i.e polling). this is the simplest way and can be easily implemented, for example using jQuery:
$.ajax(...);
then call your window.external.func() from the callback to that ajax call (of course if it's returned the intended value). Note that with this method, you will always have a delay because the client should poll the server.
2- make a connection to the server and keep it open, then send the data from the server when it's ready. For this matter, you can make use of something like SignalR. This will solve the issue of delay, but will keep an open connection which in turn brings other issues.
Actually the main factor to decide between these two methods is whether you are okay with a delay. If it's not the problem to wait for an average of 15 seconds, use the first method with a polling time of 30 seconds. If you want it to be real-time, use the second method but be cautious with its own issue like long-running connections, lack of session-state in SignalR and so on.
I just wanted to show you the choices. please tell me if it's unclear in any manner.
I am currently working on a project in ASP.NET MVC 4 and came along a module where a progress bar is needed. The question I am having right now is "What is the best way to implement an async progress bar?".
After some lookup I came across the following method:
Create a startEvent() and getProgress() in C# code.
Use javascript setTimeout() to call the getProgress() method asynchronously.
(Example: https://www.devexpress.com/Support/Center/Example/Details/E4244)
My remark with this method is that that causes the code to be dependent on the timeout you choose. So it would take some fiddling to find the best and most performant timeout.
Now the method that I would most likely have used before I researched the matter is the following:
In code behind, create a method handleItem(int index) which takes an index and does everything you want to do with the item at that index.
Determine the number of items you want to handle and pass that to your javascript.
In javascript, initiate a for loop that loops from 0 to the amount - 1, and for each index, it initiates an ajax-call to handleItem(i).
On that ajax-call's complete-statement, you can update the progress bar with the new amount.
My questions here are the following:
Does this expose too much of the program logic?
Does this create too much overhead seeing as every call goes to the server and back?
Are there any other reasons why I should refrain from using this method?
Thanks in advance
Koen Morren
This is not a recommended strategy, because the client drives the process. If there is any discontinuation of connectivity or maybe the user closes the browser, the process will stop.
Generally, if you use straight HTTP you will need to poll (aka pull) from javascript. The pseudo code is pretty much this:
Call Creates Task ID and sends it to client
Client queries the status of task with given ID
Another possibility are WebSockets, which allow your client to listen for changes that are pushed by the server.
There are many options to store the progress of a given state. You can index the progress by the HttpContext, task id, or some user id, or even store it in a database and use SqlDependency to get notifications of when the status is changed.
In summary, polling has more lag than push mechanisms. Clients should not drive an asynchronous process, but they should be either notified or provided some mechanisms on the status of an async process.
Unlike ASP.NET, there is few way to push data from server to client in MVC, WebSockets or SingnalR like api(s) can work for you.
The ajax approach is good and give you reliable mechanism to update data no matter user go to other page or closes the browser, every time ajax launched it will update UI. So there is nothing wrong there just have a fair interval in javascript.
Does this expose too much of the program logic?
Code will be written only in class file to calculate current %age.
2.Does this create too much overhead seeing as every call goes to the server and back?
No, ajax are light-weight calls
3.Are there any other reasons why I should refrain from using this method?
This method will allow user to freely navigate to other resources as ajax will work independently.
I have 2 different classes that i am testing to send files to the browser.
First one is at http://pastebin.org/1187259 uses Range specific headers in order to provide resuming
Second one is at http://pastebin.org/1187454 uses chunk reading to send large files.
Both work fine with one different. First one is wayyy slower than the second one in the sense of download speed. With first one i cannot pass over 80KB/s with second one i can get as fast as possible.
I have done few tests and result was same. Is this an illusion or is there something on the first one that slows download speed?
I also noticed that first one seems to block other requests. For example if i request a file from server with first one server will not respond to my other request to it until download finish. Even if I request different page. It doesn’t do that if i open different sessions from different browsers.
Thanks.
At last! I managed to fix the issue by adding EnableSessionState="ReadOnly" to the download page.
See http://www.guidanceshare.com/wiki/ASP.NET_2.0_Performance_Guidelines_-_Session_State
"Use the ReadOnly Attribute When You Can
For pages that only need read access to session data, consider setting EnableSessionState to ReadOnly.
Why
Page requests that use session state internally use a ReaderWriterLock object to manage session data. This allows multiple reads to occur at the same time when no lock is held. When the writer acquires the lock to update session state, all read requests are blocked. Normally two calls are made to the database for each request. The first call connects to the database, marks the session as locked, and executes the page. The second call writes any changes and unlocks the session. By setting EnableSessionState to ReadOnly, you avoid blocking, and you send fewer calls to the database thus improving the performance.
"
I have a code in my asp.net page where I am inserting some data in the database while uploading a file to the server. The problem is, it seems that the application is waiting for the file to be uploaded before it will insert to the database. Below is a code similar to mine.
public partial class _Default : System.Web.UI.Page
{
protected HtmlInputFile XLSFileInput;
...
protected void ImportButton_Click(object sender, EventArgs e)
{
InsertToDatabase(); //method to insert to database
XLSFileInput.PostedFile.SaveAs(filePath + fileName);
}
...
}
The problem here is that it seems that the InsertToDatabase() method is executing only after the file is uploaded to the server. Any help is appreciated.
This has NOTHING to do with the IIS (webserver) threading, it's more of a HTTP "problem".
The file is selected on the client and then all response data (including the file) is posted to the server before any server code is ran.
So the file is uploaded but not saved before the InsertToDatabase(); is executed.
This behaviour can only be worked around doing several posts (eg. with ajax) and is probably not a god solution for you.
Tell us more about what you are trying to accomplish and we might come up with some better suggestions :).
I would not recommend trying to manually control threading in ASP.NET, that is a job best left purely to IIS.
IMO for ASP.NET the better way to handle this is either invoke these requests through either multiple AJAX operations from the browser or to setup a WCF service that supports one way operations so when you call "InsertToDatabase" it executes a fire and forget operation to the WCF service sitting ontop your database that executes immediately and then continues onto the next line of code. Then IIS is running the code that the service method calls in it's own thread.
IMO using WCF is one of the most appropriate ways of handling threading in ASP.NET. Since you can easily set any service method to be synchronous or asynchronous.
Edit:
Introduction to Building Windows Communication Foundation Services
What You Need To Know About One-Way Calls, Callbacks, And Events
The file upload is a part of the HTML form post. Since the upload is a part of the form post process, it's always going to happen before any of your server-side code executes. The PostedFile.SaveAs() call doesn't cause the upload to happen, it just saves what was already uploaded as part of the request.
If you absolutely need the database insert to happen before the upload begins, you could do as #Chris Marisic suggests and run the insert as an AJAX call prior to submitting the form.
That's generally how single threaded applications work.
From your code I am guessing your using ASP.NET Web Forms.
You would have to consider sending the InsertToDatabase() operation off to free up the program to do your file upload. Perhaps depending upon your version consider UpdatePanels as a quick and dirty way to achieve this?
You suggest these operations are separate, provide more details to help figure out what each task does and if JavaScript is possible.
But your code sample indicates that InsertToDatabase() should be going before the file save.
If you are unsure of AJAX/JavaScript you could use a Thread pool to do your InsertToDatabase() method. However it all depends on what that method does. Please provide more code/details. I personally use this for a database insert which happens in the background (A logging Action Filter in ASP.NET MVC so other users may disagree on the validity of this usage. However it might save you learning another language.
ThreadPool.QueueUserWorkItem(delegate
{
// Code here
}
);
You could start another thread for the database insert. For Example:
ThreadStart job = new ThreadStart(InsertToDatabase);
Thread thread = new Thread(job);
thread.Start();
I am writing one simple web page with bunch of textboxes and a button control. Now when user finished editing the values on this text boxes user has to click the button and this button invoke heavily process intensive algorithm on server side code based on the data received from client (Textboxes)
And it could some time takes up to 30 to 45 minutes to complete the whole operation so the current thread is still inside the button click event handler function.
That background task only provides one event, and the web page subscribes to it to get some text data after each stage of processing
I was wandering if there is any way I can keep user up-to-date with what is the current progress on that background task. I have div element to print the current status information
So I am looking for some sort of reverse mechanism then "get" and "post".
I have read some articles on the Comet (programming) but I can't find any easy or definitive answer
Thanks in advance
Perhaps the simplest way is to submit the job, and get an id back via a straightforward page POST. That id corresponds to the long running job on the server side.
The returned page can then automatically refresh via the HTTP meta refresh mechanism, and the server can return status etc. in the refreshed page. The page will continually refresh with the id every (say) 30s until the job is complete.
It's a brute-force mechanism, but it's straightforward. Plus it has the advantage of allowing the user to bookmark the page and go away/come back. Given that your job is running for 30/45 mins, that could be important.
You are on the right track with 'Comet' (aka Ajax-Push or reverse-Ajax) - Comet is essentially an umbrella term for the ability for the server to 'push' something back to the client without the client triggering the event. Some Ajax frameworks are starting to build this in, and it is not something i would suggest trying to implement yourself close to the metal, because there are a lot of different things to consider (different webapp servers, threading models, etc). You can see that StackOverflow has some of this functionality, when it notifies you of a new answer coming in if you are writing one yourself for a given question.
Now all that being said, in your case, given the length of time of the server side processing, I would agree with Brian's answer that you should give the running job an id and use a more simple refresh mechanism to check it. If you wanted to add the look and feel of a server side push, you could query the server via standard ajax on an interval to check if the job is done rather than a simple refresh, and change the page when it is done through that Ajax call. If your job is able to report progress, then standard ajax could refresh your div with that progress, and there is no need for a server side push.
I agree completely with Brian Agnew's suggestion, with one possible improvement. Since you're essentially doing server-push operations, it might be worth considering a comet server. Now, I say that with a caveat - if you're really only managing jobs that complete every 30-45 minutes, then it may well be overkill. However, it has the advantage that you can push the results to the user when they've completed, and if the user is not longer connected, you can do something different (such as send them a notification email).
It depends:
If you need to be instantly notified when the process ends Comet (long polling) is for you
If some delay is acceptable (for instance the system notifies you 1 minute after the process finishes) then AJAX timers (polling) is your you.
Take a look to three examples of both techniques based on ItsNat framework:
Comet
AJAX Timers
Asynchronous Tasks (similar to
Comet, one shot)
I'm sorry, is Java