I have a .Net application in C# that I am in need of some async processing. The user will upload a file of PEOPLE records from the aspx page and the system should upload the file, parse the file, load the people in the database, and then move the file out of the initial directory that it was uploaded to.
I don't want the user to have to wait for this, as some files may have thousands of records, and the system could take a while to go through all of them. I basically want to return a message that says, "Thank you for uploading your file. The system will process it and send you an email when it has been completed".
I have read about the new async/await keywords, but I'm not sure if that is the route that I should take for this. I basically just need a button event handler to upload the file and the kick off a "batch" process to finish dealing with the file while returning control to the user on the UI to do whatever else he/she wants to do on the site.
I guess the secondary question would be: Does await return control to the user when used within a button event handler method? If so, then this seems to be a perfect solution for me as it won't block the thread or the user.
Is there a better method or pattern that I should use for this, or is async/await sufficient?
I guess the secondary question would be: Does await return control to the user when used within a button event handler method? If so, then this seems to be a perfect solution for me as it won't block the thread or the user.
await does yield to the message loop when used within a button event handler method in a GUI application.
Is there a better method or pattern that I should use for this, or is async/await sufficient?
async/await will not do this, because async doesn't change the HTTP protocol - your await will just yield to the ASP.NET thread pool (not to the user's browser).
The best way to solve this is to have your ASP.NET page write the file to disk (asynchronously, if possible), and then return the response "we'll email you when it's done".
Then you can have a Win32 service or something that monitors that directory, processes the files, and sends emails. Note that you should use Azure Blobs/Queues instead of the file system if you plan to deploy this to the cloud.
You can use asynchronous page - based on <%# Page Async="true" ... %>
Link : http://msdn.microsoft.com/en-us/magazine/cc163725.aspx
If you're comfortable with using ASP.NET/ThreadPool to process the file parsing/database work/email notification you can implement this using an HttpModule/Timer. Use ASP.NET to upload the file and the HttpModule to either monitor file system for new uploads and process the files, or add a database record when file is uploaded and periodically poll the database for new records.
This article will get you going and also outlines the issues you may face using this method.
If you need something more complicated or your site has a lot of traffic you'd better use a separate process (windows service or scheduled task) for the batch process.
If you require any sample code I can post, but currently using .NET 3.5 rather than 4.0 at the moment.
Related
I have an MVC application that has an action that serves file. I have a requirement that if a user has initiated a download, the user cannot initiate a new download until the current one finishes.
To put simply the user can only download one file at a time. However, I don't want to simply block the user's request but want to be able to let them know that they cannot download a new one because they are downloading an existing one.
So my question is how do you keep track when a download has been started and have not yet finished? I initially turned to dynamic ip restriction in IIS but the problem is the end user could be sitting behind layers upon layers of proxies which might not make it reliable. Is there a way to do this through code?
I also tried using FileWebRequest / FileWebResponse which creates an ongoing stream. But when a new request comes in it enters as a new action and I don't know how to check if the previous stream is still in progress.
public async Task<ActionResult> Download()
{
Session["IsDownloading"] = true;
await ServeFiles();
return View("Index");
}
public ActionResult CheckStatus()
{
var message = Session["IsDownloading"] ?? "Null";
return Content(message.ToString());
}
private async Task ServeFiles()
{
try
{ //Download async
}
finaly
{
Session["IsDownloading"] = false;
}
}
***********************Update*********************
I've updated the sample code. Download can be invoked any number of time to download a file in c:\temp, then while that's downloading, I was trying to invoke CheckStatus but it doesn't respond while the download is in progress. Only when the download completes.
I've added a project solution here.
https://github.com/janmchan/DownloadThrottling.git
***********************Update 2016-04-04 **********
After a bit of search I found out that session (I'm using SQLServer) is saved at the end of the request so even if the user manages two concurrent request, the session variable won't be updated until the end of the request making the information in the session not reliable. So now, I'm looking for a way to manually force the session state to save right after I update the value.
I think you need 3 things: identify a user, check+block if he has existing download.
The easiest way would be to set Session["isDownloading"] = true; and then check this. Set it on false when he finished downloading.
From the client when he wants to initiate download you can use javascript to call a CheckIsDownloading action and if ok call Serve to get the file.
In serve check again and block if he is downloading to avoid scenarios where he hacks the js and makes download requests.
Note: the user identity part is hard to do if he isn't logged in and you want to make sure it's only one download. He could delete cookies at any time(->new session) and do a new download.
The final solution that I end up using is similar to #eugen's answer but instead of session, using data cache (MemoryCache) which immediately takes effect and does not wait for the request to finish. Although this is only in process, this is fine since I found out that even though the application servers will be load balanced, once a session is established, it is guaranteed by the client infrastructure that they will use the same set of server for the rest of the session.
I agree with Erik Funkenbusch's comment that the user can get easily get around this and this solution is only for this specific requirement.
I have web page index.aspx and corresponding server side code index.aspx.cs. This C# code has a method which cannot executed in parallel if multiple clients connect to my website. How can I restrict this?
Here is what the method does. It creates a folder, zip it and make it available for the user to download. My requirement is that when one user is executing this method, some other user should not do this because it will create the same folder again which leads to corruption of data.
I tried using Session objects. But I came to know that session objects are stored per client basis.
Can anyone suggest me some solution?
My immediate advice would be: create a random folder name per request, which would allow you to run them concurrently. However, if that isn't an option then you will need to synchronize using something like lock or Mutex. However, this would only work well if you are returning the result from the current request, rather than zipping it in one request, and letting them download it the next.
Frankly, though, I think that you should do the zip in the request for the zip. Indeed, unless the file will be huge you don't even need to touch the file-system - you can create a zip in-memory using MemoryStream and any of the zip encoders (System.IO.Packaging.ZipPackage for example) - then just hand the client the data from the MemoryStream.
If you are using MVC, this is just return File(contents, contentType). With vanilla ASP.NET you need a few more steps.
The Application context or a static class is application wide. So you can store a flag which indicates that the process is already started. After the procees ended, you can delete the flag.
http://msdn.microsoft.com/en-us/library/94xkskdf(v=vs.100).aspx
And always use Application.Lock when you write to the application state and lock(mutex) when you use a static class.
In your case a static class would be a better solution, because it seems that the application context exist only for compatible purposes to classic asp: Using static variables instead of Application state in ASP.NET
static object mutex= new object();
lock(mutex)
{
//Do the work
}
If you use the classic asp.net session you do not need to do anything because session all ready lock the run of the pages from multiple users.
If you not, then you can follow what Marc suggest, use Mutex.
About the session lock:
Web app blocked while processing another web app on sharing same session
jQuery Ajax calls to web service seem to be synchronous
ASP.NET Server does not process pages asynchronously
Replacing ASP.Net's session entirely
I have a long running operation you might read in couple of my another questions (for your reference here is first and second).
In the beginning of whole deal, project expose a form in which user should specify all necessary information about XML file and upload XML file itself. In that method all user input data caught and went to an WCF service that handles such king of files. Controller got only task id of such processing.
Then user got redirected to progress bar page and periodically retrieves status of task completeness, refreshes the progress bar.
So here is my issue comes. When processing of XML file if over, how can I get results back and show them to user?
I know that HTTP is stateless protocol but there is cookie mechanism that could help in this situation. Of course, I may just save processing results to some temporary place, like a static class in WCF server, but there is a high load on service, so it will eat all of supplied memory.
In other words, I would like to pass task to WCF service (using netNamedPipeBinding) and receive results back as fast as it really possible. I want to escape temporary saving result to some buffer and wait until client will gather it back.
As far as I go is using temporary buffer not on service side but at client's:
using (XmlProcessingServiceClient client = new XmlProcessingServiceClient())
{
client.AnalyzeXmlAsync(new Task { fileName = filePath, id = tid });
client.AnalyzeXmlCompleted += (sender, e) =>
{
System.Web.HttpContext.Current.Application.Lock();
// here is I just use single place for all clients. I know it is not right, it is just for illustrating purposes.
System.Web.HttpContext.Current.Application["Result"] = e;
System.Web.HttpContext.Current.Application.UnLock();
};
}
I suggest you to use a SignalR hub to address your problem. You have a way to call a method on the client directly to notify the operation completed. And this happen without having to deal with the actual infrastructure trouble there is in implementing such strategies. Plus SignalR plugs easily in an asp.net MVC application.
To be honest I didn't really get the part about the wcf server and stuff, but I think I can give you more of an abstract answer. To be sure:
You have a form with some fields + file upload
The user fills in the form and supplies an XML file
You send the XML file to an WFC services which procress it
Show in the mean time a progress bar which updates
After completion show the results
If this is not want you want or this is not what your question is about you can skip my answer, otherwise read on.
Before we begin: Step 3 is a bit ambiguous: It could mean that we send the data to the service and wait for it to return the result or that we send the data to the service and we donĀ“t wait for it to return the result.
Situation 1:
Create in a view the form with all the required fields
Create an action in your controller which handles the postback.
The action will send the data to the service and when the service returns the result, your action will render a view with the result.
On the submit button you add an javascript on click event. This will trigger an ajax call to some server side code which will return the progress.
The javascript shows some sort of status bar with the correct progress and repeats itself every x seconds
When the controller finishes it will show the result
Situation 2:
-
-
After sending the data to the service the controller shows a view with the progress bar.
We add an javascript event on document ready which checks the status of the xml file and updates a progressbar. (same as the onclick event in step 4 in situation 1)
When the progressbar reaches 100% it will redirect to a different page which shows the results
Does this answer your question?
Best regards,
BHD
netNamedPipeBinding will not work for cross-machine communication if this is what you have in mind.
If you want to host our service on IIS then you will need one of the bindings that use HTTP as their transport protocol. Have a look at the duplex services that allow both endpoints to send messages. This way the server can send messages to the client anytime it wishes to. You could created a callback interface for progress reporting. If the task is going to take a considerable amount of time to complete, then the overhead of the progress reporting through HTTP might be ok.
Also have a look at Building and Accessing Duplex Services if you want to use a duplex communication over HTTP with Silverlight (PollingDuplexHttpBinding).
Finally you could look for a Comet implementation for ASP.NET. In CodeProject you will at least a couple (CometAsync and PokeIn).
I'm not sure if this is the best solution but I was able to do something similar. This was the general setup:
Controller A initialized a new class with the parameters for the action to be performed and passed the user's session object
The new class called a method in a background thread which updated the user's session as it progressed
Controller B had json methods that when called by client side javascript, checked the user's session data and returned the latest progress.
This thread states that using the session object in such a way is bad but I'm sure you can do something similar with a thread safe storage method like sql or a temp file.
I have a page that downloads a large HTML file from another domain then serve it to the user. The file is around 100k - 10MB and usually takes about 5min. What was think about doing something like this to make the user experience better.
download file
if file is not download within 10 seconds then displays a page that tells the user that the file is being downloaded
if the server completes the download in 1 second then it will serve the downloaded html
can this be done? do I need to use the async feature?
Updated question: the downloaded file is a html file
In order to provide an 'asynchronous' file download try a trick that Google is using: Create a hidden iframe and set it's source to the file you want to download. You can then still run javascript on your original page while the file is being downloaded through the iframe.
I think you should:
Return an HTML page to the user straight away, to tell them the transfer has started.
Start the download from the other domain in a separate process on your server.
Have the HTML from step 1 repeatedly reload, so you can check if the download has completed already, and possibly give an ETA or update to the user.
Return a link to the user when the initial transfer is complete.
It sounds like you need to use a waiting page that refreshes itself every so often and displays the status of your download. The download can be run on a separate thread using a System.Threading.Task, for instance.
I'm having an issue within my application Pelotonics. When a user downloads a file the system seems to block all incoming requests until that file is done downloading. What is the proper technique to to open a download dialog box (standard from the browser), let the user start downloading the file, then while the file is downloading, let the user continue throughout the application.
The way we're getting the file from the server is we have a separate ASPX page that get's passed in a value through the query string, then retrieves the stream of the file from the server, then I add the "content-disposition" header to the Response and then loop through the file's stream and read 2KB chunks out to the response.outputstream. Then once that's done I do a Response.End.
Watch this for a quick screencast on the issue:
http://www.screencast.com/users/PeloCast/folders/Jing/media/8bb4b1dd-ac66-4f84-a1a3-7fc64cd650c0
by the way, we're in ASP.NET and C#...
Thanks!!!
Daniel
I think ASP.NET allows one simultaneous page execution per session and I'm not aware of any way to configure this otherwise.
This is not a very pretty workaround, but it might help if you rewrote ASP.NET_SESSIONID value to the request cookie in Application_BeginRequest (in global.asax). Of course, you would need to the authentication some other way. I haven't tried this, though.
Another way would be launching a separate thread for the download process, but you would need to find a way how this can be done without the worker thread closing it's resources.
May I ask, is there a reason why don't you just use HttpResponse.TransmitFile?