Detect Existing Download in MVC c# for an application - c#

I have an MVC application that has an action that serves file. I have a requirement that if a user has initiated a download, the user cannot initiate a new download until the current one finishes.
To put simply the user can only download one file at a time. However, I don't want to simply block the user's request but want to be able to let them know that they cannot download a new one because they are downloading an existing one.
So my question is how do you keep track when a download has been started and have not yet finished? I initially turned to dynamic ip restriction in IIS but the problem is the end user could be sitting behind layers upon layers of proxies which might not make it reliable. Is there a way to do this through code?
I also tried using FileWebRequest / FileWebResponse which creates an ongoing stream. But when a new request comes in it enters as a new action and I don't know how to check if the previous stream is still in progress.
public async Task<ActionResult> Download()
{
Session["IsDownloading"] = true;
await ServeFiles();
return View("Index");
}
public ActionResult CheckStatus()
{
var message = Session["IsDownloading"] ?? "Null";
return Content(message.ToString());
}
private async Task ServeFiles()
{
try
{ //Download async
}
finaly
{
Session["IsDownloading"] = false;
}
}
***********************Update*********************
I've updated the sample code. Download can be invoked any number of time to download a file in c:\temp, then while that's downloading, I was trying to invoke CheckStatus but it doesn't respond while the download is in progress. Only when the download completes.
I've added a project solution here.
https://github.com/janmchan/DownloadThrottling.git
***********************Update 2016-04-04 **********
After a bit of search I found out that session (I'm using SQLServer) is saved at the end of the request so even if the user manages two concurrent request, the session variable won't be updated until the end of the request making the information in the session not reliable. So now, I'm looking for a way to manually force the session state to save right after I update the value.

I think you need 3 things: identify a user, check+block if he has existing download.
The easiest way would be to set Session["isDownloading"] = true; and then check this. Set it on false when he finished downloading.
From the client when he wants to initiate download you can use javascript to call a CheckIsDownloading action and if ok call Serve to get the file.
In serve check again and block if he is downloading to avoid scenarios where he hacks the js and makes download requests.
Note: the user identity part is hard to do if he isn't logged in and you want to make sure it's only one download. He could delete cookies at any time(->new session) and do a new download.

The final solution that I end up using is similar to #eugen's answer but instead of session, using data cache (MemoryCache) which immediately takes effect and does not wait for the request to finish. Although this is only in process, this is fine since I found out that even though the application servers will be load balanced, once a session is established, it is guaranteed by the client infrastructure that they will use the same set of server for the rest of the session.
I agree with Erik Funkenbusch's comment that the user can get easily get around this and this solution is only for this specific requirement.

Related

Trying to use asp.net async page

I have an ASP.NET (webforms) page that needs to call a stored procedure which may take up to a minute to return. I realise that this is not ideal but the the database side of this project is out of my hands and hence I must live with this problem.
Basically I am looking for some method which will allow the page to render without the stored procedure hanging the page - with the results from the database call being displayed when available.
So, I am looking at an async page. I have added "ASYNC=TRUE" to the top of the page and so far, I have the following:
private async void GetCampaignCounts(int CampaignID)
{
Task t = new Task
(
() =>
{
CampaignService cs = new CampaignService();
FilterSet.TargetCounts f = cs.GetCampaignDetails(CampaignID); //LONG RUNNING DB CALL
if (f.Total > 0)
{
panelStatsLeft.Visible = true;
//DO STUFF IN HERE
}
else
panelStatsLeft.Visible = false;
}
);
t.Start();
await t;
}
However, this still hangs the page whilst the database query is running!
Am I doing something totally wrong?!
The asynchronous requests in web applications are not intended to not hang the page. The request will still take the same time as the synchronous version. The benefits are in scalability (i.e. if your page could handle 100 simultaneous users before it would be able to handle 1000 if it is async on the same hardware assuming that the bottleneck was the request pipeline and not the database).
If you want to make the page load and update itself when the operation completes I am afraid you will need significantly more complex architecture. Your best bet is load the page and do an ajax request that runs the query and update the page when the request returns. It is a best practice to use async/await for this ajax request (again for scalability).
Personally i wouldn't bother tying the two together. A better operation is to offload the query elsewhere and return later to get the results.
So use a signalling framework such as signalr. Submit your report parameters, pass them to msmq where they can be handled on a different server or use a one way wcf request. When the request is received optionally store the result in the database and use signalling to notify client (either passing them the actual result or telling them the url where they can download the report from (pick it up from the database in that other url)). Async that into the users current browser page which presumably has a spinner saying "hey we are generating your report".
Consider signalling too.
As I describe in my MSDN article on the topic, async on ASP.NET is not a silver bullet:
When some developers learn about async and await, they believe it’s a way for the server code to “yield” to the client (for example, the browser). However, async and await on ASP.NET only “yield” to the ASP.NET runtime; the HTTP protocol remains unchanged, and you still have only one response per request. If you needed SignalR or AJAX or UpdatePanel before async/await, you’ll still need SignalR or AJAX or UpdatePanel after async/await.
This makes sense if you think about the HTTP protocol: there can be only one response for each request, and async doesn't change the HTTP protocol (more detail on my blog). So, if you return a response to the client (allowing it to render the page), then that request is done. You can't send a followup response later to change the page you already sent.
The proper solution is fairly complex, because ASP.NET wasn't designed to track "background" operations without a request. ASP.NET will recycle your application periodically, and if there are no active requests, it assumes that it's a good time to do so. So, "background" operations are in danger of being terminated without notice.
For this reason, the best solution is to have an independent worker process that actually executes the background operation using a basic distributed architecture (requests are placed into a reliable queue by the ASP.NET handler; requests will re-enter the queue automatically if the worker process fails). This also means your requests should be idempotent.
If you don't want this level of complexity, you can trade-off reliability for complexity. An intermediate step is Hangfire, which requires a database backend at least (which it uses for reliability instead of a queue). If you want the least reliability (and least complexity), you can just use HostingEnvironment.QueueBackgroundWorkItem or my ASP.NET Background Tasks library. I have a longer overview of these options on my blog.

How to control multiple connections in ASP.NET web page

I have web page index.aspx and corresponding server side code index.aspx.cs. This C# code has a method which cannot executed in parallel if multiple clients connect to my website. How can I restrict this?
Here is what the method does. It creates a folder, zip it and make it available for the user to download. My requirement is that when one user is executing this method, some other user should not do this because it will create the same folder again which leads to corruption of data.
I tried using Session objects. But I came to know that session objects are stored per client basis.
Can anyone suggest me some solution?
My immediate advice would be: create a random folder name per request, which would allow you to run them concurrently. However, if that isn't an option then you will need to synchronize using something like lock or Mutex. However, this would only work well if you are returning the result from the current request, rather than zipping it in one request, and letting them download it the next.
Frankly, though, I think that you should do the zip in the request for the zip. Indeed, unless the file will be huge you don't even need to touch the file-system - you can create a zip in-memory using MemoryStream and any of the zip encoders (System.IO.Packaging.ZipPackage for example) - then just hand the client the data from the MemoryStream.
If you are using MVC, this is just return File(contents, contentType). With vanilla ASP.NET you need a few more steps.
The Application context or a static class is application wide. So you can store a flag which indicates that the process is already started. After the procees ended, you can delete the flag.
http://msdn.microsoft.com/en-us/library/94xkskdf(v=vs.100).aspx
And always use Application.Lock when you write to the application state and lock(mutex) when you use a static class.
In your case a static class would be a better solution, because it seems that the application context exist only for compatible purposes to classic asp: Using static variables instead of Application state in ASP.NET
static object mutex= new object();
lock(mutex)
{
//Do the work
}
If you use the classic asp.net session you do not need to do anything because session all ready lock the run of the pages from multiple users.
If you not, then you can follow what Marc suggest, use Mutex.
About the session lock:
Web app blocked while processing another web app on sharing same session
jQuery Ajax calls to web service seem to be synchronous
ASP.NET Server does not process pages asynchronously
Replacing ASP.Net's session entirely

Run asynchronous operation in one controller method and get that operation's result in another

I have a long running operation you might read in couple of my another questions (for your reference here is first and second).
In the beginning of whole deal, project expose a form in which user should specify all necessary information about XML file and upload XML file itself. In that method all user input data caught and went to an WCF service that handles such king of files. Controller got only task id of such processing.
Then user got redirected to progress bar page and periodically retrieves status of task completeness, refreshes the progress bar.
So here is my issue comes. When processing of XML file if over, how can I get results back and show them to user?
I know that HTTP is stateless protocol but there is cookie mechanism that could help in this situation. Of course, I may just save processing results to some temporary place, like a static class in WCF server, but there is a high load on service, so it will eat all of supplied memory.
In other words, I would like to pass task to WCF service (using netNamedPipeBinding) and receive results back as fast as it really possible. I want to escape temporary saving result to some buffer and wait until client will gather it back.
As far as I go is using temporary buffer not on service side but at client's:
using (XmlProcessingServiceClient client = new XmlProcessingServiceClient())
{
client.AnalyzeXmlAsync(new Task { fileName = filePath, id = tid });
client.AnalyzeXmlCompleted += (sender, e) =>
{
System.Web.HttpContext.Current.Application.Lock();
// here is I just use single place for all clients. I know it is not right, it is just for illustrating purposes.
System.Web.HttpContext.Current.Application["Result"] = e;
System.Web.HttpContext.Current.Application.UnLock();
};
}
I suggest you to use a SignalR hub to address your problem. You have a way to call a method on the client directly to notify the operation completed. And this happen without having to deal with the actual infrastructure trouble there is in implementing such strategies. Plus SignalR plugs easily in an asp.net MVC application.
To be honest I didn't really get the part about the wcf server and stuff, but I think I can give you more of an abstract answer. To be sure:
You have a form with some fields + file upload
The user fills in the form and supplies an XML file
You send the XML file to an WFC services which procress it
Show in the mean time a progress bar which updates
After completion show the results
If this is not want you want or this is not what your question is about you can skip my answer, otherwise read on.
Before we begin: Step 3 is a bit ambiguous: It could mean that we send the data to the service and wait for it to return the result or that we send the data to the service and we don´t wait for it to return the result.
Situation 1:
Create in a view the form with all the required fields
Create an action in your controller which handles the postback.
The action will send the data to the service and when the service returns the result, your action will render a view with the result.
On the submit button you add an javascript on click event. This will trigger an ajax call to some server side code which will return the progress.
The javascript shows some sort of status bar with the correct progress and repeats itself every x seconds
When the controller finishes it will show the result
Situation 2:
-
-
After sending the data to the service the controller shows a view with the progress bar.
We add an javascript event on document ready which checks the status of the xml file and updates a progressbar. (same as the onclick event in step 4 in situation 1)
When the progressbar reaches 100% it will redirect to a different page which shows the results
Does this answer your question?
Best regards,
BHD
netNamedPipeBinding will not work for cross-machine communication if this is what you have in mind.
If you want to host our service on IIS then you will need one of the bindings that use HTTP as their transport protocol. Have a look at the duplex services that allow both endpoints to send messages. This way the server can send messages to the client anytime it wishes to. You could created a callback interface for progress reporting. If the task is going to take a considerable amount of time to complete, then the overhead of the progress reporting through HTTP might be ok.
Also have a look at Building and Accessing Duplex Services if you want to use a duplex communication over HTTP with Silverlight (PollingDuplexHttpBinding).
Finally you could look for a Comet implementation for ASP.NET. In CodeProject you will at least a couple (CometAsync and PokeIn).
I'm not sure if this is the best solution but I was able to do something similar. This was the general setup:
Controller A initialized a new class with the parameters for the action to be performed and passed the user's session object
The new class called a method in a background thread which updated the user's session as it progressed
Controller B had json methods that when called by client side javascript, checked the user's session data and returned the latest progress.
This thread states that using the session object in such a way is bad but I'm sure you can do something similar with a thread safe storage method like sql or a temp file.

Asynchronous or batch processing with C#

I have a .Net application in C# that I am in need of some async processing. The user will upload a file of PEOPLE records from the aspx page and the system should upload the file, parse the file, load the people in the database, and then move the file out of the initial directory that it was uploaded to.
I don't want the user to have to wait for this, as some files may have thousands of records, and the system could take a while to go through all of them. I basically want to return a message that says, "Thank you for uploading your file. The system will process it and send you an email when it has been completed".
I have read about the new async/await keywords, but I'm not sure if that is the route that I should take for this. I basically just need a button event handler to upload the file and the kick off a "batch" process to finish dealing with the file while returning control to the user on the UI to do whatever else he/she wants to do on the site.
I guess the secondary question would be: Does await return control to the user when used within a button event handler method? If so, then this seems to be a perfect solution for me as it won't block the thread or the user.
Is there a better method or pattern that I should use for this, or is async/await sufficient?
I guess the secondary question would be: Does await return control to the user when used within a button event handler method? If so, then this seems to be a perfect solution for me as it won't block the thread or the user.
await does yield to the message loop when used within a button event handler method in a GUI application.
Is there a better method or pattern that I should use for this, or is async/await sufficient?
async/await will not do this, because async doesn't change the HTTP protocol - your await will just yield to the ASP.NET thread pool (not to the user's browser).
The best way to solve this is to have your ASP.NET page write the file to disk (asynchronously, if possible), and then return the response "we'll email you when it's done".
Then you can have a Win32 service or something that monitors that directory, processes the files, and sends emails. Note that you should use Azure Blobs/Queues instead of the file system if you plan to deploy this to the cloud.
You can use asynchronous page - based on <%# Page Async="true" ... %>
Link : http://msdn.microsoft.com/en-us/magazine/cc163725.aspx
If you're comfortable with using ASP.NET/ThreadPool to process the file parsing/database work/email notification you can implement this using an HttpModule/Timer. Use ASP.NET to upload the file and the HttpModule to either monitor file system for new uploads and process the files, or add a database record when file is uploaded and periodically poll the database for new records.
This article will get you going and also outlines the issues you may face using this method.
If you need something more complicated or your site has a lot of traffic you'd better use a separate process (windows service or scheduled task) for the batch process.
If you require any sample code I can post, but currently using .NET 3.5 rather than 4.0 at the moment.

While downloading a file, all requests are blocked

I'm having an issue within my application Pelotonics. When a user downloads a file the system seems to block all incoming requests until that file is done downloading. What is the proper technique to to open a download dialog box (standard from the browser), let the user start downloading the file, then while the file is downloading, let the user continue throughout the application.
The way we're getting the file from the server is we have a separate ASPX page that get's passed in a value through the query string, then retrieves the stream of the file from the server, then I add the "content-disposition" header to the Response and then loop through the file's stream and read 2KB chunks out to the response.outputstream. Then once that's done I do a Response.End.
Watch this for a quick screencast on the issue:
http://www.screencast.com/users/PeloCast/folders/Jing/media/8bb4b1dd-ac66-4f84-a1a3-7fc64cd650c0
by the way, we're in ASP.NET and C#...
Thanks!!!
Daniel
I think ASP.NET allows one simultaneous page execution per session and I'm not aware of any way to configure this otherwise.
This is not a very pretty workaround, but it might help if you rewrote ASP.NET_SESSIONID value to the request cookie in Application_BeginRequest (in global.asax). Of course, you would need to the authentication some other way. I haven't tried this, though.
Another way would be launching a separate thread for the download process, but you would need to find a way how this can be done without the worker thread closing it's resources.
May I ask, is there a reason why don't you just use HttpResponse.TransmitFile?

Categories

Resources