I've been trying to create a controller in my project for delivering what could turn out to be quite complex reports. As a result they can take a relatively long time and a progress bar would certainly help users to know that things are progressing. The report will be kicked off via an AJAX request, with the idea being that periodic JSON requests will get the status and update the progress bar.
I've been experimenting with the AsyncController as that seems to be a nice way of running long processes without tying up resources, but it doesn't appear to give me any way of checking on the progress (and seems to block further JSON requests and I haven't discovered why yet). After that I've tried resorting to storing progress in a static variable on the controller and reading the status from that - but to be honest that all seems a bit hacky!
All suggestions gratefully accepted!
Here's a sample I wrote that you could try:
Controller:
public class HomeController : AsyncController
{
public ActionResult Index()
{
return View();
}
public void SomeTaskAsync(int id)
{
AsyncManager.OutstandingOperations.Increment();
Task.Factory.StartNew(taskId =>
{
for (int i = 0; i < 100; i++)
{
Thread.Sleep(200);
HttpContext.Application["task" + taskId] = i;
}
var result = "result";
AsyncManager.OutstandingOperations.Decrement();
AsyncManager.Parameters["result"] = result;
return result;
}, id);
}
public ActionResult SomeTaskCompleted(string result)
{
return Content(result, "text/plain");
}
public ActionResult SomeTaskProgress(int id)
{
return Json(new
{
Progress = HttpContext.Application["task" + id]
}, JsonRequestBehavior.AllowGet);
}
}
Index() View:
<script type="text/javascript">
$(function () {
var taskId = 543;
$.get('/home/sometask', { id: taskId }, function (result) {
window.clearInterval(intervalId);
$('#result').html(result);
});
var intervalId = window.setInterval(function () {
$.getJSON('/home/sometaskprogress', { id: taskId }, function (json) {
$('#progress').html(json.Progress + '%');
});
}, 5000);
});
</script>
<div id="progress"></div>
<div id="result"></div>
The idea is to start an asynchronous operation that will report the progress using HttpContext.Application meaning that each task must have an unique id. Then on the client side we start the task and then send multiple AJAX requests (every 5s) to update the progress. You may tweak the parameters to adjust to your scenario. Further improvement would be to add exception handling.
4.5 years after this question has been answered, and we have a library that can make this task much easier: SignalR. No need to use shared state (which is bad because it can lead to unexpected results), just use the HubContext class to connect to a Hub that sends messages to the client.
First, we set up a SignalR connection like usual (see e.g. here), except that we don't need any server-side method on our Hub. Then we make an AJAX call to our Endpoint/Controller/whatever and pass the connection ID, which we get as usual:var connectionId = $.connection.hub.id;. On the server side of things, you can start your process on a different thread and retutn 200 OK to the client. The process will know the connectionId so that it can send messages back to the client, like this:
GlobalHost.ConnectionManager.GetHubContext<LogHub>()
.Clients.Client(connectionId)
.log(message);
Here, log is a client-side method that you want to call from the server, hence it should be defined like you usually do with SignalR:
$.connection.logHub.client.log = function(message){...};
More details in my blog post here
If you have an access to your web server machine as a choice you can create windows service or simple console application to perform long running work. Your web app should add a record to db to indicate that operation should start, and your windows service which periodicaly checks for new records in db should start performing task and report progress to db.
Your web app can use ajax request to check for progress and show to users.
I used this method to implement excel reports for asp.net mvc application.
This report was created by windows service which runs on machine and constantly checks for new records in reports table and when it finds a new record it starts to create a report and indicate a progress by updating record field. Asp.net mvc application just has been adding new report record and tracking progress in database untill it finished and then provided a link to ready file to be downloaded.
Related
I'm building an mvc 4 application that makes file uploading. When I get HttpPostedFileBase range, I get streams on them and pass to my business logic layer to save them and bind with database records. This is done without any problems. But when displaying current state of uploading progress comes in, I'm a bit confused. I make a second request while an uploading runs, but it waits for first request to be executed completely.
I know that in same browser client instance (actually same session), my requests are synchronized. But there is a solution that I read about, asynchronous actions.
To try asynchronous actions, I used Stream.CopyToAsync(..) instead of Stream.CopyTo(..). I'm also using Task.Delay(10000) to simulate file uploading progress. Then while asynchronous UploadFile action runs, I invoked synchronous UploadProgress on same browser instance. Result is still waiting for first request to complete. Below is the code I use. Where am I wrong?
Here is async action to upload files;
[HttpPost]
public async Task<ActionResult> Upload(PFileUploadModel files)
{
if (!Session.GetIsLoggedIn())
return RedirectToAction("Login", "User");
var fileRequest = Session.CreateRequest<PAddFileRequest, bool>(); //Creates a business logic request
fileRequest.Files.Add(...);
var result = await Session.HandleRequestAsync(fileRequest); //Handles and executes a business logic request by checking authority
if (result)
return RedirectToAction("List");
return RedirectToError();
}
And upload progress action is simple as below for now :) :
public ActionResult UploadProgress()
{
//var saveProgressRequest = Session.CreateRequest<PSaveFileProgressInfoRequest, PSaveFileProgressInfoResponse>();
//saveProgressRequest.ProgressHandle = progressHandle;
//var response = Session.HandleRequest(saveProgressRequest);
return Content("Test!");
}
Thanks for helping.
async doesn't change the HTTP protocol. The Upload request is in progress until you return the result.
You're probably running into the ASP.NET session lock, which ensures that multiple request handlers don't interfere with each other when reading/writing session state. I believe that MVC always takes a write lock on the session state by default, preventing any other actions from executing simultaneously (in the same session).
You can override this behavior by specifying the SessionState attribute on your controller.
I'm newbie with SignalR and want to learn so much. i already read beginner documents. But in this case i've stucked. what i want to do is when a user got new message i want to fire a script, like alert or showing div like "you have new mail" for notify the recieved user. And my question is how can i do that ? is there anyone know how to achieve this ? or good "step-by-step" document? i really want to work with SignalR.
ps: i'm using Visual Studio 2012 and MsSQL server
edit: i forgot to write, notification must be fired when message created to DB
Thank you
In your Scripts use the following, naturally this is not all the code, but enough based off tutorials to get you going. Your userId will be generated server side, and somehow your script can get it off an element of the page, or whatever method you want. It runs when the connection is started and then every 10 seconds. Pinging our server side method of CheckMessage() .
This js would need refactoring but should give you the general idea.
...
var messageHub = $.connection.messageHub;
var userId = 4;
$.connection.hub.start().done(function () {
StartCheck();
}
//Runs every 10 seconds..
function StartCheck()
{
setInterval(messageHub.server.checkMessage(userId,$.connection.hub.id), 10000);
}
This method takes in a userId, assuming your db is set up that way, and grabs them all from your database; naturally the method used is probably not appropriate for your system, however change it as you need to. It also checks if the user has any messages, and if so sends down another message to our SignalR scripts.
public void CheckMessage(int userId,int connectionId)
{
var user = userRepo.RetrieveAllUsers.FirstOrDefault(u=>u.id == userId);
if(user.HasMessages)
{
Clients.Group(connectionId).DisplayMailPopUp();
}
}
Finally this message, upon being called would run your code to do the 'You have Mail alert' - be it a popup, a div being faded in or whatever.
...
messageHub.client.displayMailPopUp = function () {
alert("You have Mail!");
};
...
Hopefully this helps - I recommend the following links for reading up and building your first SignalR app:
http://www.asp.net/signalr/overview/signalr-20/getting-started-with-signalr-20/tutorial-getting-started-with-signalr-20-and-mvc-5
And a smaller sample: http://code.msdn.microsoft.com/SignalR-Getting-Started-b9d18aa9
Im an developing a web application with MVC3, Razor, C# 4, jQuery.
Within one screen (page) I do AJAX calls to a controller action to get some screen-updates. Using Javascripts setTimeout() I do polling.
To optimize this polling in case where the server does not have screen-updates I like to delay the HTTP response and wait a little until I a) get screen-updates or b) hit some timeout (eg. 10 seconds or so).
I tried to do something like this
[OutputCache(Duration = 0)]
[AcceptVerbs(HttpVerbs.Get)]
public ActionResult CheckForUpdates()
{
var startTime = DateTime.Now;
while (_haveUpdates || DateTime.Now > startTime.AddSeconds(10 ))
{
Thread.Sleep(3);
}
return _haveUpdate ? Json(new {updates = ... }, JsonRequestBehavior.AllowGet) : null;
}
In the view I use Javascript / jQuery like this:
<script>
$(function () {
scheduleNextCheck();
});
function scheduleNextCheck() {
setTimeout(function() {
$.ajax({
url: "#Url.Action("CheckForUpdates")",
dataType: 'json',
data: null,
timeout: 10000,
cache: false,
success: function(data) {
if (data != null && data.updates != null ) {
// Apply Updates
} else {
scheduleNextCheck();
}
},
error: function() {
scheduleNextCheck();
}
});
}, 2000);
}
When using this code the IIS7 worker process will hang/freeze totally so only killing the worker-process can unlock the entire IIS7 server. So Thread.Sleep() seems not to be a good idea. At least not my applications environment.
Did anybody do timething like this with ASP.Net MVC yet or have any idea?
Thx, Marc
==== Update: ====
Found one of the problems: the while criteria was wrong. It should be like this:
while (noUdates && DateTime.Now < startTime.AddSeconds(10))
The problem now is that other AJAX-requests are canceled while this sleep-delay-loop is running.
About SignalR: Have to take a closer look but the problem is that this needs additional server- and client-side libraries which I have to get approval for so I was hoping for an "easy" small solution with less impact and less need for training.
But SignalR still is an option for one of the next releases to replace polling-stuff - but need to get some training and experience with this stuff first.
==== Update 2: ====
I am looking for a solutions that works without additional libraries / frameworks so I can apply it to the nearly finished web-application without a huge impact.
You may take a look at SignalR which is designed exactly for such situations. Instead of having the client poll the server with multiple HTTP requests, it is the server that notifies the client when updates are available.
I'm also looking into SignalR with my site, I'm reading this article which is pretty good: http://www.dotnetcurry.com/ShowArticle.aspx?ID=780.
From my understanding, the only real constraint is the browser version, but SignalR degrades to accommodate older versions.
There's some info here too: http://www.entechsolutions.com/browser-alerts-with-asp-net-4-5-and-signalr
What I have is an AJAX form on a View that makes a call to the server. This call perform n number of tasks where n is a number decided by records in a database (typically no more than 10 records). Each record corresponds to a Build Definition in TFS so what I am trying to do is get all of these Build Definitions, queue them in TFS, and as each build completes update the UI so that user knows which builds have completed.
Unfortunately I am not sure about how best to do this. I was thinking something along these lines:
foreach (var dep in builds)
{
TFS tfsServer = new TFS(TFS_SERVER_ADDRESS);
IBuildServer buildServer;
int id = tfsServer.QueuBuild(dep.TeamProject, dep.BuildDefinition);
string teamProject = dep.TeamProject;
Task.Factory.StartNew(() => GetBuildStatus(teamProject, id, tfsServer));
}
The task that is called is:
private void GetBuildStatus(string TeamProject, int BuildID, TFS Server)
{
Server.GetBuildStatus(TeamProject, BuildID);
AsyncManager.OutstandingOperations.Decrement();
}
The problem here is that my Completed method isn't going to get called until all of the builds have completed. How would I go about feeding data back up to the UI a piece at a time?
It is also worth mentioning that the GetBuildStatus method looks like this:
do
{
var build = buildsView.QueuedBuilds.FirstOrDefault(x => x.Id == BuildID);
if(build != null)
{
status = build.Status;
detail = build.Build;
}
} while (status != QueueStatus.Completed);
return detail.Status.ToString();
Given that the duration of a build will be longer than the timeout for an HTTP request you cannot leave the browser waiting while this happens. You need to return a page and then poll for updates from that page using AJAX. Typically you'd have a timer in javascript that triggers a regular call back to the server to get the updated status information.
But, since you are using .NET you could also consider trying SignalR which lets you use long polling, server sent events or web sockets to wait for updates from the server and it wraps it all up in some easy to implement .NET classes and Javascript.
I'm doing a high-level spec on an ASP.Net page which may have some delayed data presented.
When the page loads, the initial data presented will originate from a local database (which will be fast in presenting). What I want is a separate process to go out and look for updated data (from whatever other services I have). This is more time consuming, but the idea is to present data, then if newer data is found, append this, on the fly to the top of the existing page.
I would like some recommendations on how to accomplish this.
The tech scope for this is ASP.Net 4.0, C# MVC3 and HTML5.
Thanks.
AJAX with jQuery is a good way to achieve this. So for example you could put a content placeholder div on your markup:
<div id="result" data-remote-url="#Url.Action("Load", "SomeController")"></div>
and then once the DOM is loaded:
$(function() {
$.ajax({
url: $('#result').data('remote-url'),
type: 'POST',
beforeSend: function() {
// TODO: you could show an AJAX loading spinner
// to indicate to the user that there is an ongoing
// operation so that he doesn't run out of patience
},
complete: function() {
// this will be executed no matter whether the AJAX request
// succeeds or fails => you could hide the spinner here
},
success: function(result) {
// In case of success update the corresponding div with
// the results returned by the controller action
$('#result').html(result);
},
error: function() {
// something went wrong => inform the user
// in the gentler possible manner and remember
// that he spent some of his precious time waiting
// for those results
}
});
});
where the Load controller action will take care of communicating with the remote services and return a partial view containing the data:
public ActionResult Load()
{
var model = ... go ahead and fetch the model from the remote service
return PartialView(model);
}
Now if this fetching of data is I/O intensive you could take advantage of asynchronous controllers an I/O Completion Ports which will avoid you jeopardizing worker threads during the lengthy operation of fetching data from a remote source.