I'm doing a high-level spec on an ASP.Net page which may have some delayed data presented.
When the page loads, the initial data presented will originate from a local database (which will be fast in presenting). What I want is a separate process to go out and look for updated data (from whatever other services I have). This is more time consuming, but the idea is to present data, then if newer data is found, append this, on the fly to the top of the existing page.
I would like some recommendations on how to accomplish this.
The tech scope for this is ASP.Net 4.0, C# MVC3 and HTML5.
Thanks.
AJAX with jQuery is a good way to achieve this. So for example you could put a content placeholder div on your markup:
<div id="result" data-remote-url="#Url.Action("Load", "SomeController")"></div>
and then once the DOM is loaded:
$(function() {
$.ajax({
url: $('#result').data('remote-url'),
type: 'POST',
beforeSend: function() {
// TODO: you could show an AJAX loading spinner
// to indicate to the user that there is an ongoing
// operation so that he doesn't run out of patience
},
complete: function() {
// this will be executed no matter whether the AJAX request
// succeeds or fails => you could hide the spinner here
},
success: function(result) {
// In case of success update the corresponding div with
// the results returned by the controller action
$('#result').html(result);
},
error: function() {
// something went wrong => inform the user
// in the gentler possible manner and remember
// that he spent some of his precious time waiting
// for those results
}
});
});
where the Load controller action will take care of communicating with the remote services and return a partial view containing the data:
public ActionResult Load()
{
var model = ... go ahead and fetch the model from the remote service
return PartialView(model);
}
Now if this fetching of data is I/O intensive you could take advantage of asynchronous controllers an I/O Completion Ports which will avoid you jeopardizing worker threads during the lengthy operation of fetching data from a remote source.
Related
I have website used by hundred of viewers every day. One of the pages has a timer that ticks every 10 seconds, when it does so it gets the latest data from the database and updates the screen.
The problem i have is that a high number of users and a high number of database connections takes its toll on the server.
Is there a better way of doing this? Updating server side and all users would benefit from the latest data but only the server is carrying out the calls every 10 seconds and not by every user?
SignalR is the way you'll want to go. Right now, your application is probably loading a page. Then you got some jQuery that probably sets a timer for 10 seconds. Then your timer is kicking off and you're probably doing an ajax call to get refreshed data, then putting that refreshed data into a <div> or something.
So essentially, every 10 seconds, your back end is calling your SQL server, doing some kind of SELECT statement, then the data from the SQL Server is being transmitted to your application server, where you are taking that data, transforming into displayable data.
SignalR, on the other hand works differently. It uses push technology. Push technology works like this. Lets say you have 5 people visiting your page right now. One person (person A) is doing something that saves something to the database. No one else is seeing this data though yet. But SignalR will send a signal out to everyone else (or just the people in which this database save affects) that says "Hey! There is newer data available. You should update now". The other people connected do an ajax call and get the refreshed data. And viola! The other 4 people now have updated data on their screen!
I hope I explained this clearly enough for you to understand! Scott Hanselman wrote a good introduction to SignalR.
I would use the setInterval() to fire an ajax function every 10 seconds.
window.setInterval("javascript function", milliseconds);
the javascript function would be similar to
function GetLatest(){
$.ajax({
type: 'POST',
url: '../Services/UpdateService.asmx/GetLatest',
data: '',
contentType: "application/json; charset=utf-8",
success: function (data) {
//use the data.d to get the info passed back from webservice. then
add your logic to update your html
},
error: function () {
//catch any bad data
}
});
}
your backend method should look like this. Object is whatever your objcet is
[WebMethod]
[ScriptMethod(ResponseFormat = ResponseFormat.Json)]
public Object GetLatest()
{
Object obj= new Object ();
obj.FirstName = "Dave";
obj.LastName = "Ward";
return obj;
}
Have you considered having your server side code cache the data for that 10 seconds. Cache it and set it's expiration for 10 seconds. That way, everyone gets the same information for 10 seconds and then the first after that 10 retrieves a new data set and then caches it. That way, only the first person to refresh causes a DB query, the rest get data up to 10 seconds old. Something like this:
Cache.Insert(key, data, Nothing, DateTime.Now.AddSeconds(10), TimeSpan.Zero)
I guess this is assuming that the users are all getting the same data at each poll. If they are all getting unique datasets, this won't cut it for you.
During create a chat system , I use a long life request to get message , and use a jquery request to send message like this :
*Send: *
$("#btn").click(function () {
$.ajax({
type: "POST",
url: "Chat.aspx/Insert",
data: "{ 'Str' :'" + $("#txtStr").val() + "' }",
contentType: "application/json; charset=utf-8",
dataType: "json",
success: function (data) {
},
error: function () {
}
});
});
Receive :
function Refresh() {
$.ajax({
type: "POST",
url: "Chat.aspx/GetRecords",
data: "{ 'Id' : " + $("#hdnV1").val() + "}",
success: function (data) {
$.each($(data.d), function () {
//Show it to user
});
},
complete: function () {
return Refresh();
},
contentType: "application/json; charset=utf-8",
dataType: "json",
traditional: true,
async: true
});
}
and this is my server side code to get messages :
[WebMethod]
public static object GetRecords(int Id)
{
TestEntities db = new TestEntities();
int count = 0;
while (true)
{
if (count++ >= 300)
return null;
System.Threading.Thread.Sleep(1000);
var list = db.Commets.Where(rec => rec.Id > Id).Select(rec => new { Id = rec.Id, Str = rec.Str }).ToList();
if (list.Count > 0)
return list;
}
}
When user writes something and clicks on the Send button , request goes to pending state , and I thing because the long life request is executing
I check them up in firebug, is there anybody out there to help me !?!
For more details comment me please , and sorry about my bad English syntax , I am new in English
Thanks
This is not the best method to build a chat in asp.net. As the number of users increases this method won't scale.
Take a look at SignalR is a good option to build a chat application.
However if you do want to do it yourself for some reason, the most efficient method to build a chat in asp.net is to use a IHttpAsyncHandler and ajax requests.
Asp.net has a limited number of worker threads. Putting one of them to sleep will just kill your application even for a relatively small number of users.
An async request allows you to delay the response of a request till an external event occurs.
A user makes a call to this handler and waits till someone sends him a message.
Messages are delivered as soon as you send them to a user.
On receiving a message the client makes another request and waits for the next message.
This is how you keep a persistent connection open and allows the server to push data to the client.
This is a lot more efficient than polling the site to check if messages have arrived.
Using the async handler also ensures that no asp.net threads are wasted while a user waits for messages to come. Polling with a timer you will quickly run out of threads to process requests in asp.net.
This ensures that your chat can scale well even as the number of users of the site goes up.
Here is a completely working project that implements this, along with ajax.
Avoid using a database as the method of communication between users on your site. If you want to log the chats, make all the writes async calls.
The problem is GetRecords may never return. Which, unless you are using something like WebSockets is an issue. It means the connection will never complete (until it times out) and you will more than likely not receive any data from it.
What you should do is let GetRecords return straight away, and then poll in your jQuery for new records every X seconds.
You could also try adding a Sleep in the while loop (as well as a maximum loop limit) to allow other threads to work in the system and to prevent an infinite loop. i.e.
int maxLoops = 50;
int loopNum = 0;
while (loopNum < maxLoops)
{
var list = db.Commets.Where(rec => rec.Id > Id).Select(rec => new { Id = rec.Id, Str = rec.Str }).ToList();
if (list.Count > 0)
return list;
Thread.Sleep(250); //Wait 1/4 of a second
loopNum++;
}
Since you wrote in a comment that you are using Session state in the application, it's likely correct that the long running request is blocking the update.
The reason is that Asp.Net blocks concurrent access to the Session. See this link on MSDN for details. Specifically (close to the bottom of the page):
However, if two concurrent requests are made for the same session (by
using the same SessionID value), the first request gets exclusive
access to the session information. The second request executes only
after the first request is finished.
One way to fix it might be to set EnableSessionState="ReadOnly" or EnableSessionState="False" on the page in question. This assuming that you don't need access to the Session in the particular page with the chat system.
You should be sending the Insert asynchronously as well, just like you do with the GetRecords. That should solve your issue. Add async: true to the send request.
Im an developing a web application with MVC3, Razor, C# 4, jQuery.
Within one screen (page) I do AJAX calls to a controller action to get some screen-updates. Using Javascripts setTimeout() I do polling.
To optimize this polling in case where the server does not have screen-updates I like to delay the HTTP response and wait a little until I a) get screen-updates or b) hit some timeout (eg. 10 seconds or so).
I tried to do something like this
[OutputCache(Duration = 0)]
[AcceptVerbs(HttpVerbs.Get)]
public ActionResult CheckForUpdates()
{
var startTime = DateTime.Now;
while (_haveUpdates || DateTime.Now > startTime.AddSeconds(10 ))
{
Thread.Sleep(3);
}
return _haveUpdate ? Json(new {updates = ... }, JsonRequestBehavior.AllowGet) : null;
}
In the view I use Javascript / jQuery like this:
<script>
$(function () {
scheduleNextCheck();
});
function scheduleNextCheck() {
setTimeout(function() {
$.ajax({
url: "#Url.Action("CheckForUpdates")",
dataType: 'json',
data: null,
timeout: 10000,
cache: false,
success: function(data) {
if (data != null && data.updates != null ) {
// Apply Updates
} else {
scheduleNextCheck();
}
},
error: function() {
scheduleNextCheck();
}
});
}, 2000);
}
When using this code the IIS7 worker process will hang/freeze totally so only killing the worker-process can unlock the entire IIS7 server. So Thread.Sleep() seems not to be a good idea. At least not my applications environment.
Did anybody do timething like this with ASP.Net MVC yet or have any idea?
Thx, Marc
==== Update: ====
Found one of the problems: the while criteria was wrong. It should be like this:
while (noUdates && DateTime.Now < startTime.AddSeconds(10))
The problem now is that other AJAX-requests are canceled while this sleep-delay-loop is running.
About SignalR: Have to take a closer look but the problem is that this needs additional server- and client-side libraries which I have to get approval for so I was hoping for an "easy" small solution with less impact and less need for training.
But SignalR still is an option for one of the next releases to replace polling-stuff - but need to get some training and experience with this stuff first.
==== Update 2: ====
I am looking for a solutions that works without additional libraries / frameworks so I can apply it to the nearly finished web-application without a huge impact.
You may take a look at SignalR which is designed exactly for such situations. Instead of having the client poll the server with multiple HTTP requests, it is the server that notifies the client when updates are available.
I'm also looking into SignalR with my site, I'm reading this article which is pretty good: http://www.dotnetcurry.com/ShowArticle.aspx?ID=780.
From my understanding, the only real constraint is the browser version, but SignalR degrades to accommodate older versions.
There's some info here too: http://www.entechsolutions.com/browser-alerts-with-asp-net-4-5-and-signalr
I wanna to track the user activity (log data). i have several links on the page which submit form through specific way .the problem is :
I can't find a suitable way to handle the click event and insert the log in the database in a simple way.
My code :
HtmlGenericControl a = new HtmlGenericControl("a");
a.Attributes["onclick"] = "$('#" + frm.ClientID + "').submit();";
a.InnerText = "site " + dt_list.ElementAtOrDefault(0).Field<string>("pro_name").TrimEnd();
inner_li_1.Controls.Add(a);
Now i wanna to handle the click event of the link which make the submission ?!!
This is too much, how ever its works, and I use it in some case that I can not do other way.
The javascript call, I call the log first, then continue with submit. I have also show a wait message when I make the click, how ever the events happends too fast and the user did not understand that is wait for anything. I also take care to avoid double log of the action with simple javascript. I call this function on the onclick event.
var confirmSubmited = false;
function SubmitWithLog(me)
{
// to avoid many clicks...
if(confirmSubmited)
return false;
confirmSubmited=true;
jQuery.ajax({
url: "/LogAction.ashx",
type: "GET",
timeout: 3000,
async: true, // you can try and async:false - maybe is better for you
data: action=4, // here you send the log informations
cache: false,
success: function(html) {
jQuery("#FormID").submit();
},
error: function(responseText, textStatus, XMLHttpRequest) {
jQuery("#FormID").submit();
}
});
return false;
}
The handler is as
public class LogAction : IHttpHandler
{
public void ProcessRequest(HttpContext context)
{
// log here what you wish
// end up with no content
context.Response.TrySkipIisCustomErrors = true;
context.Response.Status = "204 No Content";
context.Response.StatusCode = 204;
}
}
I like to clear that this is not a good way to log the user action, but its a way when you can not do otherwise. I use it only ones, when the user make submit and leave my site and go to one other site, and I use it to write this action.
In all other cases I use ether a reference in the url, ether make log on code behind, ether I include an empty image in the html page that make the log like that: ASP server stats for html pages
I have try to call ajax, and right way make the submit, but its fails, the ajax is stop by submit. The next way was to use timeout, to call ajax, and set timeout 500ms to make the submit, but I avoid the timeout because the ajax call takes less than 100ms as I have made it.
#Aristos -- Why in the world would you negatively impact the user experience by waiting for the Ajax call that does the logging to return before moving on with the event (click, submit, whatever...)?!? Logging is a non-critical part of the application, where the user experience does not change based on its results. It's a "send and forget" type routine. Fire off the Ajax request, leave the success and failure routines empty, and execute the submit on the line immediately following the $.ajax(); call.
So, some pseudo code for you...
function handleClick(e) {
handleLoggingOnClick("something happened here");
myForm.submit();
}
function handleLoggingOnClick(message) {
$.ajax({url:"logging.do", data:"message="+message});
}
I've been trying to create a controller in my project for delivering what could turn out to be quite complex reports. As a result they can take a relatively long time and a progress bar would certainly help users to know that things are progressing. The report will be kicked off via an AJAX request, with the idea being that periodic JSON requests will get the status and update the progress bar.
I've been experimenting with the AsyncController as that seems to be a nice way of running long processes without tying up resources, but it doesn't appear to give me any way of checking on the progress (and seems to block further JSON requests and I haven't discovered why yet). After that I've tried resorting to storing progress in a static variable on the controller and reading the status from that - but to be honest that all seems a bit hacky!
All suggestions gratefully accepted!
Here's a sample I wrote that you could try:
Controller:
public class HomeController : AsyncController
{
public ActionResult Index()
{
return View();
}
public void SomeTaskAsync(int id)
{
AsyncManager.OutstandingOperations.Increment();
Task.Factory.StartNew(taskId =>
{
for (int i = 0; i < 100; i++)
{
Thread.Sleep(200);
HttpContext.Application["task" + taskId] = i;
}
var result = "result";
AsyncManager.OutstandingOperations.Decrement();
AsyncManager.Parameters["result"] = result;
return result;
}, id);
}
public ActionResult SomeTaskCompleted(string result)
{
return Content(result, "text/plain");
}
public ActionResult SomeTaskProgress(int id)
{
return Json(new
{
Progress = HttpContext.Application["task" + id]
}, JsonRequestBehavior.AllowGet);
}
}
Index() View:
<script type="text/javascript">
$(function () {
var taskId = 543;
$.get('/home/sometask', { id: taskId }, function (result) {
window.clearInterval(intervalId);
$('#result').html(result);
});
var intervalId = window.setInterval(function () {
$.getJSON('/home/sometaskprogress', { id: taskId }, function (json) {
$('#progress').html(json.Progress + '%');
});
}, 5000);
});
</script>
<div id="progress"></div>
<div id="result"></div>
The idea is to start an asynchronous operation that will report the progress using HttpContext.Application meaning that each task must have an unique id. Then on the client side we start the task and then send multiple AJAX requests (every 5s) to update the progress. You may tweak the parameters to adjust to your scenario. Further improvement would be to add exception handling.
4.5 years after this question has been answered, and we have a library that can make this task much easier: SignalR. No need to use shared state (which is bad because it can lead to unexpected results), just use the HubContext class to connect to a Hub that sends messages to the client.
First, we set up a SignalR connection like usual (see e.g. here), except that we don't need any server-side method on our Hub. Then we make an AJAX call to our Endpoint/Controller/whatever and pass the connection ID, which we get as usual:var connectionId = $.connection.hub.id;. On the server side of things, you can start your process on a different thread and retutn 200 OK to the client. The process will know the connectionId so that it can send messages back to the client, like this:
GlobalHost.ConnectionManager.GetHubContext<LogHub>()
.Clients.Client(connectionId)
.log(message);
Here, log is a client-side method that you want to call from the server, hence it should be defined like you usually do with SignalR:
$.connection.logHub.client.log = function(message){...};
More details in my blog post here
If you have an access to your web server machine as a choice you can create windows service or simple console application to perform long running work. Your web app should add a record to db to indicate that operation should start, and your windows service which periodicaly checks for new records in db should start performing task and report progress to db.
Your web app can use ajax request to check for progress and show to users.
I used this method to implement excel reports for asp.net mvc application.
This report was created by windows service which runs on machine and constantly checks for new records in reports table and when it finds a new record it starts to create a report and indicate a progress by updating record field. Asp.net mvc application just has been adding new report record and tracking progress in database untill it finished and then provided a link to ready file to be downloaded.