Long polling Stop other request for 1 or 2 minutes - c#

During create a chat system , I use a long life request to get message , and use a jquery request to send message like this :
*Send: *
$("#btn").click(function () {
$.ajax({
type: "POST",
url: "Chat.aspx/Insert",
data: "{ 'Str' :'" + $("#txtStr").val() + "' }",
contentType: "application/json; charset=utf-8",
dataType: "json",
success: function (data) {
},
error: function () {
}
});
});
Receive :
function Refresh() {
$.ajax({
type: "POST",
url: "Chat.aspx/GetRecords",
data: "{ 'Id' : " + $("#hdnV1").val() + "}",
success: function (data) {
$.each($(data.d), function () {
//Show it to user
});
},
complete: function () {
return Refresh();
},
contentType: "application/json; charset=utf-8",
dataType: "json",
traditional: true,
async: true
});
}
and this is my server side code to get messages :
[WebMethod]
public static object GetRecords(int Id)
{
TestEntities db = new TestEntities();
int count = 0;
while (true)
{
if (count++ >= 300)
return null;
System.Threading.Thread.Sleep(1000);
var list = db.Commets.Where(rec => rec.Id > Id).Select(rec => new { Id = rec.Id, Str = rec.Str }).ToList();
if (list.Count > 0)
return list;
}
}
When user writes something and clicks on the Send button , request goes to pending state , and I thing because the long life request is executing
I check them up in firebug, is there anybody out there to help me !?!
For more details comment me please , and sorry about my bad English syntax , I am new in English
Thanks

This is not the best method to build a chat in asp.net. As the number of users increases this method won't scale.
Take a look at SignalR is a good option to build a chat application.
However if you do want to do it yourself for some reason, the most efficient method to build a chat in asp.net is to use a IHttpAsyncHandler and ajax requests.
Asp.net has a limited number of worker threads. Putting one of them to sleep will just kill your application even for a relatively small number of users.
An async request allows you to delay the response of a request till an external event occurs.
A user makes a call to this handler and waits till someone sends him a message.
Messages are delivered as soon as you send them to a user.
On receiving a message the client makes another request and waits for the next message.
This is how you keep a persistent connection open and allows the server to push data to the client.
This is a lot more efficient than polling the site to check if messages have arrived.
Using the async handler also ensures that no asp.net threads are wasted while a user waits for messages to come. Polling with a timer you will quickly run out of threads to process requests in asp.net.
This ensures that your chat can scale well even as the number of users of the site goes up.
Here is a completely working project that implements this, along with ajax.
Avoid using a database as the method of communication between users on your site. If you want to log the chats, make all the writes async calls.

The problem is GetRecords may never return. Which, unless you are using something like WebSockets is an issue. It means the connection will never complete (until it times out) and you will more than likely not receive any data from it.
What you should do is let GetRecords return straight away, and then poll in your jQuery for new records every X seconds.
You could also try adding a Sleep in the while loop (as well as a maximum loop limit) to allow other threads to work in the system and to prevent an infinite loop. i.e.
int maxLoops = 50;
int loopNum = 0;
while (loopNum < maxLoops)
{
var list = db.Commets.Where(rec => rec.Id > Id).Select(rec => new { Id = rec.Id, Str = rec.Str }).ToList();
if (list.Count > 0)
return list;
Thread.Sleep(250); //Wait 1/4 of a second
loopNum++;
}

Since you wrote in a comment that you are using Session state in the application, it's likely correct that the long running request is blocking the update.
The reason is that Asp.Net blocks concurrent access to the Session. See this link on MSDN for details. Specifically (close to the bottom of the page):
However, if two concurrent requests are made for the same session (by
using the same SessionID value), the first request gets exclusive
access to the session information. The second request executes only
after the first request is finished.
One way to fix it might be to set EnableSessionState="ReadOnly" or EnableSessionState="False" on the page in question. This assuming that you don't need access to the Session in the particular page with the chat system.

You should be sending the Insert asynchronously as well, just like you do with the GetRecords. That should solve your issue. Add async: true to the send request.

Related

.NET web forms update site content server side

I have website used by hundred of viewers every day. One of the pages has a timer that ticks every 10 seconds, when it does so it gets the latest data from the database and updates the screen.
The problem i have is that a high number of users and a high number of database connections takes its toll on the server.
Is there a better way of doing this? Updating server side and all users would benefit from the latest data but only the server is carrying out the calls every 10 seconds and not by every user?
SignalR is the way you'll want to go. Right now, your application is probably loading a page. Then you got some jQuery that probably sets a timer for 10 seconds. Then your timer is kicking off and you're probably doing an ajax call to get refreshed data, then putting that refreshed data into a <div> or something.
So essentially, every 10 seconds, your back end is calling your SQL server, doing some kind of SELECT statement, then the data from the SQL Server is being transmitted to your application server, where you are taking that data, transforming into displayable data.
SignalR, on the other hand works differently. It uses push technology. Push technology works like this. Lets say you have 5 people visiting your page right now. One person (person A) is doing something that saves something to the database. No one else is seeing this data though yet. But SignalR will send a signal out to everyone else (or just the people in which this database save affects) that says "Hey! There is newer data available. You should update now". The other people connected do an ajax call and get the refreshed data. And viola! The other 4 people now have updated data on their screen!
I hope I explained this clearly enough for you to understand! Scott Hanselman wrote a good introduction to SignalR.
I would use the setInterval() to fire an ajax function every 10 seconds.
window.setInterval("javascript function", milliseconds);
the javascript function would be similar to
function GetLatest(){
$.ajax({
type: 'POST',
url: '../Services/UpdateService.asmx/GetLatest',
data: '',
contentType: "application/json; charset=utf-8",
success: function (data) {
//use the data.d to get the info passed back from webservice. then
add your logic to update your html
},
error: function () {
//catch any bad data
}
});
}
your backend method should look like this. Object is whatever your objcet is
[WebMethod]
[ScriptMethod(ResponseFormat = ResponseFormat.Json)]
public Object GetLatest()
{
Object obj= new Object ();
obj.FirstName = "Dave";
obj.LastName = "Ward";
return obj;
}
Have you considered having your server side code cache the data for that 10 seconds. Cache it and set it's expiration for 10 seconds. That way, everyone gets the same information for 10 seconds and then the first after that 10 retrieves a new data set and then caches it. That way, only the first person to refresh causes a DB query, the rest get data up to 10 seconds old. Something like this:
Cache.Insert(key, data, Nothing, DateTime.Now.AddSeconds(10), TimeSpan.Zero)
I guess this is assuming that the users are all getting the same data at each poll. If they are all getting unique datasets, this won't cut it for you.

How to delay a controller action result in ASP.Net MVC3 C#4

Im an developing a web application with MVC3, Razor, C# 4, jQuery.
Within one screen (page) I do AJAX calls to a controller action to get some screen-updates. Using Javascripts setTimeout() I do polling.
To optimize this polling in case where the server does not have screen-updates I like to delay the HTTP response and wait a little until I a) get screen-updates or b) hit some timeout (eg. 10 seconds or so).
I tried to do something like this
[OutputCache(Duration = 0)]
[AcceptVerbs(HttpVerbs.Get)]
public ActionResult CheckForUpdates()
{
var startTime = DateTime.Now;
while (_haveUpdates || DateTime.Now > startTime.AddSeconds(10 ))
{
Thread.Sleep(3);
}
return _haveUpdate ? Json(new {updates = ... }, JsonRequestBehavior.AllowGet) : null;
}
In the view I use Javascript / jQuery like this:
<script>
$(function () {
scheduleNextCheck();
});
function scheduleNextCheck() {
setTimeout(function() {
$.ajax({
url: "#Url.Action("CheckForUpdates")",
dataType: 'json',
data: null,
timeout: 10000,
cache: false,
success: function(data) {
if (data != null && data.updates != null ) {
// Apply Updates
} else {
scheduleNextCheck();
}
},
error: function() {
scheduleNextCheck();
}
});
}, 2000);
}
When using this code the IIS7 worker process will hang/freeze totally so only killing the worker-process can unlock the entire IIS7 server. So Thread.Sleep() seems not to be a good idea. At least not my applications environment.
Did anybody do timething like this with ASP.Net MVC yet or have any idea?
Thx, Marc
==== Update: ====
Found one of the problems: the while criteria was wrong. It should be like this:
while (noUdates && DateTime.Now < startTime.AddSeconds(10))
The problem now is that other AJAX-requests are canceled while this sleep-delay-loop is running.
About SignalR: Have to take a closer look but the problem is that this needs additional server- and client-side libraries which I have to get approval for so I was hoping for an "easy" small solution with less impact and less need for training.
But SignalR still is an option for one of the next releases to replace polling-stuff - but need to get some training and experience with this stuff first.
==== Update 2: ====
I am looking for a solutions that works without additional libraries / frameworks so I can apply it to the nearly finished web-application without a huge impact.
You may take a look at SignalR which is designed exactly for such situations. Instead of having the client poll the server with multiple HTTP requests, it is the server that notifies the client when updates are available.
I'm also looking into SignalR with my site, I'm reading this article which is pretty good: http://www.dotnetcurry.com/ShowArticle.aspx?ID=780.
From my understanding, the only real constraint is the browser version, but SignalR degrades to accommodate older versions.
There's some info here too: http://www.entechsolutions.com/browser-alerts-with-asp-net-4-5-and-signalr

How to log the link click which make form submission?

I wanna to track the user activity (log data). i have several links on the page which submit form through specific way .the problem is :
I can't find a suitable way to handle the click event and insert the log in the database in a simple way.
My code :
HtmlGenericControl a = new HtmlGenericControl("a");
a.Attributes["onclick"] = "$('#" + frm.ClientID + "').submit();";
a.InnerText = "site " + dt_list.ElementAtOrDefault(0).Field<string>("pro_name").TrimEnd();
inner_li_1.Controls.Add(a);
Now i wanna to handle the click event of the link which make the submission ?!!
This is too much, how ever its works, and I use it in some case that I can not do other way.
The javascript call, I call the log first, then continue with submit. I have also show a wait message when I make the click, how ever the events happends too fast and the user did not understand that is wait for anything. I also take care to avoid double log of the action with simple javascript. I call this function on the onclick event.
var confirmSubmited = false;
function SubmitWithLog(me)
{
// to avoid many clicks...
if(confirmSubmited)
return false;
confirmSubmited=true;
jQuery.ajax({
url: "/LogAction.ashx",
type: "GET",
timeout: 3000,
async: true, // you can try and async:false - maybe is better for you
data: action=4, // here you send the log informations
cache: false,
success: function(html) {
jQuery("#FormID").submit();
},
error: function(responseText, textStatus, XMLHttpRequest) {
jQuery("#FormID").submit();
}
});
return false;
}
The handler is as
public class LogAction : IHttpHandler
{
public void ProcessRequest(HttpContext context)
{
// log here what you wish
// end up with no content
context.Response.TrySkipIisCustomErrors = true;
context.Response.Status = "204 No Content";
context.Response.StatusCode = 204;
}
}
I like to clear that this is not a good way to log the user action, but its a way when you can not do otherwise. I use it only ones, when the user make submit and leave my site and go to one other site, and I use it to write this action.
In all other cases I use ether a reference in the url, ether make log on code behind, ether I include an empty image in the html page that make the log like that: ASP server stats for html pages
I have try to call ajax, and right way make the submit, but its fails, the ajax is stop by submit. The next way was to use timeout, to call ajax, and set timeout 500ms to make the submit, but I avoid the timeout because the ajax call takes less than 100ms as I have made it.
#Aristos -- Why in the world would you negatively impact the user experience by waiting for the Ajax call that does the logging to return before moving on with the event (click, submit, whatever...)?!? Logging is a non-critical part of the application, where the user experience does not change based on its results. It's a "send and forget" type routine. Fire off the Ajax request, leave the success and failure routines empty, and execute the submit on the line immediately following the $.ajax(); call.
So, some pseudo code for you...
function handleClick(e) {
handleLoggingOnClick("something happened here");
myForm.submit();
}
function handleLoggingOnClick(message) {
$.ajax({url:"logging.do", data:"message="+message});
}

sending emails without delay

Brief Idea: I am developing a small social networking kinda site.
Now there's a user "A" who has 100 followers...now what i want to do is whenever user "A" submits an article in the website all his followers should get an email with the article link...that is okay i can do it.
Problem: Now, there's a submit button on the page which stores the article in the DB and sends email to the followers...as there are many followers it takes a lot of time sending the emails...so the page keeps showing loading msg till all the emails are sent..how can i send all the emails asynchronously ??
i mean after the article has been submitted ...the emails should go automatically to the followers without putting the email sending function in the click event of the button....hope am not confusing you folks.
can i do something like store the article in the DB , redirect to the article page , start sending emails in a batch of 10 per 10 mins automatically...this process should start as soon as an article has been submitted by an user.
I had a similar issue with batch emails, and various other long-running tasks.
I developed a window service which contained a job manager. When a job needs to run from the main MVC application, the web application communicates with the service over HTTP (actually, using JSON), and the service performs the meat of actually sending emails, or performing other long-running tasks.
This means the web application request returns immediately.
The web application can also poll the service to determine the status of any particular job that is running (each job is given a unique identifier).
I would create a database table containing information about all pending email notifications.
When hitting submit, you can quickly add rows to this table.
Then, a background thread can check the table and send the mails (and of course remove the successfully sent ones from the table).
Have you thought to implement it using AJAX ?
When the user press on the submit button, instead of posting back to the server, create 2 ajax calls:
The first one is to save the article to the repository (database ?).
After receiving succesfull answer from the server (which can include the article id), invoke 2nd ajax call to send the mails. The server can start a thread to send the mails so the answer to the client will be immediate.
My preferred way of invoking ajax calls is using JQuery:
$.ajax({
type: "POST",
url: "services.aspx/SubmitArticle",
data: "{articlecontent: '[put here the content you want to send]'}",
contentType: "application/json; charset=utf-8",
dataType: "json",
error: function(response) {
// Handle Error here. The response object contains the error details
},
success: function(response) {
// Check here if the article has been saved:
// response.d property contains the server answer. It can be boolean,
// integer, string or any other complex object
// If article saved, invoke here the send mail ajax call. assuming the response.d contains
// the article id:
sendMails(response.d);
// sendMails invokes another ajax similiar to this code snippest
}
});
In the server side the async email send method can looks like:
[WebMethod]
public static bool SendMails(int articleId)
{
// Add the actual method that send mail to the thread pool
ThreadPool.QueueUserWorkItem(new WaitCallback(DoSendMail), articleId);
return true;
}
private void DoSendMail(object a)
{
int articleId = (int)a;
// Your code that sends mails goes here
}
You could use a queueing system like MassTransit, ZMQ or MSMQ.
Or... If you really wanted to create a cool app, you could pass the emailing task to a node.js app!?

Loading new data onto a page without reload

I'm doing a high-level spec on an ASP.Net page which may have some delayed data presented.
When the page loads, the initial data presented will originate from a local database (which will be fast in presenting). What I want is a separate process to go out and look for updated data (from whatever other services I have). This is more time consuming, but the idea is to present data, then if newer data is found, append this, on the fly to the top of the existing page.
I would like some recommendations on how to accomplish this.
The tech scope for this is ASP.Net 4.0, C# MVC3 and HTML5.
Thanks.
AJAX with jQuery is a good way to achieve this. So for example you could put a content placeholder div on your markup:
<div id="result" data-remote-url="#Url.Action("Load", "SomeController")"></div>
and then once the DOM is loaded:
$(function() {
$.ajax({
url: $('#result').data('remote-url'),
type: 'POST',
beforeSend: function() {
// TODO: you could show an AJAX loading spinner
// to indicate to the user that there is an ongoing
// operation so that he doesn't run out of patience
},
complete: function() {
// this will be executed no matter whether the AJAX request
// succeeds or fails => you could hide the spinner here
},
success: function(result) {
// In case of success update the corresponding div with
// the results returned by the controller action
$('#result').html(result);
},
error: function() {
// something went wrong => inform the user
// in the gentler possible manner and remember
// that he spent some of his precious time waiting
// for those results
}
});
});
where the Load controller action will take care of communicating with the remote services and return a partial view containing the data:
public ActionResult Load()
{
var model = ... go ahead and fetch the model from the remote service
return PartialView(model);
}
Now if this fetching of data is I/O intensive you could take advantage of asynchronous controllers an I/O Completion Ports which will avoid you jeopardizing worker threads during the lengthy operation of fetching data from a remote source.

Categories

Resources