Rapid number updates on a website - c#

I wonder how to update fast numbers on a website.
I have a machine that generates a lot of output, and I need to show it on line. However my problem is the update frequency is high, and therefore I am not sure how to handle it.
It would be nice to show the last N numbers, say ten. The numbers are updated at 30Hz. That might be too much for the human eye, but the human eye is only for control here.
I wonder how to do this. A page reload would keep the browser continuously loading a page, and for a web page something more then just these numbers would need to be shown.
I might generate a raw web engine that writes the number to a page over a specific IP address and port number, but even then I wonder whether this page reloading would be too slow, giving a strange experience to the users.
How should I deal with such an extreme update rate of data on a website? Usually websites are not like that.
In the tags for this question I named the languages that I understand. In the end I will probably write in C#.

a) WebSockets in conjuction with ajax to update only parts of the site would work, disadvantage: the clients infrastructure (proxies) must support those (which is currently not the case 99% of time).
b) With existing infrastructure the approach is Long Polling. You make an XmlHttpRequest using javascript. In case no data is present, the request is blocked on server side for say 5 to 10 seconds. In case data is avaiable, you immediately answer the request. The client then immediately sends a new request. I managed to get >500 updates per second using java client connecting via proxy, http to a webserver (real time stock data displayed).
You need to bundle several updates with each request in order to get enough throughput.

You don't have to use a page reload. You can use WebSockets to establish an open two-way communication between a browser (via JavaScript) and your server.
Python Tornado has support for this built-in. Additionally, there are a couple of other Python servers that support it. Socket.IO is a great JavaScript library, with fallback, to facilitate the client side.

On the backend you can use Redis or a NewSQL database like VoltDB for fast in-memory database updates. Caching helps a lot with high latency components (esp. in a write heavy application).
On the front-end you can look into websockets and the Comet web application model http://en.wikipedia.org/wiki/Comet_%28programming%29
Many gaming companies have to deal with fast counter updates and displays - it might be worth looking into. Zynga uses a protocol call AMF http://highscalability.com/blog/2010/3/10/how-farmville-scales-the-follow-up.html

Related

Read real time data from other websites

I need a function on my website so that it can update the sports data, for example, the result of a sports game in real time. I have seen some websites do that, but I don't know how to monitoring those data. Any suggestions or help?
There are 4 possible approaches for displaying real-time data on website:
Refresh page at periodic intervals
Obsolete method. Not recommended for modern apps.
AJAX calls from browser to pull data at periodic intervals
This is the most popular method used currently by many websites
Can be done with least development effort.
3. Websocket
Modern method. Used extensively in financial services domain.
Good for bi-directional communication between client and server.
Adds an unnecessary overhead for simple updates by server (Example: match score)
4. SSE (Server-Sent-Events)
Most modern of all methods. Quickly gaining adoption.
Least overhead.
Most preferred for near real-time update from server to client.
More information on SSE:
https://developers.facebook.com/docs/graph-api/server-sent-events/

Pass Info to Client without Page Refresh in Asp.net MVC Application

Today we have seen some sites that pass data or notification to client without page refresh. Named Real time or interactive applications.
some of known site are :
Stackoverflow : notifications
Freelancer : passes project and professional counts asynchronously in numeric format
Google Mail : Counts mail memory usage by users in total.
and so more ... .
I have tried and searched some tools like SignalR. Basically SignalR designed for creating chat application. But is there a direct way without any extension in Microsoft Technologies to meet our purpose? For example suppose we want a simple counter like freelancer, Have we no way except using extensions like SignalR?
You can look at a technique called polling (which SignalR falls back to when support for other methods are not present), basically the concept is that every x seconds you'd send a request to the server to check for an update (more or less), for example (using jQuery):
setInterval(function() {
$.get("/Messages/GetCount", function(data) {
// do something with the data ...
});
}, 30000);
Every 30 seconds, check the Messages count - and perform an action accordingly. Here is a good article on polling and long polling (it mentions a SignalR alternative called Socket.IO).
Having said all that, I'd seriously just go with SignalR, those guys tested all kinds of corner cases, performance etc.
Use a Javascript timer on the client-side to make periodic asynchronous requests for updated information. This updated information can then be used to update the client-side, or can be used to prompt further requests for more details.
This solution can work for situations where you do not need to receive immediate updates whenever there are updates available on the server side (but instead can wait for the timer interval). It also may present some scaling issues and can lead to wasting bandwidth and client/server time while making unnecessary calls.
To overcome either of these, it would be best to use a library like SignalR (which can do much more than just chat applications - check out this blog post for a real world implementation that has nothing to do with chat).
Use Microsoft's ASP.NET Ajax implementation or JQuery:
Microsoft Ajax Overview

Pushing data from an ASP.NET MVC Controller to a View

I'm building the back end to a site which will have multiple "widgets" on the front end which need to update in real time.
Right now I simply have a load method which populates all the widgets with data, on page load obviously. My question is how to handle the real time aspect of further updates.
I thought of having just multiple ajax calls, which could query a service every second or so, and return the latest data, but that seems inefficient.
Is there a way to "push" data to a View from a Controller?
maybe you can have a look at this project : https://github.com/SignalR/SignalR
ASP.NET SignalR is a new library for ASP.NET developers that makes it
incredibly simple to add real-time web functionality to your
applications. What is "real-time web" functionality? It's the ability
to have your server-side code push content to the connected clients as
it happens, in real-time.
SignalR also provides a very simple, high-level API for doing server
to client RPC (call JavaScript functions in your clients' browsers
from server-side .NET code) in your ASP.NET application, as well as
adding useful hooks for connection management, e.g. connect/disconnect
events, grouping connections, authorization.
(Excerp from http://signalr.net/ )
Hope it helps.
I think your best bet is to periodically poll the server:
$(document).ready(function() {
setTimeout("getUpdate()", 30000);
function getUpdate()
{
// Make an ajax call here
}
});
This will ask for an update every 30 seconds.
It depends on how often the data on the front end needs to be updated. Most pages aren't going to need constant updating. I don't know that there is a "best practice" threshold, but I think a good starting point would be 15-20 second updates using Ajax. Make your Ajax calls fast and lean - maybe just return blank if there are no updates. If you need faster updates than that, look into something called long polling. Long polling is basically where you trigger an ajax call to the server, and the connection sits open until there is data to be sent. Long polling will take more server resources, because you will have open connections and threads running while they are waiting for data to be ready. With ASP.NET you'll also have to worry about killing long polling threads, because by default those threads wouldn't be killed when the browser closes connection (for example if someone navigates away from the page.)
You can also use Web Sockets, if its running in a browser that support HTML5

Building a scalable ASP.NET MVC Web Application

I'm currently in the process of building an ASP.NET MVC web application in c#.
I want to make sure that this application is built so that it can scale out in the future without the need for major re-factoring.
I'm quite keen on using some sort of queue to post any writes to my database base to and have a process which polls that queue asynchronously to perform the update. Once this data has been posted back to the database the client then needs to be updated with the new information. The implication here being that the process to write the data back to the database could take a short while based on business rules executing on the server.
My question is what would be the best way to handle the update from the client\browser perspective.
I'm thinking along the lines of posting the data back to the server and adding it to the queue and immediately sending a response to the client then polling at some frequency to get the updated data. Any best practices or patterns on this would be appreciated.
Also in terms of reading data from the database would you suggest using any particular techniques or would reading straight from db be sufficient given my scenario.
Update
Thought I'd post an update on this as it's been a while. We've actually ended up using Windows Azure but the solution is applicable to other platforms.
What we've ended up doing is using the Windows Azure Queue to post messages\commands to. This is a very quick process and returns immediately. We then have a worker role which processes these messages on another thread. This allows us to minimize any db writes\updates on the web role in theory allowing us to scale more easily.
We handle informing the user via emails or even silently depending on the type of data we are dealing with.
Not sure if this helps but why dont you have an auto refresh on the page every 30 seconds for example. This is sometimes how news feeds work on sports websites, saying the page will be updated every x minutes.
<meta http-equiv="refresh" content="120;url=index.aspx">
Why not let the user manually poll the status of the request? This is how your typical e-commerce app is implemented. When you purchase something online, the order is submitted to a queue for fullfillment. After it's submitted, the user is presented with a "Thank you for your order" page and a link where they can check the status of the order. The user can visit the link anytime to check the status, no need for an auto-poll mechanism.
Is your scenario so different from this?
Sorry in my previous answer I might have misunderstood. I was talking of a "queue" as something stored in a SQL DB, but it seems on reading your post again you are may be talking about a separate message queueing component like MSMQ or JMS?
I would never put a message queue in the front end, between a user and backend SQL DB. Queues are good for scaling across time, which is suitable between backend components, where variances in processing times are acceptable (e.g. order fulfillment)... when dealing with users, this variance is usually not acceptable.
While I don't know if I agree with the logic of why, I do know that something like jQuery is going to make your life a LOT easier. I would suggest making a RESTful web API that your client-side code consumes. For example, you want to post a new order to the system and have the client responsive? Make a post to www.mystore.com/order/create and have that return the new URI to access the order (i.e. order#) as a URI (www.mystore.com/order/1234). That response is then stored in the client code and a jQuery call is setup to poll for a response or stop polling on an error.
For further reading check out this Wikipedia article on the concept of REST.
Additionally you might consider the Reactive Extensions for .NET and within that check out the RxJS sub-project which has some pretty slick ways of handling with the polling problem without causing you to write the polling code yourself. Fun things to play with!
Maybe you can add a "pending transactions" area to the UI. When you queue a transaction, add it to the user's "pending transactions" list.
When it completes, show that in the user's "pending transactions" list the next time they request a new page.
You can make a completed transaction stay listed until the user clicks on it, or for a predetermined length of time.

What is the best way scale out work to multiple machines?

We're developing a .NET app that must make up to tens of thousands of small webservice calls to a 3rd party webservice. We would prefer a more 'chunky' call, but the 3rd party does not support it. We've designed the client to use a configurable number of worker threads, and through testing have code that is fairly well optimized for one multicore machine. However, we still want to improve the speed, and are looking at spreading the work accross multiple machines. We're well versed in typical client/server/database apps, but new to designing for multiple machines. So, a few questions related to that:
Is there any other client-side optimization, besides multithreading, that we should look at that could improve speed of a http request/response? (I should note this is a non-standard webservice, so is implemented using WebClient, not a WCF or SOAP client)
Our current thinking is to use WCF to publish chunks of work to MSMQ, and run clients on one or more machines to pull work off of the queue. We have experience with WCF + MSMQ, but want to be sure we're not missing better options. Are there other, better ways to do this today?
I've seen some 3rd party tools like DigiPede and Microsoft's HPC offerings, but these seem like overkill. Any experience with those products or reasons we should consider them over roll-our-own?
Sounds like your goal is to execute all these web service calls as quickly as you can, and get the results tabulated. Given that, your greatest efficiency control is going to be through scaling the number of concurrent requests you can make.
Be sure to look at your client-side connection limits. By default, I think the system default is 2 connections. I haven't tried this myself, but by upping the number of connections with this property, you should theoretically see a multiplier effect in terms of generating more requests by generating more connections from a single machine. There's more info on MS forums.
The MSMQ option works well. I'm running that configuration myself. ActiveMQ is also a fine solution, but MSMQ is already on the server.
You have a good starting point. Get that in operation, then move on to performance and throughput.
At CodeMash this year, Wesley Faler did an interesting presentation on this sort of problem. His solution was to store "jobs" in a DB, then use clients to pull down work and mark status when complete.
He then pushed the whole infrastructure up to Amazon's EC2.
Here's his slides from the presentation - they should give you the basic idea:
I've done something similar w/ multiple PC's locally - the basics of managing the workload were similar to Faler's approach.
If you have optimized the code, you could look into optimizing the network side to minimize the number of packets sent:
reuse HTTP sessions (i.e.: multiple transactions into one session by keeping the connection open, reduces TCP overhead)
reduce the number of HTTP headers to the minimum in the request to save bandwidth
if supported by server, use gzip to compress the body of the request (need to balance CPU usage to do the compression, and the bandwidth you save)
You might want to consider Rhino Service Bus instead of MSMQ. The source is available here.

Categories

Resources