Postback via webservice - c#

I work for a radio station and they want a now playing and coming next on their radio player.
I currently have xml that I will be saving to my live webserver via a webservice. Is there a way to only get the radio player to postback when the xml changes rather than every second via a timer control? We have had performance issues with frequent postbacks in the past so would like to avoid crashing our server. I am using c# asp.net 3.5.

Personally i would recommend doing this with ajax. You can then cache the response at the server side so you dont need to do any heavy lifting in the bulk of requests.
What you want to do is to have your ajax query the server every 10s or so to get the current and next song as a json response. Take the response and put it into your page.
In asp.net you could also possibly do this with an update panel but this would cause significantly more load on both the server and the client.

Comet solution or ETag may help

Related

WebBrowser control do not download images

I am automating the process of downloading my bank statement. The way I do this is by using a win forms WebBrowser control. I navigate to https://www.bankofamerica.com/ then I find the username and password textboxes in the dom fill them in with c# send click event to the submit button etc etc. Eventually I get to the statement I want to download when ready I just parse the page source.
The process works but it is very slow. In summary I will like to improve the performance of this process These are the things I am considering:
Use fiddler to see the requests and responses hoping I could automate the same process. (The problem with this approach is that the connection is encrypted also I have to set cookies and I belive it will be to complicated to do it this way).
Prevent WebBrowser Control from downloading images and css. That way the page.Ready event will fire earlier and the process could be faster.
I will rader go with option number 2 because I know very little about fiddler and just know the basics of http. How can I speed up this process?
It's trivial to capture encrypted traffic with Fiddler; simply enable the Decrypt HTTPS connections option.
It's also easy to disable download of images from the Web Browser control using the "Ambient DLControl" flags. See http://www.tech-archive.net/Archive/InetSDK/microsoft.public.inetsdk.programming.webbrowser_ctl/2009-01/msg00035.html for an example.

Ajax or SignalR for order screen

Currently I have an application in which a user stares at a dashboard, the dashboard will display new orders coming in for that user. I have rolled out the application for testing and most users are complaining of time delays and crahsing.
Currently I am using jQuery and Ajax using setInterval() and then an Ajax call to get the orders and update the screen every 30 seconds. However in some instances where there are a lot of orders the Ajax calls become overlapped.
I have stumbled across a new technology to me which seems like the solution SignalR but I have looked at the examples and have not seen any comments on performance.
Question - What is the performance like and would it be a better solution to the current above, also is it possible to configure this to target only a specific user and can this be done to the current logged in user's ID? I am using MVC4.
Any comments would be appreciated,
Thanks
You can use SignalR to send messages to a specific client/user. Take a look at the documentation here to find out more about this: https://github.com/SignalR/SignalR/wiki. SignalR allows you to make remote procedure calls (RPC's) from the client and the server and you should be able to push any notifications/new orders to the client almost immediately.
There aren't any published performance metrics for SignalR as yet but you can test it out to see how efficiently your traffic is handled.

Memory Usage on Browsers Keeps Growing IIS 7.5 Asp.net Apps

I have a Asp.net/C# web Application hosted under IIS7.5 server 2008 64bit. my application build is 32 bit. The viewstate of my pages are very large (1mb to 4mb).
the problem that i am seeing is that when i keep using the website for a period of time the Memory usages of the browsers keeps growing upto 50 MB. i am not sure where should i start looking for the problem. i have ScriptManger in my master page and proxy on my child pages not sure if it contributes to this?
and where the problem could be, any help would be appreciated.
Your issue is that you're storing too much on the client side (viewstate or cookies). For each request the user makes their browser is uploading this to your web server. So, even though the web server can process the request quickly it takes a while for the 4MB request to be uploaded to the server. This isn't a memory leak. It is a flaw in your logic in regards to how much you store on client's browser. Perhaps on each page load you are adding the same large data to the client's viewstate or cookies so after a while the requests get so large that the wait time becomes noticeable. To determine the problem you need to monitor viewstate and cookies for each page request from one client and see exactly what grows and then you should be able to identify the cause.
The viewstate is just a hidden field in the HTML is POSTed to the server when the is submitted on the page.
The client saves the posted data for a session so that it can reply/repost it on request (i.e. the user does a back and then a forward, it will prompt the user to resubmit the form/post data).
you might be able to trick the browser into not saving the data by setting the max-age in the response header to 0 and setting the cache-control to not cache.
HOWEVER, as pointed out in the comments, a viewstate >1MB is not a desirable situation to be in. You would be better storing client data in the server side Session.
There is a couple of things that you could do as a quick fix.
1) Disable the ViewState in client controls that don't need it.
http://msdn.microsoft.com/en-us/library/system.web.ui.page.enableviewstate.aspx
or
<asp:AnyControl EnableViewState="false"/>
2) Enable the trace for the application to see where it is being used disable accordingly/find the leak.
http://msdn.microsoft.com/en-us/library/wwh16c6c.aspx

Consuming WCF Service in ASP.NET, how to cache?

I have two situations in this case:
I want to query a WCF service and hold the data somewhere, because one of the web pages renders based on the data that's retrieved from the service. I don't want the page itself querying the service, but I'd rather have some sort of scheduled worker that runs once every a couple of minutes, and retrieves the data and holds it somewhere.
Where should I cache the service response, and what is the correct way to create the task to query the service every couple minutes?
I think I could achieve this by saving the response to a static variable, alongside the last query date, and then check on the page load if enough time has passed, I call the service and refresh the data, else I use the static cache.
This would also account for the case where no users access the page for a long time, and the site not futilely querying the service.
But it seems kind of rough, are there other, better ways to accomplish this kind of task?
You could indeed take another approach like having a scheduled program query the information and put it in an in-memory cache available to all the web servers in your farm. However, whether that would be better for your scenario depends on the size of your app and how much time/effort you want to spend on it.
An in-memory cache is harder to implement/support than a static variable but it's sometimes better since static variables can be cleared up every time the server resets (e.g. after X number of minutes of inactivity)
Depending on the size of your system I would start with the static variable, test drive the approach for a while and then decide if you need something more sophisticated.
Have you taken a look at Velocity
Nico: Why don't you write a simple console daemon that gets the data and stores it on your end in a database and then have your web app get the data from your local copy? You can make that console app run every certain amount of time. Inserting the data should not be a problem if you are using sql server 2008. You can pass datatable parameters to a stored proc and insert a whole table in one call. If you don't use Sql Server 2008, then serialize the whole collection returned by the web service and store in a table in one big blob column and record the timestamp when you got the data. You then can read the content of that column, deserealize your collection and reconstruct it to native objects for displaying on your page.
I've never seen (and I don't think its possible) to have your web app query the web service every certain amount of time. Imagine the web site is idle for hours therefore no interaction from anybody. That means that no events will fire and nothing will be queried.
Alternatively, you could create a dummy page executing a javascript function at certain intervals and have that javascript function make an ajax request to the server to get the data from the web service and cache the data. The problem is that the minute you walk out of that page, nothing will happen and you'll stop querying the web service. I think this is silly.

C# observer pattern message filtering

I'm writing a video cms and want all my users to have new assets displayed immediately as they come in.
If I'm right, facebook updates its wall-page in realtime. So when I post something to a friend it immediately displays on his wall. The realtime web, as they say.
I wonder how you do that? Not the technology of client-server-communication, but what goes on on the server.
I understand the principles of the observer-pattern.
But a wall is in fact a query on a table of messages.
How does the observer know what query a user is interested in?
Does it hold all the query's of all connected users and reruns it when something new comes in.
I believe Google-realtime works that way to.
Thank you for helping me out.
When you open facebook, open the script timeline in your browser to see what scripts are executing on the page. You'll notice that there is a polling script being executed several times a second. So the page is looking at the cache several times a second to see if there is any new information that can be displayed.
http://www.ajaxwith.com/Poll-vs-Push-Technology.html - this should give you a background on the subject.
Facebook uses AJAX and a JavaScript timer that polls in the background looking for anything that's changed. Other sites use the same type of functionality to update stock quotes embedded in the page, etc. It's not truly updating immediately, it's updating as frequently as the JavaScript timer hits their server. This is because web browsers use HTTP, which is a request/response protocol. A browser won't display anything that's not as a direct response to a request initiated by the browser; there's no way to just send content directly to the browser from your webserver.

Categories

Resources