Here's an API server which can give me real time news: every minute there will be something new to retrieve. There's also my web page, with a Javascript that will ask the API to get some news once every minute.
And this is not fine... unless my web page is made for a single user and will be open only on one machine at a time (which is not the case of the internet). the API, infact, restricts the number of call I can do per minute: let's suppose the API will ban me if I do more than 1 call per minute. If 100 users will load my web page, the API will receive 100 calls per minute (!!!).
Since the flow is my web page >> calls >> the API I think there is no solution without inserting another node which lazy loads from the api server.
my web page >> calls >> my server >> calls every minute >> the API
Since the instances of my web page may be many while my server is just one I think this is the solution.
However, I have no idea if:
a) is this the correct solution? Or could I somehow get my web page to behave correctly without the need of an intermediary server?
b) how can I implement this in ASP.NET MVC4? Is there any support for server side timers?
c) even if I can get IIS to retrieve the data every minute, should I then store it in a database to serve it to my web page?
d) the api server I'm talking about is The Times Newswire API. If anyone ever used a similar API, did you really created a domain model, a database table, a service and a routine just to retrieve data from it or did you just writed some javascript code in my web page? What then if you have milions of users?
You can use SignalR for this purpose, This is a push service which works by using sockets and therefore can be configured to send out one message to 1,000,000 listeners (or more obviously).
I've used this to great effect when creating a little prototype game last year and found it to be very reliable. You can use NuGet to grab the package in vs2010 and vs2012.
see Asp.net SignalR, see examples or simply google SignalR and you'll find a host of examples.
the API will ban me if I do more than 1 call per minute
Then you need a solution that calls the API for you every minute and stores it on your server. There's tons of ways of doing this, depending on many requirements. You can even go as far as writing a static HTML file which you then show the client.
You should call the API from your serverside. Fetch it one time every minute to your database. Then serve it to the users from your database via in example a restful service. ASP.NET MVC 4 can make all of this.
Also yes it do have a timer. You can check out Timer class.
You don't have to struggle with API restricts with this solution.
A) The best solution is probably to use your web server as an intermediate to the API
B) There are a lot of possibilities to choose from. The easiest would be starting a new thread when the web application starts (i.e in the Global.asax OnApplicationStarted event), and have that thread poll the external API and store the result for your clients to fetch. Another options, if you want full control of the lifecycle of this background procces, would be to create a windows service and host for instance a WebAPI in it that you clients can connect to.
Of course, these are just suggestions.
C) Depends on whether you want your clients to be able to access the latest fetch data, even if the background process has failed and the data is old. Otherwise, you can just store it in a static field somewhere. As long as the application isn't terminated the static field persists its value.
Related
I am writing an MVC webAPI that will be used to return values that will be bound to dropdown boxes or used as type-ahead textbox results on a website, and I want to cache values in memory so that I do not need to perform database requests every time the API is hit.
I am going to use the MemoryCache class and I know I can populate the cache when the first request comes in but I don't want the first request to the API to be slower than others. My question is: Is there a way for me to automatically populate the cache when the WebAPI first starts? I see there is an "App_Start" folder, maybe I just throw something in here?
After the initial population, I will probably run an hourly/daily request to update the cache as required.
MemoryCache:
http://msdn.microsoft.com/en-us/library/system.runtime.caching.memorycache.aspx
UDPATE
Ela's answer below did the trick, basically I just needed to look at the abilities of Global.asax.
Thanks for the quick help here, this has spun up a separate question for me about the pros/cons of different caching types.
Pros/Cons of different ASP.NET Caching Options
You can use the global.asax appplication start method to initialize resources.
Resources which will be used application wide basically.
The following link should help you to find more information:
http://www.asp.net/web-forms/tutorials/data-access/caching-data/caching-data-at-application-startup-cs
Hint:
If you use in process caching (which is usually the case if you cache something within the web context / thread), keep in mind that your web application is controlled by IIS.
The standard IIS configuration will shut down your web application after 20 minutes if no user requests have to be served.
This means, that any resources you have in memory, will be freed.
After this happens, the next time a user accesses your web application, the global asax, application start will be excecuted again, because IIS reinitializes your web application.
If you want to prevent this behaviour, you either configure the application pool idle timeout to not time out after 20minutes. Or you use a different cache strategy (persistent cache, distributed cache...).
To configure IIS for this, here you can find more information:
http://brad.kingsleyblog.com/IIS7-Application-Pool-Idle-Time-out-Settings/
I have a web application with private/protected methods or private/protected variables
First I would like to know when a web-server has a connection established already for a certain web application and then receives a new connection does it run a new instance of the web application for this new connection and thus re-initializing all the variables in that web application just like on a computer?
I have goggled the Internet and I am terribly confused!
Second I am using the visual studio development server and I have learned that it doesn't accept connections from other computers, I have gotten around this by using a port forwarding software. So the question is, By doing this does VS2010 web-server see each different requests as a new request or same request since am forwarding them from an app on the local computer?
Finally if I have a web application open on one browser and then decide to open it on another browser and keep the current browser open is this treated as a new request or a post-back?
The app domain is constant (can be recycled) and is created only on the first request (also can be set before that).
That is to say all the static variables are initialized only once
but all the not static classes on which your request depends are initialized on every request.
So basically all your pages in normal asp.net and all the controllers in asp.net MVC are initialized on every request.
read more about it here http://www.codeproject.com/Articles/73728/ASP-NET-Application-and-Page-Life-Cycle
*note - the image has been take from the article referred above
Its a little more complicated than that. The process is optimised for mutiple connections and is stateless, however cashing can be used to imporve scalabilty: That which does not need to be reprocessed can simply be reused: http://www.dotnetfunda.com/articles/article821-beginners-guide-how-iis-process-aspnet-request.aspx is a good place to start understanding what can go on http://msdn.microsoft.com/en-us/library/bb470252%28v=vs.100%29.aspx is a somewhat dryer ms version "iis asp page life cycle" is a good google
The web application instance handles many many requests. And shared state (cache etc) is used very effectively across those requests, whether for a single session or multiple concurrent sessions.
When a request is made, the request object (and any "page" / "controller" object) is created for that request. The state of this object is fresh, but systems like "session state", "view state", cookies, and request values can be used to repopulate it - sometimes largely automated.
A single user making separate requests is not a post-back. They are separate sessions, but even a single session that opens the same page twice (tabs, etc) is not a post-back. It mainly depends on the http verb and other evidences to determine a post-back.
You've got to read this great article: https://lowleveldesign.org/2011/07/20/global-asax-in-asp-net/ for your question. Though it's a little late, it may help others out.
I am currently developing an IRCX AJAX Chat based system and have a few questions regarding the Server and Client implementation; Any suggestions are welcome:
Server
Should this be implemented as a Web Service, or a Windows Form application? I have experience in developing Windows Forms based servers, however I am wondering if this would be better implemented as a Web Service and if so, why?
Client
How are Web Based Clients implemented today and what is the preferred method to implement a Web Based Client?
My solution so far are
ASP.NET Web Forms with an AJAX Update Panel (This seems the most viable)
Using jQuery connecting to the web service with a JavaScript timeout
Polling
How frequently should the server be polled for new messages? 0.5 seconds seems a bit excessive and anything between 2 or 3 seconds seems sluggish.
Thanks for your input.
Have a pool of connections and maintain a sort of proxy between the server and clients that sends the data to the right client based on a session id. This would mean your chat server is protected against packet attacks and you would not have to deal with web sockets which an attacker could hijack and do what they require with it.
I know the question is old, but there's an even better approach now.
SignalR is designed for things like this (real time web functionality)
SignalR can be used to add any sort of "real-time" web functionality to your ASP.NET application. While chat is often used as an example, you can do a whole lot more. Any time a user refreshes a web page to see new data, or the page implements Ajax long polling to retrieve new data, is candidate for using SignalR.
Here's a tutorial for a basic chat application HERE.
For more information, visit the SignalR website.
I believe using ASP.NET (Sockets and an Update Panel) seems to be the best approach. Using jQuery in this context now seems a bit invalid because it would not maintain a persistent state with the Chat Server which is required for Real Time Communication.
An alternative way I found would be using a Web Sockets and Backbone.JS to deal with the data returned from the server.
http://blog.fogcreek.com/the-trello-tech-stack/
I have a web part in Moss 2007 that iterates through a given list using the Sharepoint API, and sets the field value to some predetermined value. Basically bulk editing a list.
For small lists, the web part works great. But when I am dealing with list that have a large amount of items, I get an internal error, or a request timeout.
Is there a way that I can click on the button, and have in the background Asynchronously perform the actions, so that the submitter doesn't have to wait for operation to complete, or can get the post back later?
How would I do this?
Thanks in advance, need all the help i cant get.
PS. There is no .aspx page in project, just a straight up class that has createchildcontrols and other functions.
PSS. I have very little control over and web.config or anything of that matter on the server itself
Can you build a custom timer job that performs the bulk editing, and the web part would just add items for the timer job to process.
If you can deploy a web service to the farm you could re-write your web part to use AJAX to call into the web service to perform the work.
Another option would be to have your web part spawn background threads, but I would be very wary of this option. Not only is managing threads difficult, but they will be running in SharePoint's IIS worker process thus increasing the opportunity for your web part to bring down the site.
I would like to set timer for getting page source from some page and compare it if some changes happens with a version of yesterday.
which logic should i use to call some page once a day? my program is just a webpage and it cant send requests every 24 hours. how can i achieve to send request and check if changes happens?
you don't want to use a Web Application to do this, since a Web application typically responds to requests and doesn't wake up and make requests of its own (which is what you need).
What you need is a regular .NET application. It could be a console application that makes the call out to this other website. you could use the WebClient class or similar to do the job.
Once you have it all working, you can use Windows Scheduler to schedule the task at whatever interval you need.
If you need to then communicate with you Web Application from the console app, you can do the same thing, that is make a request to a specific page or handler from you console app (just like you called a page on the other website). When you web application receives the request, you can act on this and do what you need to.
Keep in mind that if all you need to do is update some database tables, you might as well do that from the console application.
Think of your console app (that uses WebClient) as a custom browser. You can call any url you need to and the Web application on the other end sees that call as if it were made by a browser. So using you specialized "browser" you can call other web sites as well as your own.