The goal is that I want my overall website response time to be instantaneous.
The problem is that I do no have IIS access, my website is hosted using external service and I have no control to the IIS panel.
My current approach now is having a scheduled code that keeps my website alive. The problem with this only approach is the hosting service has an algorithm to shutdown all their hosted website like every some hours.
This is why I need to implement another approach which to warm up / pre-load the website each time it runs.
How to do this when there is no access to the IIS panel?
The solution requires no 3rd party sites; robots; or apps, you merely write a very simple app yourself that periodically performs a trivial web function, perhaps a REST GET. By performing this function say every few minutes not only do you guarentee that the IIS pool won't timeout and be cold for a client, but it also has the nice effect of ensuring your website is up and running in a warm condition (JIT'd; and running) ready for a real request for your non-heartbeat website requests.
e.g.
In your website expose a REST API, say www.misspiggy.com/api/hiyaaaa that does nothing other than to return HTTP 200 OK.
By implemententing this in your ASP.NET app, any request to the above URL will cause your stopped or cold ASP.NET website to be JIT'd during:
first deployment (and even then only during a request is made to it)
after the IIS AppPool has timed out and needs to restart on demand
The client code that makes the REST request can be anything:
a console app
a Windows service
WinForms/WPF app
The console app can be triggered to fire via Windows Task Scheduler say every 5 minutes thus saving you the hastle of building in a scheduler.
My current approach now is having a scheduled code that keeps my website alive. The problem with this only approach is the hosting service has an algorithm to shutdown all their hosted website like every some hours
I suggest you set your ping period to be a matter of minutes rather than hours.
No admin access to server required
The problem is that I do no have IIS access, my website is hosted using external service and I have no control to the IIS panel
It should be pointed out that this solution does not require you to install anything new on the server nor make any changes on the server.
Azure
It is interesting to note that Azure Application Insights has Availability Tests that though designed for testing web site availability, can be used for this exact same purpose of keeping your website alive and warm ready to go for web clients. In fact this is what I do for my web apps.
Doing so keeps response times and latency as low as possible.
There are a number of things you can do but a real simple solution is to use a website monitoring site something like statuscake or uptime robot there are a large number of them out there. You set them up call a page or pages on your website at set intervals to ensure it is still up this has the added bonus of keeping the site warm.
I would also precompile your mvc app if you arent already doing that.
HTH
Related
Currently working on a .NET solution for an application server. I'm using .NET 4.0 running on Windows Server 2008 R2 with IIS 7.5.
My requirements are:
The application server can run multiple Console applications at once on a schedule - Quartz.net looks like a really good solution to this problem - and is working well for me so far
The application server will also host a web application that will report on jobs (what time they ran, what they did, how long they took etc)
I would like to be able to restart the "service" that is running my jobs and trigger ad hoc jobs from the web interface.
The Service that is running my jobs needs to run all the time
Once this is live I will not have direct access to the machine to restart a Windows Service, but i could potentially setup IIS to be able to do this for me.
WCF Services looks quite promising to me - but I'm not sure where to host it. My current project uses a WCF Service to run console applications using the Quartz.net plugin. Configuration for what to run and when to run it is stored in an oracle database and my WCF service connets directly to the database to retrieve this information (not sure if that is the intended use of WCF).
If I host the WCF Service in IIS / WAS then running the console applications might be a security concern from what I've read. I can keep the WCF service running all the time using appFabric at least. Alternatively I could host it in a Windows Service and allow my web app to consume the WCF service to report on the jobs. I'm concerned about using a Windows Service though as I wont have direct maintenance access to this machine and if it breaks I'm in trouble. I would really like to be able to do the maintenance from a web application. A windows service also feel a little unnecessary given it can be hosted from IIS.
So my question is - is a WCF Service the right approach to this problem or can anyone think of a better approach?
If a WCF service is a good approach - where should I host it so that I can perform maintenance via a web interface given I will not have direct access to the machine itself?
Should the WCF service be the one to start and schedule the jobs?
I think you're overengineering it, possibly.
The Problem: You have a web site which needs to start up jobs on an ad-hoc basis. Other jobs will be run to a fixed schedule. The web site will report on all/any of these jobs.
For running the scheduled jobs, a Windows Service using Quartz is indeed an ideal solution for the fixed schedule part. However, to report on those jobs the data must be collected by the Service and made available. A service can be set up to restart on fail, so you can guarantee that it will always be running (barring a minute or two when it's restarting if it fails - and why should it?. However, any history will be lost unless the Service stores it somewhere it can be retrieve it after a restart.
The simpler solution to the web site getting the history is for the Service to write its data to a database. Then it doesn't worry about a restart: all the history has already been saved, and the data can be read by the web site at any time.
Similarly, if the web site talks directly to the Service (as a WCF Service or otherwise) then what happens if the service is not currently running? The request gets fails until the restart is completed. Frustrating for the user. Alternatively, the web site puts the request into the database. The service monitors the database for requests, and starts jobs appropriately when it sees a new request. If a request is written while the service is not running, when it restarts it will see the request(s) in the DB and execute them.
So I think using a WCF service is overkill, and actually introduces some problems: persistence of history, and what to do about requests made while the service is down. These problems don't arise if you go the way I've described.
Cheers -
How to host a Windows Service in IIS and keep that service runing like it is running on Windows?
Could I use some feature from WCF service?
I've not access to the Windows itself, only to IIS. Inside that service I'll create a thread which at scheduled time will process some data.
In short, you can't.
A more detailed answer is that there are 2 problems:
IIS worker processes are launched only when a HTTP request comes in. This means you can't start your service with the system.
IIS worker processes are recycled (i.e. restarted) on several conditions. For example, a worker process is restarted if no HTTP request comes in for a long time. This means you can't control when your service is shut down, unless you have access to application pool recycling configuration. Keep in mind that the recycling logic only ensures that all pending HTTP requests are complete, but does not await all background threads to complete.
You can come with a partial solution this way:
Create a WCF service method that checks if your long-running thread is alive and if not, starts it.
Create a very simple windows service that periodically (once in 5 seconds) calls that method. Deploy the service somewhere, e.g. on your own machine.
The only question that remains is: do you really need to avoid windows services? Could you find a place to host the service? There are some use cases when a windows service is the best or even the only way.
You cant, in a nut shell.
However you can make use of the health monitoring API specifically the heartbeat functionality. see:
http://msdn.microsoft.com/en-us/library/system.web.management.webheartbeatevent.aspx
for details on the class you will need to implement to be called when there is work to do
also this answer on SO might help
Understanding heartbeat in ASP.NET health monitoring
Once you have implemented a webheartbeatevent derived class you can check your db or what ever you want to check if there is work to do.
A better solution IMHO is to scrap the service entirely and redesign the system to be 100% web based, as services become a deployment and maintenance nightmare. as i assume you are now finding out...
I am right now in the middle of creating a new web application for Azure.
I've noticed that if I do not visit the site for a while (30+ minutes) that it will take a while to load (20+ seconds) on my first visit. I presume this is because Azure has to go though and compile the application. Is there a way to either prevent the application from having to be complied after idling for an extend amount of time. Or somehow pre-complie the web application locally - and then deploy it to Azure, so it does not need to be complied on the server?
I am using VS 2012, Web Application (Web Forms) and Web Deploy
You can access my Web Site Here.
Unfortunately there is no way round this with Azure websites. As you said it is due to the fact that IIS is a demand driven web server and so only does things when it is asked to. So an IIS worker process only spins up when a request arrives for the site that is hosted in this worker process.
If you're using VS2012 and web deploy then you are most probably already compiling the code. In .Net this compile step only takes it part way though into IL (intermediate language) which is CPU independent, the worker process then needs to take this and convert it into native code that can be run on that machine. That is why your site is taking a while to load.
They did start shipping a warm up module (Application initialisation) with IIS 7.5 which was included in IIS 8 to solve this problem for initialisation heavy sites unfortunately it's not available with Azure web sites as it's a native module. If you want to use it then you would have to switch to Azure cloud services or virtual machine to run your site.
The other alternative which I've known people to use is to use a cloud monitoring service such as pingdom which obviously continuously makes request to a page on your site which keeps the worker process alive. One last alternative which is far from ideal is to have a simple script somewhere that makes a request to the page to keep it alive.
If your website becomes popular, however, there is no need for any of these steps as the mere fact that people are visiting your website will keep the worker process alive.
I just ran your site through Google Page Speed:
https://developers.google.com/speed/pagespeed/insights#url=http_3A_2F_2Fffinfo.azurewebsites.net_2F&mobile=false
If you are concerned about speed/performance on a "shared" WebSite instance you should fix some of the items listed in there. Having a huge background of 375kb is probably not the best idea...and its not even compressed.
If you can, move to an "extra small" instance of a cloud service and you can optimize a lot of additional things (turn off ASP.NET modules, remove headers, control compression, client caching). Your goal is to have a popular site, correct?...start it off right :)
It's not out of the box, but one easy way to keep your cloud service warm by having a scheduled ping is to use Windows Azure Mobile Services as described here
It's basically a small script that is scheduled to hit your website every 15 minutes.
function KeepAlive()
{
KeepSiteAlive("http://www.yousayhello.co.uk");
}
function KeepSiteAlive (siteurl)
{
console.info("Warming up "+siteurl);
var httpRequest = require('request');
httpRequest(siteurl, function(err, response, body)
{
if (err)
{
console.warn("Couldn't retrieve site");
}
else if (response.statusCode !== 200)
{
console.warn("Bad response");
}
else
{
console.info("Site Warmed Up!");
}
});
}
I'm creating a ASP.NET .NET 4.0 website and part of this site requires that there is an "always running" application. Normally I would create a Windows Service for this, but the site will be hosted within a shared hosting environment, and unless I get a virtual server, then this isn't a possibility.
My first thought was to have a thread running in the background that would do this and it would be created on Application_Start and destroyed on Application_End. I've looked around and this seems like it could be an option, but I would of course have to hit the site in order to cause the Application_Start to be called and if the associated AppPool is recylced, then this process would have to be repeated (so I believe?!?).
Within a normal ASP.NET website does these seem possible?
In the end I had a seperate thread that sits and waits for a signal to be set. Once set it then does it's work. To make sure the thread is always active I make a HTTP request for a "dummy" page to ensure that, if the AppPool was recycled, then the Application_Start event is triggered and the thread restarted.
It depends on what mean by 'always running application':
If it's a realtime service, it still makes sense to run it as separate process, even if it may have a web front-end. It's so because ASP .NET server was designed by Microsoft for specific tasks(to run web apps, render pages etc.) in many aspects like memory usage or multithreading. And I'd prefer to use at least a VDS in such case.
Another case is when it's is a periodically(say every hour) alarmed application which does some uncomplicated work - perhaps your shared hoster has some mechanisms to trigger a specific page to do some work(as my hoster does). For example, I have an ASP .NET page that monitors the tour date list of my favourite band, and sends email notification when they are going to play a gig in my town - it's triggered by hoster every 4 hours.
Question: When a webapplication gets started, it executes Application_Start in global.asax.
Now, a web application gets started as soon as the first request for a page in that application reaches the server.
But my question is: how long will the application run until the application is stopped.
I mean when after the first page request, there's no traffic on the server.
I need to know because I intend to start a server that listens on a tcp port in global.asax.
And when the application stops, the server ceases to listen to its port.
It depends on your IIS settings. Your application will run in an application pool, which takes a bunch of settings defining the behaviour of this pool.
The thing you're looking for are recycling settings. In IIS 7, you can access these easily from the management console. Go to Application Pools, right click on the application pool your app runs in (if you don't know which one that is, then it's probably the DefaultAppPool) and select recycling.
Here you'll find the options you have to control the recycling behaviour of your app pool, which in turn controls when your app 'resets'.
in a word (well 2) - shared hosting.
on shared hosting beware, (godaddy/webhost4life etc) this timeout could well be less, plus you don't have option to configure that on these hosting environments. i've had cases where the app pool is recycled after 5 mins at certain peek times, so you might have to investigate 'wakeup' routines to poke your app to keep in in the memory. i do this for a few shared hosting apps to great effect using pingalive.com.
hope this helps, even if in an abstract way.
jim