Caching & Maintaining Data In Console Application Strategy [C#] - c#

Is there a strategy for caching data in a console application that is maintained after the console application is terminated and starts up again?
For example, when my console application starts up, 4 calls are made to my database which returns a fair amount of data. The rest of the application runs and uses these lists. When the console application starts up again at the next scheduled interval it will have to retrieve these four lists again. Is there a way to have those lists cached for a certain amount of time to reduce the amount of times I have to call the database?
My current set up is a Powershell script that just pings a URL on my website which obviously can cache these 4 lists and maintain them. However I think I need to move this function into console applications to remove the load from the IIS process as I've had some high CPU spikes on my server and Im assuming its to do with this code.
One idea I had was to give an API endpoint for these four lists in my website (so they can be cached) and call that from my console application. Is that the best way to handle this or is there a proper way of caching data and maintaining it after a console application has ended and started up again?

You could use a local file and store the values. Maybe in conjunction with a database or endpoint, adding some expire date to a tag in the file.
A local file access will be much faster then accessing a database or a any remote call. A remote call, lets say from a database or IIS endpoint could be used for your first load time.

Related

EF6 slow first query implications on deployment

Knowing that Entity Framework is slow on a cold query (first query after model compilation), I am doing some of the standard work around methods to speed it up. Mainly pre-compiled views as well as making a dummy http request on the client side as soon as the application loads to trigger a query to start the model process.
My question here is specifically around how this works for a deployed application. For example, if I deploy this on Azure, is it the first cold query for the entire application that will trigger the model compilation, or will this slow cold query happen for each individual user that uses the application? In simple terms, does it happen once and only once, or every time a user hits the site for a new session?
The EF slow start is triggered from the first request/s coming into the web server that requires database services.
A couple points to note,
If you deploy to an Azure web app, ensure that the 'AlwaysOn' application setting is enabled. If not, after a given time period the web app will be suspended and the next request will trigger another cold start.
Similarly if you deploy to a VM with IIS you'll need to check the application recycling settings.
When you deploy a new version of the application code, the process will need to be restarted which will cause another slow start.
A good approach to mitigate such slow starts is by using deployment slots and pre-warming slots before sending actual user traffic to it. This is straightforward to achieve using Azure Web App deployment slots.

Calling external services which randomly timeout kills the ASP.NET MVC application

So I have this web application (ASP.NET MVC 4 Web Site) which has at least 2,000 online users at any time. One of the most popular pages in my application contains data about user, and this data is not located in my repository, it is contained in some external vendor which is integrated into my system. So whenever this page is drawn I have to make a call to those services (currently there are 17) and than draw the page according to the data given by them. The data is subject to change in any given moment so I cannot cache it. Everything is working OK most of the time and the CPU utilization is 5% - 30% (depending on the number of online users of course). For each service call I have timeouts of 5000 milliseconds (for service references I set the SendTimeout and for the raw HttpWebRequests' I set the TimeOut property to be equal to 5000 milliseconds) Now suppose that one service is down, the CPU utilization of my server goes unxpectidly low like 3% - 8% and the application is lagging, I mean it takes some time to load pages (any page), for instance, if in a normal mood the response from my application would have taken (150-250ms) now it takes 1-3 seconds. I'm out of ideas of what to do. I cannot decrease the timeout because some services are taking 3-4 seconds sometimes so the 5 second timeout is the lease I can give. What can I do to prevent the late response ? I know it's bit general question. Any suggestion would be appreciated. Thanks in advance.
It looks like you have a threading problem. Too many threads are waiting for response from the external service and they can not process other requests.
What I recommand you is to use Async Controller: http://www.asp.net/mvc/overview/performance/using-asynchronous-methods-in-aspnet-mvc-4
Suggestion 1
What if you replicate this data to your server?
Meaning you can have another service, that works separately and synchronize data of external service with your server... and your websites always point to your local data... Right, this is some kind of caching, and web pages can show kind of old data... but you can set replication service to check data as often as you need...
Suggestion 2
Another suggestion that come to mind, can you use push notification instead? All the web pages open and wait, where server checks the data, and notify all the clients with the fresh data... in this case only one thread will be busy with external data, and all the opened users will have fresh data as soon as it is available. As a starting point, check SignalR

How to share variable between all instances of worker process asp.net

I had To move a pre built asp website from single worker process environment to multiple worker process environment on cloud servers.
I was having a class with static arraylist variable which use to contain last 2 minutes of all session information for tracking purpose. The admin use to access this arraylist to view live reports. But on moving it to cloud infrastructure this has breaked down results are no longer correct. It depends on which server behind load balancer is serving the pages thus we have multiple instances of the static variable per app pool. I tried to move to mysql but we needed to flush out data regularly and it was also having performance issue. Here the arraylist is processed heavily to churn out useful data thus I need something which is inmemory.
Please note that before also the use of static variable without lock was the downside but that only led to difference between 1 or 2 records but was blazing fast.
You can consider backing your session by SQL server based session storage.
Alternatively you can use a application caching server to back it. That will let you share it across multiple web servers.

Consuming WCF Service in ASP.NET, how to cache?

I have two situations in this case:
I want to query a WCF service and hold the data somewhere, because one of the web pages renders based on the data that's retrieved from the service. I don't want the page itself querying the service, but I'd rather have some sort of scheduled worker that runs once every a couple of minutes, and retrieves the data and holds it somewhere.
Where should I cache the service response, and what is the correct way to create the task to query the service every couple minutes?
I think I could achieve this by saving the response to a static variable, alongside the last query date, and then check on the page load if enough time has passed, I call the service and refresh the data, else I use the static cache.
This would also account for the case where no users access the page for a long time, and the site not futilely querying the service.
But it seems kind of rough, are there other, better ways to accomplish this kind of task?
You could indeed take another approach like having a scheduled program query the information and put it in an in-memory cache available to all the web servers in your farm. However, whether that would be better for your scenario depends on the size of your app and how much time/effort you want to spend on it.
An in-memory cache is harder to implement/support than a static variable but it's sometimes better since static variables can be cleared up every time the server resets (e.g. after X number of minutes of inactivity)
Depending on the size of your system I would start with the static variable, test drive the approach for a while and then decide if you need something more sophisticated.
Have you taken a look at Velocity
Nico: Why don't you write a simple console daemon that gets the data and stores it on your end in a database and then have your web app get the data from your local copy? You can make that console app run every certain amount of time. Inserting the data should not be a problem if you are using sql server 2008. You can pass datatable parameters to a stored proc and insert a whole table in one call. If you don't use Sql Server 2008, then serialize the whole collection returned by the web service and store in a table in one big blob column and record the timestamp when you got the data. You then can read the content of that column, deserealize your collection and reconstruct it to native objects for displaying on your page.
I've never seen (and I don't think its possible) to have your web app query the web service every certain amount of time. Imagine the web site is idle for hours therefore no interaction from anybody. That means that no events will fire and nothing will be queried.
Alternatively, you could create a dummy page executing a javascript function at certain intervals and have that javascript function make an ajax request to the server to get the data from the web service and cache the data. The problem is that the minute you walk out of that page, nothing will happen and you'll stop querying the web service. I think this is silly.

Best approach to fire Thread/Process under IIS/WCF, in a shared hosting

Scenario: A WCF service receives an XDocument from clients, processes it and inserts a row in an MS SQL Table.
Multiple clients could be calling the WCF service simultaneously. The call usually doesn't take long (a few secs).
Now I need something to poll the SQL Table and run another set of processes in an asynchronous way.
The 2nd process doesn't have to callback anything nor is related to the WCF in any way. It just needs to read the table and perform a series of methods and maybe a Web Service call (if there are records of course), but that's all.
The WCF service clients consuming the above mentioned service have no idea of this and don't care about it.
I've read about this question in StackOverflow and I also know that a Windows Service would be ideal, but this WCF Service will be hosted on a Shared Hosting (discountasp or similar) and therefore, installing a Windows Service will not be an option (as far as I know).
Given that the architecture is fixed (I.E.: I cannot change the table, it comes from a legacy format, nor change the mechanism of the WCF Service), what would be your suggestion to poll/process this table?
I'd say I need it to check every 10 minutes or so. It doesn't need to be instant.
Thanks.
Cheat. Expose this process as another WCF service and fire a go command from a box under your control at a scheduled time.
Whilst you can fire up background threads in WCF, or use cache expiry as a poor man's scheduler those will stop when your app pool recycles until the next hit on your web site and the app pool spins up again. At least firing the request from a machine you control means you know the app pool will come back up every 10 minutes or so because you've sent a request in its direction.
A web application is not suited at all to be running something at a fixed interval. If there are no requests coming in, there is no code running in the application, and if the application is inactive for a while the IIS can decide to shut it down completely until the next request comes in.
For some applications it isn't at all important that something is run at a specific interval, only that it has been run recently. If that is the case for your application then you could just keep track of when the table was last polled, and for every request check if enough time has passed for the table to be polled again.
If you have access to administer the database, there is a scheduler in SQL Server. It can run queries, stored procedures, and even start processes if you have permission (which is very unlikely on a shared hosting, though).
If you need the code on a specific interval, and you can't access the server to schedule it or run it as a service, or can't use the SQL Server scheduler, it's simply not doable.
Make you application pool "always active" and do whatever you want with your threads.

Categories

Resources