C# HttpRuntime.Cache.Insert() Not holding cached value - c#

I'm trying to cache a price value using HttpRuntime.Cache.Insert(), but only appears to hold the value for a couple hours or something before clearing it out. What am I doing wrong? I want the value to stay in cache for 3 days.
HttpRuntime.Cache.Insert(CacheName, Price, null, DateTime.Now.AddDays(3), TimeSpan.Zero);

Short answer
Your application pool or website is being shutdown too soon. Extend the idle timeout on the site, extend the application pool lifetime for the pool running the site. Raise the memory allocation and request limits.
Full answer
If you want to know when and why something is being removed from the cache, you need to log the item removal using the CacheItemRemovedCallback option on the insertion... Then you can log the reason using the CacheItemRemovedReason argument. You can thus log the reason as one of the four listed reasons:
Removed The item is removed from the cache by a Remove method call or by an Insert method call that specified the same key.
Expired The item is removed from the cache because it expired.
Underused The item is removed from the cache because the system removed it to free memory.
DependencyChanged The item is removed from the cache because the cache dependency associated with it changed.
Typically, you will find Expired and Underused being the reasons for things that don't have explict Remove calls made against the cache and don't have dependencies.
You will likely find out, while tracing through this fun stuff, that your items are not being expired or underused. Rather, I suspect you'll find that the AppDomain is getting unloaded.
One way this can happen due to the web.config (or bin directory, or .aspx, etc.) files getting changed. For more information as to when this occurs see the Application Restarts section of this page. When that happens, the currently pending requests are drained, the cache emptied, and the AppDomain unloaded. You can detect this situation by checking the AppDomain.IsFinalizingForUnload and logging that during the callback.
Another reason for the AppDomain to recycle is when IIS decides to recycle the AppPool for any of the reasons it has been configured with. Examples of that are xxx memory has been allocated over the lifetime, yyy seconds of runtime for the AppPool, ttt scheduled recycle time, or iiii idle time (no requests incoming). For further details check this article for IIS6 or this article for IIS7

The Cache object doesn't guarantee that it will hold onto cached objects at all, much less for the full amount of time that you suggest.
If you want to more strongly encourage it to do so, you can set CacheItemPriority.High or CacheItemPriority.NotRemovable when you insert an item into the Cache. With the default Normal priority, the runtime has a fairly aggressive policy of letting go of objects when memory pressure increases.
On top of that, by default the IIS AppPool will recycle once/day or so, which will clear everything in the Cache.

The docs http://msdn.microsoft.com/en-us/library/4y13wyk9.aspx say that Cache.NoSlidingExpiration must be used if using an absolute expiration.
HttpRuntime.Cache.Insert(CacheName, Price, null, DateTime.Now.AddDays(3), Cache.NoSlidingExpiration);
this may not be your problem though, i just found that Cache.NoSlidingExpiration should be the same as TimeSpan.Zero.
Next i would check that your app pool isnt expiring and check how much cache you are using. If it's a high traffic site using a lot of memory (ie memory cache) then it will expire cache items as the memory is needed for other things.
also check the last comment here http://bytes.com/topic/net/answers/717129-c-asp-net-page-cache-getting-removed-too-soon someone seems to have found a solution to your problem.

Check the recycle time on your App Pool.

By default, items added to the cache have no set expiration, so this is definitely something outside the cache. I agree with Josh, you should check the recycle time on your App Pool.
Check out this page to see an example of how you can add a delegate to let you know exactly when your item is being removed from the cache. This might help you in troubleshooting if it's not your App Pool:
http://msdn.microsoft.com/en-us/library/system.web.caching.cache.add.aspx
~md5sum~

Related

MemoryCache - prevent expiration of items

In my application I use MemoryCache but I don't expect items to expire. Items are therefore inserted to the cache with default policy, without AbsoulteExpiration or SlidingExpiration being set.
Recently, on high server last, I experienced problems with cache, as it returned null values in place of desired values, inserted to the cache before. It turned out, that not only items eligible to expire (as those with expiration date explicitly set) are removed from the cache. Under memory pressure, where values of CacheMemoryLimit and/or PhysicalMemoryLimit are exceeded, MemoryCache removes other elements as well.
How to prevent this? How to be sure, that when element is set to the cache once, it can be safely fetched from it again?
I considered setting the PollingInterval to some huge value, but this only delays the potential problem (and the polling interval is referenced in documentation as maximal time, not the exact or minimal time). Setting PhysicalMemoryLimitPercentage to 100% also does not solve the problem since it references to the physically installed memory and not to the whole available virtual memory. Or am I wrong and it would indeed help?
CacheItemPolicy has a Priority property which can be set to NotRemovable.
You do need to be aware of how much data you are adding to the cache with this setting, though. Continuously adding data to the cache and never removing it will eventually cause memory or overflow issues.
A cache is typically used where it's acceptable for an item to no longer exist in the cache, in which case the value is retrieved again from persistent storage (a database or file, for example).
In your case, it sounds like your code requires the item to exist, which may suggest looking for another approach (a static ConcurrentDictionary as mentioned in the comments, for example).

Using InProc and Azure AppFabric Cache together

Just a bit of background first. I currently have a site hosted with Windows Azure, with multiple instances and also AppFabric as my sole caching provider.
Everything was going great until my traffic spiked earlier this morning. After the instances became overloaded and stopped responding everything came good again once the new instances started.
However I started getting messages from AppFabric saying that I was being throttled because there were too many requests in a given hour. Which is fair enough, it certainly was giving it hell.
In order to avoid these messages in the future I was planning on implementing an InProc cache for very short lifespan. So it checks InProc first, if not goes to AppFabric, if not goes to DB.
ObjectCache cache = MemoryCache.Default;
CacheItemPolicy policy = new CacheItemPolicy();
policy.AbsoluteExpiration = DateTimeOffset.Now.AddMinutes(5);
The questions I have are
Is this the best way to handle the situation?
Is this going to interfere with AppFabric Caching?
Any issues I am overlooking?
Update
I just wanted to say I chose the above method and it works well. I was using it only for general data storage and not session state. MemoryCache with session state would not work too well on Azure due to no server affinity (as mentioned by David below).
Update 16-03-2012
After realizing the obvious I also disabled SessionState on most pages. Most of my pages don't need it and hence this rapidly decreases my calls to cache under heavy load. I also disabled ViewState for most pages as well, just for that slightly quicker page load time.
Are you using cache to provide SessionState storage, or general data storage by your application, or both? It's not totally clear, because InProc usually refers to SessionState, but your sample code does not look like SessionState.
Assuming that you're storing data which can be safely cached locally, then I would recommend looking into AppFabric Local Caching. It does basically what you want, and doesn't require writing any separate code (I think...).
Otherwise, using MemoryCache as you outlined is a workable scheme. I've done this in my apps, you just need to be careful to avoid cache incoherence issues.
Depending on your application, you may also want to implement a per-request cache by storing data in the HttpContext.Items collection. This is helpful when different parts of your code might request the same data during a single request.
Try this: http://msdn.microsoft.com/en-us/magazine/hh708748.aspx
One thing I have done is use HttpContext.Items. This is only a per request cache but depending on the nature of your system can be useful.
I wouldn't suggest inproc, due to the fact there's no server affinity.
One option, with With Windows Azure Cache, to avoid the hourly quota throttling is to bump up cache size. Fortunately the price doesn't scale linearly. For instance: $45 for 128MB, $55 for 256MB. So one option is to bump up your Cache to the next size. You'll need to monitor Compute performance though, via perf counters, as there's no way to monitor cache usage realtime.
Another option is to move session state to SQL Azure, which is now an officially-supported session state provider as of Azure 1.4 (Aug. 2011 - see this article for more info). With the latest SQL Azure pricing updates, if the db stays below 100MB, it's a $4.99 monthly rate instead of the original $9.99 baseline. It's amortized daily, so even if you have transient spikes and go into 1+GB range, you still have quite an affordable cache repository.
Another possible solution would be to use Sticky Sessions like this example:
http://dunnry.com/blog/2010/10/14/StickyHTTPSessionRoutingInWindowsAzure.aspx

When is it appropriate to use CacheItemRemovedCallback?

I have a large data set that is updated once a day. I am caching the results of an expensive query on that data but I want to update that cache each day. I am considering using CacheItemRemovedCallback to reload my cache on a daily interval, but I had the following concerns:
Isn't it possible that the CacheItemRemovedCallback could be called before my expiration (in the case of running out of memory)? Which means reloading it immediately doesn't seem like a good idea.
Does the CacheItemRemovedCallback get called before or after the item is actually removed? If it is after, doesn't this theoretically leave a period of time where the cache would be unavailable?
Are these concerns relevant and if using CacheItemRemovedCallback to reload your cache is a bad idea, then when is it useful?
If you're going to reload, be sure to check the CacheItemRemovedReason. I recently had to debug an issue where a developer decided they should immediately re-populate the cache in this method, and under low memory conditions, it basically sat chewing up CPU while it got stuck in a loop of building the cache objects, adding them to the cache, expiring, repeat.
The callback is fired after the item is removed.
From everyone's responses and from further reading I have come to the following conclusion:
My concerns are valid. Using CacheItemRemovedCallback to refresh cached items is not a good idea. The only practical use for this callback seems to be logging information about when your cache is removed.
It seems that CacheItemUpdateCallback is the more appropriate way of refreshing your cache on a regular interval.
Ultimately, I have decided not to use either of these calls. Instead I will write a service action so the database import job can notify my application when it needs to refresh its data. This avoids using a timed refresh altogether.
Yes, there is a change that the method could be fired off for a lot of various reasons. However, loading or waiting to load the cache again would be dependent upon what is best for your typical use case in your application.
CacheItemRemovedCallback does indeed fire after the item is removed from the cache. Right before the item is to be removed, you can use the CacheItemUpateCallback method to determine whether or not you want to flush the cache at that time. There may be good reasons to wait in flushing the cache, such as you currently have users in your application and it takes a long amount of time to build the cache again.
Generally speaking, the best practice is to test that your cached item actually exists in the cache before using its data. If the data doesn't exist, you can rebuild the cache at that time (causing a slightly longer response for the user) or choose to do something else.
This really isn't so much a cache of individual values as it is a snapshot of an entire dataset. As such, you don't benefit from using the Cache class here.
I'd recommend loading a static collection on startup and replacing it every 24 hours by setting a timer. The idea would be to create a new collection and atomically assign it, as the old one may still be in use and we want it to remain self-consistent.

Is it OK to use static variables to cache information in ASP.net?

At the moment I am working on a project admin application in C# 3.5 on ASP.net. In order to reduce hits to the database, I'm caching a lot of information using static variables. For example, a list of users is kept in memory in a static class. The class reads in all the information from the database on startup, and will update the database whenever changes are made, but it never needs to read from the datebase.
The class pings other webservers (if they exist) with updated information at the same time as a write to the database. The pinging mechanism is a Windows service to which the cache object registers using a random available port. It is used for other things as well.
The amount of data isn't all that great. At the moment I'm using it just to cache the users (password hashes, permissions, name, email etc.) It just saves a pile of calls being made to the database.
I was wondering if there are any pitfalls to this method and/or if there are better ways to cache the data?
A pitfall: A static field is scoped per app domain, and increased load will make the server generate more app domains in the pool. This is not necessarily a problem if you only read from the statics, but you will get duplicate data in memory, and you will get a hit every time an app domain is created or recycled.
Better to use the Cache object - it's intended for things like this.
Edit: Turns out I was wrong about AppDomains (as pointed out in comments) - more instances of the Application will be generated under load, but they will all run in the same AppDomain. (But you should still use the Cache object!)
As long as you can expect that the cache will never grow to a size greater than the amount of available memory, it's fine. Also, be sure that there will only be one instance of this application per database, or the caches in the different instances of the app could "fall out of sync."
Where I work, we have a homegrown O/RM, and we do something similar to what you're doing with certain tables which are not expected to grow or change much. So, what you're doing is not unprecedented, and in fact in our system, is tried and true.
Another Pitfall you must consider is thread safety. All of your application requests are running in the same AppDomain but may come on different threads. Accessing a static variable must account for it being accessed from multiple threads. Probably a bit more overhead than you are looking for. Cache object is better for this purpose.
Hmmm... The "classic" method would be the application cache, but provided you never update the static variables, or understand the locking issues if you do, and you understand that they can disappear at anytime with an appdomain restart then I don't really see the harm in using a static.
I suggest you look into ways of having a distributed cache for your app. You can take a look at NCache or indeXus.Net
The reason I suggested that is because you rolled your own ad-hoc way of updating information that you're caching. Static variables/references are fine but they don't update/refresh (so you'll have to handle aging on your own) and you seem to have a distributed setup.

How to purge expired items from cache?

I've got a nice little class built that acts as a cache. Each item has an expiration TimeSpan or DateTime. Each time an attempt to access an item in the cache is made, the item's expiration is checked, and if it's expired, the item is removed from the cache and nothing is returned.
That's great for objects that are accessed frequently, but if an item is put in the cache and never accessed again, it's never removed, even though it's expired.
What's a good methodology for expiring such items from the cache?
Should I have a background thread infinitely enumerating every item in the cache to check if it's expired?
The best code is no code. Use the ASP.NET cache instead. You can reference it as System.Web.HttpRuntime.Cache in any application, not just web applications.
In my experience, maintaining a custom caching mechanism became more trouble than it was worth. There are several libraries out there that have already solved these problems. I would suggest using one of them. A popular one in .Net is the Enterprise Library, although I have limited experience with its caching abilities.
If you must use a custom caching mechanism, then I see no problem with a watchful thread idea you suggested. That is, if your application is a server-based application and not a web app. If it's a web app, you already have built in sliding expiration. You can then just wrap it in a strongly typed wrapper to avoid referencing cache items by key each time.
You can implement an LRU (Least Recently Used) strategy, keep your items sorted by access time, when a new item is inserted into the cache and the cache is full you evicted the item that is last in that list. See Cache algorithms at Wikipedia.
If you want to expire immediately, i would still only do that when things are accessed. I.e. when the cache object is accessed and it's time has expired refetch it.
You could also on any change to the cache (re-)start a Timer with the Interval set to the closest expiry timestamp. This will not be accurate to milliseconds and depend on a message pump running, but is not very resource-demanding.
Harald Scheirich's answer is better though, if you don't mind that objects are hanging around forever, when the cache is not updated.
You could clear suitably old items out of the cache on the first access after 1 minute after the last time items were cleared.
private DateTime nextFlush;
public object getItem(object key)
{
DateTime now = DateTime.Now
if (now > nextFlush)
{
Flush();
nextFlush = now.AddMinutes(1)
}
return fetchItem(key);
}

Categories

Resources