In my application I use MemoryCache but I don't expect items to expire. Items are therefore inserted to the cache with default policy, without AbsoulteExpiration or SlidingExpiration being set.
Recently, on high server last, I experienced problems with cache, as it returned null values in place of desired values, inserted to the cache before. It turned out, that not only items eligible to expire (as those with expiration date explicitly set) are removed from the cache. Under memory pressure, where values of CacheMemoryLimit and/or PhysicalMemoryLimit are exceeded, MemoryCache removes other elements as well.
How to prevent this? How to be sure, that when element is set to the cache once, it can be safely fetched from it again?
I considered setting the PollingInterval to some huge value, but this only delays the potential problem (and the polling interval is referenced in documentation as maximal time, not the exact or minimal time). Setting PhysicalMemoryLimitPercentage to 100% also does not solve the problem since it references to the physically installed memory and not to the whole available virtual memory. Or am I wrong and it would indeed help?
CacheItemPolicy has a Priority property which can be set to NotRemovable.
You do need to be aware of how much data you are adding to the cache with this setting, though. Continuously adding data to the cache and never removing it will eventually cause memory or overflow issues.
A cache is typically used where it's acceptable for an item to no longer exist in the cache, in which case the value is retrieved again from persistent storage (a database or file, for example).
In your case, it sounds like your code requires the item to exist, which may suggest looking for another approach (a static ConcurrentDictionary as mentioned in the comments, for example).
Related
Please excuse my noob question as I am still a junior coder , whats the difference between LRU Caching using Dictionary and Linked list and Memory Caching C#, how would one implement a LRU list on say memory cache.
Thanks in advance.
LRU is a algorithm of expiring the cache and adding new Item to your cache. the this algorithm expires the least recently used Item in your cache when the cache is full.
The MemoryCache is a class in .net 4 and after which is a way of implementing caching inside the heap memory. Caching can be categorise in different ways base on the media that you cache you can cache on hard drive or memory, based on the location of the memory you can categorize it as in-memory (inside the heap memory) and out-memory (a memory out side the heap for example on another server). MemoryCaching in c# uses in-memory and you have to be careful because it can use all the memory of your application. So its better not to use it if you have more than one node.
One another things you have to take into consideration is that you when you cache an object in an out-memory the object should be serializable. But the in-memory caching can cache any object without serializing.
Least-recently-used (LRU) evicts the key-value used-the-least when the cache is full and it needs to add a value. Whereas a MemoryCache evicts the oldest key-values, or those past their 'use-by-date' if they happen to have one.
Say if the first key-value you added is vital and you happened to read all the time, well in a LRU cache it would be kept, but in a memoryCache it would eventually disappear and need to be replaced. Sometimes though older key-values disappearing is what your after, so up-to-date values get pulled through from your backend (e.g. database).
Consider also if adding a existing key-value should be considered as being a 'used' (so recently updated stuff tends to stay around) or if 'used' is only when you read a key-value, so you just favour the things your reader likes. As always I would consider concurrency if you have more than one task or thread using it.
I have a large data set that is updated once a day. I am caching the results of an expensive query on that data but I want to update that cache each day. I am considering using CacheItemRemovedCallback to reload my cache on a daily interval, but I had the following concerns:
Isn't it possible that the CacheItemRemovedCallback could be called before my expiration (in the case of running out of memory)? Which means reloading it immediately doesn't seem like a good idea.
Does the CacheItemRemovedCallback get called before or after the item is actually removed? If it is after, doesn't this theoretically leave a period of time where the cache would be unavailable?
Are these concerns relevant and if using CacheItemRemovedCallback to reload your cache is a bad idea, then when is it useful?
If you're going to reload, be sure to check the CacheItemRemovedReason. I recently had to debug an issue where a developer decided they should immediately re-populate the cache in this method, and under low memory conditions, it basically sat chewing up CPU while it got stuck in a loop of building the cache objects, adding them to the cache, expiring, repeat.
The callback is fired after the item is removed.
From everyone's responses and from further reading I have come to the following conclusion:
My concerns are valid. Using CacheItemRemovedCallback to refresh cached items is not a good idea. The only practical use for this callback seems to be logging information about when your cache is removed.
It seems that CacheItemUpdateCallback is the more appropriate way of refreshing your cache on a regular interval.
Ultimately, I have decided not to use either of these calls. Instead I will write a service action so the database import job can notify my application when it needs to refresh its data. This avoids using a timed refresh altogether.
Yes, there is a change that the method could be fired off for a lot of various reasons. However, loading or waiting to load the cache again would be dependent upon what is best for your typical use case in your application.
CacheItemRemovedCallback does indeed fire after the item is removed from the cache. Right before the item is to be removed, you can use the CacheItemUpateCallback method to determine whether or not you want to flush the cache at that time. There may be good reasons to wait in flushing the cache, such as you currently have users in your application and it takes a long amount of time to build the cache again.
Generally speaking, the best practice is to test that your cached item actually exists in the cache before using its data. If the data doesn't exist, you can rebuild the cache at that time (causing a slightly longer response for the user) or choose to do something else.
This really isn't so much a cache of individual values as it is a snapshot of an entire dataset. As such, you don't benefit from using the Cache class here.
I'd recommend loading a static collection on startup and replacing it every 24 hours by setting a timer. The idea would be to create a new collection and atomically assign it, as the old one may still be in use and we want it to remain self-consistent.
I'm trying to cache a price value using HttpRuntime.Cache.Insert(), but only appears to hold the value for a couple hours or something before clearing it out. What am I doing wrong? I want the value to stay in cache for 3 days.
HttpRuntime.Cache.Insert(CacheName, Price, null, DateTime.Now.AddDays(3), TimeSpan.Zero);
Short answer
Your application pool or website is being shutdown too soon. Extend the idle timeout on the site, extend the application pool lifetime for the pool running the site. Raise the memory allocation and request limits.
Full answer
If you want to know when and why something is being removed from the cache, you need to log the item removal using the CacheItemRemovedCallback option on the insertion... Then you can log the reason using the CacheItemRemovedReason argument. You can thus log the reason as one of the four listed reasons:
Removed The item is removed from the cache by a Remove method call or by an Insert method call that specified the same key.
Expired The item is removed from the cache because it expired.
Underused The item is removed from the cache because the system removed it to free memory.
DependencyChanged The item is removed from the cache because the cache dependency associated with it changed.
Typically, you will find Expired and Underused being the reasons for things that don't have explict Remove calls made against the cache and don't have dependencies.
You will likely find out, while tracing through this fun stuff, that your items are not being expired or underused. Rather, I suspect you'll find that the AppDomain is getting unloaded.
One way this can happen due to the web.config (or bin directory, or .aspx, etc.) files getting changed. For more information as to when this occurs see the Application Restarts section of this page. When that happens, the currently pending requests are drained, the cache emptied, and the AppDomain unloaded. You can detect this situation by checking the AppDomain.IsFinalizingForUnload and logging that during the callback.
Another reason for the AppDomain to recycle is when IIS decides to recycle the AppPool for any of the reasons it has been configured with. Examples of that are xxx memory has been allocated over the lifetime, yyy seconds of runtime for the AppPool, ttt scheduled recycle time, or iiii idle time (no requests incoming). For further details check this article for IIS6 or this article for IIS7
The Cache object doesn't guarantee that it will hold onto cached objects at all, much less for the full amount of time that you suggest.
If you want to more strongly encourage it to do so, you can set CacheItemPriority.High or CacheItemPriority.NotRemovable when you insert an item into the Cache. With the default Normal priority, the runtime has a fairly aggressive policy of letting go of objects when memory pressure increases.
On top of that, by default the IIS AppPool will recycle once/day or so, which will clear everything in the Cache.
The docs http://msdn.microsoft.com/en-us/library/4y13wyk9.aspx say that Cache.NoSlidingExpiration must be used if using an absolute expiration.
HttpRuntime.Cache.Insert(CacheName, Price, null, DateTime.Now.AddDays(3), Cache.NoSlidingExpiration);
this may not be your problem though, i just found that Cache.NoSlidingExpiration should be the same as TimeSpan.Zero.
Next i would check that your app pool isnt expiring and check how much cache you are using. If it's a high traffic site using a lot of memory (ie memory cache) then it will expire cache items as the memory is needed for other things.
also check the last comment here http://bytes.com/topic/net/answers/717129-c-asp-net-page-cache-getting-removed-too-soon someone seems to have found a solution to your problem.
Check the recycle time on your App Pool.
By default, items added to the cache have no set expiration, so this is definitely something outside the cache. I agree with Josh, you should check the recycle time on your App Pool.
Check out this page to see an example of how you can add a delegate to let you know exactly when your item is being removed from the cache. This might help you in troubleshooting if it's not your App Pool:
http://msdn.microsoft.com/en-us/library/system.web.caching.cache.add.aspx
~md5sum~
I know that most people recommend using HttpRuntime.Cache because it has more flexibility... etc. But what if you want the object to persist in the cache for the life of the application? Is there any big downside to using the Application[] object to cache things?
As long as you don't abuse the application state, then I don't see a problem in using it for items that you don't want to expire.
Alternatively I would probably use a static variable near the code that uses it. That way you avoid to go through HttpApplicationState and then be forced to have a reference to System.Web if i want to access my data.
But be sure to think through how you use the object(s) that you store in HttpApplicationState. If it's a DataSet which you keep adding stuff to for each request, then at some point you end up eating up too much memory on the web-server. The same could happen if you keep adding items to HttpApplicationState when you process requests, at some point you will force the application to restart.
That's probably the advantage of using Cache in your situation. Consuming larger amounts memory isn't as fatal because you allow ASP.NET to release the items in your cache when memory becomes scarce.
Application is deprecated by Cache. If you need something with application scope, then you should either create it as a static member of a class or use the Cache. If you want to go the Cache route but don't ever want it to expire, you should use the CacheItemPriority.NotRemovable option when you Insert the value into the cache. Note that it is possible to use this priority and still use cache dependencies, for instance if your data depended on something in the file system. All the CacheItemPriority does is prevent the HttpRuntime.Cache from intelligently clearing the item when it feels memory pressure and uses its Least-Recently-Used algorithm to purge items that aren't seeing much use.
Use cache when you want items to automatically expire or get reclaimed when memory is scarse. Otherwise use static variables if you can, because they will yield better performance then digging through the ApplicationState collection. I'm not exactly sure what would be the case when to use ApplicationState, but there are sure to be some.
I've got a nice little class built that acts as a cache. Each item has an expiration TimeSpan or DateTime. Each time an attempt to access an item in the cache is made, the item's expiration is checked, and if it's expired, the item is removed from the cache and nothing is returned.
That's great for objects that are accessed frequently, but if an item is put in the cache and never accessed again, it's never removed, even though it's expired.
What's a good methodology for expiring such items from the cache?
Should I have a background thread infinitely enumerating every item in the cache to check if it's expired?
The best code is no code. Use the ASP.NET cache instead. You can reference it as System.Web.HttpRuntime.Cache in any application, not just web applications.
In my experience, maintaining a custom caching mechanism became more trouble than it was worth. There are several libraries out there that have already solved these problems. I would suggest using one of them. A popular one in .Net is the Enterprise Library, although I have limited experience with its caching abilities.
If you must use a custom caching mechanism, then I see no problem with a watchful thread idea you suggested. That is, if your application is a server-based application and not a web app. If it's a web app, you already have built in sliding expiration. You can then just wrap it in a strongly typed wrapper to avoid referencing cache items by key each time.
You can implement an LRU (Least Recently Used) strategy, keep your items sorted by access time, when a new item is inserted into the cache and the cache is full you evicted the item that is last in that list. See Cache algorithms at Wikipedia.
If you want to expire immediately, i would still only do that when things are accessed. I.e. when the cache object is accessed and it's time has expired refetch it.
You could also on any change to the cache (re-)start a Timer with the Interval set to the closest expiry timestamp. This will not be accurate to milliseconds and depend on a message pump running, but is not very resource-demanding.
Harald Scheirich's answer is better though, if you don't mind that objects are hanging around forever, when the cache is not updated.
You could clear suitably old items out of the cache on the first access after 1 minute after the last time items were cleared.
private DateTime nextFlush;
public object getItem(object key)
{
DateTime now = DateTime.Now
if (now > nextFlush)
{
Flush();
nextFlush = now.AddMinutes(1)
}
return fetchItem(key);
}