I've got a simple object being cached like this:
_myCache.Add(someKey, someObj, policy);
Where _myCache is declared as ObjectCache (but injected via DI as MemoryCache.Default), someObj is the object i'm adding, and policy is a CacheItemPolicy.
If i have a CacheItemPolicy like this:
var policy = new CacheItemPolicy
{
Priority = CacheItemPriority.Default,
SlidingExpiration = TimeSpan.FromHours(1)
};
It means it will expire in 1 hour. Cool.
But what will happen is that unlucky first user after the hour will have to wait for the hit.
Is there any way i can hook into an "expired" event/delegate and manually refresh the cache?
I see there is a mention of CacheEntryChangeMonitor but can't find any meaninful doco/examples on how to utilize it in my example.
PS. I know i can use CacheItemPriority.NotRemovable and expire it manually, but i can't do that in my current example because the cached data is a bit too complicated (e.g i would need to "invalidate" in like 10 different places in my code).
Any ideas?
There's a property on the CacheItemPolicy called RemovedCallback which is of type: CacheEntryRemovedCallback. Not sure why they didn't go the standard event route, but that should do what you need.
http://msdn.microsoft.com/en-us/library/system.runtime.caching.cacheitempolicy.removedcallback.aspx
Late to the party with this one but I've just noticed an interesting difference between CacheItemUpdate and CacheItemRemove callbacks.
http://msdn.microsoft.com/en-us/library/system.web.caching.cacheitemupdatereason.aspx
In particular this comment:
Unlike the CacheItemRemovedReason enumeration, this enumeration does
not include the Removed or Underused values. Updateable cache items
are not removable and can thus never be automatically removed by
ASP.NET even if there is a need to free memory.
This is my way to use CacheRemovedCallback event when cache expired.
I share for whom concern.
public static void SetObjectToCache<T>(string cacheItemName, T obj, long expireTime)
{
ObjectCache cache = MemoryCache.Default;
var cachedObject = (T)cache[cacheItemName];
if (cachedObject != null)
{
// remove it
cache.Remove(cacheItemName);
}
CacheItemPolicy policy = new CacheItemPolicy()
{
AbsoluteExpiration = DateTimeOffset.Now.AddMilliseconds(expireTime),
RemovedCallback = new CacheEntryRemovedCallback(CacheRemovedCallback)
};
cachedObject = obj;
cache.Set(cacheItemName, cachedObject, policy);
}
public static void CacheRemovedCallback(CacheEntryRemovedArguments arguments)
{
var configServerIpAddress = Thread.CurrentPrincipal.ConfigurationServerIpAddress();
long configId = Thread.CurrentPrincipal.ConfigurationId();
int userId = Thread.CurrentPrincipal.UserId();
var tagInfoService = new TagInfoService();
string returnCode = string.Empty;
if (arguments.CacheItem.Key.Contains("DatatableTags_"))
{
// do what's needed
Task.Run(() =>
{
});
}
}
Related
I am using MemoryCache in one of my project to cache some keys and value. I have attached a listener on my MemoryCache that whenever it is expiring any items then that listener should get called so that I can remove those keys from CacheKeys HashSet as well. Basically I want to be consistent with what I have in CacheKeys and MemoryCache.
I read about MemoryCache and looks like I can use RegisterPostEvictionCallback as shown below:
private static readonly HashSet<string> CacheKeys = new HashSet<string>();
private bool CacheEntries<T>(MemoryCache memoryCache, string cacheKey, T value, Configuration config, Action<MemoryCacheEntryOptions> otherOptions = null)
{
int minutes = randomGenerator.Next(config.LowTime, config.HighTime);
MemoryCacheEntryOptions options = new MemoryCacheEntryOptions()
{
Size = config.Size,
Priority = config.Priority,
AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(minutes)
};
// attaching listener
options.RegisterPostEvictionCallback(callback: EvictionCallback, state: this);
if (otherOptions != null) otherOptions(options);
CacheKeys.Add(cacheKey);
memoryCache.Set<T>(cacheKey, value, options);
return true;
}
private void EvictionCallback(object key, object value, EvictionReason reason, object state)
{
CacheKeys.Remove(key);
var message = $"Entry was evicted. Reason: {reason}.";
Console.WriteLine(message);
}
It looks like there is some issue where items doesn't expire automatically as per this this thread.
So to avoid the issue mentioned in that thread. Do I need to remove AbsoluteExpirationRelativeToNow and use CancellationTokenSource here?
If yes, how should I go ahead and make the change since I don't want to change the current functionality that I have already in my original code so if I remove AbsoluteExpirationRelativeToNow and use CancellationTokenSource here will it behave differently than what my original code is doing?
I'm new in C# and trying to understand how to work with Lazy.
I need to handle concurrent request by waiting the result of an already running operation. Requests for data may come in simultaneously with same/different credentials.
For each unique set of credentials there can be at most one GetDataInternal call in progress, with the result from that one call returned to all queued waiters when it is ready
private readonly ConcurrentDictionary<Credential, Lazy<Data>> Cache
= new ConcurrentDictionary<Credential, Lazy<Data>>();
public Data GetData(Credential credential)
{
// This instance will be thrown away if a cached
// value with our "credential" key already exists.
Lazy<Data> newLazy = new Lazy<Data>(
() => GetDataInternal(credential),
LazyThreadSafetyMode.ExecutionAndPublication
);
Lazy<Data> lazy = Cache.GetOrAdd(credential, newLazy);
bool added = ReferenceEquals(newLazy, lazy); // If true, we won the race.
Data data;
try
{
// Wait for the GetDataInternal call to complete.
data = lazy.Value;
}
finally
{
// Only the thread which created the cache value
// is allowed to remove it, to prevent races.
if (added) {
Cache.TryRemove(credential, out lazy);
}
}
return data;
}
Is that right way to use Lazy or my code is not safe?
Update:
Is it good idea to start using MemoryCache instead of ConcurrentDictionary? If yes, how to create a key value, because it's a string inside MemoryCache.Default.AddOrGetExisting()
This is correct. This is a standard pattern (except for the removal) and it's a really good cache because it prevents cache stampeding.
I'm not sure you want to remove from the cache when the computation is done because the computation will be redone over and over that way. If you don't need the removal you can simplify the code by basically deleting the second half.
Note, that Lazy has a problem in the case of an exception: The exception is stored and the factory will never be re-executed. The problem persists forever (until a human restarts the app). In my mind this makes Lazy completely unsuitable for production use in most cases.
This means that a transient error such as a network issue can render the app unavailable permanently.
This answer is directed to the updated part of the original question. See #usr answer regarding thread-safety with Lazy<T> and the potential pitfalls.
I would like to know how to avoid using ConcurrentDictionary<TKey, TValue> and start
using MemoryCache? How to implement
MemoryCache.Default.AddOrGetExisting()?
If you're looking for a cache which has a mechanism for auto expiry, then MemoryCache is a good choice if you don't want to implement the mechanics yourself.
In order to utilize MemoryCache which forces a string representation for a key, you'll need to create a unique string representation of a credential, perhaps a given user id or a unique username?
If you can, you can create an override of ToString which represents your unique identifier or simply use the said property, and utilize MemoryCache like this:
public class Credential
{
public Credential(int userId)
{
UserId = userId;
}
public int UserId { get; private set; }
}
And now your method will look like this:
private const EvictionIntervalMinutes = 10;
public Data GetData(Credential credential)
{
Lazy<Data> newLazy = new Lazy<Data>(
() => GetDataInternal(credential), LazyThreadSafetyMode.ExecutionAndPublication);
CacheItemPolicy evictionPolicy = new CacheItemPolicy
{
AbsoluteExpiration = DateTimeOffset.UtcNow.AddMinutes(EvictionIntervalMinutes)
};
var result = MemoryCache.Default.AddOrGetExisting(
new CacheItem(credential.UserId.ToString(), newLazy), evictionPolicy);
return result != null ? ((Lazy<Data>)result.Value).Value : newLazy.Value;
}
MemoryCache provides you with a thread-safe implementation, this means that two threads accessing AddOrGetExisting will only cause a single cache item to be added or retrieved. Further, Lazy<T> with ExecutionAndPublication guarantess only a single unique invocation of the factory method.
The following test fails intermittently. It caches an item in MemoryCache with an absolute expiration time, and an update callback that should be called before the item is removed. However sometimes the callback is invoked before the test finishes, and sometimes not at all.
With a large enough buffer time it will always be invoked at least once. But that does not serve my purposes, since I require that the cache always attempts to update the data before it expired.
Now in my real world scenario I will not have 10 second expiration time and granularity, but it still bothers me that this test fails intermittently.
Anyone have thoughts on why this is happening?
Note: Also intermittently fails with 60 second expiry and 5 second buffer.
using System;
using System.Runtime.Caching;
using System.Threading.Tasks;
using Microsoft.VisualStudio.TestTools.UnitTesting;
[TestClass]
public class MemoryCacheTest
{
private const double ExpiryInSeconds = 10;
private const double ExpiryBufferInSeconds = 5;
private readonly object updateItemCounterLock = new object();
private int updateItemCounter = 0;
[TestMethod]
public async Task MemoryCacheUpdateTest()
{
// Set item in cache with absolute expiration defined above
MemoryCache cache = MemoryCache.Default;
CacheItem cacheItem = new CacheItem("key", "value");
CacheItemPolicy cacheItemPolicy = new CacheItemPolicy
{
AbsoluteExpiration = DateTimeOffset.Now + TimeSpan.FromSeconds(ExpiryInSeconds),
UpdateCallback = new CacheEntryUpdateCallback(this.UpdateItem)
};
cache.Set(cacheItem, cacheItemPolicy);
// Delay for absolute expiration time + buffer
await Task.Delay(TimeSpan.FromSeconds(ExpiryInSeconds) + TimeSpan.FromSeconds(ExpiryBufferInSeconds));
// Test that the update callback was invoked once
Assert.AreEqual(1, updateItemCounter);
}
// Incrememnts the updateItemCounter
private void UpdateItem(CacheEntryUpdateArguments args)
{
lock (updateItemCounterLock)
{
updateItemCounter++;
}
}
}
I suppose calling new CacheEntryUpdateCallback is redundant. You can call:
UpdateCallback = new CacheEntryUpdateCallback(this.UpdateItem) instead
Since there was no solution to this question, I abstracted the MemoryCache methods that I needed into an interface and tested against that. At that point the test became invalid because I would have just been testing my own implementation of the interface.
I'm trying to use a MemoryCache in .net 4.5 to keep track of and automatically update various items, but it seems like no matter what I set as an AbsoluteExpiration it will always only expire in 15 seconds or more.
I want the cache items to expire every 5 seconds, but it always expires in at least 15 seconds, and if I move the expiration time out, it will end up being something like 15 seconds + my refresh interval, but never less than 15 seconds.
Is there some internal timer resolution that I'm not seeing? I looked through a bit of the reflected System.Runtime.Caching.MemoryCache code and nothing stood out to me, and I haven't been able to find anybody else who has this issue out on the internet.
I have a very basic example below that illustrates the problem.
What I want is for CacheEntryUpdate to be hit every 5 seconds or so and update with new data, but, as I've said, it only ever gets hit in 15+ seconds.
static MemoryCache MemCache;
static int RefreshInterval = 5000;
protected void Page_Load(object sender, EventArgs e)
{
if (MemCache == null)
MemCache = new MemoryCache("MemCache");
if (!MemCache.Contains("cacheItem"))
{
var cacheObj = new object();
var policy = new CacheItemPolicy
{
UpdateCallback = new CacheEntryUpdateCallback(CacheEntryUpdate),
AbsoluteExpiration = DateTimeOffset.UtcNow.AddMilliseconds(RefreshInterval)
};
var cacheItem = new CacheItem("cacheItem", cacheObj);
MemCache.Set("cacheItem", cacheItem, policy);
}
}
private void CacheEntryUpdate(CacheEntryUpdateArguments args)
{
var cacheItem = MemCache.GetCacheItem(args.Key);
var cacheObj = cacheItem.Value;
cacheItem.Value = cacheObj;
args.UpdatedCacheItem = cacheItem;
var policy = new CacheItemPolicy
{
UpdateCallback = new CacheEntryUpdateCallback(CacheEntryUpdate),
AbsoluteExpiration = DateTimeOffset.UtcNow.AddMilliseconds(RefreshInterval)
};
args.UpdatedCacheItemPolicy = policy;
}
I've figured it out. There's an internal static readonly TimeSpan on System.Runtime.Caching.CacheExpires called _tsPerBucket that is hardcoded at 20 seconds.
Apparently, this field is what's used on the internal timers that run and check to see if cache items are expired.
I'm working around this by overwriting the value using reflection and clearing the default MemoryCache instance to reset everything. It seems to work, even if it is a giant hack.
Here's the updated code:
static MemoryCache MemCache;
static int RefreshInterval = 1000;
protected void Page_Load(object sender, EventArgs e)
{
if (MemCache == null)
{
const string assembly = "System.Runtime.Caching, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a";
var type = Type.GetType("System.Runtime.Caching.CacheExpires, " + assembly, true, true);
var field = type.GetField("_tsPerBucket", BindingFlags.Static | BindingFlags.NonPublic);
field.SetValue(null, TimeSpan.FromSeconds(1));
type = typeof(MemoryCache);
field = type.GetField("s_defaultCache", BindingFlags.Static | BindingFlags.NonPublic);
field.SetValue(null, null);
MemCache = new MemoryCache("MemCache");
}
if (!MemCache.Contains("cacheItem"))
{
var cacheObj = new object();
var policy = new CacheItemPolicy
{
UpdateCallback = new CacheEntryUpdateCallback(CacheEntryUpdate),
AbsoluteExpiration = DateTimeOffset.UtcNow.AddMilliseconds(RefreshInterval)
};
var cacheItem = new CacheItem("cacheItem", cacheObj);
MemCache.Set("cacheItem", cacheItem, policy);
}
}
private void CacheEntryUpdate(CacheEntryUpdateArguments args)
{
var cacheItem = MemCache.GetCacheItem(args.Key);
var cacheObj = cacheItem.Value;
cacheItem.Value = cacheObj;
args.UpdatedCacheItem = cacheItem;
var policy = new CacheItemPolicy
{
UpdateCallback = new CacheEntryUpdateCallback(CacheEntryUpdate),
AbsoluteExpiration = DateTimeOffset.UtcNow.AddMilliseconds(RefreshInterval)
};
args.UpdatedCacheItemPolicy = policy;
}
Would you be willing/able to change from the older System.Runtime.Caching to the new Microsft.Extensions.Caching? version 1.x supports netstandard 1.3 and net451. If so then the improved API would support the usage you describe without hackery with reflection.
The MemoryCacheOptions object has a property ExpirationScanFrequency to allow you to control the scan frequency of the cache cleanup, see https://learn.microsoft.com/en-us/dotnet/api/microsoft.extensions.caching.memory.memorycacheoptions.expirationscanfrequency?view=aspnetcore-2.0
Be aware that there is no longer expiration based on timers (this is a performance design decision), and so now memory pressure or calling one of the Get() based methods for the cached items are now the triggers for expiration. However you can force time based expiration using cancellation tokens, see this SO answer for an example https://stackoverflow.com/a/47949111/3140853.
To MatteoSp - the pollingInterval in the configuration or the NameValueCollection in the constructor is a different timer. It is an interval that when called will use the two other config properties to determine if memory is at a level that requires entries to be removed using the Trim method.
An updated version basing on #Jared's answer. Insread of modify the default MemoryCache instance, here creates a new one.
class FastExpiringCache
{
public static MemoryCache Default { get; } = Create();
private static MemoryCache Create()
{
MemoryCache instance = null;
Assembly assembly = typeof(CacheItemPolicy).Assembly;
Type type = assembly.GetType("System.Runtime.Caching.CacheExpires");
if( type != null)
{
FieldInfo field = type.GetField("_tsPerBucket", BindingFlags.Static | BindingFlags.NonPublic);
if(field != null && field.FieldType == typeof(TimeSpan))
{
TimeSpan originalValue = (TimeSpan)field.GetValue(null);
field.SetValue(null, TimeSpan.FromSeconds(3));
instance = new MemoryCache("FastExpiringCache");
field.SetValue(null, originalValue); // reset to original value
}
}
return instance ?? new MemoryCache("FastExpiringCache");
}
}
I have written my own custom change monitor class for the .NET MemoryCache. It seems to initialize fine, but when I attempt to add it to the Cache, it throws an InvalidOperation exception - The method has already been invoked, and can only be invoked once.
My change monitor class:
internal class MyChangeMonitor : ChangeMonitor
{
private Timer _timer;
private readonly string _uniqueId;
private readonly TypeAsOf _typeAsOf;
private readonly string _tableName;
public GprsChangeMonitor(TypeAsOf typeAsOf, string tableName)
{
bool initComplete = false;
try
{
_typeAsOf = typeAsOf;
_tableName = tableName;
_uniqueId = Guid.NewGuid().ToString();
TimeSpan ts = new TimeSpan(0, 0, 5, 0, 0);
_timer = new Timer {Interval = ts.TotalMilliseconds};
_timer.Elapsed += CheckForChanges;
_timer.Enabled = true;
_timer.Start();
initComplete = true;
}
finally
{
base.InitializationComplete();
if(!initComplete)
Dispose(true);
}
}
void CheckForChanges(object sender, System.Timers.ElapsedEventArgs e)
{
//check for changes, if different
base.OnChanged(_typeAsOf);
}
}
The code I use to create the cache policy and add the key/value pair to the cache:
CacheItemPolicy policy = new CacheItemPolicy
{
UpdateCallback = OnCacheEntryUpdateCallback
};
policy.AbsoluteExpiration = SystemTime.Today.AddHours(24);
//monitor the for changes
string tableName = QuickRefreshItems[type];
MyChangeMonitor cm = new MyChangeMonitor(typeAsOf, tableName);
policy.ChangeMonitors.Add(cm);
cm.NotifyOnChanged(OnRefreshQuickLoadCacheItems);
MyCache.Set(cacheKey, value, policy);
The Set call throws the invalid operation exception which is weird because, according to the MSDN documentation, it only throws the ArgumentNull, Argument, ArgumentOutOfRange, and NotSupported exceptions.
I am sure that I must be making a simple mistake. But it's hard to find good documentation or examples on writing your own custom change monitor. Any help would be appreciated.
I know the comments have the answer, but I wanted it to be more obvious...
When a ChangeMonitor is used, it will fire immediately if the cache entry does not exist.
MSDN documentation states it this way:
A monitored entry is considered to have changed for any of the
following reasons:
A) The key does not exist at the time of the call to the
CreateCacheEntryChangeMonitor method. In that case, the resulting
CacheEntryChangeMonitor instance is immediately set to a changed
state. This means that when code subsequently binds a
change-notification callback, the callback is triggered immediately.
B) The associated cache entry was removed from the cache. This can
occur if the entry is explicitly removed, if it expires, or if it is
evicted to recover memory
I've had the exact same error:
Source: System.Runtime.Caching
Exception type: System.InvalidOperationException
Message: The method has already been invoked, and can only be invoked once.
Stacktrace: at System.Runtime.Caching.ChangeMonitor.NotifyOnChanged(OnChangedCallback onChangedCallback)
at System.Runtime.Caching.MemoryCacheEntry.CallNotifyOnChanged()
at System.Runtime.Caching.MemoryCacheStore.AddToCache(MemoryCacheEntry entry)
at System.Runtime.Caching.MemoryCacheStore.Set(MemoryCacheKey key, MemoryCacheEntry entry)
at System.Runtime.Caching.MemoryCache.Set(String key, Object value, CacheItemPolicy policy, String regionName)
I've searched for it for hours.. until the light of logic struck me:
I was using a static policy object that was reused.. (some unconscious process in me reuses all objects if they are equal,maybe I am afraid of constructing objects that consume some bytes in memory )
By creating a new policy object for every item in the cache, the error was gone. Pretty logical if you think about it.
Posting a late answer as I've just faced the same issue and conducted my own investigation.
When you register your change monitor with a cached item policy — policy.ChangeMonitors.Add(cm) — the CacheItemPolicy implementation registers its own change callback on it via ChangeMonitor.NotifyOnChanged. You're not supposed to be calling cm.NotifyOnChanged to register yet another callback, or it will throw The method has already been invoked, and can only be invoked once at that point.
Instead, use CacheItemPolicy.UpdateCallback or CacheItemPolicy.RemovedCallback to update/remove the cache item, e.g. as described in this blog post.