How to clear ASP.NET cache thread-safely? - c#

In my project I have some cached values implemented using singleton pattern - it looks like this:
Roles GetRoles
{
get{
var cached = HttpContext.Current.Cache["key"];
if(cached == null){
cached = new GetRolesFromDb(...);
}
return cached as Roles;
}
}
When I change the roles I'm clearing the cache (iterating over all keys).
I think it isn't thread-safe - if some request tries to get cached roles,
cached != null and meanwhile cache had been cleared GetRoles returns null.

private object lockRoles = new object();
public Roles GetRoles
{
get
{
object cached = HttpContext.Current.Cache["key"];
if(cached == null)
{
lock(lockRoles)
{
cached = HttpContext.Current.Cache["key"];
if (cached == null)
{
cached = new GetRolesFromDb(...);
HttpContext.Current.Cache["key"] = cached;
}
}
}
return (Roles)cached;
}
}
public void ClearRoles()
{
HttpContext.Current.Cache.Remove("key");
}

Related

C# cache layer issues, locks & DB

I have the Cache layer described below. The problem is that I need to get Lists or specific element from List, and then modify it, then persist it to db, and update the cache list.
So to avoid problems between the cache layer & DB layer, when getting data from cache I should make/get a copy of it, so it doesn't get changed down the way (if it's changes DB layer throws an error).
Question is how to properly approach this copy strategy, and other question is should the lock be present for add/remove operations too? (if yes, what is the right way to lock prior to modify?)
The cache is a singleton DI through services.
services.AddSingleton<MyCache>();
public class MyCache
{
private readonly AsyncLock _mutex = new AsyncLock();
public MemoryCache Cache { get; set; }
public MyCache()
{
Cache = new MemoryCache(new MemoryCacheOptions());
// populate cache from DB
}
public async Task<List<UDT>> GetUDTsForUser(string id, bool fetchFromDB = false)
{
List<UDT> list = new List<UDT>();
Cache.TryGetValue(id, out list);
if (list == null)
{
using (await _mutex.LockAsync())
{
Cache.TryGetValue(id, out list);
if (list == null && fetchFromDB)
{
var DAO = new DAO();
list = (await DAO.GetInfoForUser(id))?.UDTs;
if (list != null)
{
Cache.Set(id, list);
}
}
}
}
return list;
}
public async Task<UDT> GetUDTForUser(string id, long udtId)
{
List<UDT> list = new List<UDT>();
Cache.TryGetValue(id, out list);
if (list == null)
{
using (await _mutex.LockAsync())
{
Cache.TryGetValue(id, out list);
if (list == null)
{
var DAO = new DAO();
list = (await DAO.GetInfoForUser(id)).UDTs;
Cache.Set(id, list);
return list.FirstOrDefault(u => u.Id == udtId);
}
}
}
return list.FirstOrDefault(u => u.Id == udtId);
}
public void AddElement(UDT udt)
{
if (udt == null)
return;
var udtList = Cache.GetOrCreate(udt.GroupId, entry => {
return new List<UDT>();
});
if (udtList.Contains(udt) == false)
{
udtList.Add(udt);
Cache.Set(udt.GroupId, udtList);
}
}
public void RemoveElement(string groupId, long udtId)
{
var udtList = Cache.Get<List<UDT>>(groupId);
if (udtList != null)
{
udtList.RemoveAll(e => e.Id == udtId);
Cache.Set(groupId, udtList);
}
}
}

Looking for a way to do less locking while caching

I am using the code below to cache items. It's pretty basic.
The issue I have is that every time it caches an item, section of the code locks. So with roughly a million items arriving every hour or so, this is a problem.
I've tried creating a dictionary of static lock objects per cacheKey, so that locking is granular, but that in itself becomes an issue with managing expiration of them, etc...
Is there a better way to implement minimal locking?
private static readonly object cacheLock = new object();
public static T GetFromCache<T>(string cacheKey, Func<T> GetData) where T : class {
// Returns null if the string does not exist, prevents a race condition
// where the cache invalidates between the contains check and the retrieval.
T cachedData = MemoryCache.Default.Get(cacheKey) as T;
if (cachedData != null) {
return cachedData;
}
lock (cacheLock) {
// Check to see if anyone wrote to the cache while we where
// waiting our turn to write the new value.
cachedData = MemoryCache.Default.Get(cacheKey) as T;
if (cachedData != null) {
return cachedData;
}
// The value still did not exist so we now write it in to the cache.
cachedData = GetData();
MemoryCache.Default.Set(cacheKey, cachedData, new CacheItemPolicy(...));
return cachedData;
}
}
You may want to consider using ReaderWriterLockSlim, which you can obtain write lock only when needed.
Using cacheLock.EnterReadLock(); and cacheLock.EnterWriteLock(); should greatly improve the performance.
That link I gave even have an example of a cache, exactly what you need, I copy here:
public class SynchronizedCache
{
private ReaderWriterLockSlim cacheLock = new ReaderWriterLockSlim();
private Dictionary<int, string> innerCache = new Dictionary<int, string>();
public int Count
{ get { return innerCache.Count; } }
public string Read(int key)
{
cacheLock.EnterReadLock();
try
{
return innerCache[key];
}
finally
{
cacheLock.ExitReadLock();
}
}
public void Add(int key, string value)
{
cacheLock.EnterWriteLock();
try
{
innerCache.Add(key, value);
}
finally
{
cacheLock.ExitWriteLock();
}
}
public bool AddWithTimeout(int key, string value, int timeout)
{
if (cacheLock.TryEnterWriteLock(timeout))
{
try
{
innerCache.Add(key, value);
}
finally
{
cacheLock.ExitWriteLock();
}
return true;
}
else
{
return false;
}
}
public AddOrUpdateStatus AddOrUpdate(int key, string value)
{
cacheLock.EnterUpgradeableReadLock();
try
{
string result = null;
if (innerCache.TryGetValue(key, out result))
{
if (result == value)
{
return AddOrUpdateStatus.Unchanged;
}
else
{
cacheLock.EnterWriteLock();
try
{
innerCache[key] = value;
}
finally
{
cacheLock.ExitWriteLock();
}
return AddOrUpdateStatus.Updated;
}
}
else
{
cacheLock.EnterWriteLock();
try
{
innerCache.Add(key, value);
}
finally
{
cacheLock.ExitWriteLock();
}
return AddOrUpdateStatus.Added;
}
}
finally
{
cacheLock.ExitUpgradeableReadLock();
}
}
public void Delete(int key)
{
cacheLock.EnterWriteLock();
try
{
innerCache.Remove(key);
}
finally
{
cacheLock.ExitWriteLock();
}
}
public enum AddOrUpdateStatus
{
Added,
Updated,
Unchanged
};
~SynchronizedCache()
{
if (cacheLock != null) cacheLock.Dispose();
}
}
I don't know how MemoryCache.Default is implemented, or whether or not you have control over it.
But in general, prefer using ConcurrentDictionary over Dictionary with lock in a multi threaded environment.
GetFromCache would just become
ConcurrentDictionary<string, T> cache = new ConcurrentDictionary<string, T>();
...
cache.GetOrAdd("someKey", (key) =>
{
var data = PullDataFromDatabase(key);
return data;
});
There are two more things to take care about.
Expiry
Instead of saving T as the value of the dictionary, you can define a type
struct CacheItem<T>
{
public T Item { get; set; }
public DateTime Expiry { get; set; }
}
And store the cache as a CacheItem with a defined expiry.
cache.GetOrAdd("someKey", (key) =>
{
var data = PullDataFromDatabase(key);
return new CacheItem<T>() { Item = data, Expiry = DateTime.UtcNow.Add(TimeSpan.FromHours(1)) };
});
Now you can implement expiration in an asynchronous thread.
Timer expirationTimer = new Timer(ExpireCache, null, 60000, 60000);
...
void ExpireCache(object state)
{
var needToExpire = cache.Where(c => DateTime.UtcNow >= c.Value.Expiry).Select(c => c.Key);
foreach (var key in needToExpire)
{
cache.TryRemove(key, out CacheItem<T> _);
}
}
Once a minute, you search for all cache entries that need to be expired, and remove them.
"Locking"
Using ConcurrentDictionary guarantees that simultaneous read/writes won't corrupt the dictionary or throw an exception.
But, you can still end up with a situation where two simultaneous reads cause you to fetch the data from the database twice.
One neat trick to solve this is to wrap the value of the dictionary with Lazy
ConcurrentDictionary<string, Lazy<CacheItem<T>>> cache = new ConcurrentDictionary<string, Lazy<CacheItem<T>>>();
...
var data = cache.GetOrData("someKey", key => new Lazy<CacheItem<T>>(() =>
{
var data = PullDataFromDatabase(key);
return new CacheItem<T>() { Item = data, Expiry = DateTime.UtcNow.Add(TimeSpan.FromHours(1)) };
})).Value;
Explanation
with GetOrAdd you might end up invoking the "get from database if not in cache" delegate multiple times in the case of simultaneous requests.
However, GetOrAdd will end up using only one of the values that the delegate returned, and by returning a Lazy, you guaranty that only one Lazy will get invoked.

Response Cookie not updating after first time being set

I have a cookie helper class that gets and sets data for a cookie.
In my controller action I'm trying to update a List collection and persist that to the cookie.
UPDATE: It seems using HttpContext.Current.Response.Cookies.Add() even if the cookie exists or not it will do an upsert on it and work correctly.
So what's the purpose of Reponse.Cookie.Set() then?
private List<int> _TestNumbers = new List<int>();
cookie = new CookieHelper(_searchCookieName);
cookie.SetData("testNumbers", _TestNumbers);
_TestNumbers.Add(1);
cookie.SetData("testNumbers", _TestNumbers);
_TestNumbers.Add(2);
cookie.SetData("testNumbers", _TestNumbers);
_TestNumbers.Add(3);
cookie.SetData("testNumbers", _TestNumbers);
The cookie helper class
public class CookieHelper
{
public CookieHelper(string cookieName = null, HttpContext context = null)
{
// Set param defaults
context = context ?? HttpContext.Current;
if (cookieName != null)
_cookieName = cookieName;
// Load cookie if it exists, if not create one.
_cookie = context.Request.Cookies[_cookieName] ?? new HttpCookie(_cookieName);
Save();
}
public object GetData(string name)
{
return _cookie[name] == null ? null : new Base64Serializer().Deserialize(_cookie[name]);
}
public void SetData(string name, object value)
{
_cookie[name] = new Base64Serializer().Serialize(value);
Save();
}
public void Save()
{
_cookie.Expires = DateTime.UtcNow.AddDays(_cookieExpiration);
// Create the cookie if it doesn't exist
if(HttpContext.Current.Request.Cookies.Get(_cookieName) == null)
HttpContext.Current.Response.Cookies.Add(_cookie);
else
HttpContext.Current.Response.Cookies.Set(_cookie);
}
}

Per-Method Thread Synchronization Locking

I'd like to create a static Cached class for an ASP.NET MVC site for quick access to cached items like dropdown lists. It needs to have locking implemented so that when a key comes back empty it can be pulled from the repository while any other request threads wait on it to come back. As such, it needs per-method thread locking (versus a shared lock). My first thought was to use nameof as the lock for each method instead of creating a separate object to lock for each method. A simplified version would look something like...
public static class Cached
{
public static List<Country> GetCountriesList()
{
List<Country> cacheItem = null;
if (HttpContext.Current.Cache["CountriesList"] != null)
cacheItem = (List<Country>)HttpContext.Current.Cache["CountriesList"];
else
{
lock (nameof(GetCountriesList))
{
// Check once more in case it got stored while waiting on the lock
if (HttpContext.Current.Cache["CountriesList"] == null)
{
using (var repo = new Repository())
{
cacheItem = repo.SelectCountries();
HttpContext.Current.Cache.Insert("CountriesList", cacheItem, null, DateTime.Now.AddHours(2), TimeSpan.Zero);
}
}
else
cacheItem = (List<Country>)HttpContext.Current.Cache["CountriesList"];
}
}
return cacheItem;
}
public static List<State> GetStatesList()
{
List<State> cacheItem = null;
if (HttpContext.Current.Cache["StatesList"] != null)
cacheItem = (List<State>)HttpContext.Current.Cache["StatesList"];
else
{
lock (nameof(GetStatesList))
{
// Check once more in case it got stored while waiting on the lock
if (HttpContext.Current.Cache["StatesList"] == null)
{
using (var repo = new Repository())
{
cacheItem = repo.SelectStates();
HttpContext.Current.Cache.Insert("StatesList", cacheItem, null, DateTime.Now.AddHours(2), TimeSpan.Zero);
}
}
else
cacheItem = (List<State>)HttpContext.Current.Cache["StatesList"];
}
}
return cacheItem;
}
}
Is there anything glaringly wrong with an approach like this?
UPDATE:
Per the advice that it is a bad idea to lock on strings, I've changed it to a pattern that I found in SO's Opserver code that uses a ConcurrentDictionary to store a lock object per cache key. Is there anything wrong with the following:
public static class Cached
{
private static readonly ConcurrentDictionary<string, object> _cacheLocks = new ConcurrentDictionary<string, object>();
private const string KEY_COUNTRIES_LIST = "CountriesList";
public static List<Country> GetCountriesList()
{
List<Country> cacheItem = null;
var nullLoadLock = _cacheLocks.AddOrUpdate(KEY_COUNTRIES_LIST, k => new object(), (k, old) => old);
if (HttpContext.Current.Cache[KEY_COUNTRIES_LIST] != null)
cacheItem = (List<Country>)HttpContext.Current.Cache[KEY_COUNTRIES_LIST];
else
{
lock (nullLoadLock)
{
// Check once more in case it got stored while waiting on the lock
if (HttpContext.Current.Cache[KEY_COUNTRIES_LIST] == null)
{
using (var repo = new Repository())
{
cacheItem = repo.SelectCountries();
HttpContext.Current.Cache.Insert(KEY_COUNTRIES_LIST, cacheItem, null, DateTime.Now.AddHours(2), TimeSpan.Zero);
}
}
else
cacheItem = (List<Country>)HttpContext.Current.Cache[KEY_COUNTRIES_LIST];
}
}
return cacheItem;
}
private const string KEY_STATES_LIST = "StatesList";
public static List<State> GetStatesList()
{
List<State> cacheItem = null;
var nullLoadLock = _cacheLocks.AddOrUpdate(KEY_COUNTRIES_LIST, k => new object(), (k, old) => old);
if (HttpContext.Current.Cache[KEY_STATES_LIST] != null)
cacheItem = (List<State>)HttpContext.Current.Cache[KEY_STATES_LIST];
else
{
lock (nullLoadLock)
{
// Check once more in case it got stored while waiting on the lock
if (HttpContext.Current.Cache[KEY_STATES_LIST] == null)
{
using (var repo = new Repository())
{
cacheItem = repo.SelectStates();
HttpContext.Current.Cache.Insert(KEY_STATES_LIST, cacheItem, null, DateTime.Now.AddHours(2), TimeSpan.Zero);
}
}
else
cacheItem = (List<State>)HttpContext.Current.Cache[KEY_STATES_LIST];
}
}
return cacheItem;
}
}
Based on what you posted so far, I think you're over-thinking this. :) I don't see a need to populate yet another dictionary with your locking objects. Since you are using them in explicitly named methods, just declare them as fields as needed.
First, the advice to not lock on string values is sound, but based on the problem that two string values can appear identical while still being different objects. You could avoid that in your scenario by storing the appropriate string value in a const field:
public static class Cached
{
private const string _kcountries = "CountriesList";
private const string _kstates = "StatesList";
public static List<Country> GetCountriesList()
{
List<Country> cacheItem = (List<Country>)HttpContext.Current.Cache[_kcountries];
if (cacheItem == null)
{
lock (_kcountries)
{
// Check once more in case it got stored while waiting on the lock
cacheItem = (List<Country>)HttpContext.Current.Cache[_kcountries];
if (cacheItem == null)
{
using (var repo = new Repository())
{
cacheItem = repo.SelectCountries();
HttpContext.Current.Cache.Insert(_kcountries, cacheItem, null, DateTime.Now.AddHours(2), TimeSpan.Zero);
}
}
}
}
return cacheItem;
}
public static List<State> GetStatesList()
{
// Same as above, except using _kstates instead of _kcountries
}
}
Note that you shouldn't be using string literals throughout the code anyway. It's much better practice to define const fields to represent those values. So you kill two birds with one stone doing the above. :)
The only remaining problem is that you are still using a possibly-public value to lock, since the string literals are interned, and if the exact same string was used somewhere else, it would likely be the same interned value as well. This is of debatable concern; it's my preference to avoid doing so, to ensure no other code outside my control could take the same lock my code is trying to use, but there are those who feel such concerns are overblown. YMMV. :)
If you do care (as I do) about using the possibly-public value, then you can associate a unique object value instead of using the string reference:
public static class Cached
{
private const string _kcountriesKey = "CountriesList";
private const string _kstatesKey = "StatesList";
private static readonly object _kcountriesLock = new object();
private static readonly object _kstatesLock = new object();
public static List<Country> GetCountriesList()
{
List<Country> cacheItem = (List<Country>)HttpContext.Current.Cache[_kcountriesKey];
if (cacheItem == null)
{
lock (_kcountriesLock)
{
// Check once more in case it got stored while waiting on the lock
cacheItem = (List<Country>)HttpContext.Current.Cache[_kcountriesKey];
if (cacheItem == null)
{
using (var repo = new Repository())
{
cacheItem = repo.SelectCountries();
HttpContext.Current.Cache.Insert(_kcountriesKey, cacheItem, null, DateTime.Now.AddHours(2), TimeSpan.Zero);
}
}
}
}
return cacheItem;
}
// etc.
}
I.e. use the ...Key field for your cache (since it does require string values for keys) but the ...Lock field for locking (so that you are sure no code outside your control would have access to the object value used for the lock).
I'll note that you do have an opportunity to reduce the repetition in the code, by writing a single Get...() implementation that can be shared by your various types of data:
public static class Cached
{
private const string _kcountriesKey = "CountriesList";
private const string _kstatesKey = "StatesList";
private static readonly object _kcountriesLock = new object();
private static readonly object _kstatesLock = new object();
public static List<Country> GetCountriesList()
{
// Assuming SelectCountries() is in fact declared to return List<Country>
// then you should actually be able to omit the type parameter in the method
// call and let type inference figure it out. Same thing for the call to
// _GetCachedData<State>() in the GetStatesList() method.
return _GetCachedData<Country>(_kcountriesKey, _kcountriesLock, repo => repo.SelectCountries());
}
public static List<State> GetStatesList()
{
return _GetCachedData<State>(_kstatesKey, _kstatesLock, repo => repo.SelectStates());
}
private static List<T> _GetCachedData<T>(string key, object lockObject, Func<Repository, List<T>> selector)
{
List<T> cacheItem = (List<T>)HttpContext.Current.Cache[key];
if (cacheItem == null)
{
lock (lockObject)
{
// Check once more in case it got stored while waiting on the lock
cacheItem = (List<T>)HttpContext.Current.Cache[key];
if (cacheItem == null)
{
using (var repo = new Repository())
{
cacheItem = selector(repo);
HttpContext.Current.Cache.Insert(key, cacheItem, null, DateTime.Now.AddHours(2), TimeSpan.Zero);
}
}
}
}
return cacheItem;
}
// etc.
}
Finally, I'll note that since the underlying cache (i.e. System.Web.Caching.Cache) is thread-safe, you could just skip all of this altogether, and instead choose to blindly populate the cache if your item (the List<T> in question) isn't found. The only downside is that you in some cases could retrieve the same list more than once. The upside is that the code is a lot simpler.

Does a singleton need to be refreshed with data from webservice?

I have a singleton provider, where the main function is to retrieve an object from a webservice, and cache depending on the webservice cache headers response. This object will be accessed quite a lot. My question is when the data in the webservice changes, will any subsequent call to the singleton automatically be reflected?
public class ConfigurationProvider
{
#region Private Member Variables
private static readonly Lazy<ConfigurationProvider> _instance = new Lazy<ConfigurationProvider>(() => new ConfigurationProvider());
private static readonly HttpCache _cache = new HttpCache();
#endregion
#region Constructors
private ConfigurationProvider()
{
}
#endregion
#region Public Properties
public static ConfigurationProvider Instance
{
get { return _instance.Value; }
}
public ShowJsonResponse Configuration
{
get
{
// Try and get the configurations from webservice and add to cache
var cacheExpiry = 0;
return _cache.GetAndSet(WebApiConstant.ProxyCacheKeys.ShowJsonKey, ref cacheExpiry, () => GetConfiguration(ref cacheExpiry));
}
}
#endregion
#region Private Methods
private ShowJsonResponse GetConfiguration(ref int cacheExpiry)
{
var httpClient = new HttpClient();
try
{
var response = httpClient.GetAsync(WebApiConstant.Configuration.WebserviceUrl).Result;
if (response.IsSuccessStatusCode)
{
var showResponse = response.Content.ReadAsAsync<ShowJsonResponse>().Result;
if (response.Headers.CacheControl.Public && response.Headers.CacheControl.MaxAge.HasValue)
{
cacheExpiry = response.Headers.CacheControl.MaxAge.Value.Seconds;
}
// TODO: Remove when finished testing
// Default to 60 seconds for testing
cacheExpiry = 20;
return showResponse;
}
}
catch (HttpRequestException ex)
{
}
cacheExpiry = 0;
return null;
}
#endregion
}
The HttpCache class is just a wrapper around HttpRuntime Cache. The GetAndSet method just tries to retrieve the cache object and sets it if not found.
public override T GetAndSet<T>(string key, ref int duration, Func<T> method)
{
var data = _cache == null ? default(T) : (T) _cache[key];
if (data == null)
{
data = method();
if (duration > 0 && data != null)
{
lock (sync)
{
_cache.Insert(key, data, null, DateTime.Now.AddSeconds(duration), Cache.NoSlidingExpiration);
}
}
}
return data;
}
Usage example:
ConfigurationProvider.Instance.Configuration.Blah
Is there any perceived benefit to using the singleton pattern in this scenario, or instantiate the class regularly would be ok?
I think that the singleton pattern fits better in your case, and you won't need the object instance as well. Are you taking care of concurrency inside your HttpCache wrapper? It's important in order to avoid that concurrent threads could make multiple WS requests when two or more access the cache object at the same time or before the WS request returns.
I would suggest you to use the double lock/check pattern:
public override T GetAndSet<T>(string key, ref int duration, Func<T> method) {
var data = _cache == null ? default(T) : (T) _cache[key];
if (data == null) { //check
lock (sync) { //lock
//this avoids that a waiting thread reloads the configuration again
data = _cache == null ? default(T) : (T) _cache[key];
if (data == null) { //check again
data = method();
if (duration > 0 && data != null) {
_cache.Insert(key, data, null, DateTime.Now.AddSeconds(duration), Cache.NoSlidingExpiration);
}
}
}
}
return data;
}

Categories

Resources