I have MVC application in which action methods must be executed on certain order. Recently, I am having some strange issues and I suppose that it is due to the fact that I do not do any thread synchronization. I have barely worked with multithreading and I don't know much about it. I tried to implement some kind of locking where I have to lock according to Id. So I implemented class like below to get required lock objects.
public class ReportLockProvider
: IReportLockProvider
{
readonly ConcurrentDictionary<long, object> lockDictionary
= new ConcurrentDictionary<long, object>();
public object ProvideLockObject(long reportId)
{
return lockDictionary.GetOrAdd(reportId, new object());
}
}
I tried to use this as below:
ReportLockProvider lockProvider = new ReportLockProvider();
public async ActionResult MyAction(long reportId)
{
lock(lockProvider.ProvideLockObject(reportId))
{
// Some operations
await Something();
// Some operation
}
}
I hoped that it would work, but it event didn't compiled because I have used await inside lock body. I have searched a bit and came across to SemaphoreSlim in this answer. Now, the problem is that I have to get lock object according to Id. How can I do this? Is it OK to create multiple SemaphoreSlim objects? Is it OK if I modify code like below :
public class ReportLockProvider
: IReportLockProvider
{
readonly ConcurrentDictionary<long, SemaphoreSlim> lockDictionary
= new ConcurrentDictionary<long, SemaphoreSlim>();
public SemaphoreSlim ProvideLockObject(long reportId)
{
return lockDictionary.GetOrAdd(reportId, new SemaphoreSlim(1, 1));
}
}
public async ActionResult MyAction(long reportId)
{
var lockObject = ReportLockProvider.ProvideLockObject(reportId);
await lockObject.WaitAsync();
try
{
// Some operations
await Something();
// Some operation
}
finally
{
lockObject.Release();
}
}
The other question is that, can I use SemaphoreSlim in non-async methods? Is there any better option?
I think you are missing a static keyword in front of your lockDictionary, but it depends on how you instanciate the provider.
Here is a sample with a little change code I cooked up in LinqPad:
async Task Main()
{
ReportLockProvider reportLockProvider = new ReportLockProvider();
List<Task> tasks = new List<Task>(10);
for (long i = 1; i <= 5; i++) {
var local = i;
tasks.Add(Task.Run(() => Enter(local) ));
tasks.Add(Task.Run(() => Enter(local) ));
}
async Task Enter(long id)
{
Console.WriteLine(id + " waiting to enter");
await reportLockProvider.WaitAsync(id);
Console.WriteLine(id + " entered!");
Thread.Sleep(1000 * (int)id);
Console.WriteLine(id + " releasing");
reportLockProvider.Release(id);
}
await Task.WhenAll(tasks.ToArray());
}
public class ReportLockProvider
{
static readonly ConcurrentDictionary<long, SemaphoreSlim> lockDictionary = new ConcurrentDictionary<long, SemaphoreSlim>();
public async Task WaitAsync(long reportId)
{
await lockDictionary.GetOrAdd(reportId, new SemaphoreSlim(1, 1)).WaitAsync();
}
public void Release(long reportId)
{
SemaphoreSlim semaphore;
if (lockDictionary.TryGetValue(reportId, out semaphore))
{
semaphore.Release();
}
}
}
Let's say I have a method it gets data from server
Task<Result> GetDataFromServerAsync(...)
If there is an ongoing call in progress, I don't want to start a new request to server but wait for the original to finish.
Let's say I have
var result = await objet.GetDataFromServerAsync(...);
and in a different place, called almost at the same time I have a second call
var result2 = await objet.GetDataFromServerAsync(...);
I don't want the second to start a new request to server if the first didn't finish. I want both calls to get the same result as soon as first call finish. This is a proof of concept, I have options but I wanted to see how easy it's to do this.
Here is a quick example using Lazy<Task<T>>:
var lazyGetDataFromServer = new Lazy<Task<Result>>
(() => objet.GetDataFromServerAsync(...));
var result = await lazyGetDataFromServer.Value;
var result2 = await lazyGetDataFromServer.Value;
It doesn't matter if these 2 awaits are done from separate threads as Lazy<T> is thread-safe, so result2 if ran second will still wait and use the same output from result.
Using the code from here you can wrap this up in a class called AsyncLazy<T>, and add a custom GetAwaiter so that you can just await it without the need to do .Value, very tidy =)
You can use anything for syncrhonization in your method.
For example, I used SemaphoreSlim:
public class PartyMaker
{
private bool _isProcessing;
private readonly SemaphoreSlim _slowStuffSemaphore = new SemaphoreSlim(1, 1);
private DateTime _something;
public async Task<DateTime> ShakeItAsync()
{
try
{
var needNewRequest = !_isProcessing;
await _slowStuffSemaphore.WaitAsync().ConfigureAwait(false);
if (!needNewRequest) return _something;
_isProcessing = true;
_something = await ShakeItSlowlyAsync().ConfigureAwait(false);
return _something;
}
finally
{
_isProcessing = false;
_slowStuffSemaphore.Release();
}
}
private async Task<DateTime> ShakeItSlowlyAsync()
{
await Task.Delay(TimeSpan.FromSeconds(1)).ConfigureAwait(false);
return DateTime.UtcNow;
}
}
Usage:
var maker = new PartyMaker();
var tasks = new[] {maker.ShakeItAsync(), maker.ShakeItAsync()};
Task.WaitAll(tasks);
foreach (var task in tasks)
{
Console.WriteLine(task.Result);
}
Console.WriteLine(maker.ShakeItAsync().Result);
Here is result:
17.01.2017 22:28:39
17.01.2017 22:28:39
17.01.2017 22:28:41
UPD
With this modification you can call async methods with args:
public class PartyMaker
{
private readonly SemaphoreSlim _slowStuffSemaphore = new SemaphoreSlim(1, 1);
private readonly ConcurrentDictionary<int, int> _requestCounts = new ConcurrentDictionary<int, int>();
private readonly ConcurrentDictionary<int, DateTime> _cache = new ConcurrentDictionary<int, DateTime>();
public async Task<DateTime> ShakeItAsync(Argument argument)
{
var key = argument.GetHashCode();
DateTime result;
try
{
if (!_requestCounts.ContainsKey(key))
{
_requestCounts[key] = 1;
}
else
{
++_requestCounts[key];
}
var needNewRequest = _requestCounts[key] == 1;
await _slowStuffSemaphore.WaitAsync().ConfigureAwait(false);
if (!needNewRequest)
{
_cache.TryGetValue(key, out result);
return result;
}
_cache.TryAdd(key, default(DateTime));
result = await ShakeItSlowlyAsync().ConfigureAwait(false);
_cache[key] = result;
return result;
}
finally
{
_requestCounts[key]--;
if (_requestCounts[key] == 0)
{
int temp;
_requestCounts.TryRemove(key, out temp);
_cache.TryRemove(key, out result);
}
_slowStuffSemaphore.Release();
}
}
private async Task<DateTime> ShakeItSlowlyAsync()
{
await Task.Delay(TimeSpan.FromSeconds(1)).ConfigureAwait(false);
return DateTime.UtcNow;
}
}
public class Argument
{
public Argument(int value)
{
Value = value;
}
public int Value { get; }
public override int GetHashCode()
{
return Value.GetHashCode();
}
}
This code seems to do a good job of caching async method results. I would like to add some sort of expiration to it. I have tried Tuple but I was not successful in getting it to fully work / compile.
private static readonly ConcurrentDictionary<object, SemaphoreSlim> _keyLocks = new ConcurrentDictionary<object, SemaphoreSlim>();
private static readonly ConcurrentDictionary<object, Tuple<List<UnitDTO>, DateTime>> _cache = new ConcurrentDictionary<object, Tuple<List<UnitDTO>, DateTime>>();
public async Task<string> GetSomethingAsync(string key)
{
string value;
// get the semaphore specific to this key
var keyLock = _keyLocks.GetOrAdd(key, x => new SemaphoreSlim(1));
await keyLock.WaitAsync();
try
{
// try to get value from cache
if (!_cache.TryGetValue(key, out value))
{
// if value isn't cached, get it the long way asynchronously
value = await GetSomethingTheLongWayAsync();
// cache value
_cache.TryAdd(key, value);
}
}
finally
{
keyLock.Release();
}
return value;
}
Classical approaches and quotations
From msdn, by Stephen Cleary
Asynchronous code is often used to initialize a resource that’s then
cached and shared. There isn’t a built-in type for this, but Stephen
Toub developed an AsyncLazy that acts like a merge of Task and
Lazy. The original type is described on his blog, and an
updated version is available in my AsyncEx library.
public class AsyncLazy<T> : Lazy<Task<T>>
{
public AsyncLazy(Func<T> valueFactory) :
base(() => Task.Factory.StartNew(valueFactory)) { }
public AsyncLazy(Func<Task<T>> taskFactory) :
base(() => Task.Factory.StartNew(() => taskFactory()).Unwrap()) { }
}
Context
Let’s say in our program we have one of these AsyncLazy instances:
static string LoadString() { … }
static AsyncLazy<string> m_data = new AsyncLazy<string>(LoadString);
Usage
Thus, we can write an asynchronous method that does:
string data = await m_data.Value;
The Lazy<T> would be appropriate, but unfortunately it seems to lack the input parameter to index the result. The same issue was solved here where it is explained how to cache the results from a long-running, resource-intensive method, in case it is not async
Back to your proposed solution
Before I show the main changes related to the cache management and specific to your proposed implementation, let me suggest a couple of marginal optimization options, based on the following concerns.
often with locks, when you access them they’re uncontended, and in
such cases you really want acquiring and releasing the lock to be as
low-overhead as possible; in other words, accessing uncontended locks
should involve a fast path
Since they're just performance optimization tricks, I will leave them commented in the code so that you can measure their effects in your specific situation before.
You need to test TryGetValue again after awaiting because another parallel process could have added that value in the meantime
You don't need to keep the lock while you're awaiting
This balance of overhead vs cache misses was already pointed out in a previous answer to a similar question.
Obviously, there's overhead keeping SemaphoreSlim objects around to
prevent cache misses so it may not be worth it depending on the use
case. But if guaranteeing no cache misses is important than this
accomplishes that.
My main answer: the cache management
Regarding the cache expiration, I would suggest to add the creation DateTime to the value of the Dictionary (i.e. the time when the value is returned from GetSomethingTheLongWayAsync) and consequently discard the cached value after a fixed time span.
Find a draft below
private static readonly ConcurrentDictionary<object, SemaphoreSlim> _keyLocks = new ConcurrentDictionary<object, SemaphoreSlim>();
private static readonly ConcurrentDictionary<object, Tuple<string, DateTime>> _cache = new ConcurrentDictionary<object, Tuple<string, DateTime>>();
private static bool IsExpiredDelete(Tuple<string, DateTime> value, string key)
{
bool _is_exp = (DateTime.Now - value.Item2).TotalMinutes > Expiration;
if (_is_exp)
{
_cache.TryRemove(key, out value);
}
return _is_exp;
}
public async Task<string> GetSomethingAsync(string key)
{
Tuple<string, DateTime> cached;
// get the semaphore specific to this key
var keyLock = _keyLocks.GetOrAdd(key, x => new SemaphoreSlim(1));
await keyLock.WaitAsync();
try
{
// try to get value from cache
if (!_cache.TryGetValue(key, out cached) || IsExpiredDelete(cached,key))
{
//possible performance optimization: measure it before uncommenting
//keyLock.Release();
string value = await GetSomethingTheLongWayAsync(key);
DateTime creation = DateTime.Now;
// in case of performance optimization
// get the semaphore specific to this key
//keyLock = _keyLocks.GetOrAdd(key, x => new SemaphoreSlim(1));
//await keyLock.WaitAsync();
bool notFound;
if (notFound = !_cache.TryGetValue(key, out cached) || IsExpiredDelete(cached, key))
{
cached = new Tuple<string, DateTime>(value, creation);
_cache.TryAdd(key, cached);
}
else
{
if (!notFound && cached.Item2 < creation)
{
cached = new Tuple<string, DateTime>(value, creation);
_cache.TryAdd(key, cached);
}
}
}
}
finally
{
keyLock.Release();
}
return cached?.Item1;
}
Please, adapt the above code to your specific needs.
Making it more generic
Finally you may want to generalize it a little bit.
By the way, notice that the Dictionary are not static since one could cache two different methods with the same signature.
public class Cached<FromT, ToT>
{
private Func<FromT, Task<ToT>> GetSomethingTheLongWayAsync;
public Cached (Func<FromT, Task<ToT>> _GetSomethingTheLongWayAsync, int expiration_min ) {
GetSomethingTheLongWayAsync = _GetSomethingTheLongWayAsync;
Expiration = expiration_min;
}
int Expiration = 1;
private ConcurrentDictionary<FromT, SemaphoreSlim> _keyLocks = new ConcurrentDictionary<FromT, SemaphoreSlim>();
private ConcurrentDictionary<FromT, Tuple<ToT, DateTime>> _cache = new ConcurrentDictionary<FromT, Tuple<ToT, DateTime>>();
private bool IsExpiredDelete(Tuple<ToT, DateTime> value, FromT key)
{
bool _is_exp = (DateTime.Now - value.Item2).TotalMinutes > Expiration;
if (_is_exp)
{
_cache.TryRemove(key, out value);
}
return _is_exp;
}
public async Task<ToT> GetSomethingAsync(FromT key)
{
Tuple<ToT, DateTime> cached;
// get the semaphore specific to this key
var keyLock = _keyLocks.GetOrAdd(key, x => new SemaphoreSlim(1));
await keyLock.WaitAsync();
try
{
// try to get value from cache
if (!_cache.TryGetValue(key, out cached) || IsExpiredDelete(cached, key))
{
//possible performance optimization: measure it before uncommenting
//keyLock.Release();
ToT value = await GetSomethingTheLongWayAsync(key);
DateTime creation = DateTime.Now;
// in case of performance optimization
// get the semaphore specific to this key
//keyLock = _keyLocks.GetOrAdd(key, x => new SemaphoreSlim(1));
//await keyLock.WaitAsync();
bool notFound;
if (notFound = !_cache.TryGetValue(key, out cached) || IsExpiredDelete(cached, key))
{
cached = new Tuple<ToT, DateTime>(value, creation);
_cache.TryAdd(key, cached);
}
else
{
if (!notFound && cached.Item2 < creation)
{
cached = new Tuple<ToT, DateTime>(value, creation);
_cache.TryAdd(key, cached);
}
}
}
}
finally
{
keyLock.Release();
}
return cached.Item1;
}
}
For a generic FromT an IEqualityComparer is needed for the Dictionary
Usage/Demo
private static async Task<string> GetSomethingTheLongWayAsync(int key)
{
await Task.Delay(15000);
Console.WriteLine("Long way for: " + key);
return key.ToString();
}
static void Main(string[] args)
{
Test().Wait();
}
private static async Task Test()
{
int key;
string val;
key = 1;
var cache = new Cached<int, string>(GetSomethingTheLongWayAsync, 1);
Console.WriteLine("getting " + key);
val = await cache.GetSomethingAsync(key);
Console.WriteLine("getting " + key + " resulted in " + val);
Console.WriteLine("getting " + key);
val = await cache.GetSomethingAsync(key);
Console.WriteLine("getting " + key + " resulted in " + val);
await Task.Delay(65000);
Console.WriteLine("getting " + key);
val = await cache.GetSomethingAsync(key);
Console.WriteLine("getting " + key + " resulted in " + val);
Console.ReadKey();
}
Sophisticated alternatives
There are also more advanced possibilities like the overload of GetOrAdd that takes a delegate and Lazy objects to ensure that a generator function is called only once (instead of semaphores and locks).
public class AsyncCache<FromT, ToT>
{
private Func<FromT, Task<ToT>> GetSomethingTheLongWayAsync;
public AsyncCache(Func<FromT, Task<ToT>> _GetSomethingTheLongWayAsync, int expiration_min)
{
GetSomethingTheLongWayAsync = _GetSomethingTheLongWayAsync;
Expiration = expiration_min;
}
int Expiration;
private ConcurrentDictionary<FromT, Tuple<Lazy<Task<ToT>>, DateTime>> _cache =
new ConcurrentDictionary<FromT, Tuple<Lazy<Task<ToT>>, DateTime>>();
private bool IsExpiredDelete(Tuple<Lazy<Task<ToT>>, DateTime> value, FromT key)
{
bool _is_exp = (DateTime.Now - value.Item2).TotalMinutes > Expiration;
if (_is_exp)
{
_cache.TryRemove(key, out value);
}
return _is_exp;
}
public async Task<ToT> GetSomethingAsync(FromT key)
{
var res = _cache.AddOrUpdate(key,
t => new Tuple<Lazy<Task<ToT>>, DateTime>(new Lazy<Task<ToT>>(
() => GetSomethingTheLongWayAsync(key)
)
, DateTime.Now) ,
(k,t) =>
{
if (IsExpiredDelete(t, k))
{
return new Tuple<Lazy<Task<ToT>>, DateTime>(new Lazy<Task<ToT>>(
() => GetSomethingTheLongWayAsync(k)
), DateTime.Now);
}
return t;
}
);
return await res.Item1.Value;
}
}
Same usage, just replace AsyncCache instead of Cached.
Is there a possibility to cache a collection, retrieved using WCF from an OData service.
The situation is the following:
I generated a WCF service client with Visual Studio 2015 using the metadata of the odata service. VS generated a class inheriting from System.Data.Services.Client.DataServiceContext. This class has some properties of type System.Data.Services.Client.DataServiceQuery<T>. The data of some of these properties change seldom. Because of performance reasons I want the WCF client to load these properties just the first time and not every time I use it in the code.
Is there a built in possibility to cache the data of these properties? Or can I tell the service client not to load specific proeprties newly every time.
Assuming the service client class is ODataClient and one of its properties is `Area, for now I get the values in the following way:
var client = new ODataClient("url_to_the_service");
client.IgnoreMissingProperties = true;
var propertyInfo = client.GetType().GetProperty("Area");
var area = propertyInfo.GetValue(client) as IEnumerable<object>;
The reason why I do this in such a complicated way is, that the client should be very generic: The properties to be handled can be configured in a configuration file.
* EDIT *
I already tried to find properties in the System.Data.Services.Client.DataServiceContext class or the System.Data.Services.Client.DataServiceQuery<T> class for the caching. But i wasn't able to find any.
To my knowledge there is no "out of the box" caching concept on the client. There are options for caching the output of a request on the server which is something you might want consider as well. Googling "WCF Caching" would get you a bunch of info on this.
Regarding client side caching...#Evk is correct it is pretty straight forward. Here is an sample using MemoryCache.
using System;
using System.Runtime.Caching;
namespace Services.Util
{
public class CacheWrapper : ICacheWrapper
{
ObjectCache _cache = MemoryCache.Default;
public void ClearCache()
{
MemoryCache.Default.Dispose();
_cache = MemoryCache.Default;
}
public T GetFromCache<T>(string key, Func<T> missedCacheCall)
{
return GetFromCache<T>(key, missedCacheCall, TimeSpan.FromMinutes(5));
}
public T GetFromCache<T>(string key, Func<T> missedCacheCall, TimeSpan timeToLive)
{
var result = _cache.Get(key);
if (result == null)
{
result = missedCacheCall();
if (result != null)
{
_cache.Set(key, result, new CacheItemPolicy { AbsoluteExpiration = DateTimeOffset.Now.Add(timeToLive) });
}
}
return (T)result;
}
public void InvalidateCache(string key)
{
_cache.Remove(key);
}
}
}
This is an example of code that uses the cache...
private class DataAccessTestStub
{
public const string DateTimeTicksCacheKey = "GetDateTimeTicks";
ICacheWrapper _cache;
public DataAccessTestStub(ICacheWrapper cache)
{
_cache = cache;
}
public string GetDateTimeTicks()
{
return _cache.GetFromCache(DateTimeTicksCacheKey, () =>
{
var result = DateTime.Now.Ticks.ToString();
Thread.Sleep(100); // Create some delay
return result;
});
}
public string GetDateTimeTicks(TimeSpan timeToLive)
{
return _cache.GetFromCache(DateTimeTicksCacheKey, () =>
{
var result = DateTime.Now.Ticks.ToString();
Thread.Sleep(500); // Create some delay
return result;
}, timeToLive);
}
public void ClearDateTimeTicks()
{
_cache.InvalidateCache(DateTimeTicksCacheKey);
}
public void ClearCache()
{
_cache.ClearCache();
}
}
And some tests if you fancy...
[TestClass]
public class CacheWrapperTest
{
private DataAccessTestStub _dataAccessTestClass;
[TestInitialize]
public void Init()
{
_dataAccessTestClass = new DataAccessTestStub(new CacheWrapper());
}
[TestMethod]
public void GetFromCache_ShouldExecuteCacheMissCall()
{
var original = _dataAccessTestClass.GetDateTimeTicks();
Assert.IsNotNull(original);
}
[TestMethod]
public void GetFromCache_ShouldReturnCachedVersion()
{
var copy1 = _dataAccessTestClass.GetDateTimeTicks();
var copy2 = _dataAccessTestClass.GetDateTimeTicks();
Assert.AreEqual(copy1, copy2);
}
[TestMethod]
public void GetFromCache_ShouldRespectTimeToLive()
{
_dataAccessTestClass.ClearDateTimeTicks();
var copy1 = _dataAccessTestClass.GetDateTimeTicks(TimeSpan.FromSeconds(2));
var copy2 = _dataAccessTestClass.GetDateTimeTicks();
Assert.AreEqual(copy1, copy2);
Thread.Sleep(3000);
var copy3 = _dataAccessTestClass.GetDateTimeTicks();
Assert.AreNotEqual(copy1, copy3);
}
[TestMethod]
public void InvalidateCache_ShouldClearCachedVersion()
{
var original = _dataAccessTestClass.GetDateTimeTicks();
_dataAccessTestClass.ClearDateTimeTicks();
var updatedVersion = _dataAccessTestClass.GetDateTimeTicks();
Assert.AreNotEqual(original, updatedVersion);
}
}
I have the following cache implementation for my application:
public static class Keys
{
public const string CacheKey = "cachekey";
}
public interface ICache
{
string QueryCachedData(string param);
}
the data is loaded when the application starts in Global.asax
//Global.asax
protected void Application_Start(object sender, EventArgs e)
{
//instantiates the repository
HttpContext.Current.Application[Keys.CacheKey] = repository.getDataView();
}
the implementation recover the data from HttpContext.Current
public class Cache : ICache
{
private Cache() { }
private static Cache _instance = null;
public static Cache GetInstance()
{
if (_instance == null)
_instance = new Cache();
return _instance;
}
private System.Data.DataView GetCachedData()
{
if (HttpContext.Current.Application[Keys.CacheKey] == null)
{
//instantiates the repository
HttpContext.Current.Application[Keys.CacheKey] = repository.getDataView();
}
return HttpContext.Current.Application[Keys.CacheKey] as System.Data.DataView;
}
private readonly Object _lock = new Object();
public string QueryCachedData(string param)
{
lock (_lock)
{
var data = GetCachedData();
//Execute query
return result;
}
}
}
at some point i need consume some third party web service with the following class using the cache...
public class ThirdPartyWebserviceConsumer
{
ICache _cache;
int _provider;
public ThirdPartyWebserviceConsumer(int provider, ICache cache)
{
_cache = cache;
_provider = provider;
}
public result DoSomething()
{
var info = _cache.QueryCachedData(param);
}
}
...using multi-thread:
public List<Result> Foo(ICache cache, List<int> collectionOfProviders)
{
List<Result> results = new List<Result>();
List<Task> taskList = new List<Task>();
foreach (var provider in collectionOfProviders)
{
var task = new Task<Result>(() => new ThirdPartyWebserviceConsumer(provider, cache).DoSomething());
task.Start();
task.ContinueWith(task =>
{
results.Add(task.Result);
});
taskList.Add(task);
}
Task.WaitAll(taskList.ToArray());
return results;
}
My problem is that HttpContext.Current.Application is null in the thead context.
What options do I have? there are some form to access the HttpContext in thread? or maybe another type of cache that could be shared between the threads?
My problem is that HttpContext.Current.Application is null in the thead context. What options do I have?
HttpContext.Current is bound to the managed thread processing the current request.
If you need data from the current context for another thread, you need to copy that data out of the current context first and pass it to your separate thread.