I need to cache a generic list so I dont have to query the databse multiple times. In a web application I would just add it to the httpcontext.current.cache . What is the proper way to cache objects in console applications?
Keep it as instance member of the containing class. In web app you can't do this since page class's object is recreated on every request.
However .NET 4.0 also has MemoryCache class for this purpose.
In a class-level variable. Presumably, in the main method of your console app you instantiate at least one object of some sort. In this object's class, you declare a class-level variable (a List<String> or whatever) in which you cache whatever needs caching.
Here is a very simple cache class I use in consoles that has self clean up and easy implementation.
The Usage:
return Cache.Get("MyCacheKey", 30, () => { return new Model.Guide().ChannelListings.BuildChannelList(); });
The Class:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Timers;
namespace MyAppNamespace
{
public static class Cache
{
private static Timer cleanupTimer = new Timer() { AutoReset = true, Enabled = true, Interval = 60000 };
private static readonly Dictionary<string, CacheItem> internalCache = new Dictionary<string, CacheItem>();
static Cache()
{
cleanupTimer.Elapsed += Clean;
cleanupTimer.Start();
}
private static void Clean(object sender, ElapsedEventArgs e)
{
internalCache.Keys.ToList().ForEach(x => { try { if (internalCache[x].ExpireTime <= e.SignalTime) { Remove(x); } } catch (Exception) { /*swallow it*/ } });
}
public static T Get<T>(string key, int expiresMinutes, Func<T> refreshFunction)
{
if (internalCache.ContainsKey(key) && internalCache[key].ExpireTime > DateTime.Now)
{
return (T)internalCache[key].Item;
}
var result = refreshFunction();
Set(key, result, expiresMinutes);
return result;
}
public static void Set(string key, object item, int expiresMinutes)
{
Remove(key);
internalCache.Add(key, new CacheItem(item, expiresMinutes));
}
public static void Remove(string key)
{
if (internalCache.ContainsKey(key))
{
internalCache.Remove(key);
}
}
private struct CacheItem
{
public CacheItem(object item, int expiresMinutes)
: this()
{
Item = item;
ExpireTime = DateTime.Now.AddMinutes(expiresMinutes);
}
public object Item { get; private set; }
public DateTime ExpireTime { get; private set; }
}
}
}
// Consider this psuedo code for using Cache
public DataSet GetMySearchData(string search)
{
// if it is in my cache already (notice search criteria is the cache key)
string cacheKey = "Search " + search;
if (Cache[cacheKey] != null)
{
return (DataSet)(Cache[cacheKey]);
}
else
{
DataSet result = yourDAL.DoSearch(search);
Cache[cacheKey].Insert(result); // There are more params needed here...
return result;
}
}
Ref: How do I cache a dataset to stop round trips to db?
You might be able to just use a simple Dictionary. The thing that makes the Cache so special in the web environment is that it persists and is scoped in such a way that many users can access it. In a console app, you don't have those issues. If your needs are simple enough, the dictionary or similar structures can be used to quickly lookup values you pull out of a database.
There a many ways to implement caches, depending of what exactly you are doing. Usually you will be using a dictionary to hold cached values. Here is my simple implementation of a cache, which caches values only for a limited time:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
namespace CySoft.Collections
{
public class Cache<TKey,TValue>
{
private readonly Dictionary<TKey, CacheItem> _cache = new Dictionary<TKey, CacheItem>();
private TimeSpan _maxCachingTime;
/// <summary>
/// Creates a cache which holds the cached values for an infinite time.
/// </summary>
public Cache()
: this(TimeSpan.MaxValue)
{
}
/// <summary>
/// Creates a cache which holds the cached values for a limited time only.
/// </summary>
/// <param name="maxCachingTime">Maximum time for which the a value is to be hold in the cache.</param>
public Cache(TimeSpan maxCachingTime)
{
_maxCachingTime = maxCachingTime;
}
/// <summary>
/// Tries to get a value from the cache. If the cache contains the value and the maximum caching time is
/// not exceeded (if any is defined), then the cached value is returned, else a new value is created.
/// </summary>
/// <param name="key">Key of the value to get.</param>
/// <param name="createValue">Function creating a new value.</param>
/// <returns>A cached or a new value.</returns>
public TValue Get(TKey key, Func<TValue> createValue)
{
CacheItem cacheItem;
if (_cache.TryGetValue(key, out cacheItem) && (DateTime.Now - cacheItem.CacheTime) <= _maxCachingTime) {
return cacheItem.Item;
}
TValue value = createValue();
_cache[key] = new CacheItem(value);
return value;
}
private struct CacheItem
{
public CacheItem(TValue item)
: this()
{
Item = item;
CacheTime = DateTime.Now;
}
public TValue Item { get; private set; }
public DateTime CacheTime { get; private set; }
}
}
}
You can pass a lambda expression to the Get method, which retrieves values from a db for instance.
Use Singleton Pattern.
http://msdn.microsoft.com/en-us/library/ff650316.aspx
Related
I need to traverse a collection of disjoint folders; each folder is associated to a visited time configurated somewhere in the folder.
I then sort the folders, and process the one with the earliest visited time first. Note the processing is generally slower than the traversing.
My code targets Framework4.8.1; Currently my implementation is as follows:
public class BySeparateThread
{
ConcurrentDictionary<string, DateTime?> _dict = new ConcurrentDictionary<string, DateTime?>();
private object _lock;
/// <summary>
/// this will be called by producer thread;
/// </summary>
/// <param name="address"></param>
/// <param name="time"></param>
public void add(string address,DateTime? time) {
_dict.TryAdd(address, time);
}
/// <summary>
/// called by subscriber thread;
/// </summary>
/// <returns></returns>
public string? next() {
lock (_lock) {
var r = _dict.FirstOrDefault();
//return sortedList.FirstOrDefault().Value;
if (r.Key is null)
{
return r.Key;
}
if (r.Value is null)
{
_dict.TryRemove(r.Key, out var _);
return r.Key;
}
var key = r.Key;
foreach (var item in _dict.Skip(1) )
{
if (item.Value is null)
{
_dict.TryRemove(item.Key, out var _);
return item.Key;
}
if (item.Value< r.Value)
{
r=item;
}
}
_dict.TryRemove(key, out var _);
return key;
}
}
/// <summary>
/// this will be assigned of false by producer thread;
/// </summary>
public bool _notComplete = true;
/// <summary>
/// shared configuration for subscribers;
/// </summary>
fs.addresses_.disjoint.deV_._bak.Io io; //.io_._CfgX.Create(cancel, git)
/// <summary>
/// run this in a separate thread other than <see cref="add(string, DateTime?)"/>
/// </summary>
/// <param name="sln"></param>
/// <returns></returns>
public async Task _asyn_ofAddress(string sln)
{
while (_notComplete)
{
var f = next();
if (f is null )
{
await Task.Delay(30*1000);
//await Task.Yield();
continue;
}
/// degree of concurrency is controlled by a semophore; for instance, at most 4 are tackled:
new dev.srcs.each.sln_.delvable.Bak_srcsInAddresses(io)._startTask_ofAddress(sln);
}
}
}
For the above, I'm concerned about the while(_notComplete) part, as it looks like there would be many loops doing nothing there. I think there should be better ways to remove the while by utilizing the fact that the collection can notify whether it's empty or not at some/various stages such as when we add.
There would be better implementation which can be based on some mature framework such as those being considered by me these days but I often stopped wondering at some implementation details:
BlockingCollection
for this one, I don't know how to make the collection added and sorted dynamically while producer and subscriber are on the run;
Channel
Again, I could not come up with one fitting my need after I read its examples;
Pipeline
I havenot fully understood it;
Rx
I tried to implement an observable and an observer. It only gives me a macroscope framework, but when I get into the details, I ended with what I'm currently doing and I begin to wonder: with what I'm doing, I don't need Rx here.
Dataflow
Shall I implement my own BufferBlock or ActionBlock? It seems the built-in bufferBlock cannot be customized to sort things before releasing them to the next block.
Sorting buffered Observables seems similar to my problem; but it ends with a solution similar to the one I currently have but am not satisfied with, as stated in the above.
Could some one give me a sample code? Please give as concrete code as you can; As you can see, I have researched some general ideas/paths and finally what stops me short is the details, which are often glossed over in some docs.
I just found one solution which is better than my current one. I believe there are some even better ones, so please do post your answers if you find some; my current one is just what I can hack for what I know so far.
I found Prioritized queues in Task Parallel Library, and I write a similar one for my case:
using System;
using System.Collections;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Diagnostics;
using System.Linq;
using System.Reactive.Subjects;
using System.Threading;
using System.Threading.Tasks;
namespace nilnul.dev.srcs.every.slns._bak
{
public class BySortedSet : IProducerConsumerCollection<(string, DateTime)>
{
private class _Comparer : IComparer<(string, DateTime)>
{
public int Compare((string, DateTime) first, (string, DateTime) second)
{
var returnValue = first.Item2.CompareTo(second.Item2);
if (returnValue == 0)
returnValue = first.Item1.CompareTo(second.Item1);
return returnValue;
}
static public _Comparer Singleton
{
get
{
return nilnul._obj.typ_.nilable_.unprimable_.Singleton<_Comparer>.Instance;// just some magic to get an instance
}
}
}
SortedSet<(string, DateTime)> _dict = new SortedSet<(string, DateTime)>(
_Comparer.Singleton
);
private object _lock=new object();
public int Count
{
get
{
lock(_lock){
return _dict.Count;
}
}
}
public object SyncRoot => _lock;
public bool IsSynchronized => true;
IEnumerator IEnumerable.GetEnumerator()
{
return GetEnumerator();
//throw new NotImplementedException();
}
public void CopyTo((string, DateTime)[] array, int index)
{
lock (_lock)
{
foreach (var item in _dict)
{
array[index++] = item;
}
}
}
public void CopyTo(Array array, int index)
{
lock (_lock)
{
foreach (var item in _dict)
{
array.SetValue(item, index++);
}
}
}
public bool TryAdd((string, DateTime) item)
{
lock (_lock)
{
return _dict.Add(item);
}
}
public bool TryTake(out (string, DateTime) item)
{
lock (_lock)
{
item = _dict.Min;
if (item==default)
{
return false;
}
return _dict.Remove(item);
}
}
public (string, DateTime)[] ToArray()
{
lock (_lock)
{
return this._dict.ToArray();
}
}
public IEnumerator<(string, DateTime)> GetEnumerator()
{
return ToArray().AsEnumerable().GetEnumerator();
}
/// <summary>
/// </summary>
/// <returns></returns>
public BlockingCollection<(string, DateTime)> asBlockingCollection() {
return new BlockingCollection<(string, DateTime)>(
this
);
}
}
}
Then I can use that like:
static public void ExampleUse(CancellationToken cancellationToken) {
var s = new BySortedSet().asBlockingCollection();
/// traversal thread:
s.Add(("", DateTime.MinValue));
//...
s.CompleteAdding();
/// tackler thread:
///
foreach (var item in s.GetConsumingEnumerable(cancellationToken))
{
/// process the item;
/// todo: degree of parallelism is controlled by the tackler, or is there a better way like in dataflow or Rx or sth else?
}
}
Thanks!
public class CacheController
{
public IMemoryCache _memoryCache {get; set;}
public string getCacheMethodOne(string token)
{
string cacheValue = null;
string cacheKey = null;
if (!_memoryCache.TryGetValue<string>("123456", out cacheValue))
{
cacheValue = token;
cacheKey = "123456";
var cacheEntryOptions = new MemoryCacheEntryOptions().SetSlidingExpiration(TimeSpan.FromMinutes(2));
_memoryCache.Set<string>("123456", cacheValue, cacheEntryOptions);
}
return cacheKey;
}
}
Problem with this line of code:
string otp = new
CacheController().getCacheMethodOne(ClientJsonOtp.ToString());
throws exception.
Object reference not set to an instance of an object.
Should i create new instances of IMemorycahce.
If i do so, will it affect the cache. as it may lose the previous cache instance.
try
{
var finalResult = result.Content.ReadAsStringAsync().Result;
var ClientJsonOtp = JsonConvert.DeserializeObject(finalResult);
string otp = new CacheController().getCacheMethodOne(ClientJsonOtp.ToString());
return Json(ClientJsonOtp, JsonRequestBehavior.AllowGet);
}
You need to create one, at least once. Otherwise it will always be null.
You can do that when you call the empty constructor:
public CacheController()
{
this._memoryCache = new // whatever memory cache you choose;
}
You can even better inject it somewhere using dependency injection. The place depends on application type.
But best of all, try to have only once cache. Each time you create one you lose the previous, so you will either try the singleton pattern, or inject using a single instance configuration and let the DI container handle the rest.
An example for the singleton implementation: here
You can access by using:
Cache.Instance.Read(//what)
Here is the cache implementation
using System;
using System.Configuration;
using System.Runtime.Caching;
namespace Client.Project.HelperClasses
{
/// <summary>
/// Thread Safe Singleton Cache Class
/// </summary>
public sealed class Cache
{
private static volatile Cache instance; // Locks var until assignment is complete for double safety
private static MemoryCache memoryCache;
private static object syncRoot = new Object();
private static string settingMemoryCacheName;
private static double settingCacheExpirationTimeInMinutes;
private Cache() { }
/// <summary>
/// Singleton Cache Instance
/// </summary>
public static Cache Instance
{
get
{
if (instance == null)
{
lock (syncRoot)
{
if (instance == null)
{
InitializeInstance();
}
}
}
return instance;
}
}
private static void InitializeInstance()
{
var appSettings = ConfigurationManager.AppSettings;
settingMemoryCacheName = appSettings["MemoryCacheName"];
if (settingMemoryCacheName == null)
throw new Exception("Please enter a name for the cache in app.config, under 'MemoryCacheName'");
if (! Double.TryParse(appSettings["CacheExpirationTimeInMinutes"], out settingCacheExpirationTimeInMinutes))
throw new Exception("Please enter how many minutes the cache should be kept in app.config, under 'CacheExpirationTimeInMinutes'");
instance = new Cache();
memoryCache = new MemoryCache(settingMemoryCacheName);
}
/// <summary>
/// Writes Key Value Pair to Cache
/// </summary>
/// <param name="Key">Key to associate Value with in Cache</param>
/// <param name="Value">Value to be stored in Cache associated with Key</param>
public void Write(string Key, object Value)
{
memoryCache.Add(Key, Value, DateTimeOffset.Now.AddMinutes(settingCacheExpirationTimeInMinutes));
}
/// <summary>
/// Returns Value stored in Cache
/// </summary>
/// <param name="Key"></param>
/// <returns>Value stored in cache</returns>
public object Read(string Key)
{
return memoryCache.Get(Key);
}
/// <summary>
/// Returns Value stored in Cache, null if non existent
/// </summary>
/// <param name="Key"></param>
/// <returns>Value stored in cache</returns>
public object TryRead(string Key)
{
try
{
return memoryCache.Get(Key);
}
catch (Exception)
{
return null;
}
}
}
}
I have read lots of information about page caching and partial page caching in a MVC application. However, I would like to know how you would cache data.
In my scenario I will be using LINQ to Entities (entity framework). On the first call to GetNames (or whatever the method is) I want to grab the data from the database. I want to save the results in cache and on the second call to use the cached version if it exists.
Can anyone show an example of how this would work, where this should be implemented (model?) and if it would work.
I have seen this done in traditional ASP.NET apps , typically for very static data.
Here's a nice and simple cache helper class/service I use:
using System.Runtime.Caching;
public class InMemoryCache: ICacheService
{
public T GetOrSet<T>(string cacheKey, Func<T> getItemCallback) where T : class
{
T item = MemoryCache.Default.Get(cacheKey) as T;
if (item == null)
{
item = getItemCallback();
MemoryCache.Default.Add(cacheKey, item, DateTime.Now.AddMinutes(10));
}
return item;
}
}
interface ICacheService
{
T GetOrSet<T>(string cacheKey, Func<T> getItemCallback) where T : class;
}
Usage:
cacheProvider.GetOrSet("cache key", (delegate method if cache is empty));
Cache provider will check if there's anything by the name of "cache id" in the cache, and if there's not, it will call a delegate method to fetch data and store it in cache.
Example:
var products=cacheService.GetOrSet("catalog.products", ()=>productRepository.GetAll())
Reference the System.Web dll in your model and use System.Web.Caching.Cache
public string[] GetNames()
{
string[] names = Cache["names"] as string[];
if(names == null) //not in cache
{
names = DB.GetNames();
Cache["names"] = names;
}
return names;
}
A bit simplified but I guess that would work. This is not MVC specific and I have always used this method for caching data.
I'm referring to TT's post and suggest the following approach:
Reference the System.Web dll in your model and use System.Web.Caching.Cache
public string[] GetNames()
{
var noms = Cache["names"];
if(noms == null)
{
noms = DB.GetNames();
Cache["names"] = noms;
}
return ((string[])noms);
}
You should not return a value re-read from the cache, since you'll never know if at that specific moment it is still in the cache. Even if you inserted it in the statement before, it might already be gone or has never been added to the cache - you just don't know.
So you add the data read from the database and return it directly, not re-reading from the cache.
For .NET 4.5+ framework
add reference: System.Runtime.Caching
add using statement:
using System.Runtime.Caching;
public string[] GetNames()
{
var noms = System.Runtime.Caching.MemoryCache.Default["names"];
if(noms == null)
{
noms = DB.GetNames();
System.Runtime.Caching.MemoryCache.Default["names"] = noms;
}
return ((string[])noms);
}
In the .NET Framework 3.5 and earlier versions, ASP.NET provided an in-memory cache implementation in the System.Web.Caching namespace. In previous versions of the .NET Framework, caching was available only in the System.Web namespace and therefore required a dependency on ASP.NET classes. In the .NET Framework 4, the System.Runtime.Caching namespace contains APIs that are designed for both Web and non-Web applications.
More info:
https://msdn.microsoft.com/en-us/library/dd997357(v=vs.110).aspx
https://learn.microsoft.com/en-us/dotnet/framework/performance/caching-in-net-framework-applications
Steve Smith did two great blog posts which demonstrate how to use his CachedRepository pattern in ASP.NET MVC. It uses the repository pattern effectively and allows you to get caching without having to change your existing code.
http://ardalis.com/Introducing-the-CachedRepository-Pattern
http://ardalis.com/building-a-cachedrepository-via-strategy-pattern
In these two posts he shows you how to set up this pattern and also explains why it is useful. By using this pattern you get caching without your existing code seeing any of the caching logic. Essentially you use the cached repository as if it were any other repository.
I have used it in this way and it works for me.
https://msdn.microsoft.com/en-us/library/system.web.caching.cache.add(v=vs.110).aspx
parameters info for system.web.caching.cache.add.
public string GetInfo()
{
string name = string.Empty;
if(System.Web.HttpContext.Current.Cache["KeyName"] == null)
{
name = GetNameMethod();
System.Web.HttpContext.Current.Cache.Add("KeyName", name, null, DateTime.Noew.AddMinutes(5), Cache.NoSlidingExpiration, CacheitemPriority.AboveNormal, null);
}
else
{
name = System.Web.HttpContext.Current.Cache["KeyName"] as string;
}
return name;
}
AppFabric Caching is distributed and an in-memory caching technic that stores data in key-value pairs using physical memory across multiple servers. AppFabric provides performance and scalability improvements for .NET Framework applications. Concepts and Architecture
Extending #Hrvoje Hudo's answer...
Code:
using System;
using System.Runtime.Caching;
public class InMemoryCache : ICacheService
{
public TValue Get<TValue>(string cacheKey, int durationInMinutes, Func<TValue> getItemCallback) where TValue : class
{
TValue item = MemoryCache.Default.Get(cacheKey) as TValue;
if (item == null)
{
item = getItemCallback();
MemoryCache.Default.Add(cacheKey, item, DateTime.Now.AddMinutes(durationInMinutes));
}
return item;
}
public TValue Get<TValue, TId>(string cacheKeyFormat, TId id, int durationInMinutes, Func<TId, TValue> getItemCallback) where TValue : class
{
string cacheKey = string.Format(cacheKeyFormat, id);
TValue item = MemoryCache.Default.Get(cacheKey) as TValue;
if (item == null)
{
item = getItemCallback(id);
MemoryCache.Default.Add(cacheKey, item, DateTime.Now.AddMinutes(durationInMinutes));
}
return item;
}
}
interface ICacheService
{
TValue Get<TValue>(string cacheKey, Func<TValue> getItemCallback) where TValue : class;
TValue Get<TValue, TId>(string cacheKeyFormat, TId id, Func<TId, TValue> getItemCallback) where TValue : class;
}
Examples
Single item caching (when each item is cached based on its ID because caching the entire catalog for the item type would be too intensive).
Product product = cache.Get("product_{0}", productId, 10, productData.getProductById);
Caching all of something
IEnumerable<Categories> categories = cache.Get("categories", 20, categoryData.getCategories);
Why TId
The second helper is especially nice because most data keys are not composite. Additional methods could be added if you use composite keys often. In this way you avoid doing all sorts of string concatenation or string.Formats to get the key to pass to the cache helper. It also makes passing the data access method easier because you don't have to pass the ID into the wrapper method... the whole thing becomes very terse and consistant for the majority of use cases.
Here's an improvement to Hrvoje Hudo's answer. This implementation has a couple of key improvements:
Cache keys are created automatically based on the function to update data and the object passed in that specifies dependencies
Pass in time span for any cache duration
Uses a lock for thread safety
Note that this has a dependency on Newtonsoft.Json to serialize the dependsOn object, but that can be easily swapped out for any other serialization method.
ICache.cs
public interface ICache
{
T GetOrSet<T>(Func<T> getItemCallback, object dependsOn, TimeSpan duration) where T : class;
}
InMemoryCache.cs
using System;
using System.Reflection;
using System.Runtime.Caching;
using Newtonsoft.Json;
public class InMemoryCache : ICache
{
private static readonly object CacheLockObject = new object();
public T GetOrSet<T>(Func<T> getItemCallback, object dependsOn, TimeSpan duration) where T : class
{
string cacheKey = GetCacheKey(getItemCallback, dependsOn);
T item = MemoryCache.Default.Get(cacheKey) as T;
if (item == null)
{
lock (CacheLockObject)
{
item = getItemCallback();
MemoryCache.Default.Add(cacheKey, item, DateTime.Now.Add(duration));
}
}
return item;
}
private string GetCacheKey<T>(Func<T> itemCallback, object dependsOn) where T: class
{
var serializedDependants = JsonConvert.SerializeObject(dependsOn);
var methodType = itemCallback.GetType();
return methodType.FullName + serializedDependants;
}
}
Usage:
var order = _cache.GetOrSet(
() => _session.Set<Order>().SingleOrDefault(o => o.Id == orderId)
, new { id = orderId }
, new TimeSpan(0, 10, 0)
);
public sealed class CacheManager
{
private static volatile CacheManager instance;
private static object syncRoot = new Object();
private ObjectCache cache = null;
private CacheItemPolicy defaultCacheItemPolicy = null;
private CacheEntryRemovedCallback callback = null;
private bool allowCache = true;
private CacheManager()
{
cache = MemoryCache.Default;
callback = new CacheEntryRemovedCallback(this.CachedItemRemovedCallback);
defaultCacheItemPolicy = new CacheItemPolicy();
defaultCacheItemPolicy.AbsoluteExpiration = DateTime.Now.AddHours(1.0);
defaultCacheItemPolicy.RemovedCallback = callback;
allowCache = StringUtils.Str2Bool(ConfigurationManager.AppSettings["AllowCache"]); ;
}
public static CacheManager Instance
{
get
{
if (instance == null)
{
lock (syncRoot)
{
if (instance == null)
{
instance = new CacheManager();
}
}
}
return instance;
}
}
public IEnumerable GetCache(String Key)
{
if (Key == null || !allowCache)
{
return null;
}
try
{
String Key_ = Key;
if (cache.Contains(Key_))
{
return (IEnumerable)cache.Get(Key_);
}
else
{
return null;
}
}
catch (Exception)
{
return null;
}
}
public void ClearCache(string key)
{
AddCache(key, null);
}
public bool AddCache(String Key, IEnumerable data, CacheItemPolicy cacheItemPolicy = null)
{
if (!allowCache) return true;
try
{
if (Key == null)
{
return false;
}
if (cacheItemPolicy == null)
{
cacheItemPolicy = defaultCacheItemPolicy;
}
String Key_ = Key;
lock (Key_)
{
return cache.Add(Key_, data, cacheItemPolicy);
}
}
catch (Exception)
{
return false;
}
}
private void CachedItemRemovedCallback(CacheEntryRemovedArguments arguments)
{
String strLog = String.Concat("Reason: ", arguments.RemovedReason.ToString(), " | Key-Name: ", arguments.CacheItem.Key, " | Value-Object: ", arguments.CacheItem.Value.ToString());
LogManager.Instance.Info(strLog);
}
}
I use two classes. First one the cache core object:
public class Cacher<TValue>
where TValue : class
{
#region Properties
private Func<TValue> _init;
public string Key { get; private set; }
public TValue Value
{
get
{
var item = HttpRuntime.Cache.Get(Key) as TValue;
if (item == null)
{
item = _init();
HttpContext.Current.Cache.Insert(Key, item);
}
return item;
}
}
#endregion
#region Constructor
public Cacher(string key, Func<TValue> init)
{
Key = key;
_init = init;
}
#endregion
#region Methods
public void Refresh()
{
HttpRuntime.Cache.Remove(Key);
}
#endregion
}
Second one is list of cache objects:
public static class Caches
{
static Caches()
{
Languages = new Cacher<IEnumerable<Language>>("Languages", () =>
{
using (var context = new WordsContext())
{
return context.Languages.ToList();
}
});
}
public static Cacher<IEnumerable<Language>> Languages { get; private set; }
}
I will say implementing Singleton on this persisting data issue can be a solution for this matter in case you find previous solutions much complicated
public class GPDataDictionary
{
private Dictionary<string, object> configDictionary = new Dictionary<string, object>();
/// <summary>
/// Configuration values dictionary
/// </summary>
public Dictionary<string, object> ConfigDictionary
{
get { return configDictionary; }
}
private static GPDataDictionary instance;
public static GPDataDictionary Instance
{
get
{
if (instance == null)
{
instance = new GPDataDictionary();
}
return instance;
}
}
// private constructor
private GPDataDictionary() { }
} // singleton
HttpContext.Current.Cache.Insert("subjectlist", subjectlist);
You can also try and use the caching built into ASP MVC:
Add the following attribute to the controller method you'd like to cache:
[OutputCache(Duration=10)]
In this case the ActionResult of this will be cached for 10 seconds.
More on this here
I'm building a LINQ extension to streamline database access through EF, and part of that process is mapping the data entity to the business entity.
I use a Dictionary<string, int> to decide what navigational properties to include, and to what depth.
Example:
public static class LinqExtensions
{
private static readonly Dictionary<string, int> Dictionary = new Dictionary<string, int>();
/// <summary>
/// Adds the navigational property identified by value to be included in the query and entity mapping, recursing a maximum of depth times.
/// </summary>
/// <param name="value">Navigational Property to add</param>
/// <param name="depth">Desired recursion depth</param>
public static TSource With<TSource>(this TSource source, string value, int depth = 0)
{
Dictionary.Add(value, depth);
return source;
}
/// <summary>
/// Clears the navigational property dictionary
/// </summary>
public static void Reset()
{
Dictionary.Clear();
}
/// <summary>
/// Builds and executes a query, dynamically including desired navigational properties in a asynchronous fashion.
/// The result is then mapped to the provided TResult business entity and returned as a list.
/// </summary>
/// <returns>Null or a list of mapped domain Entities</returns>
public static async Task<IEnumerable<TResult>> BuildQueryAsync<TSource, TResult>(this IQueryable<TSource> dbEntity) where TResult : class, new()
{
var query = dbEntity;
var localDictionary = new Dictionary<string, int>(Dictionary);
Reset();
foreach (var i in localDictionary)
{
query = query.Include(i.Key);
}
List<TSource> result = await (from entity in query select entity).ToListAsync();
return Equals(result, default(TSource)) ? null : result.Select(u => u.BuildEntity(new TResult(), localDictionary));
}
/// <summary>
/// Maps values from sourceEntity to targetEntity, recursing into properties defined in localDictionary.
/// </summary>
public static TTarget BuildEntity<TSource, TTarget>(this TSource sourceEntity, TTarget targetEntity, Dictionary<string, int> localDictionary)
{
return (TTarget)targetEntity.InjectFrom(new SinglePropertyDepthInjection(localDictionary), sourceEntity);
}
}
This lets me access stuff in my repository and services as follows:
public override async Task<IEnumerable<User>> GetAllAsync()
{
return await _context.Users.With("Messages", 1).With("Notifications", 2).BuildQueryAsync<Data.Entities.User, User>();
}
Now i'm well aware that this isn't feasible, due to static properties being shared across all requests.
I know I could easilly add a dictionary as a method parameter, and solving it as such:
public override async Task<IEnumerable<User>> GetAllAsync()
{
var dict = new Dictionary<string, int>();
dict.Add("Messages", 1);
dict.Add("Notifications", 2);
return await _context.Users.BuildQueryAsync<Data.Entities.User, User>(dict);
}
But I was wondering if there is perhaps a more elegant solution, ideally keeping it as part of the LINQ query.
I know there is HttpContext.Current, but since the methods involved are async I'm not sure how good of an idea it is to go back onto the context thread.
Any ideas?
I think CallContext might be what you are looking for.
In combination with the disposable pattern such things can be scoped pretty easy.
public class IncludeScope : IDisposable
{
private const string CallContextKey = "IncludeScopKey";
private object oldValue;
public IncludeScope(IDictionary<string,int> values)
{
this.oldValue = CallContext.GetData(CallContextKey);
this.Includes = new Dictionary<string,int>(values);
CallContext.SetData(CallContextKey, this);
}
public Dictionary<string,int> Includes { get; private set; }
public static IncludeScope Current {
get { return CallContext.GetData(CallContextKey) as IncludeScope; }
}
private bool _disposed;
protected virtual bool IsDisposed
{
get
{
return _disposed;
}
}
~IncludeScope()
{
Dispose(false);
}
public void Dispose()
{
Dispose(true);
GC.SuppressFinalize(this);
}
protected virtual void Dispose(bool disposing)
{
if (!_disposed) {
if (disposing) {
CallContext.SetData(CallContextKey, oldValue);
}
_disposed = true;
}
}
}
The scope can be declared like this.
using(var scope = new IncludeScope(new Dictionary<string,int>{{"Message",1}, {"Notifications",2}})){
var query = await GetQueryAsync<User>();
…
}
In any method call within the using the scope can be accessed like this.
private static Task<IQueryable<T>> GetQueryAsync<T>() {
var baseQuery = context.Set<T>();
foreach (var include in IncludeScope.Current.Includes) {
}
}
I have created a cache using the MemoryCache class. I add some items to it but when I need to reload the cache I want to clear it first. What is the quickest way to do this? Should I loop through all the items and remove them one at a time or is there a better way?
Dispose the existing MemoryCache and create a new MemoryCache object.
The problem with enumeration
The MemoryCache.GetEnumerator() Remarks section warns: "Retrieving an enumerator for a MemoryCache instance is a resource-intensive and blocking operation. Therefore, the enumerator should not be used in production applications."
Here's why, explained in pseudocode of the GetEnumerator() implementation:
Create a new Dictionary object (let's call it AllCache)
For Each per-processor segment in the cache (one Dictionary object per processor)
{
Lock the segment/Dictionary (using lock construct)
Iterate through the segment/Dictionary and add each name/value pair one-by-one
to the AllCache Dictionary (using references to the original MemoryCacheKey
and MemoryCacheEntry objects)
}
Create and return an enumerator on the AllCache Dictionary
Since the implementation splits the cache across multiple Dictionary objects, it must bring everything together into a single collection in order to hand back an enumerator. Every call to GetEnumerator executes the full copy process detailed above. The newly created Dictionary contains references to the original internal key and value objects, so your actual cached data values are not duplicated.
The warning in the documentation is correct. Avoid GetEnumerator() -- including all of the answers above that use LINQ queries.
A better and more flexible solution
Here's an efficient way of clearing the cache that simply builds on the existing change monitoring infrastructure. It also provides the flexibility to clear either the entire cache or just a named subset and has none of the problems discussed above.
// By Thomas F. Abraham (http://www.tfabraham.com)
namespace CacheTest
{
using System;
using System.Diagnostics;
using System.Globalization;
using System.Runtime.Caching;
public class SignaledChangeEventArgs : EventArgs
{
public string Name { get; private set; }
public SignaledChangeEventArgs(string name = null) { this.Name = name; }
}
/// <summary>
/// Cache change monitor that allows an app to fire a change notification
/// to all associated cache items.
/// </summary>
public class SignaledChangeMonitor : ChangeMonitor
{
// Shared across all SignaledChangeMonitors in the AppDomain
private static event EventHandler<SignaledChangeEventArgs> Signaled;
private string _name;
private string _uniqueId = Guid.NewGuid().ToString("N", CultureInfo.InvariantCulture);
public override string UniqueId
{
get { return _uniqueId; }
}
public SignaledChangeMonitor(string name = null)
{
_name = name;
// Register instance with the shared event
SignaledChangeMonitor.Signaled += OnSignalRaised;
base.InitializationComplete();
}
public static void Signal(string name = null)
{
if (Signaled != null)
{
// Raise shared event to notify all subscribers
Signaled(null, new SignaledChangeEventArgs(name));
}
}
protected override void Dispose(bool disposing)
{
SignaledChangeMonitor.Signaled -= OnSignalRaised;
}
private void OnSignalRaised(object sender, SignaledChangeEventArgs e)
{
if (string.IsNullOrWhiteSpace(e.Name) || string.Compare(e.Name, _name, true) == 0)
{
Debug.WriteLine(
_uniqueId + " notifying cache of change.", "SignaledChangeMonitor");
// Cache objects are obligated to remove entry upon change notification.
base.OnChanged(null);
}
}
}
public static class CacheTester
{
public static void TestCache()
{
MemoryCache cache = MemoryCache.Default;
// Add data to cache
for (int idx = 0; idx < 50; idx++)
{
cache.Add("Key" + idx.ToString(), "Value" + idx.ToString(), GetPolicy(idx));
}
// Flush cached items associated with "NamedData" change monitors
SignaledChangeMonitor.Signal("NamedData");
// Flush all cached items
SignaledChangeMonitor.Signal();
}
private static CacheItemPolicy GetPolicy(int idx)
{
string name = (idx % 2 == 0) ? null : "NamedData";
CacheItemPolicy cip = new CacheItemPolicy();
cip.AbsoluteExpiration = System.DateTimeOffset.UtcNow.AddHours(1);
cip.ChangeMonitors.Add(new SignaledChangeMonitor(name));
return cip;
}
}
}
From http://connect.microsoft.com/VisualStudio/feedback/details/723620/memorycache-class-needs-a-clear-method
The workaround is:
List<string> cacheKeys = MemoryCache.Default.Select(kvp => kvp.Key).ToList();
foreach (string cacheKey in cacheKeys)
{
MemoryCache.Default.Remove(cacheKey);
}
var cacheItems = cache.ToList();
foreach (KeyValuePair<String, Object> a in cacheItems)
{
cache.Remove(a.Key);
}
If performance isn't an issue then this nice one-liner will do the trick:
cache.ToList().ForEach(a => cache.Remove(a.Key));
It seems that there is a Trim method.
So to clear all contents you'd just do
cache.Trim(100)
EDIT:
after digging some more, it seems that looking into Trim is not worth your time
https://connect.microsoft.com/VisualStudio/feedback/details/831755/memorycache-trim-method-doesnt-evict-100-of-the-items
How do I clear a System.Runtime.Caching.MemoryCache
Ran across this, and based on it, wrote a slightly more effective, parallel clear method:
public void ClearAll()
{
var allKeys = _cache.Select(o => o.Key);
Parallel.ForEach(allKeys, key => _cache.Remove(key));
}
You could also do something like this:
Dim _Qry = (From n In CacheObject.AsParallel()
Select n).ToList()
For Each i In _Qry
CacheObject.Remove(i.Key)
Next
You can dispose the MemoryCache.Default cache and then re-set the private field singleton to null, to make it recreate the MemoryCache.Default.
var field = typeof(MemoryCache).GetField("s_defaultCache",
BindingFlags.Static |
BindingFlags.NonPublic);
field.SetValue(null, null);
I was only interested in clearing the cache and found this as an option, when using the c# GlobalCachingProvider
var cache = GlobalCachingProvider.Instance.GetAllItems();
if (dbOperation.SuccessLoadingAllCacheToDB(cache))
{
cache.Clear();
}
a bit improved version of magritte answer.
var cacheKeys = MemoryCache.Default.Where(kvp.Value is MyType).Select(kvp => kvp.Key).ToList();
foreach (string cacheKey in cacheKeys)
{
MemoryCache.Default.Remove(cacheKey);
}
This discussion is also being done here:
https://learn.microsoft.com/en-us/answers/answers/983399/view.html
I wrote an answer there and I'll transcribe it here:
using System.Collections.Generic;
using Microsoft.Extensions.Caching.Memory;
using ServiceStack;
public static class IMemoryCacheExtensions
{
static readonly List<object> entries = new();
/// <summary>
/// Removes all entries, added via the "TryGetValueExtension()" method
/// </summary>
/// <param name="cache"></param>
public static void Clear(this IMemoryCache cache)
{
for (int i = 0; i < entries.Count; i++)
{
cache.Remove(entries[i]);
}
entries.Clear();
}
/// <summary>
/// Use this extension method, to be able to remove all your entries later using "Clear()" method
/// </summary>
/// <typeparam name="TItem"></typeparam>
/// <param name="cache"></param>
/// <param name="key"></param>
/// <param name="value"></param>
/// <returns></returns>
public static bool TryGetValueExtension<TItem>(this IMemoryCache cache, object key, out TItem value)
{
entries.AddIfNotExists(key);
if (cache.TryGetValue(key, out object result))
{
if (result == null)
{
value = default;
return true;
}
if (result is TItem item)
{
value = item;
return true;
}
}
value = default;
return false;
}
}