I have created a cache using the MemoryCache class. I add some items to it but when I need to reload the cache I want to clear it first. What is the quickest way to do this? Should I loop through all the items and remove them one at a time or is there a better way?
Dispose the existing MemoryCache and create a new MemoryCache object.
The problem with enumeration
The MemoryCache.GetEnumerator() Remarks section warns: "Retrieving an enumerator for a MemoryCache instance is a resource-intensive and blocking operation. Therefore, the enumerator should not be used in production applications."
Here's why, explained in pseudocode of the GetEnumerator() implementation:
Create a new Dictionary object (let's call it AllCache)
For Each per-processor segment in the cache (one Dictionary object per processor)
{
Lock the segment/Dictionary (using lock construct)
Iterate through the segment/Dictionary and add each name/value pair one-by-one
to the AllCache Dictionary (using references to the original MemoryCacheKey
and MemoryCacheEntry objects)
}
Create and return an enumerator on the AllCache Dictionary
Since the implementation splits the cache across multiple Dictionary objects, it must bring everything together into a single collection in order to hand back an enumerator. Every call to GetEnumerator executes the full copy process detailed above. The newly created Dictionary contains references to the original internal key and value objects, so your actual cached data values are not duplicated.
The warning in the documentation is correct. Avoid GetEnumerator() -- including all of the answers above that use LINQ queries.
A better and more flexible solution
Here's an efficient way of clearing the cache that simply builds on the existing change monitoring infrastructure. It also provides the flexibility to clear either the entire cache or just a named subset and has none of the problems discussed above.
// By Thomas F. Abraham (http://www.tfabraham.com)
namespace CacheTest
{
using System;
using System.Diagnostics;
using System.Globalization;
using System.Runtime.Caching;
public class SignaledChangeEventArgs : EventArgs
{
public string Name { get; private set; }
public SignaledChangeEventArgs(string name = null) { this.Name = name; }
}
/// <summary>
/// Cache change monitor that allows an app to fire a change notification
/// to all associated cache items.
/// </summary>
public class SignaledChangeMonitor : ChangeMonitor
{
// Shared across all SignaledChangeMonitors in the AppDomain
private static event EventHandler<SignaledChangeEventArgs> Signaled;
private string _name;
private string _uniqueId = Guid.NewGuid().ToString("N", CultureInfo.InvariantCulture);
public override string UniqueId
{
get { return _uniqueId; }
}
public SignaledChangeMonitor(string name = null)
{
_name = name;
// Register instance with the shared event
SignaledChangeMonitor.Signaled += OnSignalRaised;
base.InitializationComplete();
}
public static void Signal(string name = null)
{
if (Signaled != null)
{
// Raise shared event to notify all subscribers
Signaled(null, new SignaledChangeEventArgs(name));
}
}
protected override void Dispose(bool disposing)
{
SignaledChangeMonitor.Signaled -= OnSignalRaised;
}
private void OnSignalRaised(object sender, SignaledChangeEventArgs e)
{
if (string.IsNullOrWhiteSpace(e.Name) || string.Compare(e.Name, _name, true) == 0)
{
Debug.WriteLine(
_uniqueId + " notifying cache of change.", "SignaledChangeMonitor");
// Cache objects are obligated to remove entry upon change notification.
base.OnChanged(null);
}
}
}
public static class CacheTester
{
public static void TestCache()
{
MemoryCache cache = MemoryCache.Default;
// Add data to cache
for (int idx = 0; idx < 50; idx++)
{
cache.Add("Key" + idx.ToString(), "Value" + idx.ToString(), GetPolicy(idx));
}
// Flush cached items associated with "NamedData" change monitors
SignaledChangeMonitor.Signal("NamedData");
// Flush all cached items
SignaledChangeMonitor.Signal();
}
private static CacheItemPolicy GetPolicy(int idx)
{
string name = (idx % 2 == 0) ? null : "NamedData";
CacheItemPolicy cip = new CacheItemPolicy();
cip.AbsoluteExpiration = System.DateTimeOffset.UtcNow.AddHours(1);
cip.ChangeMonitors.Add(new SignaledChangeMonitor(name));
return cip;
}
}
}
From http://connect.microsoft.com/VisualStudio/feedback/details/723620/memorycache-class-needs-a-clear-method
The workaround is:
List<string> cacheKeys = MemoryCache.Default.Select(kvp => kvp.Key).ToList();
foreach (string cacheKey in cacheKeys)
{
MemoryCache.Default.Remove(cacheKey);
}
var cacheItems = cache.ToList();
foreach (KeyValuePair<String, Object> a in cacheItems)
{
cache.Remove(a.Key);
}
If performance isn't an issue then this nice one-liner will do the trick:
cache.ToList().ForEach(a => cache.Remove(a.Key));
It seems that there is a Trim method.
So to clear all contents you'd just do
cache.Trim(100)
EDIT:
after digging some more, it seems that looking into Trim is not worth your time
https://connect.microsoft.com/VisualStudio/feedback/details/831755/memorycache-trim-method-doesnt-evict-100-of-the-items
How do I clear a System.Runtime.Caching.MemoryCache
Ran across this, and based on it, wrote a slightly more effective, parallel clear method:
public void ClearAll()
{
var allKeys = _cache.Select(o => o.Key);
Parallel.ForEach(allKeys, key => _cache.Remove(key));
}
You could also do something like this:
Dim _Qry = (From n In CacheObject.AsParallel()
Select n).ToList()
For Each i In _Qry
CacheObject.Remove(i.Key)
Next
You can dispose the MemoryCache.Default cache and then re-set the private field singleton to null, to make it recreate the MemoryCache.Default.
var field = typeof(MemoryCache).GetField("s_defaultCache",
BindingFlags.Static |
BindingFlags.NonPublic);
field.SetValue(null, null);
I was only interested in clearing the cache and found this as an option, when using the c# GlobalCachingProvider
var cache = GlobalCachingProvider.Instance.GetAllItems();
if (dbOperation.SuccessLoadingAllCacheToDB(cache))
{
cache.Clear();
}
a bit improved version of magritte answer.
var cacheKeys = MemoryCache.Default.Where(kvp.Value is MyType).Select(kvp => kvp.Key).ToList();
foreach (string cacheKey in cacheKeys)
{
MemoryCache.Default.Remove(cacheKey);
}
This discussion is also being done here:
https://learn.microsoft.com/en-us/answers/answers/983399/view.html
I wrote an answer there and I'll transcribe it here:
using System.Collections.Generic;
using Microsoft.Extensions.Caching.Memory;
using ServiceStack;
public static class IMemoryCacheExtensions
{
static readonly List<object> entries = new();
/// <summary>
/// Removes all entries, added via the "TryGetValueExtension()" method
/// </summary>
/// <param name="cache"></param>
public static void Clear(this IMemoryCache cache)
{
for (int i = 0; i < entries.Count; i++)
{
cache.Remove(entries[i]);
}
entries.Clear();
}
/// <summary>
/// Use this extension method, to be able to remove all your entries later using "Clear()" method
/// </summary>
/// <typeparam name="TItem"></typeparam>
/// <param name="cache"></param>
/// <param name="key"></param>
/// <param name="value"></param>
/// <returns></returns>
public static bool TryGetValueExtension<TItem>(this IMemoryCache cache, object key, out TItem value)
{
entries.AddIfNotExists(key);
if (cache.TryGetValue(key, out object result))
{
if (result == null)
{
value = default;
return true;
}
if (result is TItem item)
{
value = item;
return true;
}
}
value = default;
return false;
}
}
Related
I need to traverse a collection of disjoint folders; each folder is associated to a visited time configurated somewhere in the folder.
I then sort the folders, and process the one with the earliest visited time first. Note the processing is generally slower than the traversing.
My code targets Framework4.8.1; Currently my implementation is as follows:
public class BySeparateThread
{
ConcurrentDictionary<string, DateTime?> _dict = new ConcurrentDictionary<string, DateTime?>();
private object _lock;
/// <summary>
/// this will be called by producer thread;
/// </summary>
/// <param name="address"></param>
/// <param name="time"></param>
public void add(string address,DateTime? time) {
_dict.TryAdd(address, time);
}
/// <summary>
/// called by subscriber thread;
/// </summary>
/// <returns></returns>
public string? next() {
lock (_lock) {
var r = _dict.FirstOrDefault();
//return sortedList.FirstOrDefault().Value;
if (r.Key is null)
{
return r.Key;
}
if (r.Value is null)
{
_dict.TryRemove(r.Key, out var _);
return r.Key;
}
var key = r.Key;
foreach (var item in _dict.Skip(1) )
{
if (item.Value is null)
{
_dict.TryRemove(item.Key, out var _);
return item.Key;
}
if (item.Value< r.Value)
{
r=item;
}
}
_dict.TryRemove(key, out var _);
return key;
}
}
/// <summary>
/// this will be assigned of false by producer thread;
/// </summary>
public bool _notComplete = true;
/// <summary>
/// shared configuration for subscribers;
/// </summary>
fs.addresses_.disjoint.deV_._bak.Io io; //.io_._CfgX.Create(cancel, git)
/// <summary>
/// run this in a separate thread other than <see cref="add(string, DateTime?)"/>
/// </summary>
/// <param name="sln"></param>
/// <returns></returns>
public async Task _asyn_ofAddress(string sln)
{
while (_notComplete)
{
var f = next();
if (f is null )
{
await Task.Delay(30*1000);
//await Task.Yield();
continue;
}
/// degree of concurrency is controlled by a semophore; for instance, at most 4 are tackled:
new dev.srcs.each.sln_.delvable.Bak_srcsInAddresses(io)._startTask_ofAddress(sln);
}
}
}
For the above, I'm concerned about the while(_notComplete) part, as it looks like there would be many loops doing nothing there. I think there should be better ways to remove the while by utilizing the fact that the collection can notify whether it's empty or not at some/various stages such as when we add.
There would be better implementation which can be based on some mature framework such as those being considered by me these days but I often stopped wondering at some implementation details:
BlockingCollection
for this one, I don't know how to make the collection added and sorted dynamically while producer and subscriber are on the run;
Channel
Again, I could not come up with one fitting my need after I read its examples;
Pipeline
I havenot fully understood it;
Rx
I tried to implement an observable and an observer. It only gives me a macroscope framework, but when I get into the details, I ended with what I'm currently doing and I begin to wonder: with what I'm doing, I don't need Rx here.
Dataflow
Shall I implement my own BufferBlock or ActionBlock? It seems the built-in bufferBlock cannot be customized to sort things before releasing them to the next block.
Sorting buffered Observables seems similar to my problem; but it ends with a solution similar to the one I currently have but am not satisfied with, as stated in the above.
Could some one give me a sample code? Please give as concrete code as you can; As you can see, I have researched some general ideas/paths and finally what stops me short is the details, which are often glossed over in some docs.
I just found one solution which is better than my current one. I believe there are some even better ones, so please do post your answers if you find some; my current one is just what I can hack for what I know so far.
I found Prioritized queues in Task Parallel Library, and I write a similar one for my case:
using System;
using System.Collections;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Diagnostics;
using System.Linq;
using System.Reactive.Subjects;
using System.Threading;
using System.Threading.Tasks;
namespace nilnul.dev.srcs.every.slns._bak
{
public class BySortedSet : IProducerConsumerCollection<(string, DateTime)>
{
private class _Comparer : IComparer<(string, DateTime)>
{
public int Compare((string, DateTime) first, (string, DateTime) second)
{
var returnValue = first.Item2.CompareTo(second.Item2);
if (returnValue == 0)
returnValue = first.Item1.CompareTo(second.Item1);
return returnValue;
}
static public _Comparer Singleton
{
get
{
return nilnul._obj.typ_.nilable_.unprimable_.Singleton<_Comparer>.Instance;// just some magic to get an instance
}
}
}
SortedSet<(string, DateTime)> _dict = new SortedSet<(string, DateTime)>(
_Comparer.Singleton
);
private object _lock=new object();
public int Count
{
get
{
lock(_lock){
return _dict.Count;
}
}
}
public object SyncRoot => _lock;
public bool IsSynchronized => true;
IEnumerator IEnumerable.GetEnumerator()
{
return GetEnumerator();
//throw new NotImplementedException();
}
public void CopyTo((string, DateTime)[] array, int index)
{
lock (_lock)
{
foreach (var item in _dict)
{
array[index++] = item;
}
}
}
public void CopyTo(Array array, int index)
{
lock (_lock)
{
foreach (var item in _dict)
{
array.SetValue(item, index++);
}
}
}
public bool TryAdd((string, DateTime) item)
{
lock (_lock)
{
return _dict.Add(item);
}
}
public bool TryTake(out (string, DateTime) item)
{
lock (_lock)
{
item = _dict.Min;
if (item==default)
{
return false;
}
return _dict.Remove(item);
}
}
public (string, DateTime)[] ToArray()
{
lock (_lock)
{
return this._dict.ToArray();
}
}
public IEnumerator<(string, DateTime)> GetEnumerator()
{
return ToArray().AsEnumerable().GetEnumerator();
}
/// <summary>
/// </summary>
/// <returns></returns>
public BlockingCollection<(string, DateTime)> asBlockingCollection() {
return new BlockingCollection<(string, DateTime)>(
this
);
}
}
}
Then I can use that like:
static public void ExampleUse(CancellationToken cancellationToken) {
var s = new BySortedSet().asBlockingCollection();
/// traversal thread:
s.Add(("", DateTime.MinValue));
//...
s.CompleteAdding();
/// tackler thread:
///
foreach (var item in s.GetConsumingEnumerable(cancellationToken))
{
/// process the item;
/// todo: degree of parallelism is controlled by the tackler, or is there a better way like in dataflow or Rx or sth else?
}
}
Thanks!
I have read lots of information about page caching and partial page caching in a MVC application. However, I would like to know how you would cache data.
In my scenario I will be using LINQ to Entities (entity framework). On the first call to GetNames (or whatever the method is) I want to grab the data from the database. I want to save the results in cache and on the second call to use the cached version if it exists.
Can anyone show an example of how this would work, where this should be implemented (model?) and if it would work.
I have seen this done in traditional ASP.NET apps , typically for very static data.
Here's a nice and simple cache helper class/service I use:
using System.Runtime.Caching;
public class InMemoryCache: ICacheService
{
public T GetOrSet<T>(string cacheKey, Func<T> getItemCallback) where T : class
{
T item = MemoryCache.Default.Get(cacheKey) as T;
if (item == null)
{
item = getItemCallback();
MemoryCache.Default.Add(cacheKey, item, DateTime.Now.AddMinutes(10));
}
return item;
}
}
interface ICacheService
{
T GetOrSet<T>(string cacheKey, Func<T> getItemCallback) where T : class;
}
Usage:
cacheProvider.GetOrSet("cache key", (delegate method if cache is empty));
Cache provider will check if there's anything by the name of "cache id" in the cache, and if there's not, it will call a delegate method to fetch data and store it in cache.
Example:
var products=cacheService.GetOrSet("catalog.products", ()=>productRepository.GetAll())
Reference the System.Web dll in your model and use System.Web.Caching.Cache
public string[] GetNames()
{
string[] names = Cache["names"] as string[];
if(names == null) //not in cache
{
names = DB.GetNames();
Cache["names"] = names;
}
return names;
}
A bit simplified but I guess that would work. This is not MVC specific and I have always used this method for caching data.
I'm referring to TT's post and suggest the following approach:
Reference the System.Web dll in your model and use System.Web.Caching.Cache
public string[] GetNames()
{
var noms = Cache["names"];
if(noms == null)
{
noms = DB.GetNames();
Cache["names"] = noms;
}
return ((string[])noms);
}
You should not return a value re-read from the cache, since you'll never know if at that specific moment it is still in the cache. Even if you inserted it in the statement before, it might already be gone or has never been added to the cache - you just don't know.
So you add the data read from the database and return it directly, not re-reading from the cache.
For .NET 4.5+ framework
add reference: System.Runtime.Caching
add using statement:
using System.Runtime.Caching;
public string[] GetNames()
{
var noms = System.Runtime.Caching.MemoryCache.Default["names"];
if(noms == null)
{
noms = DB.GetNames();
System.Runtime.Caching.MemoryCache.Default["names"] = noms;
}
return ((string[])noms);
}
In the .NET Framework 3.5 and earlier versions, ASP.NET provided an in-memory cache implementation in the System.Web.Caching namespace. In previous versions of the .NET Framework, caching was available only in the System.Web namespace and therefore required a dependency on ASP.NET classes. In the .NET Framework 4, the System.Runtime.Caching namespace contains APIs that are designed for both Web and non-Web applications.
More info:
https://msdn.microsoft.com/en-us/library/dd997357(v=vs.110).aspx
https://learn.microsoft.com/en-us/dotnet/framework/performance/caching-in-net-framework-applications
Steve Smith did two great blog posts which demonstrate how to use his CachedRepository pattern in ASP.NET MVC. It uses the repository pattern effectively and allows you to get caching without having to change your existing code.
http://ardalis.com/Introducing-the-CachedRepository-Pattern
http://ardalis.com/building-a-cachedrepository-via-strategy-pattern
In these two posts he shows you how to set up this pattern and also explains why it is useful. By using this pattern you get caching without your existing code seeing any of the caching logic. Essentially you use the cached repository as if it were any other repository.
I have used it in this way and it works for me.
https://msdn.microsoft.com/en-us/library/system.web.caching.cache.add(v=vs.110).aspx
parameters info for system.web.caching.cache.add.
public string GetInfo()
{
string name = string.Empty;
if(System.Web.HttpContext.Current.Cache["KeyName"] == null)
{
name = GetNameMethod();
System.Web.HttpContext.Current.Cache.Add("KeyName", name, null, DateTime.Noew.AddMinutes(5), Cache.NoSlidingExpiration, CacheitemPriority.AboveNormal, null);
}
else
{
name = System.Web.HttpContext.Current.Cache["KeyName"] as string;
}
return name;
}
AppFabric Caching is distributed and an in-memory caching technic that stores data in key-value pairs using physical memory across multiple servers. AppFabric provides performance and scalability improvements for .NET Framework applications. Concepts and Architecture
Extending #Hrvoje Hudo's answer...
Code:
using System;
using System.Runtime.Caching;
public class InMemoryCache : ICacheService
{
public TValue Get<TValue>(string cacheKey, int durationInMinutes, Func<TValue> getItemCallback) where TValue : class
{
TValue item = MemoryCache.Default.Get(cacheKey) as TValue;
if (item == null)
{
item = getItemCallback();
MemoryCache.Default.Add(cacheKey, item, DateTime.Now.AddMinutes(durationInMinutes));
}
return item;
}
public TValue Get<TValue, TId>(string cacheKeyFormat, TId id, int durationInMinutes, Func<TId, TValue> getItemCallback) where TValue : class
{
string cacheKey = string.Format(cacheKeyFormat, id);
TValue item = MemoryCache.Default.Get(cacheKey) as TValue;
if (item == null)
{
item = getItemCallback(id);
MemoryCache.Default.Add(cacheKey, item, DateTime.Now.AddMinutes(durationInMinutes));
}
return item;
}
}
interface ICacheService
{
TValue Get<TValue>(string cacheKey, Func<TValue> getItemCallback) where TValue : class;
TValue Get<TValue, TId>(string cacheKeyFormat, TId id, Func<TId, TValue> getItemCallback) where TValue : class;
}
Examples
Single item caching (when each item is cached based on its ID because caching the entire catalog for the item type would be too intensive).
Product product = cache.Get("product_{0}", productId, 10, productData.getProductById);
Caching all of something
IEnumerable<Categories> categories = cache.Get("categories", 20, categoryData.getCategories);
Why TId
The second helper is especially nice because most data keys are not composite. Additional methods could be added if you use composite keys often. In this way you avoid doing all sorts of string concatenation or string.Formats to get the key to pass to the cache helper. It also makes passing the data access method easier because you don't have to pass the ID into the wrapper method... the whole thing becomes very terse and consistant for the majority of use cases.
Here's an improvement to Hrvoje Hudo's answer. This implementation has a couple of key improvements:
Cache keys are created automatically based on the function to update data and the object passed in that specifies dependencies
Pass in time span for any cache duration
Uses a lock for thread safety
Note that this has a dependency on Newtonsoft.Json to serialize the dependsOn object, but that can be easily swapped out for any other serialization method.
ICache.cs
public interface ICache
{
T GetOrSet<T>(Func<T> getItemCallback, object dependsOn, TimeSpan duration) where T : class;
}
InMemoryCache.cs
using System;
using System.Reflection;
using System.Runtime.Caching;
using Newtonsoft.Json;
public class InMemoryCache : ICache
{
private static readonly object CacheLockObject = new object();
public T GetOrSet<T>(Func<T> getItemCallback, object dependsOn, TimeSpan duration) where T : class
{
string cacheKey = GetCacheKey(getItemCallback, dependsOn);
T item = MemoryCache.Default.Get(cacheKey) as T;
if (item == null)
{
lock (CacheLockObject)
{
item = getItemCallback();
MemoryCache.Default.Add(cacheKey, item, DateTime.Now.Add(duration));
}
}
return item;
}
private string GetCacheKey<T>(Func<T> itemCallback, object dependsOn) where T: class
{
var serializedDependants = JsonConvert.SerializeObject(dependsOn);
var methodType = itemCallback.GetType();
return methodType.FullName + serializedDependants;
}
}
Usage:
var order = _cache.GetOrSet(
() => _session.Set<Order>().SingleOrDefault(o => o.Id == orderId)
, new { id = orderId }
, new TimeSpan(0, 10, 0)
);
public sealed class CacheManager
{
private static volatile CacheManager instance;
private static object syncRoot = new Object();
private ObjectCache cache = null;
private CacheItemPolicy defaultCacheItemPolicy = null;
private CacheEntryRemovedCallback callback = null;
private bool allowCache = true;
private CacheManager()
{
cache = MemoryCache.Default;
callback = new CacheEntryRemovedCallback(this.CachedItemRemovedCallback);
defaultCacheItemPolicy = new CacheItemPolicy();
defaultCacheItemPolicy.AbsoluteExpiration = DateTime.Now.AddHours(1.0);
defaultCacheItemPolicy.RemovedCallback = callback;
allowCache = StringUtils.Str2Bool(ConfigurationManager.AppSettings["AllowCache"]); ;
}
public static CacheManager Instance
{
get
{
if (instance == null)
{
lock (syncRoot)
{
if (instance == null)
{
instance = new CacheManager();
}
}
}
return instance;
}
}
public IEnumerable GetCache(String Key)
{
if (Key == null || !allowCache)
{
return null;
}
try
{
String Key_ = Key;
if (cache.Contains(Key_))
{
return (IEnumerable)cache.Get(Key_);
}
else
{
return null;
}
}
catch (Exception)
{
return null;
}
}
public void ClearCache(string key)
{
AddCache(key, null);
}
public bool AddCache(String Key, IEnumerable data, CacheItemPolicy cacheItemPolicy = null)
{
if (!allowCache) return true;
try
{
if (Key == null)
{
return false;
}
if (cacheItemPolicy == null)
{
cacheItemPolicy = defaultCacheItemPolicy;
}
String Key_ = Key;
lock (Key_)
{
return cache.Add(Key_, data, cacheItemPolicy);
}
}
catch (Exception)
{
return false;
}
}
private void CachedItemRemovedCallback(CacheEntryRemovedArguments arguments)
{
String strLog = String.Concat("Reason: ", arguments.RemovedReason.ToString(), " | Key-Name: ", arguments.CacheItem.Key, " | Value-Object: ", arguments.CacheItem.Value.ToString());
LogManager.Instance.Info(strLog);
}
}
I use two classes. First one the cache core object:
public class Cacher<TValue>
where TValue : class
{
#region Properties
private Func<TValue> _init;
public string Key { get; private set; }
public TValue Value
{
get
{
var item = HttpRuntime.Cache.Get(Key) as TValue;
if (item == null)
{
item = _init();
HttpContext.Current.Cache.Insert(Key, item);
}
return item;
}
}
#endregion
#region Constructor
public Cacher(string key, Func<TValue> init)
{
Key = key;
_init = init;
}
#endregion
#region Methods
public void Refresh()
{
HttpRuntime.Cache.Remove(Key);
}
#endregion
}
Second one is list of cache objects:
public static class Caches
{
static Caches()
{
Languages = new Cacher<IEnumerable<Language>>("Languages", () =>
{
using (var context = new WordsContext())
{
return context.Languages.ToList();
}
});
}
public static Cacher<IEnumerable<Language>> Languages { get; private set; }
}
I will say implementing Singleton on this persisting data issue can be a solution for this matter in case you find previous solutions much complicated
public class GPDataDictionary
{
private Dictionary<string, object> configDictionary = new Dictionary<string, object>();
/// <summary>
/// Configuration values dictionary
/// </summary>
public Dictionary<string, object> ConfigDictionary
{
get { return configDictionary; }
}
private static GPDataDictionary instance;
public static GPDataDictionary Instance
{
get
{
if (instance == null)
{
instance = new GPDataDictionary();
}
return instance;
}
}
// private constructor
private GPDataDictionary() { }
} // singleton
HttpContext.Current.Cache.Insert("subjectlist", subjectlist);
You can also try and use the caching built into ASP MVC:
Add the following attribute to the controller method you'd like to cache:
[OutputCache(Duration=10)]
In this case the ActionResult of this will be cached for 10 seconds.
More on this here
I started using PostSharp recently, mainly because I want to implement Method Cache. I found few examples of how to do it, but all these examples are based on implementing a cache method attribute and decorating methods which results should be cached:
http://www.postsharp.net/blog/post/SOLID-Caching
http://doc.postsharp.net/example-cache
I would like to implement the cache in a way to be able to set some additional perameters for each method in run time based on configuration. For example I would like to register a particular method cache with individual expiration time, but I do not know that value during code compilation.
The thing I do not how to do is how to do it during runtime, without decorating the methods.
This shall be simple, though you need to code the details you need, following are the important details, pasting my code, so some variables are our custom, modify them accordingly. You need to apply [ViewrCache] attribute to the relevant method post creating the underneath code and it will the modify the method IL accordingly at runtime
Create a custom caching attribute extended from PostSharp.Aspects.MethodInterceptionAspect as follows:
[Serializable]
public sealed class ViewrCacheAttribute : MethodInterceptionAspect
Class Variables:
// Cache Expiration from Config file
private static readonly int CacheTimeOut = Convert.ToInt32(WebConfigurationManager.AppSettings[CacheSettings.CacheTimeOutKey]);
// Parameters to be ignored from the Cache Key
private string[] IgnoreParameters { get; set; }
Override the method OnInvoke(MethodInterceptionArgs args) as follows:
// My custom implementation that you may want to change
public override void OnInvoke(MethodInterceptionArgs args)
{
// Fetch standard Cache
var cache = MemoryCache.Default;
// Fetch Cache Key using the Method arguments
string cacheKey = GetCacheKey(args); // Code pasted below
var cacheValue = cache.Get(cacheKey);
if (cacheValue != null)
{
args.ReturnValue = cacheValue;
return;
}
ReloadCache(cacheKey, args);
}
// Get Cache Key method
private string GetCacheKey(MethodInterceptionArgs args)
{
//need to exclude paging Parameters from Key
var builder = new StringBuilder();
var returnKey = new ViewrCacheKey { CallingMethodFullName = args.Method.ToString() };
// Loop through the Call arguments / parameters
foreach (var argument in args.Arguments)
{
if (argument != null)
{
if ((IgnoreParameters == null) ||
(IgnoreParameters != null && !IgnoreParameters.Contains(argument.ToString())))
builder.Append(JsonConvert.SerializeObject(argument));
}
}
returnKey.ControllerHash = builder.ToString();
return JsonConvert.SerializeObject(returnKey);
}
// Reload Cache
private Object ReloadCache(string cacheKey, MethodInterceptionArgs args)
{
//call the actual Method
base.OnInvoke(args);
//Save result into local cache
InsertCache(cacheKey, args.ReturnValue);
return args.ReturnValue;
}
// Insert Cache
private void InsertCache(string key, object value)
{
// Setting Cache Item Policy
var policy = new CacheItemPolicy
{
SlidingExpiration = new TimeSpan(0, 0, CacheTimeOut)
};
// sliding Expiration Timeout in Seconds
ObjectCache cache = MemoryCache.Default;
// Set the key,value in the cache
cache.Set(key, value, policy);
}
// ViewR Cache Key
public class ViewrCacheKey
{
/// <summary>
///
/// </summary>
public string CallingMethodFullName { get; set; }
/// <summary>
///
/// </summary>
public string ControllerHash { get; set; }
}
I need to cache a generic list so I dont have to query the databse multiple times. In a web application I would just add it to the httpcontext.current.cache . What is the proper way to cache objects in console applications?
Keep it as instance member of the containing class. In web app you can't do this since page class's object is recreated on every request.
However .NET 4.0 also has MemoryCache class for this purpose.
In a class-level variable. Presumably, in the main method of your console app you instantiate at least one object of some sort. In this object's class, you declare a class-level variable (a List<String> or whatever) in which you cache whatever needs caching.
Here is a very simple cache class I use in consoles that has self clean up and easy implementation.
The Usage:
return Cache.Get("MyCacheKey", 30, () => { return new Model.Guide().ChannelListings.BuildChannelList(); });
The Class:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Timers;
namespace MyAppNamespace
{
public static class Cache
{
private static Timer cleanupTimer = new Timer() { AutoReset = true, Enabled = true, Interval = 60000 };
private static readonly Dictionary<string, CacheItem> internalCache = new Dictionary<string, CacheItem>();
static Cache()
{
cleanupTimer.Elapsed += Clean;
cleanupTimer.Start();
}
private static void Clean(object sender, ElapsedEventArgs e)
{
internalCache.Keys.ToList().ForEach(x => { try { if (internalCache[x].ExpireTime <= e.SignalTime) { Remove(x); } } catch (Exception) { /*swallow it*/ } });
}
public static T Get<T>(string key, int expiresMinutes, Func<T> refreshFunction)
{
if (internalCache.ContainsKey(key) && internalCache[key].ExpireTime > DateTime.Now)
{
return (T)internalCache[key].Item;
}
var result = refreshFunction();
Set(key, result, expiresMinutes);
return result;
}
public static void Set(string key, object item, int expiresMinutes)
{
Remove(key);
internalCache.Add(key, new CacheItem(item, expiresMinutes));
}
public static void Remove(string key)
{
if (internalCache.ContainsKey(key))
{
internalCache.Remove(key);
}
}
private struct CacheItem
{
public CacheItem(object item, int expiresMinutes)
: this()
{
Item = item;
ExpireTime = DateTime.Now.AddMinutes(expiresMinutes);
}
public object Item { get; private set; }
public DateTime ExpireTime { get; private set; }
}
}
}
// Consider this psuedo code for using Cache
public DataSet GetMySearchData(string search)
{
// if it is in my cache already (notice search criteria is the cache key)
string cacheKey = "Search " + search;
if (Cache[cacheKey] != null)
{
return (DataSet)(Cache[cacheKey]);
}
else
{
DataSet result = yourDAL.DoSearch(search);
Cache[cacheKey].Insert(result); // There are more params needed here...
return result;
}
}
Ref: How do I cache a dataset to stop round trips to db?
You might be able to just use a simple Dictionary. The thing that makes the Cache so special in the web environment is that it persists and is scoped in such a way that many users can access it. In a console app, you don't have those issues. If your needs are simple enough, the dictionary or similar structures can be used to quickly lookup values you pull out of a database.
There a many ways to implement caches, depending of what exactly you are doing. Usually you will be using a dictionary to hold cached values. Here is my simple implementation of a cache, which caches values only for a limited time:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
namespace CySoft.Collections
{
public class Cache<TKey,TValue>
{
private readonly Dictionary<TKey, CacheItem> _cache = new Dictionary<TKey, CacheItem>();
private TimeSpan _maxCachingTime;
/// <summary>
/// Creates a cache which holds the cached values for an infinite time.
/// </summary>
public Cache()
: this(TimeSpan.MaxValue)
{
}
/// <summary>
/// Creates a cache which holds the cached values for a limited time only.
/// </summary>
/// <param name="maxCachingTime">Maximum time for which the a value is to be hold in the cache.</param>
public Cache(TimeSpan maxCachingTime)
{
_maxCachingTime = maxCachingTime;
}
/// <summary>
/// Tries to get a value from the cache. If the cache contains the value and the maximum caching time is
/// not exceeded (if any is defined), then the cached value is returned, else a new value is created.
/// </summary>
/// <param name="key">Key of the value to get.</param>
/// <param name="createValue">Function creating a new value.</param>
/// <returns>A cached or a new value.</returns>
public TValue Get(TKey key, Func<TValue> createValue)
{
CacheItem cacheItem;
if (_cache.TryGetValue(key, out cacheItem) && (DateTime.Now - cacheItem.CacheTime) <= _maxCachingTime) {
return cacheItem.Item;
}
TValue value = createValue();
_cache[key] = new CacheItem(value);
return value;
}
private struct CacheItem
{
public CacheItem(TValue item)
: this()
{
Item = item;
CacheTime = DateTime.Now;
}
public TValue Item { get; private set; }
public DateTime CacheTime { get; private set; }
}
}
}
You can pass a lambda expression to the Get method, which retrieves values from a db for instance.
Use Singleton Pattern.
http://msdn.microsoft.com/en-us/library/ff650316.aspx
I'm looking for a priority queue with an interface like this:
class PriorityQueue<T>
{
public void Enqueue(T item, int priority)
{
}
public T Dequeue()
{
}
}
All the implementations I've seen assume that item is an IComparable but I don't like this approach; I want to specify the priority when I'm pushing it onto the queue.
If a ready-made implementation doesn't exist, what's the best way to go about doing this myself? What underlying data structure should I use? Some sort of self-balancing tree, or what? A standard C#.net structure would be nice.
If you have an existing priority queue implementation based on IComparable, you can easily use that to build the structure you need:
public class CustomPriorityQueue<T> // where T need NOT be IComparable
{
private class PriorityQueueItem : IComparable<PriorityQueueItem>
{
private readonly T _item;
private readonly int _priority:
// obvious constructor, CompareTo implementation and Item accessor
}
// the existing PQ implementation where the item *does* need to be IComparable
private readonly PriorityQueue<PriorityQueueItem> _inner = new PriorityQueue<PriorityQueueItem>();
public void Enqueue(T item, int priority)
{
_inner.Enqueue(new PriorityQueueItem(item, priority));
}
public T Dequeue()
{
return _inner.Dequeue().Item;
}
}
You can add safety checks and what not, but here is a very simple implementation using SortedList:
class PriorityQueue<T> {
SortedList<Pair<int>, T> _list;
int count;
public PriorityQueue() {
_list = new SortedList<Pair<int>, T>(new PairComparer<int>());
}
public void Enqueue(T item, int priority) {
_list.Add(new Pair<int>(priority, count), item);
count++;
}
public T Dequeue() {
T item = _list[_list.Keys[0]];
_list.RemoveAt(0);
return item;
}
}
I'm assuming that smaller values of priority correspond to higher priority items (this is easy to modify).
If multiple threads will be accessing the queue you will need to add a locking mechanism too. This is easy, but let me know if you need guidance here.
SortedList is implemented internally as a binary tree.
The above implementation needs the following helper classes. This address Lasse V. Karlsen's comment that items with the same priority can not be added using the naive implementation using a SortedList.
class Pair<T> {
public T First { get; private set; }
public T Second { get; private set; }
public Pair(T first, T second) {
First = first;
Second = second;
}
public override int GetHashCode() {
return First.GetHashCode() ^ Second.GetHashCode();
}
public override bool Equals(object other) {
Pair<T> pair = other as Pair<T>;
if (pair == null) {
return false;
}
return (this.First.Equals(pair.First) && this.Second.Equals(pair.Second));
}
}
class PairComparer<T> : IComparer<Pair<T>> where T : IComparable {
public int Compare(Pair<T> x, Pair<T> y) {
if (x.First.CompareTo(y.First) < 0) {
return -1;
}
else if (x.First.CompareTo(y.First) > 0) {
return 1;
}
else {
return x.Second.CompareTo(y.Second);
}
}
}
You could write a wrapper around one of the existing implementations that modifies the interface to your preference:
using System;
class PriorityQueueThatYouDontLike<T> where T: IComparable<T>
{
public void Enqueue(T item) { throw new NotImplementedException(); }
public T Dequeue() { throw new NotImplementedException(); }
}
class PriorityQueue<T>
{
class ItemWithPriority : IComparable<ItemWithPriority>
{
public ItemWithPriority(T t, int priority)
{
Item = t;
Priority = priority;
}
public T Item {get; private set;}
public int Priority {get; private set;}
public int CompareTo(ItemWithPriority other)
{
return Priority.CompareTo(other.Priority);
}
}
PriorityQueueThatYouDontLike<ItemWithPriority> q = new PriorityQueueThatYouDontLike<ItemWithPriority>();
public void Enqueue(T item, int priority)
{
q.Enqueue(new ItemWithPriority(item, priority));
}
public T Dequeue()
{
return q.Dequeue().Item;
}
}
This is the same as itowlson's suggestion. I just took longer to write mine because I filled out more of the methods. :-s
Here's a very simple lightweight implementation that has O(log(n)) performance for both push and pop. It uses a heap data structure built on top of a List<T>.
/// <summary>Implements a priority queue of T, where T has an ordering.</summary>
/// Elements may be added to the queue in any order, but when we pull
/// elements out of the queue, they will be returned in 'ascending' order.
/// Adding new elements into the queue may be done at any time, so this is
/// useful to implement a dynamically growing and shrinking queue. Both adding
/// an element and removing the first element are log(N) operations.
///
/// The queue is implemented using a priority-heap data structure. For more
/// details on this elegant and simple data structure see "Programming Pearls"
/// in our library. The tree is implemented atop a list, where 2N and 2N+1 are
/// the child nodes of node N. The tree is balanced and left-aligned so there
/// are no 'holes' in this list.
/// <typeparam name="T">Type T, should implement IComparable[T];</typeparam>
public class PriorityQueue<T> where T : IComparable<T> {
/// <summary>Clear all the elements from the priority queue</summary>
public void Clear () {
mA.Clear ();
}
/// <summary>Add an element to the priority queue - O(log(n)) time operation.</summary>
/// <param name="item">The item to be added to the queue</param>
public void Add (T item) {
// We add the item to the end of the list (at the bottom of the
// tree). Then, the heap-property could be violated between this element
// and it's parent. If this is the case, we swap this element with the
// parent (a safe operation to do since the element is known to be less
// than it's parent). Now the element move one level up the tree. We repeat
// this test with the element and it's new parent. The element, if lesser
// than everybody else in the tree will eventually bubble all the way up
// to the root of the tree (or the head of the list). It is easy to see
// this will take log(N) time, since we are working with a balanced binary
// tree.
int n = mA.Count; mA.Add (item);
while (n != 0) {
int p = n / 2; // This is the 'parent' of this item
if (mA[n].CompareTo (mA[p]) >= 0) break; // Item >= parent
T tmp = mA[n]; mA[n] = mA[p]; mA[p] = tmp; // Swap item and parent
n = p; // And continue
}
}
/// <summary>Returns the number of elements in the queue.</summary>
public int Count {
get { return mA.Count; }
}
/// <summary>Returns true if the queue is empty.</summary>
/// Trying to call Peek() or Next() on an empty queue will throw an exception.
/// Check using Empty first before calling these methods.
public bool Empty {
get { return mA.Count == 0; }
}
/// <summary>Allows you to look at the first element waiting in the queue, without removing it.</summary>
/// This element will be the one that will be returned if you subsequently call Next().
public T Peek () {
return mA[0];
}
/// <summary>Removes and returns the first element from the queue (least element)</summary>
/// <returns>The first element in the queue, in ascending order.</returns>
public T Next () {
// The element to return is of course the first element in the array,
// or the root of the tree. However, this will leave a 'hole' there. We
// fill up this hole with the last element from the array. This will
// break the heap property. So we bubble the element downwards by swapping
// it with it's lower child until it reaches it's correct level. The lower
// child (one of the orignal elements with index 1 or 2) will now be at the
// head of the queue (root of the tree).
T val = mA[0];
int nMax = mA.Count - 1;
mA[0] = mA[nMax]; mA.RemoveAt (nMax); // Move the last element to the top
int p = 0;
while (true) {
// c is the child we want to swap with. If there
// is no child at all, then the heap is balanced
int c = p * 2; if (c >= nMax) break;
// If the second child is smaller than the first, that's the one
// we want to swap with this parent.
if (c + 1 < nMax && mA[c + 1].CompareTo (mA[c]) < 0) c++;
// If the parent is already smaller than this smaller child, then
// we are done
if (mA[p].CompareTo (mA[c]) <= 0) break;
// Othewise, swap parent and child, and follow down the parent
T tmp = mA[p]; mA[p] = mA[c]; mA[c] = tmp;
p = c;
}
return val;
}
/// <summary>The List we use for implementation.</summary>
List<T> mA = new List<T> ();
}
That is the exact interface used by my highly optimized C# priority-queue.
It was developed specifically for pathfinding applications (A*, etc.), but should work perfectly for any other application as well.
public class User
{
public string Name { get; private set; }
public User(string name)
{
Name = name;
}
}
...
var priorityQueue = new SimplePriorityQueue<User>();
priorityQueue.Enqueue(new User("Jason"), 1);
priorityQueue.Enqueue(new User("Joseph"), 10);
//Because it's a min-priority queue, the following line will return "Jason"
User user = priorityQueue.Dequeue();
What would be so terrible about something like this?
class PriorityQueue<TItem, TPriority> where TPriority : IComparable
{
private SortedList<TPriority, Queue<TItem>> pq = new SortedList<TPriority, Queue<TItem>>();
public int Count { get; private set; }
public void Enqueue(TItem item, TPriority priority)
{
++Count;
if (!pq.ContainsKey(priority)) pq[priority] = new Queue<TItem>();
pq[priority].Enqueue(item);
}
public TItem Dequeue()
{
--Count;
var queue = pq.ElementAt(0).Value;
if (queue.Count == 1) pq.RemoveAt(0);
return queue.Dequeue();
}
}
class PriorityQueue<TItem> : PriorityQueue<TItem, int> { }
I realise that your question specifically asks for a non-IComparable-based implementation, but I want to point out a recent article from Visual Studio Magazine.
http://visualstudiomagazine.com/articles/2012/11/01/priority-queues-with-c.aspx
This article with #itowlson's can give a complete answer.
A little late but I'll add it here for reference
https://github.com/ERufian/Algs4-CSharp
Key-value-pair priority queues are implemented in Algs4/IndexMaxPQ.cs, Algs4/IndexMinPQ.cs and Algs4/IndexPQDictionary.cs
Notes:
If the Priorities are not IComparable's, an IComparer can be specified in the constructor
Instead of enqueueing the object and its priority, what is enqueued is an index and its priority (and, for the original question, a separate List or T[] would be needed to convert that index to the expected result)
.NET6 finally offers an API for PriorityQueue
See here
Seems like you could roll your own with a seriews of Queues, one for each priority. Dictionary and just add it to the appropriate one.