Completely editing the earlier version, Can the following implementation be the Thread Safe List implementation. I just need to know whether it would truly thread safe or not, I know performance wise there would still be issues. Currently version is using ReaderWriterLockSlim, I have another implementation using the Lock, doing the same job
using System.Collections.Generic;
using System.Threading;
/// <summary>
/// Thread safe version of the List using ReaderWriterLockSlim
/// </summary>
/// <typeparam name="T"></typeparam>
public class ThreadSafeListWithRWLock<T> : IList<T>
{
// Internal private list which would be accessed in a thread safe manner
private List<T> internalList;
// ReaderWriterLockSlim object to take care of thread safe acess between multiple readers and writers
private readonly ReaderWriterLockSlim rwLockList;
/// <summary>
/// Public constructor with variable initialization code
/// </summary>
public ThreadSafeListWithRWLock()
{
internalList = new List<T>();
rwLockList = new ReaderWriterLockSlim();
}
/// <summary>
/// Get the Enumerator to the Thread safe list
/// </summary>
/// <returns></returns>
public IEnumerator<T> GetEnumerator()
{
return Clone().GetEnumerator();
}
/// <summary>
/// System.Collections.IEnumerable.GetEnumerator implementation to get the IEnumerator type
/// </summary>
/// <returns></returns>
System.Collections.IEnumerator System.Collections.IEnumerable.GetEnumerator()
{
return Clone().GetEnumerator();
}
/// <summary>
/// Clone method to create an in memory copy of the Thread safe list
/// </summary>
/// <returns></returns>
public List<T> Clone()
{
List<T> clonedList = new List<T>();
rwLockList.EnterReadLock();
internalList.ForEach(element => { clonedList.Add(element); });
rwLockList.ExitReadLock();
return (clonedList);
}
/// <summary>
/// Add an item to Thread safe list
/// </summary>
/// <param name="item"></param>
public void Add(T item)
{
rwLockList.EnterWriteLock();
internalList.Add(item);
rwLockList.ExitWriteLock();
}
/// <summary>
/// Remove an item from Thread safe list
/// </summary>
/// <param name="item"></param>
/// <returns></returns>
public bool Remove(T item)
{
bool isRemoved;
rwLockList.EnterWriteLock();
isRemoved = internalList.Remove(item);
rwLockList.ExitWriteLock();
return (isRemoved);
}
/// <summary>
/// Clear all elements of Thread safe list
/// </summary>
public void Clear()
{
rwLockList.EnterWriteLock();
internalList.Clear();
rwLockList.ExitWriteLock();
}
/// <summary>
/// Contains an item in the Thread safe list
/// </summary>
/// <param name="item"></param>
/// <returns></returns>
public bool Contains(T item)
{
bool containsItem;
rwLockList.EnterReadLock();
containsItem = internalList.Contains(item);
rwLockList.ExitReadLock();
return (containsItem);
}
/// <summary>
/// Copy elements of the Thread safe list to a compatible array from specified index in the aray
/// </summary>
/// <param name="array"></param>
/// <param name="arrayIndex"></param>
public void CopyTo(T[] array, int arrayIndex)
{
rwLockList.EnterReadLock();
internalList.CopyTo(array,arrayIndex);
rwLockList.ExitReadLock();
}
/// <summary>
/// Count elements in a Thread safe list
/// </summary>
public int Count
{
get
{
int count;
rwLockList.EnterReadLock();
count = internalList.Count;
rwLockList.ExitReadLock();
return (count);
}
}
/// <summary>
/// Check whether Thread safe list is read only
/// </summary>
public bool IsReadOnly
{
get { return false; }
}
/// <summary>
/// Index of an item in the Thread safe list
/// </summary>
/// <param name="item"></param>
/// <returns></returns>
public int IndexOf(T item)
{
int itemIndex;
rwLockList.EnterReadLock();
itemIndex = internalList.IndexOf(item);
rwLockList.ExitReadLock();
return (itemIndex);
}
/// <summary>
/// Insert an item at a specified index in a Thread safe list
/// </summary>
/// <param name="index"></param>
/// <param name="item"></param>
public void Insert(int index, T item)
{
rwLockList.EnterWriteLock();
if (index <= internalList.Count - 1 && index >= 0)
internalList.Insert(index,item);
rwLockList.ExitWriteLock();
}
/// <summary>
/// Remove an item at a specified index in Thread safe list
/// </summary>
/// <param name="index"></param>
public void RemoveAt(int index)
{
rwLockList.EnterWriteLock();
if (index <= internalList.Count - 1 && index >= 0)
internalList.RemoveAt(index);
rwLockList.ExitWriteLock();
}
/// <summary>
/// Indexer for the Thread safe list
/// </summary>
/// <param name="index"></param>
/// <returns></returns>
public T this[int index]
{
get
{
T returnItem = default(T);
rwLockList.EnterReadLock();
if (index <= internalList.Count - 1 && index >= 0)
returnItem = internalList[index];
rwLockList.ExitReadLock();
return (returnItem);
}
set
{
rwLockList.EnterWriteLock();
if (index <= internalList.Count - 1 && index >= 0)
internalList[index] = value;
rwLockList.ExitWriteLock();
}
}
}
Implementing a custom List<T> that encapsulates thread-safety is rarely worth the effort. You're likely best off just using lock whenever you access the List<T>.
But being in a performance intensive industry myself there have been cases where this becomes a bottleneck. The main drawback to lock is the possibility of context switching which is, relatively speaking, extremely expensive both in wall clock time and CPU cycles.
The best way around this is using immutability. Have all readers access an immutable list and writers "update" it using Interlocked operations to replace it with a new instance. This is a lock-free design that makes reads free of synchronization and writes lock-free (eliminates context switching).
I will stress that in almost all cases this is overkill and I wouldn't even consider going down this path unless you're positive you need to and you understand the drawbacks. A couple of the obvious ones are readers getting point-in-time snapshots and wasting memory creating copies.
ImmutableList from Microsoft.Bcl.Immutable is also worth a look. It's entirely thread-safe.
That is not threadsafe.
The method GetEnumerator() will not retain any lock after the enumerator has been returned, so any thread would be free to use the returned enumerator without any locking to prevent them from doing so.
In general, trying to create a threadsafe list type is very difficult.
See this StackOverflow thread for some discussion: No ConcurrentList<T> in .Net 4.0?
If you are trying to use some kind of reader/writer lock, rather than a simple locking scheme for both reading and writing, your concurrent reads probably greatly outnumber your writes. In that case, a copy-on-write approach, as suggested by Zer0, can be appropriate.
In an answer to a related question I posted a generic utility function that helps turning any modification on any data structure into a thread-safe and highly concurrent operation.
Code
static class CopyOnWriteSwapper
{
public static void Swap<T>(ref T obj, Func<T, T> cloner, Action<T> op)
where T : class
{
while (true)
{
var objBefore = Volatile.Read(ref obj);
var newObj = cloner(objBefore);
op(newObj);
if (Interlocked.CompareExchange(ref obj, newObj, objBefore) == objBefore)
return;
}
}
}
Usage
CopyOnWriteSwapper.Swap(ref _myList,
orig => new List<string>(orig),
clone => clone.Add("asdf"));
More details about what you can do with it, and a couple caveats can be found in the original answer.
Related
I would like to implement a simple in-memory LRU cache system and I was thinking about a solution based on an IDictionary implementation which could handle an hashed LRU mechanism.
Coming from java, I have experiences with LinkedHashMap, which works fine for what I need: I can't find anywhere a similar solution for .NET.
Has anyone developed it or has anyone had experiences like this?
This a very simple and fast implementation we developed for a web site we own.
We tried to improve the code as much as possible, while keeping it thread safe.
I think the code is very simple and clear, but if you need some explanation or a guide related to how to use it, don't hesitate to ask.
namespace LRUCache
{
public class LRUCache<K,V>
{
private int capacity;
private Dictionary<K, LinkedListNode<LRUCacheItem<K, V>>> cacheMap = new Dictionary<K, LinkedListNode<LRUCacheItem<K, V>>>();
private LinkedList<LRUCacheItem<K, V>> lruList = new LinkedList<LRUCacheItem<K, V>>();
public LRUCache(int capacity)
{
this.capacity = capacity;
}
[MethodImpl(MethodImplOptions.Synchronized)]
public V get(K key)
{
LinkedListNode<LRUCacheItem<K, V>> node;
if (cacheMap.TryGetValue(key, out node))
{
V value = node.Value.value;
lruList.Remove(node);
lruList.AddLast(node);
return value;
}
return default(V);
}
[MethodImpl(MethodImplOptions.Synchronized)]
public void add(K key, V val)
{
if (cacheMap.TryGetValue(key, out var existingNode))
{
lruList.Remove(existingNode);
}
else if (cacheMap.Count >= capacity)
{
RemoveFirst();
}
LRUCacheItem<K, V> cacheItem = new LRUCacheItem<K, V>(key, val);
LinkedListNode<LRUCacheItem<K, V>> node = new LinkedListNode<LRUCacheItem<K, V>>(cacheItem);
lruList.AddLast(node);
// cacheMap.Add(key, node); - here's bug if try to add already existing value
cacheMap[key] = node;
}
private void RemoveFirst()
{
// Remove from LRUPriority
LinkedListNode<LRUCacheItem<K,V>> node = lruList.First;
lruList.RemoveFirst();
// Remove from cache
cacheMap.Remove(node.Value.key);
}
}
class LRUCacheItem<K,V>
{
public LRUCacheItem(K k, V v)
{
key = k;
value = v;
}
public K key;
public V value;
}
}
There is nothing in the base class libraries that does this.
On the free side, maybe something like C5's HashedLinkedList would work.
If you're willing to pay, maybe check out this C# toolkit. It contains an implementation.
I've recently released a class called LurchTable to address the need for a C# variant of the LinkedHashMap. A brief discussion of the LurchTable can be found here.
Basic features:
Linked Concurrent Dictionary by Insertion, Modification, or Access
Dictionary/ConcurrentDictionary interface support
Peek/TryDequeue/Dequeue access to 'oldest' entry
Allows hard-limit on items enforced at insertion
Exposes events for add, update, and remove
Source Code: http://csharptest.net/browse/src/Library/Collections/LurchTable.cs
GitHub: https://github.com/csharptest/CSharpTest.Net.Collections
HTML Help: http://help.csharptest.net/
PM> Install-Package CSharpTest.Net.Collections
The LRUCache answer with sample code above uses MethodImplOptions.Synchronized, which is equivalent to putting lock(this) around each method call. Whilst correct, this global lock will significantly reduce throughput under concurrent load.
To solve this I implemented a thread safe pseudo LRU designed for concurrent workloads. Performance is very close to ConcurrentDictionary, ~10x faster than MemoryCache and hit rate is better than a conventional LRU. Full analysis provided in the github link below.
Usage looks like this:
int capacity = 666;
var lru = new ConcurrentLru<int, SomeItem>(capacity);
var value = lru.GetOrAdd(1, (k) => new SomeItem(k));
GitHub: https://github.com/bitfaster/BitFaster.Caching
Install-Package BitFaster.Caching
Found you answer while googling, also found this:
http://code.google.com/p/csharp-lru-cache/
csharp-lru-cache: LRU cache collection class library
This is a collection class that
functions as a least-recently-used
cache. It implements ICollection<T>,
but also exposes three other members:
Capacity, the maximum number of items
the cache can contain. Once the
collection is at capacity, adding a
new item to the cache will cause the
least recently used item to be
discarded. If the Capacity is set to 0
at construction, the cache will not
automatically discard items.
Oldest,
the oldest (i.e. least recently used)
item in the collection.
DiscardingOldestItem, an event raised
when the cache is about to discard its
oldest item. This is an extremely
simple implementation. While its Add
and Remove methods are thread-safe, it
shouldn't be used in heavy
multithreading environments because
the entire collection is locked during
those methods.
This takes Martin's code with Mr T's suggestions and makes it Stylecop friendly. Oh, it also allows for disposal of values as they cycle out of the cache.
namespace LruCache
{
using System;
using System.Collections.Generic;
/// <summary>
/// A least-recently-used cache stored like a dictionary.
/// </summary>
/// <typeparam name="TKey">
/// The type of the key to the cached item
/// </typeparam>
/// <typeparam name="TValue">
/// The type of the cached item.
/// </typeparam>
/// <remarks>
/// Derived from https://stackoverflow.com/a/3719378/240845
/// </remarks>
public class LruCache<TKey, TValue>
{
private readonly Dictionary<TKey, LinkedListNode<LruCacheItem>> cacheMap =
new Dictionary<TKey, LinkedListNode<LruCacheItem>>();
private readonly LinkedList<LruCacheItem> lruList =
new LinkedList<LruCacheItem>();
private readonly Action<TValue> dispose;
/// <summary>
/// Initializes a new instance of the <see cref="LruCache{TKey, TValue}"/>
/// class.
/// </summary>
/// <param name="capacity">
/// Maximum number of elements to cache.
/// </param>
/// <param name="dispose">
/// When elements cycle out of the cache, disposes them. May be null.
/// </param>
public LruCache(int capacity, Action<TValue> dispose = null)
{
this.Capacity = capacity;
this.dispose = dispose;
}
/// <summary>
/// Gets the capacity of the cache.
/// </summary>
public int Capacity { get; }
/// <summary>Gets the value associated with the specified key.</summary>
/// <param name="key">
/// The key of the value to get.
/// </param>
/// <param name="value">
/// When this method returns, contains the value associated with the specified
/// key, if the key is found; otherwise, the default value for the type of the
/// <paramref name="value" /> parameter. This parameter is passed
/// uninitialized.
/// </param>
/// <returns>
/// true if the <see cref="T:System.Collections.Generic.Dictionary`2" />
/// contains an element with the specified key; otherwise, false.
/// </returns>
public bool TryGetValue(TKey key, out TValue value)
{
lock (this.cacheMap)
{
LinkedListNode<LruCacheItem> node;
if (this.cacheMap.TryGetValue(key, out node))
{
value = node.Value.Value;
this.lruList.Remove(node);
this.lruList.AddLast(node);
return true;
}
value = default(TValue);
return false;
}
}
/// <summary>
/// Looks for a value for the matching <paramref name="key"/>. If not found,
/// calls <paramref name="valueGenerator"/> to retrieve the value and add it to
/// the cache.
/// </summary>
/// <param name="key">
/// The key of the value to look up.
/// </param>
/// <param name="valueGenerator">
/// Generates a value if one isn't found.
/// </param>
/// <returns>
/// The requested value.
/// </returns>
public TValue Get(TKey key, Func<TValue> valueGenerator)
{
lock (this.cacheMap)
{
LinkedListNode<LruCacheItem> node;
TValue value;
if (this.cacheMap.TryGetValue(key, out node))
{
value = node.Value.Value;
this.lruList.Remove(node);
this.lruList.AddLast(node);
}
else
{
value = valueGenerator();
if (this.cacheMap.Count >= this.Capacity)
{
this.RemoveFirst();
}
LruCacheItem cacheItem = new LruCacheItem(key, value);
node = new LinkedListNode<LruCacheItem>(cacheItem);
this.lruList.AddLast(node);
this.cacheMap.Add(key, node);
}
return value;
}
}
/// <summary>
/// Adds the specified key and value to the dictionary.
/// </summary>
/// <param name="key">
/// The key of the element to add.
/// </param>
/// <param name="value">
/// The value of the element to add. The value can be null for reference types.
/// </param>
public void Add(TKey key, TValue value)
{
lock (this.cacheMap)
{
if (this.cacheMap.Count >= this.Capacity)
{
this.RemoveFirst();
}
LruCacheItem cacheItem = new LruCacheItem(key, value);
LinkedListNode<LruCacheItem> node =
new LinkedListNode<LruCacheItem>(cacheItem);
this.lruList.AddLast(node);
this.cacheMap.Add(key, node);
}
}
private void RemoveFirst()
{
// Remove from LRUPriority
LinkedListNode<LruCacheItem> node = this.lruList.First;
this.lruList.RemoveFirst();
// Remove from cache
this.cacheMap.Remove(node.Value.Key);
// dispose
this.dispose?.Invoke(node.Value.Value);
}
private class LruCacheItem
{
public LruCacheItem(TKey k, TValue v)
{
this.Key = k;
this.Value = v;
}
public TKey Key { get; }
public TValue Value { get; }
}
}
}
The Caching Application Block of EntLib has an LRU scavenging option out of the box and can be in memory. It might be a bit heavyweight for what you want tho.
I don't believe so. I've certainly seen hand-rolled ones implemented several times in various unrelated projects (which more or less confirms this. If there was one, surely at least one of the projects would have used it).
It's pretty simple to implement, and usually gets done by creating a class which contains both a Dictionary and a List.
The keys go in the list (in-order) and the items go in the dictionary.
When you Add a new item to the collection, the function checks the length of the list, pulls out the last Key (if it's too long) and then evicts the key and value from the dictionary to match. Not much more to it really
I like Lawrence's implementation. Hashtable + LinkedList is a good solution.
Regarding threading, I would not lock this with[MethodImpl(MethodImplOptions.Synchronized)], but rather use ReaderWriterLockSlim or spin lock (since contention usually fast) instead.
In the Get function I would check if it's already the 1st item first, rather than always removing and adding. This gives you the possibility to keep that within a reader lock that is not blocking other readers.
I just accidently found now LruCache.cs in aws-sdk-net: https://github.com/aws/aws-sdk-net/blob/master/sdk/src/Core/Amazon.Runtime/Internal/Util/LruCache.cs
If it's an asp.net app you can use the cache class[1] but you'll be competing for space with other cached stuff, which may be what you want or may not be.
[1] http://msdn.microsoft.com/en-us/library/system.web.caching.cache.aspx
Is this do-able?
Here's my situation and could you suggest an easier or more efficient way if what I'm trying to do isn't advisable.
We're talking about report generation page here.First, I have a stored procedure that takes a REALLY long time to finish executing if no filters/condition is set. Meaning it is a view all, this stored proc returns a list. This list then populates a table in my view. It could be just 10 up to thousands records but the execution is pretty long because it computes this and that against thousands of record, to make it short, I won't alter my stored procedure.
Now from this first view, I have a printable version button which calls another page with the same contents but print-friendly page. I dont want to execute the painful stored proc to get the same list, I want to re-use what is already generated. How can I do this?
From what I understand you are thinking of implementing some way of caching the data that you calculated through a painfully slow stored procedure?
One option would be to implement a CacheManager and cache the results for a certain period of time:
/// <summary>
/// Cache Manager Singleton
/// </summary>
public class CacheManager
{
/// <summary>
/// The instance
/// </summary>
private static MemoryCache instance = null;
/// <summary>
/// Gets the instance of memoryCache.
/// </summary>
/// <value>The instance of memoryCache.</value>
public static MemoryCache Instance
{
get
{
if (instance == null)
{
instance = new MemoryCache();
}
return instance;
}
}
}
/// <summary>
/// Cache Manager
/// </summary>
public class MemoryCache
{
/// <summary>
/// Gets the expiration date of the object
/// </summary>
/// <value>The no absolute expiration.</value>
public DateTime NoAbsoluteExpiration
{
get { return DateTime.MaxValue; }
}
/// <summary>
/// Retrieve the object in cache
/// If the object doesn't exist in cache or is obsolete, getItemCallback method is called to fill the object
/// </summary>
/// <typeparam name="T">Object Type to put or get in cache</typeparam>
/// <param name="httpContext">Http Context</param>
/// <param name="cacheId">Object identifier in cache - Must be unique</param>
/// <param name="getItemCallback">Callback method to fill the object</param>
/// <param name="slidingExpiration">Expiration date</param>
/// <returns>Object put in cache</returns>
public T Get<T>(string cacheId, Func<T> getItemCallback, TimeSpan? slidingExpiration = null) where T : class
{
T item = HttpRuntime.Cache.Get(cacheId) as T;
if (item == null)
{
item = getItemCallback();
if (slidingExpiration == null)
{
slidingExpiration = TimeSpan.FromSeconds(30);
}
HttpRuntime.Cache.Insert(cacheId, item, null, this.NoAbsoluteExpiration, slidingExpiration.Value);
}
return item;
}
/// <summary>
/// Retrieve the object in cache
/// If the object doesn't exist in cache or is obsolete, null is returned
/// </summary>
/// <typeparam name="T">Object Type to put or get in cache</typeparam>
/// <param name="httpContext">Http Context</param>
/// <param name="cacheId">Object identifier in cache - Must be unique</param>
/// <returns>Object put in cache</returns>
public T Get<T>(string cacheId) where T : class
{
T item = HttpRuntime.Cache.Get(cacheId) as T;
if (item == null)
{
return null;
}
return item;
}
/// <summary>
/// Delete an object using his unique id
/// </summary>
/// <param name="httpContext">Http Context</param>
/// <param name="cacheId">Object identifier in cache</param>
/// <returns><c>true</c> if XXXX, <c>false</c> otherwise</returns>
public bool Clear(string cacheId)
{
var item = HttpRuntime.Cache.Get(cacheId);
if (item != null)
{
HttpRuntime.Cache.Remove(cacheId);
return true;
}
return false;
}
/// <summary>
/// Delete all object in cache
/// </summary>
/// <param name="httpContext">Http Context</param>
public void ClearAll(string filter = null)
{
var item = HttpRuntime.Cache.GetEnumerator();
while (item.MoveNext())
{
DictionaryEntry entry = (DictionaryEntry)item.Current;
var key = entry.Key.ToString();
if (filter != null && (key.ToLower().Contains(filter.ToLower()) || filter == "*" )) //if filter, only delete if the key contains the filter value
{
HttpRuntime.Cache.Remove(entry.Key.ToString());
}
else if (filter == null) //no filter, delete everything
{
HttpRuntime.Cache.Remove(entry.Key.ToString());
}
}
}
}
Note: I didn't write this myself but can't find the original source.
This is how you use it:
// Retrieve object from cache if it exist, callback is performed to set and return object otherwise
UserObject userObject = CacheManager.Instance.Get(httpContext, "singleId", ()=> {
return new UserObject() {
Id = Guid.NewGuid()
};
});
// Retrieve object from cache if it exist, return null otherwise
UserObject userObjectRetrieved = CacheManager.Instance.Retrieve<UserObject>(httpContext, "singleId");
// Remove object from cache
CacheManager.Instance.Clear(httpContext, "singleId");
// Remove all object from cache
CacheManager.Instance.ClearAll(httpContext);
Not sure how your View design is meant to be but you could populate the printable version at the same time and hide it until the button is clicked.
The c# app that I'm working on uses an ObservableDictionary. The performance on this is not nearly fast enough to accommodate it's functionality. The ObservableDictionary is very rapidly being interacted with (Removing, Adding, and Updating elements) and has to sort every single time a change is made. Is there an alternative I can use to ObservableDictionary that would focus on performance and still be able to sort rapidly?
It's very simple to write your own that does not cause a rebinding each time an add / update or delete is done. We wrote our own because of this very reason. Essentially what we do is to disable the notification that a change has been made until all objects in the collection have been processed, and then we generate the notification. It looks something like this. As you can see, we use MvvmLight as our MVVM framework library. This will improve performance tremendously.
using System;
using System.Collections.Generic;
using System.Collections.ObjectModel;
using System.Collections.Specialized;
using System.Linq;
using FS.Mes.Client.Mvvm.MvvmTools.MvvmLight;
namespace FS.Mes.Client.Mvvm.MvvmTools
{
/// <summary>
/// Our implementation of ObservableCollection. This fixes one significant limitation in the .NET Framework implementation. When items
/// are added or removed from an observable collection, the OnCollectionChanged event is fired for each item. Depending on how the collection
/// is bound, this can cause significant performance issues. This implementation gets around this by suppressing the notification until all
/// items are added or removed. This implementation is also thread safe. Operations against this collection are always done on the thread that
/// owns the collection.
/// </summary>
/// <typeparam name="T">Collection type</typeparam>
public class FsObservableCollection<T> : ObservableCollection<T>
{
#region [Constructor]
/// <summary>
/// Constructor
/// </summary>
public FsObservableCollection()
{
DispatcherHelper.Initialize();
}
#endregion
#region [Public Properties]
/// <summary>
/// Gets or sets a property that determines if we are delaying notifications on updates.
/// </summary>
public bool DelayOnCollectionChangedNotification { get; set; }
#endregion
/// <summary>
/// Add a range of IEnumerable items to the observable collection and optionally delay notification until the operation is complete.
/// </summary>
/// <param name="items"></param>
/// <param name="delayCollectionChangedNotification">Value indicating whether delay notification will be turned on/off</param>
public void AddRange(IEnumerable<T> items, bool delayCollectionChangedNotification = true)
{
if (items == null)
throw new ArgumentNullException("items");
DoDispatchedAction(() =>
{
DelayOnCollectionChangedNotification = delayCollectionChangedNotification;
// Do we have any items to add?
if (items.Any())
{
try
{
foreach (var item in items)
this.Add(item);
}
finally
{
// We're done. Turn delay notification off and call the OnCollectionChanged() method and tell it we had a 'dramatic' change
// in the collection.
DelayOnCollectionChangedNotification = false;
this.OnCollectionChanged(new NotifyCollectionChangedEventArgs(NotifyCollectionChangedAction.Reset));
}
}
});
}
/// <summary>
/// Clear the items in the ObservableCollection and optionally delay notification until the operation is complete.
/// </summary>
public void ClearItems(bool delayCollectionChangedNotification = true)
{
// Do we have anything to remove?
if (!this.Any())
return;
DoDispatchedAction(() =>
{
try
{
DelayOnCollectionChangedNotification = delayCollectionChangedNotification;
this.Clear();
}
finally
{
// We're done. Turn delay notification off and call the OnCollectionChanged() method and tell it we had a 'dramatic' change
// in the collection.
DelayOnCollectionChangedNotification = false;
this.OnCollectionChanged(new NotifyCollectionChangedEventArgs(NotifyCollectionChangedAction.Reset));
}
});
}
/// <summary>
/// Override the virtual OnCollectionChanged() method on the ObservableCollection class.
/// </summary>
/// <param name="e">Event arguments</param>
protected override void OnCollectionChanged(NotifyCollectionChangedEventArgs e)
{
DoDispatchedAction(() =>
{
if (!DelayOnCollectionChangedNotification)
base.OnCollectionChanged(e);
});
}
/// <summary>
/// Makes sure 'action' is executed on the thread that owns the object. Otherwise, things will go boom.
/// </summary>
///<param name="action">The action which should be executed</param>
private static void DoDispatchedAction(Action action)
{
DispatcherHelper.CheckInvokeOnUI(action);
}
}
}
I need to fill some dropdown boxex from some reference data. i.e City List, Country List etc. I need to fill it in various webforms. I think, we should cache this data in our application so that, we don't hit database on every form. I am new to caching and ASP.Net. Please suggest me how to do this.
I always add the following class to all my projects which give me easy access to the Cache object. Implementing this, following Hasan Khan's answer would be a good way to go.
public static class CacheHelper
{
/// <summary>
/// Insert value into the cache using
/// appropriate name/value pairs
/// </summary>
/// <typeparam name="T">Type of cached item</typeparam>
/// <param name="o">Item to be cached</param>
/// <param name="key">Name of item</param>
public static void Add<T>(T o, string key, double Timeout)
{
HttpContext.Current.Cache.Insert(
key,
o,
null,
DateTime.Now.AddMinutes(Timeout),
System.Web.Caching.Cache.NoSlidingExpiration);
}
/// <summary>
/// Remove item from cache
/// </summary>
/// <param name="key">Name of cached item</param>
public static void Clear(string key)
{
HttpContext.Current.Cache.Remove(key);
}
/// <summary>
/// Check for item in cache
/// </summary>
/// <param name="key">Name of cached item</param>
/// <returns></returns>
public static bool Exists(string key)
{
return HttpContext.Current.Cache[key] != null;
}
/// <summary>
/// Retrieve cached item
/// </summary>
/// <typeparam name="T">Type of cached item</typeparam>
/// <param name="key">Name of cached item</param>
/// <param name="value">Cached value. Default(T) if item doesn't exist.</param>
/// <returns>Cached item as type</returns>
public static bool Get<T>(string key, out T value)
{
try
{
if (!Exists(key))
{
value = default(T);
return false;
}
value = (T)HttpContext.Current.Cache[key];
}
catch
{
value = default(T);
return false;
}
return true;
}
}
From other question of yours I read that you're using 3 layer architecture with dal, business and presentation layer.
So I assume that you have some data access class. Ideal thing to do would be to have a cached implementation of the same class and do caching in that.
Eg: If you have an interface IUserRepository then UserRepository class would implement it and add/delete/update entries in db via methods then you can also have CachedUserRepository which will contain instance of UserRepository object and on get methods it will first look into the cache against some key (derived from method parameters) and if the item is found then it will return it otherwise you call the method on internal object; get the data; add to cache and then return it.
Your CachedUserRepository will also have instance of cache object obviously. You can look at http://msdn.microsoft.com/en-us/library/18c1wd61(v=vs.85).aspx for details on how to use Cache object.
I would like to implement a simple in-memory LRU cache system and I was thinking about a solution based on an IDictionary implementation which could handle an hashed LRU mechanism.
Coming from java, I have experiences with LinkedHashMap, which works fine for what I need: I can't find anywhere a similar solution for .NET.
Has anyone developed it or has anyone had experiences like this?
This a very simple and fast implementation we developed for a web site we own.
We tried to improve the code as much as possible, while keeping it thread safe.
I think the code is very simple and clear, but if you need some explanation or a guide related to how to use it, don't hesitate to ask.
namespace LRUCache
{
public class LRUCache<K,V>
{
private int capacity;
private Dictionary<K, LinkedListNode<LRUCacheItem<K, V>>> cacheMap = new Dictionary<K, LinkedListNode<LRUCacheItem<K, V>>>();
private LinkedList<LRUCacheItem<K, V>> lruList = new LinkedList<LRUCacheItem<K, V>>();
public LRUCache(int capacity)
{
this.capacity = capacity;
}
[MethodImpl(MethodImplOptions.Synchronized)]
public V get(K key)
{
LinkedListNode<LRUCacheItem<K, V>> node;
if (cacheMap.TryGetValue(key, out node))
{
V value = node.Value.value;
lruList.Remove(node);
lruList.AddLast(node);
return value;
}
return default(V);
}
[MethodImpl(MethodImplOptions.Synchronized)]
public void add(K key, V val)
{
if (cacheMap.TryGetValue(key, out var existingNode))
{
lruList.Remove(existingNode);
}
else if (cacheMap.Count >= capacity)
{
RemoveFirst();
}
LRUCacheItem<K, V> cacheItem = new LRUCacheItem<K, V>(key, val);
LinkedListNode<LRUCacheItem<K, V>> node = new LinkedListNode<LRUCacheItem<K, V>>(cacheItem);
lruList.AddLast(node);
// cacheMap.Add(key, node); - here's bug if try to add already existing value
cacheMap[key] = node;
}
private void RemoveFirst()
{
// Remove from LRUPriority
LinkedListNode<LRUCacheItem<K,V>> node = lruList.First;
lruList.RemoveFirst();
// Remove from cache
cacheMap.Remove(node.Value.key);
}
}
class LRUCacheItem<K,V>
{
public LRUCacheItem(K k, V v)
{
key = k;
value = v;
}
public K key;
public V value;
}
}
There is nothing in the base class libraries that does this.
On the free side, maybe something like C5's HashedLinkedList would work.
If you're willing to pay, maybe check out this C# toolkit. It contains an implementation.
I've recently released a class called LurchTable to address the need for a C# variant of the LinkedHashMap. A brief discussion of the LurchTable can be found here.
Basic features:
Linked Concurrent Dictionary by Insertion, Modification, or Access
Dictionary/ConcurrentDictionary interface support
Peek/TryDequeue/Dequeue access to 'oldest' entry
Allows hard-limit on items enforced at insertion
Exposes events for add, update, and remove
Source Code: http://csharptest.net/browse/src/Library/Collections/LurchTable.cs
GitHub: https://github.com/csharptest/CSharpTest.Net.Collections
HTML Help: http://help.csharptest.net/
PM> Install-Package CSharpTest.Net.Collections
The LRUCache answer with sample code above uses MethodImplOptions.Synchronized, which is equivalent to putting lock(this) around each method call. Whilst correct, this global lock will significantly reduce throughput under concurrent load.
To solve this I implemented a thread safe pseudo LRU designed for concurrent workloads. Performance is very close to ConcurrentDictionary, ~10x faster than MemoryCache and hit rate is better than a conventional LRU. Full analysis provided in the github link below.
Usage looks like this:
int capacity = 666;
var lru = new ConcurrentLru<int, SomeItem>(capacity);
var value = lru.GetOrAdd(1, (k) => new SomeItem(k));
GitHub: https://github.com/bitfaster/BitFaster.Caching
Install-Package BitFaster.Caching
Found you answer while googling, also found this:
http://code.google.com/p/csharp-lru-cache/
csharp-lru-cache: LRU cache collection class library
This is a collection class that
functions as a least-recently-used
cache. It implements ICollection<T>,
but also exposes three other members:
Capacity, the maximum number of items
the cache can contain. Once the
collection is at capacity, adding a
new item to the cache will cause the
least recently used item to be
discarded. If the Capacity is set to 0
at construction, the cache will not
automatically discard items.
Oldest,
the oldest (i.e. least recently used)
item in the collection.
DiscardingOldestItem, an event raised
when the cache is about to discard its
oldest item. This is an extremely
simple implementation. While its Add
and Remove methods are thread-safe, it
shouldn't be used in heavy
multithreading environments because
the entire collection is locked during
those methods.
This takes Martin's code with Mr T's suggestions and makes it Stylecop friendly. Oh, it also allows for disposal of values as they cycle out of the cache.
namespace LruCache
{
using System;
using System.Collections.Generic;
/// <summary>
/// A least-recently-used cache stored like a dictionary.
/// </summary>
/// <typeparam name="TKey">
/// The type of the key to the cached item
/// </typeparam>
/// <typeparam name="TValue">
/// The type of the cached item.
/// </typeparam>
/// <remarks>
/// Derived from https://stackoverflow.com/a/3719378/240845
/// </remarks>
public class LruCache<TKey, TValue>
{
private readonly Dictionary<TKey, LinkedListNode<LruCacheItem>> cacheMap =
new Dictionary<TKey, LinkedListNode<LruCacheItem>>();
private readonly LinkedList<LruCacheItem> lruList =
new LinkedList<LruCacheItem>();
private readonly Action<TValue> dispose;
/// <summary>
/// Initializes a new instance of the <see cref="LruCache{TKey, TValue}"/>
/// class.
/// </summary>
/// <param name="capacity">
/// Maximum number of elements to cache.
/// </param>
/// <param name="dispose">
/// When elements cycle out of the cache, disposes them. May be null.
/// </param>
public LruCache(int capacity, Action<TValue> dispose = null)
{
this.Capacity = capacity;
this.dispose = dispose;
}
/// <summary>
/// Gets the capacity of the cache.
/// </summary>
public int Capacity { get; }
/// <summary>Gets the value associated with the specified key.</summary>
/// <param name="key">
/// The key of the value to get.
/// </param>
/// <param name="value">
/// When this method returns, contains the value associated with the specified
/// key, if the key is found; otherwise, the default value for the type of the
/// <paramref name="value" /> parameter. This parameter is passed
/// uninitialized.
/// </param>
/// <returns>
/// true if the <see cref="T:System.Collections.Generic.Dictionary`2" />
/// contains an element with the specified key; otherwise, false.
/// </returns>
public bool TryGetValue(TKey key, out TValue value)
{
lock (this.cacheMap)
{
LinkedListNode<LruCacheItem> node;
if (this.cacheMap.TryGetValue(key, out node))
{
value = node.Value.Value;
this.lruList.Remove(node);
this.lruList.AddLast(node);
return true;
}
value = default(TValue);
return false;
}
}
/// <summary>
/// Looks for a value for the matching <paramref name="key"/>. If not found,
/// calls <paramref name="valueGenerator"/> to retrieve the value and add it to
/// the cache.
/// </summary>
/// <param name="key">
/// The key of the value to look up.
/// </param>
/// <param name="valueGenerator">
/// Generates a value if one isn't found.
/// </param>
/// <returns>
/// The requested value.
/// </returns>
public TValue Get(TKey key, Func<TValue> valueGenerator)
{
lock (this.cacheMap)
{
LinkedListNode<LruCacheItem> node;
TValue value;
if (this.cacheMap.TryGetValue(key, out node))
{
value = node.Value.Value;
this.lruList.Remove(node);
this.lruList.AddLast(node);
}
else
{
value = valueGenerator();
if (this.cacheMap.Count >= this.Capacity)
{
this.RemoveFirst();
}
LruCacheItem cacheItem = new LruCacheItem(key, value);
node = new LinkedListNode<LruCacheItem>(cacheItem);
this.lruList.AddLast(node);
this.cacheMap.Add(key, node);
}
return value;
}
}
/// <summary>
/// Adds the specified key and value to the dictionary.
/// </summary>
/// <param name="key">
/// The key of the element to add.
/// </param>
/// <param name="value">
/// The value of the element to add. The value can be null for reference types.
/// </param>
public void Add(TKey key, TValue value)
{
lock (this.cacheMap)
{
if (this.cacheMap.Count >= this.Capacity)
{
this.RemoveFirst();
}
LruCacheItem cacheItem = new LruCacheItem(key, value);
LinkedListNode<LruCacheItem> node =
new LinkedListNode<LruCacheItem>(cacheItem);
this.lruList.AddLast(node);
this.cacheMap.Add(key, node);
}
}
private void RemoveFirst()
{
// Remove from LRUPriority
LinkedListNode<LruCacheItem> node = this.lruList.First;
this.lruList.RemoveFirst();
// Remove from cache
this.cacheMap.Remove(node.Value.Key);
// dispose
this.dispose?.Invoke(node.Value.Value);
}
private class LruCacheItem
{
public LruCacheItem(TKey k, TValue v)
{
this.Key = k;
this.Value = v;
}
public TKey Key { get; }
public TValue Value { get; }
}
}
}
The Caching Application Block of EntLib has an LRU scavenging option out of the box and can be in memory. It might be a bit heavyweight for what you want tho.
I don't believe so. I've certainly seen hand-rolled ones implemented several times in various unrelated projects (which more or less confirms this. If there was one, surely at least one of the projects would have used it).
It's pretty simple to implement, and usually gets done by creating a class which contains both a Dictionary and a List.
The keys go in the list (in-order) and the items go in the dictionary.
When you Add a new item to the collection, the function checks the length of the list, pulls out the last Key (if it's too long) and then evicts the key and value from the dictionary to match. Not much more to it really
I like Lawrence's implementation. Hashtable + LinkedList is a good solution.
Regarding threading, I would not lock this with[MethodImpl(MethodImplOptions.Synchronized)], but rather use ReaderWriterLockSlim or spin lock (since contention usually fast) instead.
In the Get function I would check if it's already the 1st item first, rather than always removing and adding. This gives you the possibility to keep that within a reader lock that is not blocking other readers.
I just accidently found now LruCache.cs in aws-sdk-net: https://github.com/aws/aws-sdk-net/blob/master/sdk/src/Core/Amazon.Runtime/Internal/Util/LruCache.cs
If it's an asp.net app you can use the cache class[1] but you'll be competing for space with other cached stuff, which may be what you want or may not be.
[1] http://msdn.microsoft.com/en-us/library/system.web.caching.cache.aspx