C# Priority Queue - c#

I'm looking for a priority queue with an interface like this:
class PriorityQueue<T>
{
public void Enqueue(T item, int priority)
{
}
public T Dequeue()
{
}
}
All the implementations I've seen assume that item is an IComparable but I don't like this approach; I want to specify the priority when I'm pushing it onto the queue.
If a ready-made implementation doesn't exist, what's the best way to go about doing this myself? What underlying data structure should I use? Some sort of self-balancing tree, or what? A standard C#.net structure would be nice.

If you have an existing priority queue implementation based on IComparable, you can easily use that to build the structure you need:
public class CustomPriorityQueue<T> // where T need NOT be IComparable
{
private class PriorityQueueItem : IComparable<PriorityQueueItem>
{
private readonly T _item;
private readonly int _priority:
// obvious constructor, CompareTo implementation and Item accessor
}
// the existing PQ implementation where the item *does* need to be IComparable
private readonly PriorityQueue<PriorityQueueItem> _inner = new PriorityQueue<PriorityQueueItem>();
public void Enqueue(T item, int priority)
{
_inner.Enqueue(new PriorityQueueItem(item, priority));
}
public T Dequeue()
{
return _inner.Dequeue().Item;
}
}

You can add safety checks and what not, but here is a very simple implementation using SortedList:
class PriorityQueue<T> {
SortedList<Pair<int>, T> _list;
int count;
public PriorityQueue() {
_list = new SortedList<Pair<int>, T>(new PairComparer<int>());
}
public void Enqueue(T item, int priority) {
_list.Add(new Pair<int>(priority, count), item);
count++;
}
public T Dequeue() {
T item = _list[_list.Keys[0]];
_list.RemoveAt(0);
return item;
}
}
I'm assuming that smaller values of priority correspond to higher priority items (this is easy to modify).
If multiple threads will be accessing the queue you will need to add a locking mechanism too. This is easy, but let me know if you need guidance here.
SortedList is implemented internally as a binary tree.
The above implementation needs the following helper classes. This address Lasse V. Karlsen's comment that items with the same priority can not be added using the naive implementation using a SortedList.
class Pair<T> {
public T First { get; private set; }
public T Second { get; private set; }
public Pair(T first, T second) {
First = first;
Second = second;
}
public override int GetHashCode() {
return First.GetHashCode() ^ Second.GetHashCode();
}
public override bool Equals(object other) {
Pair<T> pair = other as Pair<T>;
if (pair == null) {
return false;
}
return (this.First.Equals(pair.First) && this.Second.Equals(pair.Second));
}
}
class PairComparer<T> : IComparer<Pair<T>> where T : IComparable {
public int Compare(Pair<T> x, Pair<T> y) {
if (x.First.CompareTo(y.First) < 0) {
return -1;
}
else if (x.First.CompareTo(y.First) > 0) {
return 1;
}
else {
return x.Second.CompareTo(y.Second);
}
}
}

You could write a wrapper around one of the existing implementations that modifies the interface to your preference:
using System;
class PriorityQueueThatYouDontLike<T> where T: IComparable<T>
{
public void Enqueue(T item) { throw new NotImplementedException(); }
public T Dequeue() { throw new NotImplementedException(); }
}
class PriorityQueue<T>
{
class ItemWithPriority : IComparable<ItemWithPriority>
{
public ItemWithPriority(T t, int priority)
{
Item = t;
Priority = priority;
}
public T Item {get; private set;}
public int Priority {get; private set;}
public int CompareTo(ItemWithPriority other)
{
return Priority.CompareTo(other.Priority);
}
}
PriorityQueueThatYouDontLike<ItemWithPriority> q = new PriorityQueueThatYouDontLike<ItemWithPriority>();
public void Enqueue(T item, int priority)
{
q.Enqueue(new ItemWithPriority(item, priority));
}
public T Dequeue()
{
return q.Dequeue().Item;
}
}
This is the same as itowlson's suggestion. I just took longer to write mine because I filled out more of the methods. :-s

Here's a very simple lightweight implementation that has O(log(n)) performance for both push and pop. It uses a heap data structure built on top of a List<T>.
/// <summary>Implements a priority queue of T, where T has an ordering.</summary>
/// Elements may be added to the queue in any order, but when we pull
/// elements out of the queue, they will be returned in 'ascending' order.
/// Adding new elements into the queue may be done at any time, so this is
/// useful to implement a dynamically growing and shrinking queue. Both adding
/// an element and removing the first element are log(N) operations.
///
/// The queue is implemented using a priority-heap data structure. For more
/// details on this elegant and simple data structure see "Programming Pearls"
/// in our library. The tree is implemented atop a list, where 2N and 2N+1 are
/// the child nodes of node N. The tree is balanced and left-aligned so there
/// are no 'holes' in this list.
/// <typeparam name="T">Type T, should implement IComparable[T];</typeparam>
public class PriorityQueue<T> where T : IComparable<T> {
/// <summary>Clear all the elements from the priority queue</summary>
public void Clear () {
mA.Clear ();
}
/// <summary>Add an element to the priority queue - O(log(n)) time operation.</summary>
/// <param name="item">The item to be added to the queue</param>
public void Add (T item) {
// We add the item to the end of the list (at the bottom of the
// tree). Then, the heap-property could be violated between this element
// and it's parent. If this is the case, we swap this element with the
// parent (a safe operation to do since the element is known to be less
// than it's parent). Now the element move one level up the tree. We repeat
// this test with the element and it's new parent. The element, if lesser
// than everybody else in the tree will eventually bubble all the way up
// to the root of the tree (or the head of the list). It is easy to see
// this will take log(N) time, since we are working with a balanced binary
// tree.
int n = mA.Count; mA.Add (item);
while (n != 0) {
int p = n / 2; // This is the 'parent' of this item
if (mA[n].CompareTo (mA[p]) >= 0) break; // Item >= parent
T tmp = mA[n]; mA[n] = mA[p]; mA[p] = tmp; // Swap item and parent
n = p; // And continue
}
}
/// <summary>Returns the number of elements in the queue.</summary>
public int Count {
get { return mA.Count; }
}
/// <summary>Returns true if the queue is empty.</summary>
/// Trying to call Peek() or Next() on an empty queue will throw an exception.
/// Check using Empty first before calling these methods.
public bool Empty {
get { return mA.Count == 0; }
}
/// <summary>Allows you to look at the first element waiting in the queue, without removing it.</summary>
/// This element will be the one that will be returned if you subsequently call Next().
public T Peek () {
return mA[0];
}
/// <summary>Removes and returns the first element from the queue (least element)</summary>
/// <returns>The first element in the queue, in ascending order.</returns>
public T Next () {
// The element to return is of course the first element in the array,
// or the root of the tree. However, this will leave a 'hole' there. We
// fill up this hole with the last element from the array. This will
// break the heap property. So we bubble the element downwards by swapping
// it with it's lower child until it reaches it's correct level. The lower
// child (one of the orignal elements with index 1 or 2) will now be at the
// head of the queue (root of the tree).
T val = mA[0];
int nMax = mA.Count - 1;
mA[0] = mA[nMax]; mA.RemoveAt (nMax); // Move the last element to the top
int p = 0;
while (true) {
// c is the child we want to swap with. If there
// is no child at all, then the heap is balanced
int c = p * 2; if (c >= nMax) break;
// If the second child is smaller than the first, that's the one
// we want to swap with this parent.
if (c + 1 < nMax && mA[c + 1].CompareTo (mA[c]) < 0) c++;
// If the parent is already smaller than this smaller child, then
// we are done
if (mA[p].CompareTo (mA[c]) <= 0) break;
// Othewise, swap parent and child, and follow down the parent
T tmp = mA[p]; mA[p] = mA[c]; mA[c] = tmp;
p = c;
}
return val;
}
/// <summary>The List we use for implementation.</summary>
List<T> mA = new List<T> ();
}

That is the exact interface used by my highly optimized C# priority-queue.
It was developed specifically for pathfinding applications (A*, etc.), but should work perfectly for any other application as well.
public class User
{
public string Name { get; private set; }
public User(string name)
{
Name = name;
}
}
...
var priorityQueue = new SimplePriorityQueue<User>();
priorityQueue.Enqueue(new User("Jason"), 1);
priorityQueue.Enqueue(new User("Joseph"), 10);
//Because it's a min-priority queue, the following line will return "Jason"
User user = priorityQueue.Dequeue();

What would be so terrible about something like this?
class PriorityQueue<TItem, TPriority> where TPriority : IComparable
{
private SortedList<TPriority, Queue<TItem>> pq = new SortedList<TPriority, Queue<TItem>>();
public int Count { get; private set; }
public void Enqueue(TItem item, TPriority priority)
{
++Count;
if (!pq.ContainsKey(priority)) pq[priority] = new Queue<TItem>();
pq[priority].Enqueue(item);
}
public TItem Dequeue()
{
--Count;
var queue = pq.ElementAt(0).Value;
if (queue.Count == 1) pq.RemoveAt(0);
return queue.Dequeue();
}
}
class PriorityQueue<TItem> : PriorityQueue<TItem, int> { }

I realise that your question specifically asks for a non-IComparable-based implementation, but I want to point out a recent article from Visual Studio Magazine.
http://visualstudiomagazine.com/articles/2012/11/01/priority-queues-with-c.aspx
This article with #itowlson's can give a complete answer.

A little late but I'll add it here for reference
https://github.com/ERufian/Algs4-CSharp
Key-value-pair priority queues are implemented in Algs4/IndexMaxPQ.cs, Algs4/IndexMinPQ.cs and Algs4/IndexPQDictionary.cs
Notes:
If the Priorities are not IComparable's, an IComparer can be specified in the constructor
Instead of enqueueing the object and its priority, what is enqueued is an index and its priority (and, for the original question, a separate List or T[] would be needed to convert that index to the expected result)

.NET6 finally offers an API for PriorityQueue
See here

Seems like you could roll your own with a seriews of Queues, one for each priority. Dictionary and just add it to the appropriate one.

Related

How to implement a sorted buffer?

I need to traverse a collection of disjoint folders; each folder is associated to a visited time configurated somewhere in the folder.
I then sort the folders, and process the one with the earliest visited time first. Note the processing is generally slower than the traversing.
My code targets Framework4.8.1; Currently my implementation is as follows:
public class BySeparateThread
{
ConcurrentDictionary<string, DateTime?> _dict = new ConcurrentDictionary<string, DateTime?>();
private object _lock;
/// <summary>
/// this will be called by producer thread;
/// </summary>
/// <param name="address"></param>
/// <param name="time"></param>
public void add(string address,DateTime? time) {
_dict.TryAdd(address, time);
}
/// <summary>
/// called by subscriber thread;
/// </summary>
/// <returns></returns>
public string? next() {
lock (_lock) {
var r = _dict.FirstOrDefault();
//return sortedList.FirstOrDefault().Value;
if (r.Key is null)
{
return r.Key;
}
if (r.Value is null)
{
_dict.TryRemove(r.Key, out var _);
return r.Key;
}
var key = r.Key;
foreach (var item in _dict.Skip(1) )
{
if (item.Value is null)
{
_dict.TryRemove(item.Key, out var _);
return item.Key;
}
if (item.Value< r.Value)
{
r=item;
}
}
_dict.TryRemove(key, out var _);
return key;
}
}
/// <summary>
/// this will be assigned of false by producer thread;
/// </summary>
public bool _notComplete = true;
/// <summary>
/// shared configuration for subscribers;
/// </summary>
fs.addresses_.disjoint.deV_._bak.Io io; //.io_._CfgX.Create(cancel, git)
/// <summary>
/// run this in a separate thread other than <see cref="add(string, DateTime?)"/>
/// </summary>
/// <param name="sln"></param>
/// <returns></returns>
public async Task _asyn_ofAddress(string sln)
{
while (_notComplete)
{
var f = next();
if (f is null )
{
await Task.Delay(30*1000);
//await Task.Yield();
continue;
}
/// degree of concurrency is controlled by a semophore; for instance, at most 4 are tackled:
new dev.srcs.each.sln_.delvable.Bak_srcsInAddresses(io)._startTask_ofAddress(sln);
}
}
}
For the above, I'm concerned about the while(_notComplete) part, as it looks like there would be many loops doing nothing there. I think there should be better ways to remove the while by utilizing the fact that the collection can notify whether it's empty or not at some/various stages such as when we add.
There would be better implementation which can be based on some mature framework such as those being considered by me these days but I often stopped wondering at some implementation details:
BlockingCollection
for this one, I don't know how to make the collection added and sorted dynamically while producer and subscriber are on the run;
Channel
Again, I could not come up with one fitting my need after I read its examples;
Pipeline
I havenot fully understood it;
Rx
I tried to implement an observable and an observer. It only gives me a macroscope framework, but when I get into the details, I ended with what I'm currently doing and I begin to wonder: with what I'm doing, I don't need Rx here.
Dataflow
Shall I implement my own BufferBlock or ActionBlock? It seems the built-in bufferBlock cannot be customized to sort things before releasing them to the next block.
Sorting buffered Observables seems similar to my problem; but it ends with a solution similar to the one I currently have but am not satisfied with, as stated in the above.
Could some one give me a sample code? Please give as concrete code as you can; As you can see, I have researched some general ideas/paths and finally what stops me short is the details, which are often glossed over in some docs.
I just found one solution which is better than my current one. I believe there are some even better ones, so please do post your answers if you find some; my current one is just what I can hack for what I know so far.
I found Prioritized queues in Task Parallel Library, and I write a similar one for my case:
using System;
using System.Collections;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Diagnostics;
using System.Linq;
using System.Reactive.Subjects;
using System.Threading;
using System.Threading.Tasks;
namespace nilnul.dev.srcs.every.slns._bak
{
public class BySortedSet : IProducerConsumerCollection<(string, DateTime)>
{
private class _Comparer : IComparer<(string, DateTime)>
{
public int Compare((string, DateTime) first, (string, DateTime) second)
{
var returnValue = first.Item2.CompareTo(second.Item2);
if (returnValue == 0)
returnValue = first.Item1.CompareTo(second.Item1);
return returnValue;
}
static public _Comparer Singleton
{
get
{
return nilnul._obj.typ_.nilable_.unprimable_.Singleton<_Comparer>.Instance;// just some magic to get an instance
}
}
}
SortedSet<(string, DateTime)> _dict = new SortedSet<(string, DateTime)>(
_Comparer.Singleton
);
private object _lock=new object();
public int Count
{
get
{
lock(_lock){
return _dict.Count;
}
}
}
public object SyncRoot => _lock;
public bool IsSynchronized => true;
IEnumerator IEnumerable.GetEnumerator()
{
return GetEnumerator();
//throw new NotImplementedException();
}
public void CopyTo((string, DateTime)[] array, int index)
{
lock (_lock)
{
foreach (var item in _dict)
{
array[index++] = item;
}
}
}
public void CopyTo(Array array, int index)
{
lock (_lock)
{
foreach (var item in _dict)
{
array.SetValue(item, index++);
}
}
}
public bool TryAdd((string, DateTime) item)
{
lock (_lock)
{
return _dict.Add(item);
}
}
public bool TryTake(out (string, DateTime) item)
{
lock (_lock)
{
item = _dict.Min;
if (item==default)
{
return false;
}
return _dict.Remove(item);
}
}
public (string, DateTime)[] ToArray()
{
lock (_lock)
{
return this._dict.ToArray();
}
}
public IEnumerator<(string, DateTime)> GetEnumerator()
{
return ToArray().AsEnumerable().GetEnumerator();
}
/// <summary>
/// </summary>
/// <returns></returns>
public BlockingCollection<(string, DateTime)> asBlockingCollection() {
return new BlockingCollection<(string, DateTime)>(
this
);
}
}
}
Then I can use that like:
static public void ExampleUse(CancellationToken cancellationToken) {
var s = new BySortedSet().asBlockingCollection();
/// traversal thread:
s.Add(("", DateTime.MinValue));
//...
s.CompleteAdding();
/// tackler thread:
///
foreach (var item in s.GetConsumingEnumerable(cancellationToken))
{
/// process the item;
/// todo: degree of parallelism is controlled by the tackler, or is there a better way like in dataflow or Rx or sth else?
}
}
Thanks!

LINQ continue after Take

Say we have an IEnumerable<T> stuff;
Is there a concise way to Take n elements and then another m elements after the first, without re-evaluating?
example code:
stuff.Take(10);
stuff.Skip(10).Take(20); // re-evaluates stuff
What I was thinking was maybe this (not working code)
var it = stuff.GetEnumerator();
it.Take(10);
it.Take(20);
Edit to add to the difficulty and to clarify the complexity of what I would like to accomplish: I want to continue the query after the Take, i.e.
it.Take(10);
var cont = it.Select(Mutate);
cont.Take(20);
cont = cont.Where(Filter);
cont.Take(5);
You can use the Publish extension method in the System.Interactive NuGet package put out by Microsoft to accomplish this. This is a fantastic library that provides some 'missing' LINQ functions. From the documentation, the Publish method:
Creates a buffer with a view over the source sequence, causing each enumerator to obtain access to the remainder of the sequence from the current index in the buffer.
I.e. it allows you to partially enumerate a sequence and the next time you enumerate the sequence you will pick up where the previous enumeration left off.
var publishedSource = stuff.Publish();
var firstTenItems = publishedSource.Take(10).ToArray();
var nextTwentyTransformedItems = publishedSource.Take(20).Select(Mutate).ToArray();
// How you apply 'Where' depends on what you want to achieve.
// This returns the next 5 items that match the filter but if there are less
// than 5 items that match the filter you could end up enumerating the
// entire remainder of the sequence.
var nextFiveFilteredItems = publishedSource.Where(Filter).Take(5).ToArray();
// This enumerates _only_ the next 5 items and yields any that match the filter.
var nextOfFiveItemsThatPassFilter = publishedSource.Take(5).Where(Filter).ToArray()
If you want to just create a wrapper for IEnumerable that will handle any LINQ appended on and take one pass through the source, use this class and extension:
public static class EnumerableOnceExt {
public static EnumerableOnce<IEnumerable<T>, T> EnumerableOnce<T>(this IEnumerable<T> src) => new EnumerableOnce<IEnumerable<T>, T>(src);
}
public class EnumerableOnce<T, V> : IEnumerable<V>, IDisposable where T : IEnumerable<V> {
EnumeratorOnce<V> onceEnum;
public EnumerableOnce(T src) {
onceEnum = new EnumeratorOnce<V>(src.GetEnumerator());
}
public IEnumerator<V> GetEnumerator() {
return onceEnum;
}
IEnumerator IEnumerable.GetEnumerator() {
return onceEnum;
}
public void DoSkip(int n) {
while (n > 0 && onceEnum.MoveNext())
--n;
}
public void DoTake(int n) {
while (n > 0 && onceEnum.MoveNext())
--n;
}
#region IDisposable Support
private bool disposedValue = false; // To detect redundant calls
protected virtual void Dispose(bool disposing) {
if (!disposedValue) {
if (disposing) {
onceEnum.ActuallyDispose();
}
disposedValue = true;
}
}
// This code added to correctly implement the disposable pattern.
public void Dispose() {
Dispose(true);
}
#endregion
}
public class EnumeratorOnce<V> : IEnumerator<V> {
IEnumerator<V> origEnum;
public EnumeratorOnce(IEnumerator<V> src) {
origEnum = src;
}
public V Current => origEnum.Current;
object IEnumerator.Current => origEnum.Current;
public bool MoveNext() => origEnum.MoveNext();
public void Reset() {
origEnum.Reset();
}
public void ActuallyDispose() {
origEnum.Dispose();
}
#region IDisposable Support
protected virtual void Dispose(bool disposing) {
// don't allow disposing early
}
// This code added to correctly implement the disposable pattern.
public void Dispose() {
Dispose(true);
}
#endregion
}
Now your sample code will work if you call EnumerableOnce() to wrap the source, as long as you execute the enumerations:
var it1 = it.EnumerableOnce();
it1.Take(10).ToList();
var #continue = it1.Select(Mutate);
#continue.Take(20).ToList();
#continue = #continue.Where(Filter);
#continue.Take(5).ToList();
You can also add new methods to EnumerableOnce:
public void DoSkip(int n) {
while (n > 0 && srcEnum.MoveNext())
--n;
}
public void DoTake(int n) {
while (n > 0 && srcEnum.MoveNext())
--n;
}
And call them:
var it1 = it.EnumerableOnce();
it1.DoTake(10);
var #continue = it1.Select(Mutate);
#continue.DoSkip(20);
#continue = #continue.Where(Filter);
#continue.DoTake(5);

Correct Concurrent Collection for storing timed non-recurring structures

I am searching for right thread-safe collection (concurrent collection) for the following scenario:
I may have requests from an external source which generates GUIDs (so it is unique and non-recurring). I need to store (say the last 100 requests) and check if duplicate GUIDs are delivered or not. I may not save all GUIDs more than 100 or so due to some limitations.
Now the problem is that when this mechanism is used in a service, it must be bound to 100 items and searching based on GUIDs is vital.
I decided to use ConcurrentDictionary yet I doubt it is a good decision since I may change the keys after using the whole 100 slots. I may find a good mechanism to replace the oldest requests when dictionary is full.
Any idea is much appreciated.
A code snippet is provided to show my incomplete implementation
public static ConcurrentDictionary<string, TimedProto> IncidentsCreated = new ConcurrentDictionary<string, TimedProto>(20, 100);
private static bool AddTo_AddedIncidents(proto ReceivedIncident)
{
try
{
int OldestCounter = 0;
DateTime OldestTime = DateTime.Now;
if (IncidentsCreated.Count < 100)
{
TimedProto tp = new TimedProto();
tp.IncidentProto = ReceivedIncident;
tp.time = DateTime.Now;
IncidentsCreated.AddOrUpdate(ReceivedIncident.IncidentGUID, tp,
(s,i) => i);
return true;
}
else //array is full, a replace oldest mechanism is required
{
}
return true;
}
catch (Exception ex)
{
LogEvent("AddTo_AddedIncidents\n"+ex.ToString(), EventLogEntryType.Error);
return false;
}
}
public struct proto
{
public string IncidentGUID;
//other variables
}
public struct TimedProto
{
public proto IncidentProto;
public DateTime time;
}
Thanks
Try this one: http://ayende.com/blog/162529/trivial-lru-cache-impl?key=02e8069c-62f8-4042-a7d2-d93806369824&utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+AyendeRahien+%28Ayende+%40+Rahien%29
Your implementation is flawed since you do use DateTime which has a granularity of 15ms. This means that you can accidentally delete even your most recent guids if you have a high inflow.
public class LruCache<TKey, TValue>
{
private readonly int _capacity;
private readonly Stopwatch _stopwatch = Stopwatch.StartNew();
class Reference<T> where T : struct
{
public T Value;
}
private class Node
{
public TValue Value;
public volatile Reference<long> Ticks;
}
private readonly ConcurrentDictionary<TKey, Node> _nodes = new ConcurrentDictionary<TKey, Node>();
public LruCache(int capacity)
{
Debug.Assert(capacity > 10);
_capacity = capacity;
}
public void Set(TKey key, TValue value)
{
var node = new Node
{
Value = value,
Ticks = new Reference<long> { Value = _stopwatch.ElapsedTicks }
};
_nodes.AddOrUpdate(key, node, (_, __) => node);
if (_nodes.Count > _capacity)
{
foreach (var source in _nodes.OrderBy(x => x.Value.Ticks).Take(_nodes.Count / 10))
{
Node _;
_nodes.TryRemove(source.Key, out _);
}
}
}
public bool TryGet(TKey key, out TValue value)
{
Node node;
if (_nodes.TryGetValue(key, out node))
{
node.Ticks = new Reference<long> {Value = _stopwatch.ElapsedTicks};
value = node.Value;
return true;
}
value = default(TValue);
return false;
}
}
I would use a Circular Buffer for this - there are plenty of implementations around, including this one, and making a thread-safe wrapper for one of them wouldn't be hard.
With only 100 or so slots, looking up by key would be reasonably efficient, and inserting would be extremely efficient (no reallocation as old items are discarded and replaced by new ones).

How to clear MemoryCache?

I have created a cache using the MemoryCache class. I add some items to it but when I need to reload the cache I want to clear it first. What is the quickest way to do this? Should I loop through all the items and remove them one at a time or is there a better way?
Dispose the existing MemoryCache and create a new MemoryCache object.
The problem with enumeration
The MemoryCache.GetEnumerator() Remarks section warns: "Retrieving an enumerator for a MemoryCache instance is a resource-intensive and blocking operation. Therefore, the enumerator should not be used in production applications."
Here's why, explained in pseudocode of the GetEnumerator() implementation:
Create a new Dictionary object (let's call it AllCache)
For Each per-processor segment in the cache (one Dictionary object per processor)
{
Lock the segment/Dictionary (using lock construct)
Iterate through the segment/Dictionary and add each name/value pair one-by-one
to the AllCache Dictionary (using references to the original MemoryCacheKey
and MemoryCacheEntry objects)
}
Create and return an enumerator on the AllCache Dictionary
Since the implementation splits the cache across multiple Dictionary objects, it must bring everything together into a single collection in order to hand back an enumerator. Every call to GetEnumerator executes the full copy process detailed above. The newly created Dictionary contains references to the original internal key and value objects, so your actual cached data values are not duplicated.
The warning in the documentation is correct. Avoid GetEnumerator() -- including all of the answers above that use LINQ queries.
A better and more flexible solution
Here's an efficient way of clearing the cache that simply builds on the existing change monitoring infrastructure. It also provides the flexibility to clear either the entire cache or just a named subset and has none of the problems discussed above.
// By Thomas F. Abraham (http://www.tfabraham.com)
namespace CacheTest
{
using System;
using System.Diagnostics;
using System.Globalization;
using System.Runtime.Caching;
public class SignaledChangeEventArgs : EventArgs
{
public string Name { get; private set; }
public SignaledChangeEventArgs(string name = null) { this.Name = name; }
}
/// <summary>
/// Cache change monitor that allows an app to fire a change notification
/// to all associated cache items.
/// </summary>
public class SignaledChangeMonitor : ChangeMonitor
{
// Shared across all SignaledChangeMonitors in the AppDomain
private static event EventHandler<SignaledChangeEventArgs> Signaled;
private string _name;
private string _uniqueId = Guid.NewGuid().ToString("N", CultureInfo.InvariantCulture);
public override string UniqueId
{
get { return _uniqueId; }
}
public SignaledChangeMonitor(string name = null)
{
_name = name;
// Register instance with the shared event
SignaledChangeMonitor.Signaled += OnSignalRaised;
base.InitializationComplete();
}
public static void Signal(string name = null)
{
if (Signaled != null)
{
// Raise shared event to notify all subscribers
Signaled(null, new SignaledChangeEventArgs(name));
}
}
protected override void Dispose(bool disposing)
{
SignaledChangeMonitor.Signaled -= OnSignalRaised;
}
private void OnSignalRaised(object sender, SignaledChangeEventArgs e)
{
if (string.IsNullOrWhiteSpace(e.Name) || string.Compare(e.Name, _name, true) == 0)
{
Debug.WriteLine(
_uniqueId + " notifying cache of change.", "SignaledChangeMonitor");
// Cache objects are obligated to remove entry upon change notification.
base.OnChanged(null);
}
}
}
public static class CacheTester
{
public static void TestCache()
{
MemoryCache cache = MemoryCache.Default;
// Add data to cache
for (int idx = 0; idx < 50; idx++)
{
cache.Add("Key" + idx.ToString(), "Value" + idx.ToString(), GetPolicy(idx));
}
// Flush cached items associated with "NamedData" change monitors
SignaledChangeMonitor.Signal("NamedData");
// Flush all cached items
SignaledChangeMonitor.Signal();
}
private static CacheItemPolicy GetPolicy(int idx)
{
string name = (idx % 2 == 0) ? null : "NamedData";
CacheItemPolicy cip = new CacheItemPolicy();
cip.AbsoluteExpiration = System.DateTimeOffset.UtcNow.AddHours(1);
cip.ChangeMonitors.Add(new SignaledChangeMonitor(name));
return cip;
}
}
}
From http://connect.microsoft.com/VisualStudio/feedback/details/723620/memorycache-class-needs-a-clear-method
The workaround is:
List<string> cacheKeys = MemoryCache.Default.Select(kvp => kvp.Key).ToList();
foreach (string cacheKey in cacheKeys)
{
MemoryCache.Default.Remove(cacheKey);
}
var cacheItems = cache.ToList();
foreach (KeyValuePair<String, Object> a in cacheItems)
{
cache.Remove(a.Key);
}
If performance isn't an issue then this nice one-liner will do the trick:
cache.ToList().ForEach(a => cache.Remove(a.Key));
It seems that there is a Trim method.
So to clear all contents you'd just do
cache.Trim(100)
EDIT:
after digging some more, it seems that looking into Trim is not worth your time
https://connect.microsoft.com/VisualStudio/feedback/details/831755/memorycache-trim-method-doesnt-evict-100-of-the-items
How do I clear a System.Runtime.Caching.MemoryCache
Ran across this, and based on it, wrote a slightly more effective, parallel clear method:
public void ClearAll()
{
var allKeys = _cache.Select(o => o.Key);
Parallel.ForEach(allKeys, key => _cache.Remove(key));
}
You could also do something like this:
Dim _Qry = (From n In CacheObject.AsParallel()
Select n).ToList()
For Each i In _Qry
CacheObject.Remove(i.Key)
Next
You can dispose the MemoryCache.Default cache and then re-set the private field singleton to null, to make it recreate the MemoryCache.Default.
var field = typeof(MemoryCache).GetField("s_defaultCache",
BindingFlags.Static |
BindingFlags.NonPublic);
field.SetValue(null, null);
I was only interested in clearing the cache and found this as an option, when using the c# GlobalCachingProvider
var cache = GlobalCachingProvider.Instance.GetAllItems();
if (dbOperation.SuccessLoadingAllCacheToDB(cache))
{
cache.Clear();
}
a bit improved version of magritte answer.
var cacheKeys = MemoryCache.Default.Where(kvp.Value is MyType).Select(kvp => kvp.Key).ToList();
foreach (string cacheKey in cacheKeys)
{
MemoryCache.Default.Remove(cacheKey);
}
This discussion is also being done here:
https://learn.microsoft.com/en-us/answers/answers/983399/view.html
I wrote an answer there and I'll transcribe it here:
using System.Collections.Generic;
using Microsoft.Extensions.Caching.Memory;
using ServiceStack;
public static class IMemoryCacheExtensions
{
static readonly List<object> entries = new();
/// <summary>
/// Removes all entries, added via the "TryGetValueExtension()" method
/// </summary>
/// <param name="cache"></param>
public static void Clear(this IMemoryCache cache)
{
for (int i = 0; i < entries.Count; i++)
{
cache.Remove(entries[i]);
}
entries.Clear();
}
/// <summary>
/// Use this extension method, to be able to remove all your entries later using "Clear()" method
/// </summary>
/// <typeparam name="TItem"></typeparam>
/// <param name="cache"></param>
/// <param name="key"></param>
/// <param name="value"></param>
/// <returns></returns>
public static bool TryGetValueExtension<TItem>(this IMemoryCache cache, object key, out TItem value)
{
entries.AddIfNotExists(key);
if (cache.TryGetValue(key, out object result))
{
if (result == null)
{
value = default;
return true;
}
if (result is TItem item)
{
value = item;
return true;
}
}
value = default;
return false;
}
}

c# Adding a Remove(int index) method to the .NET Queue class

I would like to use the generic queue class as described in the .NET framework (3.5)
but I will need a Remove(int index) method to remove items from the queue. Can I achieve this functionality with an extension method? Anyone care to point me in the right direction?
What you want is a List<T> where you always call RemoveAt(0) when you want to get the item from the Queue. Everything else is the same, really (calling Add would add an item to the end of the Queue).
Combining both casperOne's and David Anderson's suggestions to the next level. The following class inherits from List and hides the methods that would be detrimental to the FIFO concept while adding the three Queue methods (Equeue, Dequeu, Peek).
public class ListQueue<T> : List<T>
{
new public void Add(T item) { throw new NotSupportedException(); }
new public void AddRange(IEnumerable<T> collection) { throw new NotSupportedException(); }
new public void Insert(int index, T item) { throw new NotSupportedException(); }
new public void InsertRange(int index, IEnumerable<T> collection) { throw new NotSupportedException(); }
new public void Reverse() { throw new NotSupportedException(); }
new public void Reverse(int index, int count) { throw new NotSupportedException(); }
new public void Sort() { throw new NotSupportedException(); }
new public void Sort(Comparison<T> comparison) { throw new NotSupportedException(); }
new public void Sort(IComparer<T> comparer) { throw new NotSupportedException(); }
new public void Sort(int index, int count, IComparer<T> comparer) { throw new NotSupportedException(); }
public void Enqueue(T item)
{
base.Add(item);
}
public T Dequeue()
{
var t = base[0];
base.RemoveAt(0);
return t;
}
public T Peek()
{
return base[0];
}
}
Test code:
class Program
{
static void Main(string[] args)
{
ListQueue<string> queue = new ListQueue<string>();
Console.WriteLine("Item count in ListQueue: {0}", queue.Count);
Console.WriteLine();
for (int i = 1; i <= 10; i++)
{
var text = String.Format("Test{0}", i);
queue.Enqueue(text);
Console.WriteLine("Just enqueued: {0}", text);
}
Console.WriteLine();
Console.WriteLine("Item count in ListQueue: {0}", queue.Count);
Console.WriteLine();
var peekText = queue.Peek();
Console.WriteLine("Just peeked at: {0}", peekText);
Console.WriteLine();
var textToRemove = "Test5";
queue.Remove(textToRemove);
Console.WriteLine("Just removed: {0}", textToRemove);
Console.WriteLine();
var queueCount = queue.Count;
for (int i = 0; i < queueCount; i++)
{
var text = queue.Dequeue();
Console.WriteLine("Just dequeued: {0}", text);
}
Console.WriteLine();
Console.WriteLine("Item count in ListQueue: {0}", queue.Count);
Console.WriteLine();
Console.WriteLine("Now try to ADD an item...should cause an exception.");
queue.Add("shouldFail");
}
}
Here's how you remove a specific item from the queue with one line of Linq (it's recreating the queue, BUT for the lack of a better method...)
//replace "<string>" with your actual underlying type
myqueue = new Queue<string>(myqueue.Where(s => s != itemToBeRemoved));
I know it's not removing by index, but still, someone might find this useful (this question ranks in Google for "remove specific item from a c# queue" so I decided to add this answer, sorry)
It's a pretty late answer but I write it for future readers
List<T> is exactly what you need but it has a big disadvantage when compared to Queue<T>: it's implemented with an array then Dequeue() is pretty expansive (in terms of time) because all items must be shifted one step back with Array.Copy. Even Queue<T> uses an array but together with two indices (for head and tail).
In your case you also need Remove/RemoveAt and its performance won't be good (for the same reason: if you're not removing from list tail then another array must be allocated and items copied).
A better data structure to have quick Dequeue/Remove time is a linked list (you'll sacrifice - a little bit - performance for Enqueue but assuming your queue has an equal number of Enqueue/Dequeue operations you'll have a great gain in performance, especially when its size will grow).
Let's see a simple skeleton for its implementation (I'll skip implementation for IEnumerable<T>, IList<T> and other helper methods).
class LinkedQueue<T>
{
public int Count
{
get { return _items.Count; }
}
public void Enqueue(T item)
{
_items.AddLast(item);
}
public T Dequeue()
{
if (_items.First == null)
throw new InvalidOperationException("...");
var item = _items.First.Value;
_items.RemoveFirst();
return item;
}
public void Remove(T item)
{
_items.Remove(item);
}
public void RemoveAt(int index)
{
Remove(_items.Skip(index).First());
}
private LinkedList<T> _items = new LinkedList<T>();
}
For a quick comparison:
Queue List LinkedList
Enqueue O(1)/O(n)* O(1)/O(n)* O(1)
Dequeue O(1) O(n) O(1)
Remove n/a O(n) O(n)
* O(1) is typical case but sometimes it'll be O(n) (when internal array need to be resized).
Of course you'll pay something for what you gain: memory usage is bigger (especially for small T overhead will be great). Right implementation (List<T> vs LinkedList<T>) must be chosen carefully according to your usage scenario, you may also convert that code to use a single linked list to reduce 50% of memory overhead.
Although there isn't a built-in way, you shouldn't use a List structure or other structure, IFF RemoveAt isn't a frequent operation.
If you are normally enqueuing and dequeuing but only occasionally removing, then you should be able to afford a queue rebuild when removing.
public static void Remove<T>(this Queue<T> queue, T itemToRemove) where T : class
{
var list = queue.ToList(); //Needs to be copy, so we can clear the queue
queue.Clear();
foreach (var item in list)
{
if (item == itemToRemove)
continue;
queue.Enqueue(item);
}
}
public static void RemoveAt<T>(this Queue<T> queue, int itemIndex)
{
var list = queue.ToList(); //Needs to be copy, so we can clear the queue
queue.Clear();
for (int i = 0; i < list.Count; i++)
{
if (i == itemIndex)
continue;
queue.Enqueue(list[i]);
}
}
The following approach might be more efficient, using less memory, and thus less GC:
public static void RemoveAt<T>(this Queue<T> queue, int itemIndex)
{
var cycleAmount = queue.Count;
for (int i = 0; i < cycleAmount; i++)
{
T item = queue.Dequeue();
if (i == itemIndex)
continue;
queue.Enqueue(item);
}
}
Someone will probably develop a better solution, but from what I see you will need to return a new Queue object in your Remove method. You'll want to check if the index is out of bounds and I may have got the ordering of the items being added wrong, but here's a quick and dirty example that could be made into an extension quite easily.
public class MyQueue<T> : Queue<T> {
public MyQueue()
: base() {
// Default constructor
}
public MyQueue(Int32 capacity)
: base(capacity) {
// Default constructor
}
/// <summary>
/// Removes the item at the specified index and returns a new Queue
/// </summary>
public MyQueue<T> RemoveAt(Int32 index) {
MyQueue<T> retVal = new MyQueue<T>(Count - 1);
for (Int32 i = 0; i < this.Count - 1; i++) {
if (i != index) {
retVal.Enqueue(this.ElementAt(i));
}
}
return retVal;
}
}
I do not believe we should be using List<T> to emulate a queue, for a queue, the enqueue and dequeue operations should be very highly performant, which they would not be when using a List<T>. For the RemoveAt method however, it is acceptable to be non-performant, as it is not the primary purpose of a Queue<T>.
My approach at implementing RemoveAt is O(n) but the queue still maintains a largely O(1) enqueue (sometimes the internal array needs reallocating which makes the operations O(n)) and always O(1) dequeue.
Here is my implementation of a RemoveAt(int) extension method for a Queue<T>:
public static void RemoveAt<T>(this Queue<T> queue, int index)
{
Contract.Requires(queue != null);
Contract.Requires(index >= 0);
Contract.Requires(index < queue.Count);
var i = 0;
// Move all the items before the one to remove to the back
for (; i < index; ++i)
{
queue.MoveHeadToTail();
}
// Remove the item at the index
queue.Dequeue();
// Move all subsequent items to the tail end of the queue.
var queueCount = queue.Count;
for (; i < queueCount; ++i)
{
queue.MoveHeadToTail();
}
}
Where MoveHeadToTail is defined as follows:
private static void MoveHeadToTail<T>(this Queue<T> queue)
{
Contract.Requires(queue != null);
var dequed = queue.Dequeue();
queue.Enqueue(dequed);
}
This implementation also modifies the actual Queue<T> rather than returning a new Queue<T> (which I think is more in-keeping with other RemoveAt implementations).
In fact, this defeats the whole purpose of Queue and the class you'll eventually come up with the will violate the FIFO semantics altogether.
If the Queue is being used to preserve the order of the items in the collection, and if you will not have duplicate items, then a SortedSet might be what you are looking for. The SortedSet acts much like a List<T>, but stays ordered. Great for things like drop down selections.
David Anderson's solution is probably the best but has some overhead.
Are you using custom objects in the queue? if so, add a boolean like cancel
Check with your workers that process the queue if that boolean is set and then skip it.
Note that with a list you can make the "removal" process more efficient if you don't actually remove the item but merely "mark" it as "removed". Yes, you have to add a bit of code to deal with how you've done it, but the payoff is the efficiency.
Just as one example - Say you have a List<string>. Then you can, for example, just set that particular item to null and be done with it.
The queue class is so difficult to understand. Use a generic list instead.

Categories

Resources