Timed semaphore - c#

I have the following class to manage access to a resource:
class Sync : IDisposable
{
private static readonly SemaphoreSlim Semaphore = new SemaphoreSlim(20);
private Sync()
{
}
public static async Task<Sync> Acquire()
{
await Semaphore.WaitAsync();
return new Sync();
}
public void Dispose()
{
Semaphore.Release();
}
}
Usage:
using (await Sync.Acquire())
{
// use a resource here
}
Now it allows not more than 20 shared usages.
How to modify this class to allow not more than N shared usages per unit of time (for example, not more than 20 per second)?

"20 per second" is completely different than "20 at a time". I recommend that you leave the thread synchronization behind and use higher-level abstractions capable of working more naturally with time as a concept.
In particular, Reactive Extensions has a number of different throttling operators.

Here's a basic reimplementation which calls Semaphore.Release either when the specified time period has elapsed, or (optionally - see code comments in Dispose()) when the Sync instance is disposed.
class Sync : IDisposable
{
private static readonly SemaphoreSlim Semaphore = new SemaphoreSlim(20);
// 0 : semaphore needs to be released.
// 1 : semaphore already released.
private int State = 0;
private Sync()
{
}
// Renamed to conform to Microsoft's guidelines.
public static async Task<Sync> AcquireAsync(TimeSpan releaseAfter)
{
var sync = new Sync();
await Semaphore.WaitAsync().ConfigureAwait(false);
try
{
return sync;
}
finally
{
// Fire-and-forget, not awaited.
sync.DelayedRelease(releaseAfter);
}
}
private async void DelayedRelease(TimeSpan releaseAfter)
{
await Task.Delay(releaseAfter).ConfigureAwait(false);
this.ReleaseOnce();
}
private void ReleaseOnce()
{
// Ensure that we call Semaphore.Release() at most
// once during the lifetime of this instance -
// either via DelayedRelease, or via Dispose.
if (Interlocked.Exchange(ref this.State, 1) == 0)
{
Semaphore.Release();
}
}
public void Dispose()
{
// Uncomment if you want the ability to
// release the semaphore via Dispose
// thus bypassing the throttling.
//this.ReleaseOnce();
}
}

Related

Implementing a Starve method ("Unrelease"/"Hold") for SemaphoreSlim

I'm using a SemaphoreSlim with a FIFO behaviour and now I want to add to it a Starve(int amount) method to remove threads from the pool, sort of the opposite to Release().
If there are any running tasks, they will of course continue until they are done, since for the moment the semaphore is not keeping track of what is actually running and "owes" the semaphore a release call.
The reason is that the user will dynamically control how many processes are allowed at any time for a given semaphore.
The strategy I'm following is:
if there are threads available, i.e., CurrentCount > 0, then call Await() on the SemaphoreSlim without releasing back.
if there are no more threads available, because presumably tasks are running and potentially even queuing, then next time that Release() is called ignore it to prevent threads being released (an int variable keeps count)
I have added the code I have so far below. The main issues I'm struggling with are how to ensure thread safety, no deadlocks and no surprising race conditions.
Given that I cannot access the private lock() of the semaphore, I created a new object to at least try and prevent several threads to manipulate the new variables (within the wrapper) at the same time.
However, I fear that other variables like CurrentCount which are within the SemaphoreSlim could also change half way through and mess things up... I would expect the lock in the Release() method to prevent changes to CurrentCount, but maybe I should also apply the lock to the Wait and WaitAsync (which potentially could also change CurrentCount)? That would probably also result in uneccessary locks between two calls to Wait (?)
The call to semaphore.Wait() is in this situation any better or worse than await semaphore.WaitAsync() ?
Are there any better ways to extend the functionality of a class such as SemaphoreSlim, which contains many private variables that potentially are needed or that would be useful to have access to?
I briefly considered creating a new class which inherits from SemaphoreSlim, or looking at extension methods, maybe using reflection to access the private variables,... but none seem to be obvious or valid.
public class SemaphoreQueue
{
private SemaphoreSlim semaphore;
private ConcurrentQueue<TaskCompletionSource<bool>> queue = new ConcurrentQueue<TaskCompletionSource<bool>>();
private int releasesToIgnore;
private object lockObj;
private const int NO_MAXIMUM = Int32.MaxValue; // cannot access SemaphoreSlim.NO_MAXIMUM
public SemaphoreQueue(int initialCount) : this(initialCount, NO_MAXIMUM) { }
public SemaphoreQueue(int initialCount, int maxCount)
{
semaphore = new SemaphoreSlim(initialCount, maxCount);
lockObj = new object();
releasesToIgnore = 0;
}
public void Starve(int amount)
{
lock (lockObj)
{
// a maximum of CurrentCount threads can be immediatelly starved by calling Wait without release
while ((semaphore.CurrentCount > 0) && (amount > 0))
{
semaphore.Wait();
amount -= 1;
}
// presumably there are still tasks running. The next Releases will be ignored.
if (amount > 0)
releasesToIgnore += amount;
}
}
public int Release()
{
return Release(1);
}
public int Release(int num)
{
lock (lockObj)
{
if (releasesToIgnore > num)
{
releasesToIgnore -= num;
return semaphore.CurrentCount;
}
else
{
int oldReleasesToIgnore = releasesToIgnore;
releasesToIgnore = 0;
return semaphore.Release(num - oldReleasesToIgnore);
}
}
}
public void Wait(CancellationToken token)
{
WaitAsync(token).Wait();
}
public Task WaitAsync(CancellationToken token)
{
var tcs = new TaskCompletionSource<bool>();
queue.Enqueue(tcs);
QueuedAwait(token);
return tcs.Task;
}
public int CurrentCount { get => this.semaphore.CurrentCount; }
private void QueuedAwait(CancellationToken token)
{
semaphore.WaitAsync(token).ContinueWith(t =>
{
TaskCompletionSource<bool> popped;
if (queue.TryDequeue(out popped))
popped.SetResult(true);
});
}
public void Dispose()
{
semaphore.Dispose();
}
}

Producer/ Consumer pattern using threads and EventWaitHandle

I guess it is sort of a code review, but here is my implementation of the producer / consumer pattern. What I would like to know is would there be a case in which the while loops in the ReceivingThread() or SendingThread() methods might stop executing. Please note that EnqueueSend(DataSendEnqeueInfo info) is called from multiple different threads and I probably can't use tasks here since I definitely have to consume commands in a separate thread.
private Thread mReceivingThread;
private Thread mSendingThread;
private Queue<DataRecievedEnqeueInfo> mReceivingThreadQueue;
private Queue<DataSendEnqeueInfo> mSendingThreadQueue;
private readonly object mReceivingQueueLock = new object();
private readonly object mSendingQueueLock = new object();
private bool mIsRunning;
EventWaitHandle mRcWaitHandle;
EventWaitHandle mSeWaitHandle;
private void ReceivingThread()
{
while (mIsRunning)
{
mRcWaitHandle.WaitOne();
DataRecievedEnqeueInfo item = null;
while (mReceivingThreadQueue.Count > 0)
{
lock (mReceivingQueueLock)
{
item = mReceivingThreadQueue.Dequeue();
}
ProcessReceivingItem(item);
}
mRcWaitHandle.Reset();
}
}
private void SendingThread()
{
while (mIsRunning)
{
mSeWaitHandle.WaitOne();
while (mSendingThreadQueue.Count > 0)
{
DataSendEnqeueInfo item = null;
lock (mSendingQueueLock)
{
item = mSendingThreadQueue.Dequeue();
}
ProcessSendingItem(item);
}
mSeWaitHandle.Reset();
}
}
internal void EnqueueRecevingData(DataRecievedEnqeueInfo info)
{
lock (mReceivingQueueLock)
{
mReceivingThreadQueue.Enqueue(info);
mRcWaitHandle.Set();
}
}
public void EnqueueSend(DataSendEnqeueInfo info)
{
lock (mSendingQueueLock)
{
mSendingThreadQueue.Enqueue(info);
mSeWaitHandle.Set();
}
}
P.S the idea here is that am using WaitHandles to put thread to sleep when the queue is empty, and signal them to start when new items are enqueued.
UPDATE
I am just going to leave this https://blogs.msdn.microsoft.com/benwilli/2015/09/10/tasks-are-still-not-threads-and-async-is-not-parallel/ ,for people who might be trying to implement Producer/Consumer pattern using TPL or tasks.
Use a BlockingCollection instead of Queue, EventWaitHandle and lock objects:
public class DataInfo { }
private Thread mReceivingThread;
private Thread mSendingThread;
private BlockingCollection<DataInfo> queue;
private CancellationTokenSource receivingCts = new CancellationTokenSource();
private void ReceivingThread()
{
try
{
while (!receivingCts.IsCancellationRequested)
{
// This will block until an item is added to the queue or the cancellation token is cancelled
DataInfo item = queue.Take(receivingCts.Token);
ProcessReceivingItem(item);
}
}
catch (OperationCanceledException)
{
}
}
internal void EnqueueRecevingData(DataInfo info)
{
// When a new item is produced, just add it to the queue
queue.Add(info);
}
// To cancel the receiving thread, cancel the token
private void CancelReceivingThread()
{
receivingCts.Cancel();
}
Personally, for simple producer-consumer problems, I would just use BlockingCollection. There would be no need to manually code your own synchronization logic. The consuming threads will also block if there are no items present in the queue.
Here is what your code might look like if you use this class:
private BlockingCollection<DataRecievedEnqeueInfo> mReceivingThreadQueue = new BlockingCollection<DataRecievedEnqeueInfo>();
private BlockingCollection<DataSendEnqeueInfo> mSendingThreadQueue = new BlockingCollection<DataSendEnqeueInfo>();
public void Stop()
{
// No need for mIsRunning. Makes the enumerables in the GetConsumingEnumerable() calls
// below to complete.
mReceivingThreadQueue.CompleteAdding();
mSendingThreadQueue.CompleteAdding();
}
private void ReceivingThread()
{
foreach (DataRecievedEnqeueInfo item in mReceivingThreadQueue.GetConsumingEnumerable())
{
ProcessReceivingItem(item);
}
}
private void SendingThread()
{
foreach (DataSendEnqeueInfo item in mSendingThreadQueue.GetConsumingEnumerable())
{
ProcessSendingItem(item);
}
}
internal void EnqueueRecevingData(DataRecievedEnqeueInfo info)
{
// You can also use TryAdd() if there is a possibility that you
// can add items after you have stopped. Otherwise, this can throw an
// an exception after CompleteAdding() has been called.
mReceivingThreadQueue.Add(info);
}
public void EnqueueSend(DataSendEnqeueInfo info)
{
mSendingThreadQueue.Add(info);
}
As suggested in comments, you also can give a try to the TPL Dataflow blocks.
As far as I can see, you have two similar pipelines, for receive and send, so I assume that your class hierarchy is like this:
class EnqueueInfo { }
class DataRecievedEnqeueInfo : EnqueueInfo { }
class DataSendEnqeueInfo : EnqueueInfo { }
We can assemble an abstract class which will encapsulate the logic for creating the pipeline, and providing the interface for processing the items, like this:
abstract class EnqueueInfoProcessor<T>
where T : EnqueueInfo
{
// here we will store all the messages received before the handling
private readonly BufferBlock<T> _buffer;
// simple action block for actual handling the items
private ActionBlock<T> _action;
// cancellation token to cancel the pipeline
public EnqueueInfoProcessor(CancellationToken token)
{
_buffer = new BufferBlock<T>(new DataflowBlockOptions { CancellationToken = token });
_action = new ActionBlock<T>(item => ProcessItem(item), new ExecutionDataflowBlockOptions
{
MaxDegreeOfParallelism = Environment.ProcessorCount,
CancellationToken = token
});
// we are linking two blocks so all the items from buffer
// will flow down to action block in order they've been received
_buffer.LinkTo(_action, new DataflowLinkOptions { PropagateCompletion = true });
}
public void PostItem(T item)
{
// synchronously wait for posting to complete
_buffer.Post(item);
}
public async Task SendItemAsync(T item)
{
// asynchronously wait for message to be posted
await _buffer.SendAsync(item);
}
// abstract method to implement
protected abstract void ProcessItem(T item);
}
Note that you also can encapsulate the link between two blocks by using the Encapsulate<TInput, TOutput> method, but in that case you have to properly handle the Completion of the buffer block, if you're using it.
After this, we just need to implement two methods for receive and send handle logic:
public class SendEnqueueInfoProcessor : EnqueueInfoProcessor<DataSendEnqeueInfo>
{
SendEnqueueInfoProcessor(CancellationToken token)
: base(token)
{
}
protected override void ProcessItem(DataSendEnqeueInfo item)
{
// send logic here
}
}
public class RecievedEnqueueInfoProcessor : EnqueueInfoProcessor<DataRecievedEnqeueInfo>
{
RecievedEnqueueInfoProcessor(CancellationToken token)
: base(token)
{
}
protected override void ProcessItem(DataRecievedEnqeueInfo item)
{
// recieve logic here
}
}
You also can create more complicated pipeline with TransformBlock<DataRecievedEnqeueInfo, DataSendEnqeueInfo>, if your message flow is about a ReceiveInfo message became SendInfo.

API methods must wait until critical method is done

I have MVC API controller.
One method in this controller is critical.
This mean that all other API methods must wait util this method is done.
My basic idea is to block threads in constructor.
But I am not sure if this is so smart?
public class TestApi : Controller
{
private static bool wait = false;
public TestApi()
{
// wait if critical method is working.
while (wait)
{
System.Threading.Thread.Sleep(100);
}
}
[HttpPost]
public void PostCriticalMethod()
{
try
{
wait = true;
// do critical work
}
finally
{
wait = false;
}
}
// Many non critical API methods...
}
Solution two:
public class TestApi : Controller
{
private static bool wait = false;
private static AutoResetEvent waitHandle = new AutoResetEvent(false);
public TestApi()
{
// wait if critical method is working.
if (wait) waitHandle.WaitOne();
}
[HttpPost]
public void PostCriticalMethod()
{
try
{
wait = true;
// do critical work
}
finally {
waitHandle.Set();
wait = false;
}
}
// Many non critical API methods...
}
My solution (This is async version, but non async is even simpler):
In base class (common for all controllers) I add method BlockOtherRequestsBeforeExecute
private static readonly SemaphoreSlim semaphoreInit = new SemaphoreSlim(1, 1);
protected async Task BlockOtherRequestsBeforeExecute(Func<Task> criticalAction)
{
await semaphoreInit.WaitAsync();
try
{
await criticalAction();
}
finally
{
semaphoreInit.Release();
}
}
Then I can call method in secure way if I need to:
await BlockOtherRequestsBeforeExecute(async () => await RestoreDatabase());
Important part is that semaphoreInit must be used in all critical places.
This can be done in constructor of base class, and then all API-s are blocked until critical action is not finished.

How to create a FIFO/strong semaphore

I need to code my own FIFO/strong semaphore in C#, using a semaphore class of my own as a base. I found this example, but it's not quite right since I'm not supposed to be using Monitor.Enter/Exit yet.
These are the methods for my regular semaphore, and I was wondering if there was a simple way to adapt it to be FIFO.
public virtual void Acquire()
{
lock (this)
{
while (uintTokens == 0)
{
Monitor.Wait(this);
}
uintTokens--;
}
}
public virtual void Release(uint tokens = 1)
{
lock (this)
{
uintTokens += tokens;
Monitor.PulseAll(this);
}
}
So SemaphoreSlim gives us a good starting place, so we'll begin by wrapping one of those in a new class, and directing everything but the wait method to that semaphore.
To get a queue like behavior we'll want a queue object, and to make sure it's safe in the face of multithreaded access, we'll use a ConcurrentQueue.
In this queue we'll put TaskCompletionSource objects. When we want to have something start waiting it can create a TCS, add it to the queue, and then inform the semaphore to asynchronously pop the next item off of the queue and mark it as "completed" when the wait finishes. We'll know that there will always be an equal or lesser number of continuations as there are items in the queue.
Then we just wait on the Task from the TCS.
We can also trivially create a WaitAsync method that returns a task, by just returning it instead of waiting on it.
public class SemaphoreQueue
{
private SemaphoreSlim semaphore;
private ConcurrentQueue<TaskCompletionSource<bool>> queue =
new ConcurrentQueue<TaskCompletionSource<bool>>();
public SemaphoreQueue(int initialCount)
{
semaphore = new SemaphoreSlim(initialCount);
}
public SemaphoreQueue(int initialCount, int maxCount)
{
semaphore = new SemaphoreSlim(initialCount, maxCount);
}
public void Wait()
{
WaitAsync().Wait();
}
public Task WaitAsync()
{
var tcs = new TaskCompletionSource<bool>();
queue.Enqueue(tcs);
semaphore.WaitAsync().ContinueWith(t =>
{
TaskCompletionSource<bool> popped;
if (queue.TryDequeue(out popped))
popped.SetResult(true);
});
return tcs.Task;
}
public void Release()
{
semaphore.Release();
}
}
I have created a FifoSemaphore class and I am successfully using it in my solutions. Current limitation is that it behaves like a Semaphore(1, 1).
public class FifoSemaphore
{
private readonly object lockObj = new object();
private List<Semaphore> WaitingQueue = new List<Semaphore>();
private Semaphore RequestNewSemaphore()
{
lock (lockObj)
{
Semaphore newSemaphore = new Semaphore(1, 1);
newSemaphore.WaitOne();
return newSemaphore;
}
}
#region Public Functions
public void Release()
{
lock (lockObj)
{
WaitingQueue.RemoveAt(0);
if (WaitingQueue.Count > 0)
{
WaitingQueue[0].Release();
}
}
}
public void WaitOne()
{
Semaphore semaphore = RequestNewSemaphore();
lock (lockObj)
{
WaitingQueue.Add(semaphore);
semaphore.Release();
if(WaitingQueue.Count > 1)
{
semaphore.WaitOne();
}
}
semaphore.WaitOne();
}
#endregion
}
Usage is just like with a regular semaphore:
FifoSemaphore fifoSemaphore = new FifoSemaphore();
On each thread:
fifoSemaphore.WaitOne();
//do work
fifoSemaphore.Release();

Wrap asynchronous calls with synchronous method

I have a 3rd party DLL with an asynchronous method that I want to wrap with another method that waits for its result.
I started writing a class to hide the functionality, but now I can't work out how to wait for Doc.Completed to be called by the DLL after this.version.DownloadFile(this) in Doc.Download.
The DLL calls InitTransfer, then OnProgressNotify a number of times, then Completed. OnError may be called at any stage, but Completed is always called last. I don't care about InitTransfer, OnProgressNotify or OnError.
I have read
Asynchronous call in synchronous method and Turn asynchronous calls into synchronous but I don't understand how to apply the answers to this case.
I'm using C# 4.
public class Doc : SomeInterfaceFromTheDll
{
private readonly IVersion version; // An interface from the DLL.
private bool downloadSuccessful;
public Doc(IVersion version)
{
this.version = version;
}
public bool Download()
{
this.version.DownloadFile(this);
return ??? // I want to return this.downloadSuccessful after Completed() runs.
}
public void Completed(short reason)
{
Trace.WriteLine(string.Format("Notify.Completed({0})", reason));
this.downloadSuccessful = reason == 0 ? true : false;
}
public void InitTransfer(int totalSize)
{
Trace.WriteLine(string.Format("Notify.InitTransfer({0})", totalSize));
}
public void OnError(string errorText)
{
Trace.WriteLine(string.Format("Notify.OnError({0})", errorText));
}
public void OnProgressNotify(int bytesRead)
{
Trace.WriteLine(string.Format("Notify.OnProgressNotify({0})", bytesRead));
}
}
This can be achieved using a ManualResetEvent as shown below. There are a few caveats though. The primary one being that this mechanism does not permit you to call Download() on the same Doc instance on multiple threads at the same time. If you need to do this, then a different approach may be required.
public class Doc : SomeInterfaceFromTheDll
{
private readonly IVersion version; // An interface from the DLL.
private readonly ManualResetEvent _complete = new ManualResetEvent(false);
private bool downloadSuccessful;
// ...
public bool Download()
{
this.version.DownloadFile(this);
// Wait for the event to be signalled...
_complete.WaitOne();
return this.downloadSuccessful;
}
public void Completed(short reason)
{
Trace.WriteLine(string.Format("Notify.Completed({0})", reason));
this.downloadSuccessful = reason == 0;
// Signal that the download is complete
_complete.Set();
}
// ...
}

Categories

Resources