Reset and Dispose observable subscriber, Reactive Extensions - c#

Suppose I have this :
public class UploadDicomSet
{
public UploadDicomSet()
{
var cachCleanTimer = Observable.Interval(TimeSpan.FromMinutes(2));
cachCleanTimer.Subscribe(CheckUploadSetList);
//Start subscriber
}
void CheckUploadSetList(long interval)
{
//Stop and dispose subscriber
}
public void AddDicomFile(SharedLib.DicomFile dicomFile)
{
//Renew subscriber, call CheckUploadSetList 2 minutes later
}
}
1- in CheckUploadSetList I want to dispose or finish observable
2- in AddDicomFile I want to reset it
as comment in methods.
UPDATE:
I can do it by Timer as:
public class UploadDicomSet : ImportBaseSet
{
Timer _timer;
public UploadDicomSet()
{
_timer = new Timer(CheckUploadSetList, null, 120000, Timeout.Infinite);
}
void CheckUploadSetList(object state)
{
Logging logging = new Logging(LogFile);
try
{
_timer.Dispose(); //Stop the subscription
//dispose everything
}
catch (Exception exp)
{
logging.Log(ErrorCode.Error, "CheckUploadSetList() failed..., EXP:{0}", exp.ToString());
}
}
public void AddDicomFile(SharedLib.DicomFile dicomFile)
{
_timer.Change(120000, Timeout.Infinite);
}
}
Thanks in advance.

You should use Switch() for this kind of thing.
Something like this:
public class UploadDicomSet : ImportBaseSet
{
IDisposable subscription;
Subject<IObservable<long>> subject = new Subject<IObservable<long>>();
public UploadDicomSet()
{
subscription = subject.Switch().Subscribe(s => CheckUploadSetList(s));
subject.OnNext(Observable.Interval(TimeSpan.FromMinutes(2)));
}
void CheckUploadSetList(long interval)
{
subject.OnNext(Observable.Never<long>());
// Do other things
}
public void AddDicomFile(SharedLib.DicomFile dicomFile)
{
subject.OnNext(Observable.Interval(TimeSpan.FromMinutes(2)));
// Reset the subscription to go off in 2 minutes from now
// Do other things
}
}

Using Reactive Extension for just some timer function seems a bit overkill to me. Why not just use an ordinary timer for this, and start/stop it at given times?
Let me give an idea.
public class UploadDicomSet : ImportBaseSet
{
IDisposable subscription;
public void CreateSubscription()
{
var cachCleanTimer = Observable.Interval(TimeSpan.FromMinutes(2));
if(subscription != null)
subscription.Dispose();
subscription = cachCleanTimer.Subscribe(s => CheckUploadSetList(s));
}
public UploadDicomSet()
{
CreateSubscription();
// Do other things
}
void CheckUploadSetList(long interval)
{
subscription.Dispose(); // Stop the subscription
// Do other things
}
public void AddDicomFile(SharedLib.DicomFile dicomFile)
{
CreateSubscription(); // Reset the subscription to go off in 2 minutes from now
// Do other things
}
}
Background material
I really can recommend these sites:
http://www.introtorx.com/
http://rxwiki.wikidot.com/101samples

Related

How to make a queued message broker in pure C#

Background
I'm in a need for a queued message broker dispatching messages in a distributed (over consecutive frames) manner. In the example shown below it will process no more than 10 subscribers, and then wait for the next frame before processing further.
(For the sake of clarification for those not familiar with Unity3D, Process() method is run using Unity's built-in StartCoroutine() method and - in this case - will last for the lifetime of the game - waiting or processing from the queue.)
So i have such a relatively simple class:
public class MessageBus : IMessageBus
{
private const int LIMIT = 10;
private readonly WaitForSeconds Wait;
private Queue<IMessage> Messages;
private Dictionary<Type, List<Action<IMessage>>> Subscribers;
public MessageBus()
{
Wait = new WaitForSeconds(2f);
Messages = new Queue<IMessage>();
Subscribers = new Dictionary<Type, List<Action<IMessage>>>();
}
public void Submit(IMessage message)
{
Messages.Enqueue(message);
}
public IEnumerator Process()
{
var processed = 0;
while (true)
{
if (Messages.Count == 0)
{
yield return Wait;
}
else
{
while(Messages.Count > 0)
{
var message = Messages.Dequeue();
foreach (var subscriber in Subscribers[message.GetType()])
{
if (processed >= LIMIT)
{
processed = 0;
yield return null;
}
processed++;
subscriber?.Invoke(message);
}
}
processed = 0;
}
}
}
public void Subscribe<T>(Action<IMessage> handler) where T : IMessage
{
if (!Subscribers.ContainsKey(typeof(T)))
{
Subscribers[typeof(T)] = new List<Action<IMessage>>();
}
Subscribers[typeof(T)].Add(handler);
}
public void Unsubscribe<T>(Action<IMessage> handler) where T : IMessage
{
if (!Subscribers.ContainsKey(typeof(T)))
{
return;
}
Subscribers[typeof(T)].Remove(handler);
}
}
And it works and behaves just as expected, but there is one problem.
The problem
I would like to use it (from the subscriber's point of view) like this:
public void Run()
{
MessageBus.Subscribe<TestEvent>(OnTestEvent);
}
public void OnTestEvent(TestEvent message)
{
message.SomeTestEventMethod();
}
But this obviously fails because Action<IMessage> cannot be converted to Action<TestEvent>.
The only way i can use it is like this:
public void Run()
{
MessageBus.Subscribe<TestEvent>(OnTestEvent);
}
public void OnTestEvent(IMessage message)
{
((TestEvent)message).SomeTestEventMethod();
}
But this feels unelegant and very wasteful as every subscriber needs to do the casting on it's own.
What i have tried
I was experimenting with "casting" actions like that:
public void Subscribe<T>(Action<T> handler) where T : IMessage
{
if (!Subscribers.ContainsKey(typeof(T)))
{
Subscribers[typeof(T)] = new List<Action<IMessage>>();
}
Subscribers[typeof(T)].Add((IMessage a) => handler((T)a));
}
And this works for the subscribe part, but obviously not for the unsubscribe. I could cache somewhere newly created handler-wrapper-lambdas for use when unsubscribing, but i don't think this is the real solution, to be honest.
The question
How can i make this to work as i would like to? Preferably with some C# "magic" if possible, but i'm aware it may require a completely different approach.
Also because this will be used in a game, and be run for it's lifetime i would like a garbage-free solution if possible.
So the problem is that you are trying to store lists of a different type as values in the subscriber dictionary.
One way to get around this might be to store a List<Delegate> and then to use Delegate.DynamicInvoke.
Here's some test code that summarizes the main points:
Dictionary<Type, List<Delegate>> Subscribers = new Dictionary<Type, List<Delegate>>();
void Main()
{
Subscribe<Evt>(ev => Console.WriteLine($"hello {ev.Message}"));
IMessage m = new Evt("spender");
foreach (var subscriber in Subscribers[m.GetType()])
{
subscriber?.DynamicInvoke(m);
}
}
public void Subscribe<T>(Action<T> handler) where T : IMessage
{
if (!Subscribers.ContainsKey(typeof(T)))
{
Subscribers[typeof(T)] = new List<Delegate>();
}
Subscribers[typeof(T)].Add(handler);
}
public interface IMessage{}
public class Evt : IMessage
{
public Evt(string message)
{
this.Message = message;
}
public string Message { get; }
}

Reset Observable timer if an operation called

I am newbie with Reactive Extensions but What is the best way to implement below scenario using Reactive Extensions:
1- Subscribe an event for every minute in constructor
2- if an operation get called then that subscriber get reset
3- if nothing happened or that operation won't be called for a minute the event in step 1 get fired
Something like this:
public class ImportClient : Carrier<IImportService>, IImportService
{
IObservable<long> proxyCleaner;
void DisposeProxy(long interval)
{
this.Close();
//Dispos proxy
}
public void RunDisposeTimer()
{
proxyCleaner = Observable.Interval(TimeSpan.FromMinutes(1));
proxyCleaner.Subscribe(DisposeProxy);
}
public ImportClient(String endpointConfigurationName) : base(endpointConfigurationName)
{
RunDisposeTimer();
}
public Attach_DTO_OUT AttachImage(AttachImage_DTO_IN source_C)
{
//Reset timer here
//Reset proxyCleaner
using (OperationContextScope scope = new OperationContextScope(this.InnerChannel))
{
AddMessageHeader<Token>(Token);
return base.Channel.AttachImage(source_C);
}
}
}
because my WCF service is Session Full I need to dispose it manually after specific time.
UPDATE:
I think it is possible to do with ObservableCollection but how:
private ObservableCollection<string> collection;
public void RunDisposeTimer()
{
collection = new ObservableCollection<string>();
collection.CollectionChanged += Collection_CollectionChanged;
}
private void Collection_CollectionChanged(object sender, System.Collections.Specialized.NotifyCollectionChangedEventArgs e)
{
//Here reset timer
throw new NotImplementedException();
}

Producer/ Consumer pattern using threads and EventWaitHandle

I guess it is sort of a code review, but here is my implementation of the producer / consumer pattern. What I would like to know is would there be a case in which the while loops in the ReceivingThread() or SendingThread() methods might stop executing. Please note that EnqueueSend(DataSendEnqeueInfo info) is called from multiple different threads and I probably can't use tasks here since I definitely have to consume commands in a separate thread.
private Thread mReceivingThread;
private Thread mSendingThread;
private Queue<DataRecievedEnqeueInfo> mReceivingThreadQueue;
private Queue<DataSendEnqeueInfo> mSendingThreadQueue;
private readonly object mReceivingQueueLock = new object();
private readonly object mSendingQueueLock = new object();
private bool mIsRunning;
EventWaitHandle mRcWaitHandle;
EventWaitHandle mSeWaitHandle;
private void ReceivingThread()
{
while (mIsRunning)
{
mRcWaitHandle.WaitOne();
DataRecievedEnqeueInfo item = null;
while (mReceivingThreadQueue.Count > 0)
{
lock (mReceivingQueueLock)
{
item = mReceivingThreadQueue.Dequeue();
}
ProcessReceivingItem(item);
}
mRcWaitHandle.Reset();
}
}
private void SendingThread()
{
while (mIsRunning)
{
mSeWaitHandle.WaitOne();
while (mSendingThreadQueue.Count > 0)
{
DataSendEnqeueInfo item = null;
lock (mSendingQueueLock)
{
item = mSendingThreadQueue.Dequeue();
}
ProcessSendingItem(item);
}
mSeWaitHandle.Reset();
}
}
internal void EnqueueRecevingData(DataRecievedEnqeueInfo info)
{
lock (mReceivingQueueLock)
{
mReceivingThreadQueue.Enqueue(info);
mRcWaitHandle.Set();
}
}
public void EnqueueSend(DataSendEnqeueInfo info)
{
lock (mSendingQueueLock)
{
mSendingThreadQueue.Enqueue(info);
mSeWaitHandle.Set();
}
}
P.S the idea here is that am using WaitHandles to put thread to sleep when the queue is empty, and signal them to start when new items are enqueued.
UPDATE
I am just going to leave this https://blogs.msdn.microsoft.com/benwilli/2015/09/10/tasks-are-still-not-threads-and-async-is-not-parallel/ ,for people who might be trying to implement Producer/Consumer pattern using TPL or tasks.
Use a BlockingCollection instead of Queue, EventWaitHandle and lock objects:
public class DataInfo { }
private Thread mReceivingThread;
private Thread mSendingThread;
private BlockingCollection<DataInfo> queue;
private CancellationTokenSource receivingCts = new CancellationTokenSource();
private void ReceivingThread()
{
try
{
while (!receivingCts.IsCancellationRequested)
{
// This will block until an item is added to the queue or the cancellation token is cancelled
DataInfo item = queue.Take(receivingCts.Token);
ProcessReceivingItem(item);
}
}
catch (OperationCanceledException)
{
}
}
internal void EnqueueRecevingData(DataInfo info)
{
// When a new item is produced, just add it to the queue
queue.Add(info);
}
// To cancel the receiving thread, cancel the token
private void CancelReceivingThread()
{
receivingCts.Cancel();
}
Personally, for simple producer-consumer problems, I would just use BlockingCollection. There would be no need to manually code your own synchronization logic. The consuming threads will also block if there are no items present in the queue.
Here is what your code might look like if you use this class:
private BlockingCollection<DataRecievedEnqeueInfo> mReceivingThreadQueue = new BlockingCollection<DataRecievedEnqeueInfo>();
private BlockingCollection<DataSendEnqeueInfo> mSendingThreadQueue = new BlockingCollection<DataSendEnqeueInfo>();
public void Stop()
{
// No need for mIsRunning. Makes the enumerables in the GetConsumingEnumerable() calls
// below to complete.
mReceivingThreadQueue.CompleteAdding();
mSendingThreadQueue.CompleteAdding();
}
private void ReceivingThread()
{
foreach (DataRecievedEnqeueInfo item in mReceivingThreadQueue.GetConsumingEnumerable())
{
ProcessReceivingItem(item);
}
}
private void SendingThread()
{
foreach (DataSendEnqeueInfo item in mSendingThreadQueue.GetConsumingEnumerable())
{
ProcessSendingItem(item);
}
}
internal void EnqueueRecevingData(DataRecievedEnqeueInfo info)
{
// You can also use TryAdd() if there is a possibility that you
// can add items after you have stopped. Otherwise, this can throw an
// an exception after CompleteAdding() has been called.
mReceivingThreadQueue.Add(info);
}
public void EnqueueSend(DataSendEnqeueInfo info)
{
mSendingThreadQueue.Add(info);
}
As suggested in comments, you also can give a try to the TPL Dataflow blocks.
As far as I can see, you have two similar pipelines, for receive and send, so I assume that your class hierarchy is like this:
class EnqueueInfo { }
class DataRecievedEnqeueInfo : EnqueueInfo { }
class DataSendEnqeueInfo : EnqueueInfo { }
We can assemble an abstract class which will encapsulate the logic for creating the pipeline, and providing the interface for processing the items, like this:
abstract class EnqueueInfoProcessor<T>
where T : EnqueueInfo
{
// here we will store all the messages received before the handling
private readonly BufferBlock<T> _buffer;
// simple action block for actual handling the items
private ActionBlock<T> _action;
// cancellation token to cancel the pipeline
public EnqueueInfoProcessor(CancellationToken token)
{
_buffer = new BufferBlock<T>(new DataflowBlockOptions { CancellationToken = token });
_action = new ActionBlock<T>(item => ProcessItem(item), new ExecutionDataflowBlockOptions
{
MaxDegreeOfParallelism = Environment.ProcessorCount,
CancellationToken = token
});
// we are linking two blocks so all the items from buffer
// will flow down to action block in order they've been received
_buffer.LinkTo(_action, new DataflowLinkOptions { PropagateCompletion = true });
}
public void PostItem(T item)
{
// synchronously wait for posting to complete
_buffer.Post(item);
}
public async Task SendItemAsync(T item)
{
// asynchronously wait for message to be posted
await _buffer.SendAsync(item);
}
// abstract method to implement
protected abstract void ProcessItem(T item);
}
Note that you also can encapsulate the link between two blocks by using the Encapsulate<TInput, TOutput> method, but in that case you have to properly handle the Completion of the buffer block, if you're using it.
After this, we just need to implement two methods for receive and send handle logic:
public class SendEnqueueInfoProcessor : EnqueueInfoProcessor<DataSendEnqeueInfo>
{
SendEnqueueInfoProcessor(CancellationToken token)
: base(token)
{
}
protected override void ProcessItem(DataSendEnqeueInfo item)
{
// send logic here
}
}
public class RecievedEnqueueInfoProcessor : EnqueueInfoProcessor<DataRecievedEnqeueInfo>
{
RecievedEnqueueInfoProcessor(CancellationToken token)
: base(token)
{
}
protected override void ProcessItem(DataRecievedEnqeueInfo item)
{
// recieve logic here
}
}
You also can create more complicated pipeline with TransformBlock<DataRecievedEnqeueInfo, DataSendEnqeueInfo>, if your message flow is about a ReceiveInfo message became SendInfo.

Xamarin PCL - Making API calls based on user entry, how to dampen requests?

I have a search task that makes a request to an API, within a portable class library, when the user enters text in the textbox, this works as expected but I have a concern over performance at scale. When we have a large userbase all making requests to this API on every key press I can foresee performance issues.
I have limited the API call to only fire when there are more than three valid characters but I want to dampen this further. I could implement a timer over the top of this but it does not feel like a good solution and is not present in the PCL framework.
Is there a recommended pattern to achieve this type of request dampening?
private async Task GetClubs()
{
try
{
if (!string.IsNullOrWhiteSpace(ClubSearch) && ClubSearch.Replace(" ", "").Length >= 3)
{
Clubs = await dataService.GetClubs(ClubSearch);
}
}
catch (DataServiceException ex)
{
...
}
}
Usually that is done with timer. When search text changes you start (or reuse) a timer which will fire after delay and execute search request. If more text is typed during that delay - timer is reset. Sample code:
public class MyClass {
private readonly Timer _timer;
const int ThrottlePeriod = 500; // ms
public MyClass() {
_timer = new System.Threading.Timer(_ => {
ExecuteRequest();
}, null, Timeout.Infinite, Timeout.Infinite);
}
private string _searchTerm;
public string SearchTerm
{
get { return _searchTerm; }
set
{
_searchTerm = value;
ResetTimer();
}
}
private void ResetTimer() {
_timer.Change(ThrottlePeriod, Timeout.Infinite);
}
private void ExecuteRequest() {
Console.WriteLine(SearchTerm);
}
}
If timer is not available, you can do the same with Task.Delay:
public class MyClass
{
const int ThrottlePeriod = 500; // ms
private string _searchTerm;
public string SearchTerm
{
get { return _searchTerm; }
set
{
_searchTerm = value;
SearchWithDelay();
}
}
private async void SearchWithDelay() {
var before = this.SearchTerm;
await Task.Delay(ThrottlePeriod);
if (before == this.SearchTerm) {
// did not change while we were waiting
ExecuteRequest();
}
}
private void ExecuteRequest()
{
Console.WriteLine(SearchTerm);
}
}
Cheap/Fast way to implement this is a Task.Delay:
var mySearchThread = new Thread (new ThreadStart (async delegate {
while (true) {
if (!String.IsNullOrWhiteSpace(seachText) {
YourSearchMethod(seachText)
};
InvokeOnMainThread ( () => {
// Refresh your datasource on the UIthread
});
await Task.Delay (2000);
}
})).Start ();
A PCL-based solution (and amazing clean way with a great framework) is to use ReactiveUI throttling (Throttle), then you can do feats like:
// Throttle searching to every 2 seconds
this.WhenAnyValue(x => x.SearchText)
.Where(x => !String.IsNullOrWhiteSpace(x))
.Throttle(TimeSpan.FromSeconds(2))
.InvokeCommand(SearchCommand)
Ref: http://reactiveui.net
Ref: http://docs.reactiveui.net/en/user-guide/when-any/index.html

Simple in-memory message queue

Our existing implementation of domain events limits (by blocking) publishing to one thread at a time to avoid reentrant calls to handlers:
public interface IDomainEvent {} // Marker interface
public class Dispatcher : IDisposable
{
private readonly SemaphoreSlim semaphore = new SemaphoreSlim(1, 1);
// Subscribe code...
public void Publish(IDomainEvent domainEvent)
{
semaphore.Wait();
try
{
// Get event subscriber(s) from concurrent dictionary...
foreach (Action<IDomainEvent> subscriber in eventSubscribers)
{
subscriber(domainEvent);
}
}
finally
{
semaphore.Release();
}
}
// Dispose pattern...
}
If a handler publishes an event, this will deadlock.
How can I rewrite this to serialize calls to Publish? In other words, if subscribing handler A publishes event B, I'll get:
Handler A called
Handler B called
while preserving the condition of no reentrant calls to handlers in a multithreaded environment.
I do not want to change the public method signature; there's no place in the application to call a method to publish a queue, for instance.
We came up with a way to do it synchronously.
public class Dispatcher : IDisposable
{
private readonly ConcurrentQueue<IDomainEvent> queue = new ConcurrentQueue<IDomainEvent>();
private readonly SemaphoreSlim semaphore = new SemaphoreSlim(1, 1);
// Subscribe code...
public void Publish(IDomainEvent domainEvent)
{
queue.Enqueue(domainEvent);
if (IsPublishing)
{
return;
}
PublishQueue();
}
private void PublishQueue()
{
IDomainEvent domainEvent;
while (queue.TryDequeue(out domainEvent))
{
InternalPublish(domainEvent);
}
}
private void InternalPublish(IDomainEvent domainEvent)
{
semaphore.Wait();
try
{
// Get event subscriber(s) from concurrent dictionary...
foreach (Action<IDomainEvent> subscriber in eventSubscribers)
{
subscriber(domainEvent);
}
}
finally
{
semaphore.Release();
}
// Necessary, as calls to Publish during publishing could have queued events and returned.
PublishQueue();
}
private bool IsPublishing
{
get { return semaphore.CurrentCount < 1; }
}
// Dispose pattern for semaphore...
}
}
You will have to make Publish asynchronous to achieve that. Naive implementation would be as simple as:
public class Dispatcher : IDisposable {
private readonly BlockingCollection<IDomainEvent> _queue = new BlockingCollection<IDomainEvent>(new ConcurrentQueue<IDomainEvent>());
private readonly CancellationTokenSource _cts = new CancellationTokenSource();
public Dispatcher() {
new Thread(Consume) {
IsBackground = true
}.Start();
}
private List<Action<IDomainEvent>> _subscribers = new List<Action<IDomainEvent>>();
public void AddSubscriber(Action<IDomainEvent> sub) {
_subscribers.Add(sub);
}
private void Consume() {
try {
foreach (var #event in _queue.GetConsumingEnumerable(_cts.Token)) {
try {
foreach (Action<IDomainEvent> subscriber in _subscribers) {
subscriber(#event);
}
}
catch (Exception ex) {
// log, handle
}
}
}
catch (OperationCanceledException) {
// expected
}
}
public void Publish(IDomainEvent domainEvent) {
_queue.Add(domainEvent);
}
public void Dispose() {
_cts.Cancel();
}
}
It can't be done with that interface. You can process the event subscriptions asynchronously to remove the deadlock while still running them serially, but then you can't guarantee the order you described. Another call to Publish might enqueue something (event C) while the handler for event A is running but before it publishes event B. Then event B ends up behind event C in the queue.
As long as Handler A is on equal footing with other clients when it comes to getting an item in the queue, it either has to wait like everyone else (deadlock) or it has to play fairly (first come, first served). The interface you have there doesn't allow the two to be treated differently.
That's not to say you couldn't get up to some shenanigans in your logic to attempt to differentiate them (e.g. based on thread id or something else identifiable), but anything along those lines would unreliable if you don't control the subscriber code as well.

Categories

Resources