How to run a Task on a custom TaskScheduler using await? - c#

I have some methods returning Task<T> on which I can await at will. I'd like to have those Tasks executed on a custom TaskScheduler instead of the default one.
var task = GetTaskAsync ();
await task;
I know I can create a new TaskFactory (new CustomScheduler ()) and do a StartNew () from it, but StartNew () takes an action and create the Task, and I already have the Task (returned behind the scenes by a TaskCompletionSource)
How can I specify my own TaskScheduler for await ?

I think what you really want is to do a Task.Run, but with a custom scheduler. StartNew doesn't work intuitively with asynchronous methods; Stephen Toub has a great blog post about the differences between Task.Run and TaskFactory.StartNew.
So, to create your own custom Run, you can do something like this:
private static readonly TaskFactory myTaskFactory = new TaskFactory(
CancellationToken.None, TaskCreationOptions.DenyChildAttach,
TaskContinuationOptions.None, new MyTaskScheduler());
private static Task RunOnMyScheduler(Func<Task> func)
{
return myTaskFactory.StartNew(func).Unwrap();
}
private static Task<T> RunOnMyScheduler<T>(Func<Task<T>> func)
{
return myTaskFactory.StartNew(func).Unwrap();
}
private static Task RunOnMyScheduler(Action func)
{
return myTaskFactory.StartNew(func);
}
private static Task<T> RunOnMyScheduler<T>(Func<T> func)
{
return myTaskFactory.StartNew(func);
}
Then you can execute synchronous or asynchronous methods on your custom scheduler.

The TaskCompletionSource<T>.Task is constructed without any action and the scheduler
is assigned on the first call to ContinueWith(...) (from Asynchronous Programming with the Reactive Framework and the Task Parallel Library — Part 3).
Thankfully you can customize the await behavior slightly by implementing your own class deriving from INotifyCompletion and then using it in a pattern similar to await SomeTask.ConfigureAwait(false) to configure the scheduler that the task should start using in the OnCompleted(Action continuation) method (from await anything;).
Here is the usage:
TaskCompletionSource<object> source = new TaskCompletionSource<object>();
public async Task Foo() {
// Force await to schedule the task on the supplied scheduler
await SomeAsyncTask().ConfigureScheduler(scheduler);
}
public Task SomeAsyncTask() { return source.Task; }
Here is a simple implementation of ConfigureScheduler using a Task extension method with the important part in OnCompleted:
public static class TaskExtension {
public static CustomTaskAwaitable ConfigureScheduler(this Task task, TaskScheduler scheduler) {
return new CustomTaskAwaitable(task, scheduler);
}
}
public struct CustomTaskAwaitable {
CustomTaskAwaiter awaitable;
public CustomTaskAwaitable(Task task, TaskScheduler scheduler) {
awaitable = new CustomTaskAwaiter(task, scheduler);
}
public CustomTaskAwaiter GetAwaiter() { return awaitable; }
public struct CustomTaskAwaiter : INotifyCompletion {
Task task;
TaskScheduler scheduler;
public CustomTaskAwaiter(Task task, TaskScheduler scheduler) {
this.task = task;
this.scheduler = scheduler;
}
public void OnCompleted(Action continuation) {
// ContinueWith sets the scheduler to use for the continuation action
task.ContinueWith(x => continuation(), scheduler);
}
public bool IsCompleted { get { return task.IsCompleted; } }
public void GetResult() { }
}
}
Here's a working sample that will compile as a console application:
using System;
using System.Collections.Generic;
using System.Runtime.CompilerServices;
using System.Threading.Tasks;
namespace Example {
class Program {
static TaskCompletionSource<object> source = new TaskCompletionSource<object>();
static TaskScheduler scheduler = new CustomTaskScheduler();
static void Main(string[] args) {
Console.WriteLine("Main Started");
var task = Foo();
Console.WriteLine("Main Continue ");
// Continue Foo() using CustomTaskScheduler
source.SetResult(null);
Console.WriteLine("Main Finished");
}
public static async Task Foo() {
Console.WriteLine("Foo Started");
// Force await to schedule the task on the supplied scheduler
await SomeAsyncTask().ConfigureScheduler(scheduler);
Console.WriteLine("Foo Finished");
}
public static Task SomeAsyncTask() { return source.Task; }
}
public struct CustomTaskAwaitable {
CustomTaskAwaiter awaitable;
public CustomTaskAwaitable(Task task, TaskScheduler scheduler) {
awaitable = new CustomTaskAwaiter(task, scheduler);
}
public CustomTaskAwaiter GetAwaiter() { return awaitable; }
public struct CustomTaskAwaiter : INotifyCompletion {
Task task;
TaskScheduler scheduler;
public CustomTaskAwaiter(Task task, TaskScheduler scheduler) {
this.task = task;
this.scheduler = scheduler;
}
public void OnCompleted(Action continuation) {
// ContinueWith sets the scheduler to use for the continuation action
task.ContinueWith(x => continuation(), scheduler);
}
public bool IsCompleted { get { return task.IsCompleted; } }
public void GetResult() { }
}
}
public static class TaskExtension {
public static CustomTaskAwaitable ConfigureScheduler(this Task task, TaskScheduler scheduler) {
return new CustomTaskAwaitable(task, scheduler);
}
}
public class CustomTaskScheduler : TaskScheduler {
protected override IEnumerable<Task> GetScheduledTasks() { yield break; }
protected override bool TryExecuteTaskInline(Task task, bool taskWasPreviouslyQueued) { return false; }
protected override void QueueTask(Task task) {
TryExecuteTask(task);
}
}
}

There is no way to embed rich async features into a custom TaskScheduler. This class was not designed with async/await in mind. The standard way to use a custom TaskScheduler is as an argument to the Task.Factory.StartNew method. This method does not understand async delegates. It is possible to provide an async delegate, but it is treated as any other delegate that returns some result. To get the actual awaited result of the async delegate one must call Unwrap() to the task returned.
This is not the problem though. The problem is that the TaskScheduler infrastructure does not treat the async delegate as a single unit of work. Each task is split into multiple mini-tasks (using every await as a separator), and each mini-task is processed individually. This severely restricts the asynchronous functionality that can be implemented on top of this class. As an example here is a custom TaskScheduler that is intended to queue the supplied tasks one at a time (to limit the concurrency in other words):
public class MyTaskScheduler : TaskScheduler
{
private readonly SemaphoreSlim _semaphore = new SemaphoreSlim(1);
protected async override void QueueTask(Task task)
{
await _semaphore.WaitAsync();
try
{
await Task.Run(() => base.TryExecuteTask(task));
await task;
}
finally
{
_semaphore.Release();
}
}
protected override bool TryExecuteTaskInline(Task task,
bool taskWasPreviouslyQueued) => false;
protected override IEnumerable<Task> GetScheduledTasks() { yield break; }
}
The SemaphoreSlim should ensure that only one Task would run at a time. Unfortunately it doesn't work. The semaphore is released prematurely, because the Task passed in the call QueueTask(task) is not the task that represents the whole work of the async delegate, but only the part until the first await. The other parts are passed to the TryExecuteTaskInline method. There is no way to correlate these task-parts, because no identifier or other mechanism is provided. Here is what happens in practice:
var taskScheduler = new MyTaskScheduler();
var tasks = Enumerable.Range(1, 5).Select(n => Task.Factory.StartNew(async () =>
{
Console.WriteLine($"{DateTime.Now:HH:mm:ss.fff} Item {n} Started");
await Task.Delay(1000);
Console.WriteLine($"{DateTime.Now:HH:mm:ss.fff} Item {n} Finished");
}, default, TaskCreationOptions.None, taskScheduler))
.Select(t => t.Unwrap())
.ToArray();
Task.WaitAll(tasks);
Output:
05:29:58.346 Item 1 Started
05:29:58.358 Item 2 Started
05:29:58.358 Item 3 Started
05:29:58.358 Item 4 Started
05:29:58.358 Item 5 Started
05:29:59.358 Item 1 Finished
05:29:59.374 Item 5 Finished
05:29:59.374 Item 4 Finished
05:29:59.374 Item 2 Finished
05:29:59.374 Item 3 Finished
Disaster, all tasks are queued at once.
Conclusion: Customizing the TaskScheduler class is not the way to go when advanced async features are required.
Update: Here is another observation, regarding custom TaskSchedulers in the presence of an ambient SynchronizationContext. The await mechanism by default captures the current SynchronizationContext, or the current TaskScheduler, and invokes the continuation on either the captured context
or the scheduler. If both are present, the current SynchronizationContext is preferred, and the current TaskScheduler is ignored. Below is a demonstration of this behavior, in a WinForms application¹:
private async void Button1_Click(object sender, EventArgs e)
{
await Task.Factory.StartNew(async () =>
{
MessageBox.Show($"{Thread.CurrentThread.ManagedThreadId}, {TaskScheduler.Current}");
await Task.Delay(1000);
MessageBox.Show($"{Thread.CurrentThread.ManagedThreadId}, {TaskScheduler.Current}");
}, default, TaskCreationOptions.None,
TaskScheduler.FromCurrentSynchronizationContext()).Unwrap();
}
Clicking the button causes two messages to popup sequentially, with this information:
1, System.Threading.Tasks.SynchronizationContextTaskScheduler
1, System.Threading.Tasks.ThreadPoolTaskScheduler
This experiment shows that only the first part of the asynchronous delegate, the part before the first await, was scheduled on the non-default scheduler.
This behavior limits even further the practical usefulness of custom TaskSchedulers in an async/await-enabled environment.
¹ Windows Forms applications have a WindowsFormsSynchronizationContext installed automatically, when the Application.Run method is called.

Can you fit for this method call:
await Task.Factory.StartNew(
() => { /* to do what you need */ },
CancellationToken.None, /* you can change as you need */
TaskCreationOptions.None, /* you can change as you need */
customScheduler);

After the comments it looks like you want to control the scheduler on which the code after the await is run.
The compile creates a continuation from the await that runs on the current SynchronizationContext by default. So your best shot is to set up the SynchronizationContext before calling await.
There are some ways to await a specific context. See Configure Await from Jon Skeet, especially the part about SwitchTo, for more information on how to implement something like this.
EDIT:
The SwitchTo method from TaskEx has been removed, as it was too easy to misuse. See the MSDN Forum for reasons.

Faced with same issue, tried to use LimitedConcurrencyLevelTaskScheduler, but it does not support async tasks. So...
Just wrote my own small simple Scheduler, that allow to run async Tasks based on global ThreadPool (and Task.Run method) with ability to limit current max degree of parallelism. It is enough for my exact purposes, maybe will also help you, guys.
Main demo code (console app, dotnet core 3.1) :
static async Task Main(string[] args)
{
//5 tasks to run per time
int concurrentLimit = 5;
var scheduler = new ThreadPoolConcurrentScheduler(concurrentLimit);
//catch all errors in separate event handler
scheduler.OnError += Scheduler_OnError;
// just monitor "live" state and output to console
RunTaskStateMonitor(scheduler);
// simulate adding new tasks "on the fly"
SimulateAddingTasksInParallel(scheduler);
Console.WriteLine("start adding 50 tasks");
//add 50 tasks
for (var i = 1; i <= 50; i++)
{
scheduler.StartNew(myAsyncTask);
}
Console.WriteLine("50 tasks added to scheduler");
Thread.Sleep(1000000);
}
Supporting code (place it in the same place) :
private static void Scheduler_OnError(Exception ex)
{
Console.WriteLine(ex.ToString());
}
private static int currentTaskFinished = 0;
//your sample of async task
static async Task myAsyncTask()
{
Console.WriteLine("task started ");
using (HttpClient httpClient = new HttpClient())
{
//just make http request to ... wikipedia!
//sorry, Jimmy Wales! assume,guys, you will not DDOS wiki :)
var uri = new Uri("https://wikipedia.org/");
var response = await httpClient.GetAsync(uri);
string result = await response.Content.ReadAsStringAsync();
if (string.IsNullOrEmpty(result))
Console.WriteLine("error, await is not working");
else
Console.WriteLine($"task result : site length is {result.Length}");
}
//or simulate it using by sync sleep
//Thread.Sleep(1000);
//and for tesing exception :
//throw new Exception("my custom error");
Console.WriteLine("task finished ");
//just incrementing total ran tasks to output in console
Interlocked.Increment(ref currentTaskFinished);
}
static void SimulateAddingTasksInParallel(ThreadPoolConcurrentScheduler taskScheduler)
{
int runCount = 0;
Task.Factory.StartNew(() =>
{
while (true)
{
runCount++;
if (runCount > 5)
break;
//every 10 sec 5 times
Thread.Sleep(10000);
//adding new 5 tasks from outer task
Console.WriteLine("start adding new 5 tasks!");
for (var i = 1; i <= 5; i++)
{
taskScheduler.StartNew(myAsyncTask);
}
Console.WriteLine("new 5 tasks added!");
}
}, TaskCreationOptions.LongRunning);
}
static void RunTaskStateMonitor(ThreadPoolConcurrentScheduler taskScheduler)
{
int prev = -1;
int prevQueueSize = -1;
int prevFinished = -1;
Task.Factory.StartNew(() =>
{
while (true)
{
// getting current thread count in working state
var currCount = taskScheduler.GetCurrentWorkingThreadCount();
// getting inner queue state
var queueSize = taskScheduler.GetQueueTaskCount();
//just output overall state if something changed
if (prev != currCount || queueSize != prevQueueSize || prevFinished != currentTaskFinished)
{
Console.WriteLine($"Monitor : running tasks:{currCount}, queueLength:{queueSize}. total Finished tasks : " + currentTaskFinished);
prev = currCount;
prevQueueSize = queueSize;
prevFinished = currentTaskFinished;
}
// check it every 10 ms
Thread.Sleep(10);
}
}
, TaskCreationOptions.LongRunning);
}
Scheduler :
public class ThreadPoolConcurrentScheduler
{
private readonly int _limitParallelThreadsCount;
private int _threadInProgressCount = 0;
public delegate void onErrorDelegate(Exception ex);
public event onErrorDelegate OnError;
private ConcurrentQueue<Func<Task>> _taskQueue;
private readonly object _queueLocker = new object();
public ThreadPoolConcurrentScheduler(int limitParallelThreadsCount)
{
//set maximum parallel tasks to run
_limitParallelThreadsCount = limitParallelThreadsCount;
// thread-safe queue to store tasks
_taskQueue = new ConcurrentQueue<Func<Task>>();
}
//main method to start async task
public void StartNew(Func<Task> task)
{
lock (_queueLocker)
{
// checking limit
if (_threadInProgressCount >= _limitParallelThreadsCount)
{
//waiting new "free" threads in queue
_scheduleTask(task);
}
else
{
_startNewTask(task);
}
}
}
private void _startNewTask(Func<Task> task)
{
Interlocked.Increment(ref _threadInProgressCount);
Task.Run(async () =>
{
try
{
await task();
}
catch (Exception e)
{
//Console.WriteLine(e);
OnError?.Invoke(e);
}
}).ContinueWith(_onTaskEnded);
}
//will be called on task end
private void _onTaskEnded(Task task)
{
lock (_queueLocker)
{
Interlocked.Decrement(ref _threadInProgressCount);
//queue has more priority, so if thread is free - let's check queue first
if (!_taskQueue.IsEmpty)
{
if (_taskQueue.TryDequeue(out var result))
{
_startNewTask(result);
}
}
}
}
private void _scheduleTask(Func<Task> task)
{
_taskQueue.Enqueue(task);
}
//returning in progress task count
public int GetCurrentWorkingThreadCount()
{
return _threadInProgressCount;
}
//return number of tasks waiting to run
public int GetQueueTaskCount()
{
lock (_queueLocker) return _taskQueue.Count;
}
}
Few notes :
First - check comments to it, maybe it is the worst code ever!
Did not test in prod
Did not implement cancellation tokens and any other functionality, that should be there, but i'm too lazy. Sorry

Related

How can I get the result of my work in a Task

I need to do a work in a Task (infinite loop for monitoring) but how can I get the result of this work?
My logic to do this stuff i wrong? This is a scope problem I think.
There is an example simplified:
The variable is "first" and I want "edit"
namespace my{
public class Program{
public static void Main(string[] args){
Logic p = new Logic();
Task t = new Task(p.process);
t.Start();
Console.WriteLine(p.getVar());// result="first"
}
}
public class Logic{
public string test = "first";
public void process(){
while(true){
//If condition here
this.test = "edit";
}
}
public String getVar(){
return this.test;
}
}
}
It can be done using custom event. In your case it can be something like:
public event Action<string> OnValueChanged;
Then attach to it
p.OnValueChanged += (newValue) => Console.WriteLine(newValue);
And do not forget to fire it
this.test = "edit";
OnValueChanged?.Invoke(this.test);
Tasks aren't threads, they don't need a .Start call to start them. All examples and tutorials show the use of Task.Run or Task.StartNew for a reason - tasks are a promise that a function will execute at some point in the future and produce a result. They will run on threads pulled from a ThreadPool when a Task Scheduler decides they should. Creating cold tasks and calling .Start doesn't guarantee they will start, it simply makes the code a lot more difficult to read.
In the simplest case, polling eg a remote HTTP endpoint could be as simple as :
public static async Task Main()
{
var client=new HttpClient(serverUrl);
while(true)
{
var response=await client.GetAsync(relativeServiceUrl);
if(!response.IsSuccessStatusCode)
{
//That was an error, do something with it
}
await Task.Delay(1000);
}
}
There's no need to start a new Task because GetAsync is asynchronous. WCF and ADO.NET also provide asynchronous execution methods.
If there's no asynchronous method to call, or if we need to perform some heavey work before the async call, we can use Task.Run to start a method in parallel and await for it to finish:
public bool CheckThatService(string serviceUrl)
{
....
}
public static async Task Main()
{
var url="...";
//...
while(true)
{
var ok=Task.Run(()=>CheckThatService(url));
if(!ok)
{
//That was an error, do something with it
}
await Task.Delay(1000);
}
}
What if we want to test multiple systems in parallel? We can start multiple tasks in parallel, await all of them to complete and check their results:
public static async Task Main()
{
var urls=new[]{"...","..."};
//...
while(true)
{
var tasks=urls.Select(url=>Task.Run(()=>CheckThatService(url));
var responses=await Task.WhenAll(tasks);
foreach(var response in responses)
{
///Check the value, due something
}
await Task.Delay(1000);
}
}
Task.WhenAll returns an array with the results in the order the tasks were created. This allows checking the index to find the original URL. A better idea would be to return the result and url together, eg using tuples :
public static (bool ok,string url) CheckThatService(string serviceUrl)
{
....
return (true,url);
}
The code wouldn't change a lot:
var tasks=urls.Select(url=>Task.Run(()=>CheckThatService(url));
var responses=await Task.WhenAll(tasks);
foreach(var response in responses.Where(resp=>!resp.ok))
{
///Check the value, due something
}
What if we wanted to store the results from all the calls? We can't use a List or Queue because they aren't thread safe. We can use a ConcurrentQueue instead:
ConcurrentQueue<string> _results=new ConcurrentQueue<string>();
public static (bool ok,string url) CheckThatService(string serviceUrl)
{
....
_results.Enqueue(someresult);
return (true,url);
}
If we want to report progress regularly we can use IProgress<T> as shown in Enabling Progress and Cancellation in Async APIs.
We could put all the monitoring code in a separate method/class that accepts an IProgress< T> parameter with a progress object that can report success, error messages and the URL that caused them, eg :
class MonitorDTO
{
public string Url{get;set;}
public bool Success{get;set;}
public string Message{get;set;}
public MonitorDTO(string ulr,bool success,string msg)
{
//...
}
}
class MyMonitor
{
string[] _urls=url;
public MyMonitor(string[] urls)
{
_urls=url;
}
public Task Run(IProgress<MonitorDTO> progress)
{
while(true)
{
var ok=Task.Run(()=>CheckThatService(url));
if(!ok)
{
_progress.Report(new MonitorDTO(ok,url,"some message");
}
await Task.Delay(1000);
}
}
}
This class could be used in this way:
public static async Task Maim()
{
var ulrs=new[]{....};
var monitor=new MyMonitor(urls);
var progress=new Progress<MonitorDTO>(pg=>{
Console.WriteLine($"{pg.Success} for {pg.Url}: {pg.Message}");
});
await monitor.Run(progress);
}
Enabling Progress and Cancellation in Async APIs shows how to use the CancellationTokenSource to implement another important part of a monitoring class - cancelling it. The monitoring method could check the status of a cancellation token periodically and stop monitoring when it's raised:
public Task Run(IProgress<MonitorDTO> progress,CancellationToken ct)
{
while(!ct.IsCancellationRequested)
{
//...
}
}
public static async Task Maim()
{
var ulrs=new[]{....};
var monitor=new MyMonitor(urls);
var progress=new Progress<MonitorDTO>(pg=>{
Console.WriteLine($"{pg.Success} for {pg.Url}: {pg.Message}");
});
var cts = new CancellationTokenSource();
//Not awaiting yet!
var monitorTask=monitor.Run(progress,cts.Token);
//Keep running until the first keypress
Console.ReadKey();
//Cancel and wait for the monitoring class to gracefully stop
cts.Cancel();
await monitorTask;
In this case the loop will exit when the CancellationToken is raised. By not awaiting on MyMonitor.Run() we can keep working on the main thread until an event occurs that signals monitoring should stop.
The getVar method is executed before the process method.
Make sure that you wait until your task is finished before you call the getVar method.
Logic p = new Logic();
Task t = new Task(p.process);
t.Start();
t.Wait(); // Add this line!
Console.WriteLine(p.getVar());
If you want to learn more about the Wait method, please check this link.

Task scheduler: when awaiting in a Task.Factory.StartNew, is the thread returned to the pool?

I'm implementing a worker engine with an upper limit to concurrency. I'm using a semaphore to wait until concurrency drops below the maximum, then use Task.Factory.StartNew to wrap the async handler in a try/catch, with a finally which releases the semaphore.
I realise this creates threads on the thread pool - but my question is, when one of those task-running threads actually awaits (on a real IO call or wait handle), is the thread returned to the pool, as I'd hope it would be?
If there's a better way to implement a task scheduler with limited concurrency where the work handler is an async method (returns Task), I'd love to hear it too. Or, let's say ideally, if there's a way to queue up an async method (again, it's a Task-returning async method) that feels less dodgy than wrapping it in a synchronous delegate and passing it to Task.Factory.StartNew, that would seem perfect..?
(This also makes me think that there are two kinds of parallelism here: how many tasks are being processed overall, but also how many continuations are running on different threads concurrently. Might be cool to have configurable options for both, though not a fixed requirement..)
Edit: snippet:
concurrencySemaphore.Wait(cancelToken);
deferRelease = false;
try
{
var result = GetWorkItem();
if (result == null)
{ // no work, wait for new work or exit signal
signal = WaitHandle.WaitAny(signals);
continue;
}
deferRelease = true;
tasks.Add(Task.Factory.StartNew(() =>
{
try
{
DoWorkHereAsync(result); // guess I'd think to .GetAwaiter().GetResult() here.. not run this yet
}
finally
{
concurrencySemaphore.Release();
}
}, cancelToken));
}
finally
{
if (!deferRelease)
{
concurrencySemaphore.Release();
}
}
Here an example of a TaskWorker, that will not produce countless worker threads.
The magic is done by awaiting SemaphoreSlim.WaitAsync() which is an IO task (and there is no thread).
class TaskWorker
{
private readonly SemaphoreSlim _semaphore;
public TaskWorker(int maxDegreeOfParallelism)
{
if (maxDegreeOfParallelism <= 0)
{
throw new ArgumentOutOfRangeException(nameof(maxDegreeOfParallelism));
}
_semaphore = new SemaphoreSlim(maxDegreeOfParallelism, maxDegreeOfParallelism);
}
public async Task RunAsync(Func<Task> taskFactory, CancellationToken cancellationToken = default(CancellationToken))
{
// No ConfigureAwait(false) here to keep the SyncContext if any
// for the real task
await _semaphore.WaitAsync(cancellationToken);
try
{
await taskFactory().ConfigureAwait(false);
}
finally
{
_semaphore.Release(1);
}
}
public async Task<T> RunAsync<T>(Func<Task<T>> taskFactory, CancellationToken cancellationToken = default(CancellationToken))
{
await _semaphore.WaitAsync(cancellationToken);
try
{
return await taskFactory().ConfigureAwait(false);
}
finally
{
_semaphore.Release(1);
}
}
}
and a simple console app to test
class Program
{
static void Main(string[] args)
{
var worker = new TaskWorker(1);
var cts = new CancellationTokenSource();
var token = cts.Token;
var tasks = Enumerable.Range(1, 10)
.Select(e => worker.RunAsync(() => SomeWorkAsync(e, token), token))
.ToArray();
Task.WhenAll(tasks).GetAwaiter().GetResult();
}
static async Task SomeWorkAsync(int id, CancellationToken cancellationToken)
{
Console.WriteLine($"Some Started {id}");
await Task.Delay(2000, cancellationToken).ConfigureAwait(false);
Console.WriteLine($"Some Finished {id}");
}
}
Update
TaskWorker implementing IDisposable
class TaskWorker : IDisposable
{
private readonly CancellationTokenSource _cts = new CancellationTokenSource();
private readonly SemaphoreSlim _semaphore;
private readonly int _maxDegreeOfParallelism;
public TaskWorker(int maxDegreeOfParallelism)
{
if (maxDegreeOfParallelism <= 0)
{
throw new ArgumentOutOfRangeException(nameof(maxDegreeOfParallelism));
}
_maxDegreeOfParallelism = maxDegreeOfParallelism;
_semaphore = new SemaphoreSlim(maxDegreeOfParallelism, maxDegreeOfParallelism);
}
public async Task RunAsync(Func<Task> taskFactory, CancellationToken cancellationToken = default(CancellationToken))
{
ThrowIfDisposed();
using (var cts = CancellationTokenSource.CreateLinkedTokenSource(cancellationToken, _cts.Token))
{
// No ConfigureAwait(false) here to keep the SyncContext if any
// for the real task
await _semaphore.WaitAsync(cts.Token);
try
{
await taskFactory().ConfigureAwait(false);
}
finally
{
_semaphore.Release(1);
}
}
}
public async Task<T> RunAsync<T>(Func<Task<T>> taskFactory, CancellationToken cancellationToken = default(CancellationToken))
{
ThrowIfDisposed();
using (var cts = CancellationTokenSource.CreateLinkedTokenSource(cancellationToken, _cts.Token))
{
await _semaphore.WaitAsync(cts.Token);
try
{
return await taskFactory().ConfigureAwait(false);
}
finally
{
_semaphore.Release(1);
}
}
}
private void ThrowIfDisposed()
{
if (disposedValue)
{
throw new ObjectDisposedException(this.GetType().FullName);
}
}
#region IDisposable Support
private bool disposedValue = false;
protected virtual void Dispose(bool disposing)
{
if (!disposedValue)
{
if (disposing)
{
_cts.Cancel();
// consume all semaphore slots
for (int i = 0; i < _maxDegreeOfParallelism; i++)
{
_semaphore.WaitAsync().GetAwaiter().GetResult();
}
_semaphore.Dispose();
_cts.Dispose();
}
disposedValue = true;
}
}
public void Dispose()
{
Dispose(true);
}
#endregion
}
You can think that thread is returned to a ThreadPool even thought it is not actauly a return. The thread simply picks next queued item when async operation starts.
I would suggest you to look at Task.Run instead of Task.Factory.StartNew Task.Run vs Task.Factory.StartNew.
And also have a look at TPL DataFlow. I think it will match your requirements.

Queue of async tasks with throttling which supports muti-threading

I need to implement a library to request vk.com API. The problem is that API supports only 3 requests per second. I would like to have API asynchronous.
Important: API should support safe accessing from multiple threads.
My idea is implement some class called throttler which allow no more than 3 request/second and delay other request.
The interface is next:
public interface IThrottler : IDisposable
{
Task<TResult> Throttle<TResult>(Func<Task<TResult>> task);
}
The usage is like
var audio = await throttler.Throttle(() => api.MyAudio());
var messages = await throttler.Throttle(() => api.ReadMessages());
var audioLyrics = await throttler.Throttle(() => api.AudioLyrics(audioId));
/// Here should be delay because 3 requests executed
var photo = await throttler.Throttle(() => api.MyPhoto());
How to implement throttler?
Currently I implemented it as queue which is processed by background thread.
public Task<TResult> Throttle<TResult>(Func<Task<TResult>> task)
{
/// TaskRequest has method Run() to run task
/// TaskRequest uses TaskCompletionSource to provide new task
/// which is resolved when queue processed til this element.
var request = new TaskRequest<TResult>(task);
requestQueue.Enqueue(request);
return request.ResultTask;
}
This is shorten code of background thread loop which process the queue:
private void ProcessQueue(object state)
{
while (true)
{
IRequest request;
while (requestQueue.TryDequeue(out request))
{
/// Delay method calculates actual delay value and calls Thread.Sleep()
Delay();
request.Run();
}
}
}
Is it possible to implement this without background thread?
So we'll start out with a solution to a simpler problem, that of creating a queue that process up to N tasks concurrently, rather than throttling to N tasks started per second, and build on that:
public class TaskQueue
{
private SemaphoreSlim semaphore;
public TaskQueue()
{
semaphore = new SemaphoreSlim(1);
}
public TaskQueue(int concurrentRequests)
{
semaphore = new SemaphoreSlim(concurrentRequests);
}
public async Task<T> Enqueue<T>(Func<Task<T>> taskGenerator)
{
await semaphore.WaitAsync();
try
{
return await taskGenerator();
}
finally
{
semaphore.Release();
}
}
public async Task Enqueue(Func<Task> taskGenerator)
{
await semaphore.WaitAsync();
try
{
await taskGenerator();
}
finally
{
semaphore.Release();
}
}
}
We'll also use the following helper methods to match the result of a TaskCompletionSource to a `Task:
public static void Match<T>(this TaskCompletionSource<T> tcs, Task<T> task)
{
task.ContinueWith(t =>
{
switch (t.Status)
{
case TaskStatus.Canceled:
tcs.SetCanceled();
break;
case TaskStatus.Faulted:
tcs.SetException(t.Exception.InnerExceptions);
break;
case TaskStatus.RanToCompletion:
tcs.SetResult(t.Result);
break;
}
});
}
public static void Match<T>(this TaskCompletionSource<T> tcs, Task task)
{
Match(tcs, task.ContinueWith(t => default(T)));
}
Now for our actual solution what we can do is each time we need to perform a throttled operation we create a TaskCompletionSource, and then go into our TaskQueue and add an item that starts the task, matches the TCS to its result, doesn't await it, and then delays the task queue for 1 second. The task queue will then not allow a task to start until there are no longer N tasks started in the past second, while the result of the operation itself is the same as the create Task:
public class Throttler
{
private TaskQueue queue;
public Throttler(int requestsPerSecond)
{
queue = new TaskQueue(requestsPerSecond);
}
public Task<T> Enqueue<T>(Func<Task<T>> taskGenerator)
{
TaskCompletionSource<T> tcs = new TaskCompletionSource<T>();
var unused = queue.Enqueue(() =>
{
tcs.Match(taskGenerator());
return Task.Delay(TimeSpan.FromSeconds(1));
});
return tcs.Task;
}
public Task Enqueue<T>(Func<Task> taskGenerator)
{
TaskCompletionSource<bool> tcs = new TaskCompletionSource<bool>();
var unused = queue.Enqueue(() =>
{
tcs.Match(taskGenerator());
return Task.Delay(TimeSpan.FromSeconds(1));
});
return tcs.Task;
}
}
I solved a similar problem using a wrapper around SemaphoreSlim. In my scenario, I had some other throttling mechanisms as well, and I needed to make sure that requests didn't hit the external API too often even if request number 1 took longer to reach the API than request number 3. My solution was to use a wrapper around SemaphoreSlim that had to be released by the caller, but the actual SemaphoreSlim would not be released until a set time had passed.
public class TimeGatedSemaphore
{
private readonly SemaphoreSlim semaphore;
public TimeGatedSemaphore(int maxRequest, TimeSpan minimumHoldTime)
{
semaphore = new SemaphoreSlim(maxRequest);
MinimumHoldTime = minimumHoldTime;
}
public TimeSpan MinimumHoldTime { get; }
public async Task<IDisposable> WaitAsync()
{
await semaphore.WaitAsync();
return new InternalReleaser(semaphore, Task.Delay(MinimumHoldTime));
}
private class InternalReleaser : IDisposable
{
private readonly SemaphoreSlim semaphoreToRelease;
private readonly Task notBeforeTask;
public InternalReleaser(SemaphoreSlim semaphoreSlim, Task dependantTask)
{
semaphoreToRelease = semaphoreSlim;
notBeforeTask = dependantTask;
}
public void Dispose()
{
notBeforeTask.ContinueWith(_ => semaphoreToRelease.Release());
}
}
}
Example usage:
private TimeGatedSemaphore requestThrottler = new TimeGatedSemaphore(3, TimeSpan.FromSeconds(1));
public async Task<T> MyRequestSenderHelper(string endpoint)
{
using (await requestThrottler.WaitAsync())
return await SendRequestToAPI(endpoint);
}
Here is one solution that uses a Stopwatch:
public class Throttler : IThrottler
{
private readonly Stopwatch m_Stopwatch;
private int m_NumberOfRequestsInLastSecond;
private readonly int m_MaxNumberOfRequestsPerSecond;
public Throttler(int max_number_of_requests_per_second)
{
m_MaxNumberOfRequestsPerSecond = max_number_of_requests_per_second;
m_Stopwatch = Stopwatch.StartNew();
}
public async Task<TResult> Throttle<TResult>(Func<Task<TResult>> task)
{
var elapsed = m_Stopwatch.Elapsed;
if (elapsed > TimeSpan.FromSeconds(1))
{
m_NumberOfRequestsInLastSecond = 1;
m_Stopwatch.Restart();
return await task();
}
if (m_NumberOfRequestsInLastSecond >= m_MaxNumberOfRequestsPerSecond)
{
TimeSpan time_to_wait = TimeSpan.FromSeconds(1) - elapsed;
await Task.Delay(time_to_wait);
m_NumberOfRequestsInLastSecond = 1;
m_Stopwatch.Restart();
return await task();
}
m_NumberOfRequestsInLastSecond++;
return await task();
}
}
Here is how this code can be tested:
class Program
{
static void Main(string[] args)
{
DoIt();
Console.ReadLine();
}
static async Task DoIt()
{
Func<Task<int>> func = async () =>
{
await Task.Delay(100);
return 1;
};
Throttler throttler = new Throttler(3);
for (int i = 0; i < 10; i++)
{
var result = await throttler.Throttle(func);
Console.WriteLine(DateTime.Now);
}
}
}
You can use this as Generic
public TaskThrottle(int maxTasksToRunInParallel)
{
_semaphore = new SemaphoreSlim(maxTasksToRunInParallel);
}
public void TaskThrottler<T>(IEnumerable<Task<T>> tasks, int timeoutInMilliseconds, CancellationToken cancellationToken = default(CancellationToken)) where T : class
{
// Get Tasks as List
var taskList = tasks as IList<Task<T>> ?? tasks.ToList();
var postTasks = new List<Task<int>>();
// When the first task completed, it will flag
taskList.ForEach(x =>
{
postTasks.Add(x.ContinueWith(y => _semaphore.Release(), cancellationToken));
});
taskList.ForEach(x =>
{
// Wait for open slot
_semaphore.Wait(timeoutInMilliseconds, cancellationToken);
cancellationToken.ThrowIfCancellationRequested();
x.Start();
});
Task.WaitAll(taskList.ToArray(), cancellationToken);
}
Edit: this solution works but use it only if it is ok to process all request in serial (in one thread). Otherwise use solution accepted as answer.
Well, thanks to Best way in .NET to manage queue of tasks on a separate (single) thread
My question is almost duplicate except adding delay before execution, which is actually simple.
The main helper here is SemaphoreSlim class which allows to restrict degree of parallelism.
So, first create a semaphore:
// Semaphore allows run 1 thread concurrently.
private readonly SemaphoreSlim semaphore = new SemaphoreSlim(1, 1);
And, final version of throttle looks like
public async Task<TResult> Throttle<TResult>(Func<Task<TResult>> task)
{
await semaphore.WaitAsync();
try
{
await delaySource.Delay();
return await task();
}
finally
{
semaphore.Release();
}
}
Delay source is also pretty simple:
private class TaskDelaySource
{
private readonly int maxTasks;
private readonly TimeSpan inInterval;
private readonly Queue<long> ticks = new Queue<long>();
public TaskDelaySource(int maxTasks, TimeSpan inInterval)
{
this.maxTasks = maxTasks;
this.inInterval = inInterval;
}
public async Task Delay()
{
// We will measure time of last maxTasks tasks.
while (ticks.Count > maxTasks)
ticks.Dequeue();
if (ticks.Any())
{
var now = DateTime.UtcNow.Ticks;
var lastTick = ticks.First();
// Calculate interval between last maxTasks task and current time
var intervalSinceLastTask = TimeSpan.FromTicks(now - lastTick);
if (intervalSinceLastTask < inInterval)
await Task.Delay((int)(inInterval - intervalSinceLastTask).TotalMilliseconds);
}
ticks.Enqueue(DateTime.UtcNow.Ticks);
}
}

Background thread loop and two-way-communication

I already used BackgroundWorker and Task to do something in the background and after it posting it back to the UI. I even used BackgroundWorker and ReportProgress with an endless-loop (beside cancellation) to continuously post things to the UI thread.
But this time I need a more controllable scenario:
The background thread continuously polls other systems. With Invoke it can send updates the the UI. But how can the UI send message to the background thread? Like changed settings.
In fact I am asking for the best .NET practice to have a worker thread with these specifics:
Runs in background, does not block UI
Can send updates to UI (Invoke, Dispatch)
Runs in endless loop but can be paused, resumed and stopped in a proper way
UI thread can send updated settings to the background thread
In my scenario I still use WinForms but I guess it should not matter? I will convert the application to WPF later.
Which best practice do you suggest?
I would use TPL and a custom task scheduler for this, similar to Stephen Toub's StaTaskScheduler. That's what WorkerWithTaskScheduler implements below. In this case, the worker thread is also a task scheduler, which can run arbitrary Task items (with ExecutePendingTasks) while doing the work on its main loop.
Executing a lambda wrapped as a TPL Task on the worker thread's context is a very convenient way to send the worker thread a message and get back the result. This can be done synchrounsly with WorkerWithTaskScheduler.Run().Wait/Result or asynchronously with await WorkerWithTaskScheduler.Run(). Note how ContinueExecution and WaitForPendingTasks are used to pause/resume/end the worker's main loop. I hope the code is self-explanatory, but let me know if I should clarify anything.
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
namespace Console_21628490
{
// Test
class Program
{
static async Task DoWorkAsync()
{
Console.WriteLine("Initial thread: " + Thread.CurrentThread.ManagedThreadId);
// the worker thread lambda
Func<WorkerWithTaskScheduler<int>, int> workAction = (worker) =>
{
var result = 0;
Console.WriteLine("Worker thread: " + Thread.CurrentThread.ManagedThreadId);
while (worker.ContinueExecution)
{
// observe cancellation
worker.Token.ThrowIfCancellationRequested();
// executed pending tasks scheduled with WorkerWithTaskScheduler.Run
worker.ExecutePendingTasks();
// do the work item
Thread.Sleep(200); // simulate work payload
result++;
Console.Write("\rDone so far: " + result);
if (result > 100)
break; // done after 100 items
}
return result;
};
try
{
// cancel in 30s
var cts = new CancellationTokenSource(30000);
// start the worker
var worker = new WorkerWithTaskScheduler<int>(workAction, cts.Token);
// pause upon Enter
Console.WriteLine("\nPress Enter to pause...");
Console.ReadLine();
worker.WaitForPendingTasks = true;
// resume upon Enter
Console.WriteLine("\nPress Enter to resume...");
Console.ReadLine();
worker.WaitForPendingTasks = false;
// send a "message", i.e. run a lambda inside the worker thread
var response = await worker.Run(() =>
{
// do something in the context of the worker thread
return Thread.CurrentThread.ManagedThreadId;
}, cts.Token);
Console.WriteLine("\nReply from Worker thread: " + response);
// End upon Enter
Console.WriteLine("\nPress Enter to stop...");
Console.ReadLine();
// worker.EndExecution() to get the result gracefully
worker.ContinueExecution = false; // or worker.Cancel() to throw
var result = await worker.WorkerTask;
Console.Write("\nFinished, result: " + result);
}
catch (Exception ex)
{
while (ex is AggregateException)
ex = ex.InnerException;
Console.WriteLine(ex.Message);
}
}
static void Main(string[] args)
{
DoWorkAsync().Wait();
Console.WriteLine("\nPress Enter to Exit.");
Console.ReadLine();
}
}
//
// WorkerWithTaskScheduler
//
public class WorkerWithTaskScheduler<TResult> : TaskScheduler, IDisposable
{
readonly CancellationTokenSource _workerCts;
Task<TResult> _workerTask;
readonly BlockingCollection<Task> _pendingTasks;
Thread _workerThread;
volatile bool _continueExecution = true;
volatile bool _waitForTasks = false;
// start the main loop
public WorkerWithTaskScheduler(
Func<WorkerWithTaskScheduler<TResult>, TResult> executeMainLoop,
CancellationToken token)
{
_pendingTasks = new BlockingCollection<Task>();
_workerCts = CancellationTokenSource.CreateLinkedTokenSource(token);
_workerTask = Task.Factory.StartNew<TResult>(() =>
{
_workerThread = Thread.CurrentThread;
return executeMainLoop(this);
}, _workerCts.Token, TaskCreationOptions.LongRunning, TaskScheduler.Default);
}
// a sample action for WorkerWithTaskScheduler constructor
public static void ExecuteMainLoop(WorkerWithTaskScheduler<TResult> worker)
{
while (!worker.ContinueExecution)
{
worker.Token.ThrowIfCancellationRequested();
worker.ExecutePendingTasks();
}
}
// get the Task
public Task<TResult> WorkerTask
{
get { return _workerTask; }
}
// get CancellationToken
public CancellationToken Token
{
get { return _workerCts.Token; }
}
// check/set if the main loop should continue
public bool ContinueExecution
{
get { return _continueExecution; }
set { _continueExecution = value; }
}
// request cancellation
public void Cancel()
{
_workerCts.Cancel();
}
// check if we're on the correct thread
public void VerifyWorkerThread()
{
if (Thread.CurrentThread != _workerThread)
throw new InvalidOperationException("Invalid thread.");
}
// check if the worker task itself is still alive
public void VerifyWorkerTask()
{
if (_workerTask == null || _workerTask.IsCompleted)
throw new InvalidOperationException("The worker thread has ended.");
}
// make ExecutePendingTasks block or not block
public bool WaitForPendingTasks
{
get { return _waitForTasks; }
set
{
_waitForTasks = value;
if (value) // wake it up
Run(() => { }, this.Token);
}
}
// execute all pending tasks and return
public void ExecutePendingTasks()
{
VerifyWorkerThread();
while (this.ContinueExecution)
{
this.Token.ThrowIfCancellationRequested();
Task item;
if (_waitForTasks)
{
item = _pendingTasks.Take(this.Token);
}
else
{
if (!_pendingTasks.TryTake(out item))
break;
}
TryExecuteTask(item);
}
}
//
// TaskScheduler methods
//
protected override void QueueTask(Task task)
{
_pendingTasks.Add(task);
}
protected override IEnumerable<Task> GetScheduledTasks()
{
return _pendingTasks.ToArray();
}
protected override bool TryExecuteTaskInline(
Task task, bool taskWasPreviouslyQueued)
{
return _workerThread == Thread.CurrentThread &&
TryExecuteTask(task);
}
public override int MaximumConcurrencyLevel
{
get { return 1; }
}
public void Dispose()
{
if (_workerTask != null)
{
_workerCts.Cancel();
_workerTask.Wait();
_pendingTasks.Dispose();
_workerTask = null;
}
}
//
// Task.Factory.StartNew wrappers using this task scheduler
//
public Task Run(Action action, CancellationToken token)
{
VerifyWorkerTask();
return Task.Factory.StartNew(action, token, TaskCreationOptions.None, this);
}
public Task<T> Run<T>(Func<T> action, CancellationToken token)
{
VerifyWorkerTask();
return Task.Factory.StartNew(action, token, TaskCreationOptions.None, this);
}
public Task Run(Func<Task> action, CancellationToken token)
{
VerifyWorkerTask();
return Task.Factory.StartNew(action, token, TaskCreationOptions.None, this).Unwrap();
}
public Task<T> Run<T>(Func<Task<T>> action, CancellationToken token)
{
VerifyWorkerTask();
return Task.Factory.StartNew(action, token, TaskCreationOptions.None, this).Unwrap();
}
}
}
To implement worker-to-client notifications, you can use the IProgress<T> pattern (example of this).
First thing that comes to mind, and the cleanest approach imo is to have the background thread method that is continuously running be an instance method of a class. This class instance can then expose properties/methods that allow others to change state (e.g. through the UI) - some locking may be required since you are reading/updating state from different threads.

What is correct way to combine long-running tasks with async / await pattern?

I have a "High-Precision" timer class that I need to be able to be start, stop & pause / resume. To do this, I'm tying together a couple of different examples I found on the internet, but I'm not sure if I'm using Tasks with asnyc / await correctly.
Here is my relevant code:
//based on http://haukcode.wordpress.com/2013/01/29/high-precision-timer-in-netc/
public class HighPrecisionTimer : IDisposable
{
Task _task;
CancellationTokenSource _cancelSource;
//based on http://blogs.msdn.com/b/pfxteam/archive/2013/01/13/cooperatively-pausing-async-methods.aspx
PauseTokenSource _pauseSource;
Stopwatch _watch;
Stopwatch Watch { get { return _watch ?? (_watch = Stopwatch.StartNew()); } }
public bool IsPaused
{
get { return _pauseSource != null && _pauseSource.IsPaused; }
private set
{
if (value)
{
_pauseSource = new PauseTokenSource();
}
else
{
_pauseSource.IsPaused = false;
}
}
}
public bool IsRunning { get { return !IsPaused && _task != null && _task.Status == TaskStatus.Running; } }
public void Start()
{
if (IsPaused)
{
IsPaused = false;
}
else if (!IsRunning)
{
_cancelSource = new CancellationTokenSource();
_task = new Task(ExecuteAsync, _cancelSource.Token, TaskCreationOptions.LongRunning);
_task.Start();
}
}
public void Stop()
{
if (_cancelSource != null)
{
_cancelSource.Cancel();
}
}
public void Pause()
{
if (!IsPaused)
{
if (_watch != null)
{
_watch.Stop();
}
}
IsPaused = !IsPaused;
}
async void ExecuteAsync()
{
while (!_cancelSource.IsCancellationRequested)
{
if (_pauseSource != null && _pauseSource.IsPaused)
{
await _pauseSource.Token.WaitWhilePausedAsync();
}
// DO CUSTOM TIMER STUFF...
}
if (_watch != null)
{
_watch.Stop();
_watch = null;
}
_cancelSource = null;
_pauseSource = null;
}
public void Dispose()
{
if (IsRunning)
{
_cancelSource.Cancel();
}
}
}
Can anyone please take a look and provide me some pointers on whether I'm doing this correctly?
UPDATE
I have tried modifying my code per Noseratio's comments below, but I still cannot figure out the syntax. Every attempt to pass the ExecuteAsync() method to either TaskFactory.StartNew or Task.Run, results in a compilation error like the following:
"The call is ambiguous between the following methods or properties: TaskFactory.StartNew(Action, CancellationToken...) and TaskFactory.StartNew<Task>(Func<Task>, CancellationToken...)".
Finally, is there a way to specify the LongRunning TaskCreationOption without having to provide a TaskScheduler?
async **Task** ExecuteAsync()
{
while (!_cancelSource.IsCancellationRequested)
{
if (_pauseSource != null && _pauseSource.IsPaused)
{
await _pauseSource.Token.WaitWhilePausedAsync();
}
//...
}
}
public void Start()
{
//_task = Task.Factory.StartNew(ExecuteAsync, _cancelSource.Token, TaskCreationOptions.LongRunning, null);
//_task = Task.Factory.StartNew(ExecuteAsync, _cancelSource.Token);
//_task = Task.Run(ExecuteAsync, _cancelSource.Token);
}
UPDATE 2
I think I've narrowed this down, but still not sure about the correct syntax. Would this be the right way to create the task so that the consumer / calling code continues on, with the task spinning-up and starting on a new asynchronous thread?
_task = Task.Run(async () => await ExecuteAsync, _cancelSource.Token);
//**OR**
_task = Task.Factory.StartNew(async () => await ExecuteAsync, _cancelSource.Token, TaskCreationOptions.LongRunning, TaskScheduler.Default);
Here are some points:
async void methods are only good for asynchronous event handlers (more info). Your async void ExecuteAsync() returns instantly (as soon as the code flow reaches await _pauseSource inside it). Essentially, your _task is in the completed state after that, while the rest of ExecuteAsync will be executed unobserved (because it's void). It may even not continue executing at all, depending on when your main thread (and thus, the process) terminates.
Given that, you should make it async Task ExecuteAsync(), and use Task.Run or Task.Factory.StartNew instead of new Task to start it. Because you want your task's action method be async, you'd be dealing with nested tasks here, i.e. Task<Task>, which Task.Run would automatically unwrap for you. More info can be found here and here.
PauseTokenSource takes the following approach (by design, AFAIU): the consumer side of the code (the one which calls Pause) actually only requests a pause, but doesn't synchronize on it. It will continue executing after Pause, even though the producer side may not have reached the awaiting state yet, i.e. await _pauseSource.Token.WaitWhilePausedAsync(). This may be ok for your app logic, but you should be aware of it. More info here.
[UPDATE] Below is the correct syntax for using Factory.StartNew. Note Task<Task> and task.Unwrap. Also note _task.Wait() in Stop, it's there to make sure the task has completed when Stop returns (in a way similar to Thread.Join). Also, TaskScheduler.Default is used to instruct Factory.StartNew to use the thread pool scheduler. This is important if your create your HighPrecisionTimer object from inside another task, which in turn was created on a thread with non-default synchronization context, e.g. a UI thread (more info here and here).
using System;
using System.Threading;
using System.Threading.Tasks;
namespace ConsoleApplication
{
public class HighPrecisionTimer
{
Task _task;
CancellationTokenSource _cancelSource;
public void Start()
{
_cancelSource = new CancellationTokenSource();
Task<Task> task = Task.Factory.StartNew(
function: ExecuteAsync,
cancellationToken: _cancelSource.Token,
creationOptions: TaskCreationOptions.LongRunning,
scheduler: TaskScheduler.Default);
_task = task.Unwrap();
}
public void Stop()
{
_cancelSource.Cancel(); // request the cancellation
_task.Wait(); // wait for the task to complete
}
async Task ExecuteAsync()
{
Console.WriteLine("Enter ExecuteAsync");
while (!_cancelSource.IsCancellationRequested)
{
await Task.Delay(42); // for testing
// DO CUSTOM TIMER STUFF...
}
Console.WriteLine("Exit ExecuteAsync");
}
}
class Program
{
public static void Main()
{
var highPrecisionTimer = new HighPrecisionTimer();
Console.WriteLine("Start timer");
highPrecisionTimer.Start();
Thread.Sleep(2000);
Console.WriteLine("Stop timer");
highPrecisionTimer.Stop();
Console.WriteLine("Press Enter to exit...");
Console.ReadLine();
}
}
}
I'm adding code for running long running task (infinite with cancelation) with internal sub tasks:
Task StartLoop(CancellationToken cancellationToken)
{
return Task.Factory.StartNew(async () => {
while (true)
{
if (cancellationToken.IsCancellationRequested)
break;
await _taskRunner.Handle(cancellationToken);
await Task.Delay(TimeSpan.FromMilliseconds(100), cancellationToken);
}
},
cancellationToken,
TaskCreationOptions.LongRunning,
TaskScheduler.Default);
}

Categories

Resources