We are creating a shared WCF channel to use with an async operation:
var channelFactory = new ChannelFactory<IWcfService>(new NetTcpBinding {TransferMode = TransferMode.Buffered});
channelFactory.Endpoint.Behaviors.Add(new DispatcherSynchronizationBehavior(true, 25));
var channel = channelFactory.CreateChannel(new EndpointAddress(new Uri("net.tcp://localhost:80/Service").AbsoluteUri + "/Test"));
This calls the following service:
[ServiceContract]
public interface IWcfService
{
[OperationContract]
Task<MyClass> DoSomethingAsync();
}
[ServiceBehavior(ConcurrencyMode = ConcurrencyMode.Multiple, InstanceContextMode = InstanceContextMode.PerCall)]
public class WcfServiceImpl : IWcfService
{
public Task<MyClass> DoSomethingAsync()
{
Thread.Sleep(4000);
return Task.FromResult(new MyClass());
}
}
[Serializable]
public class MyClass
{
public string SomeString { get; set; }
public MyClass Related { get; set; }
public int[] Numbers { get; set; }
}
If we start 3 requests at once and simulate a long running task on the response:
using ((IDisposable)channel)
{
var task1 = Task.Run(async () => await DoStuffAsync(channel));
var task2 = Task.Run(async () => await DoStuffAsync(channel));
var task3 = Task.Run(async () => await DoStuffAsync(channel));
Task.WaitAll(task1, task2, task3);
}
}
public static async Task DoStuffAsync(IWcfService channel)
{
await channel.DoSomethingAsync();
Console.WriteLine("Response");
// Simulate long running CPU bound operation
Thread.Sleep(5000);
Console.WriteLine("Wait completed");
}
Then all 3 requests reach the server concurrently, it then responds to all 3 requests at the same time.
However once the response reaches the client it processes each in turn.
Response
// 5 second delay
Wait completed
// Instant
Response
// 5 second delay
Wait completed
// Instant
Response
The responses resume on different threads but only runs 1 per time.
If we use streaming instead of buffered we get the expected behaviour, the client processes all 3 responses concurrently.
We have tried setting max buffer size, using DispatcherSynchronizationBehaviour, different concurrency modes, toggling sessions, ConfigureAwait false and calling channel.Open() explicitly.
There seems to be no way to get proper concurrent responses on a shared session.
Edit
I have added an image of what I believe to be happening, this only happens in Buffered mode, in streamed mode the main thread does not block.
#Underscore
I was trying to solve exact same problem recently. Although, I wasn't able to identify exactly why TransferMode.Buffered is causing what seems to be a global lock on a WCF channel until the thread that was using it gets released, I've found this similar issue deadlock after awaiting. They suggest a workaround which is to add RunContinuationsAsynchronously() to your awaits i.e. await channel.DoSomethingAsync().RunContinuationsAsynchronously() where RunContinuationsAsynchronously():
public static class TaskExtensions
{
public static Task<T> RunContinuationsAsynchronously<T>(this Task<T> task)
{
var tcs = new TaskCompletionSource<T>();
task.ContinueWith((t, o) =>
{
if (t.IsFaulted)
{
if (t.Exception != null) tcs.SetException(t.Exception.InnerExceptions);
}
else if (t.IsCanceled)
{
tcs.SetCanceled();
}
else
{
tcs.SetResult(t.Result);
}
}, TaskContinuationOptions.ExecuteSynchronously, TaskScheduler.Default);
return tcs.Task;
}
public static Task RunContinuationsAsynchronously(this Task task)
{
var tcs = new TaskCompletionSource<object>();
task.ContinueWith((t, o) =>
{
if (t.IsFaulted)
{
if (t.Exception != null) tcs.SetException(t.Exception.InnerExceptions);
}
else if (t.IsCanceled)
{
tcs.SetCanceled();
}
else
{
tcs.SetResult(null);
}
}, TaskContinuationOptions.ExecuteSynchronously, TaskScheduler.Default);
return tcs.Task;
}
}
Which separates WCF continuations. Apparently Task.Yield() works too.
It would be nice to actually understand why this is happening though.
Related
I have a console application in which I need to retrieve some data from 4 different sites. I placed each HTTP request in a task and I wait for them all to complete.
It was working when I only had to get data from 2 sites. but then I needed to add other sources of data and when adding 3 or more requests, the Task.WaitAll() hangs.
Below is my code.
The reason I ended up using Task.WaitAll() was because I need to stop and prevent the console application from exiting - i.e. I need to perform other tasks only after all the HTTP requests come back with data.
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
using System.Linq;
using System.Net;
using System.Text;
using System.Threading.Tasks;
namespace ConsoleApp1
{
class Program
{
static Task[] tasks = new Task[3];
static void Main(string[] args)
{
try
{
Run();
}
catch (System.Exception ex)
{
}
}
public static async void Run()
{
//works when using one or two tasks
tasks[0] = HttpExtensions.GetMyData("http://www.w3.org/TR/PNG/iso_8859-1.txt");
tasks[1] = HttpExtensions.GetMyData("http://www.w3.org/TR/PNG/iso_8859-1.txt");
//fails when add 3 or more task
tasks[2] = HttpExtensions.GetMyData("http://www.w3.org/TR/PNG/iso_8859-1.txt");
//tasks[3] = HttpExtensions.GetMyData("http://www.w3.org/TR/PNG/iso_8859-1.txt");
Task.WaitAll(tasks);
var result4 = ((Task<Stream>)tasks[2]).Result;
}
}
public static class HttpExtensions
{
public static Stopwatch sw;
public static long http_ticks = 0;
public static Task<HttpWebResponse> GetResponseAsync(this HttpWebRequest request)
{
var taskComplete = new TaskCompletionSource<HttpWebResponse>();
request.BeginGetResponse(asyncResponse =>
{
try
{
HttpWebRequest responseRequest = (HttpWebRequest)asyncResponse.AsyncState;
HttpWebResponse someResponse = (HttpWebResponse)responseRequest.EndGetResponse(asyncResponse);
taskComplete.TrySetResult(someResponse);
}
catch (WebException webExc)
{
HttpWebResponse failedResponse = (HttpWebResponse)webExc.Response;
taskComplete.TrySetResult(failedResponse);
}
}, request);
return taskComplete.Task;
}
public static async Task<Stream> GetMyData(string urlToCall)
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(urlToCall);
request.Method = HttpMethod.Get;
HttpWebResponse response = (HttpWebResponse)await request.GetResponseAsync();
//using (var sr = new StreamReader(response.GetResponseStream()))
//{
return response.GetResponseStream();
//}
}
}
public static class HttpMethod
{
public static string Head { get { return "HEAD"; } }
public static string Post { get { return "POST"; } }
public static string Put { get { return "PUT"; } }
public static string Get { get { return "GET"; } }
public static string Delete { get { return "DELETE"; } }
public static string Trace { get { return "TRACE"; } }
public static string Options { get { return "OPTIONS"; } }
public static string Connect { get { return "CONNECT"; } }
public static string Patch { get { return "PATCH"; } }
}
}
There a number of concerns.
First, as I mentioned in the comments above, by not returning a Task you are more or less hanging your application since it can't tell when the Task is completed.
However, once you change the Run() method to return a task, you need to invoke it via a Task.Run call in your Main method.
Second, you are over-complicating your code by using WebClient. Switch to HttpClient and take advantage of its natural async/await API.
Third, you aren't actually awaiting anything in your Run() method so changing it to a task does nothing since you aren't awaiting a result which will cause it to run synchronously (no pun intended). Update your method to await a result.
Finally, WaitAll blocks the thread, which may not be what you want. You can use WhenAll instead and await that call, allowing your application to release the thread while your tasks run.
Below is a complete, working example of my recommended modifications, simplified to show a working program. The Main method recommendation is taken from https://social.msdn.microsoft.com/Forums/vstudio/en-US/fe9acdfc-66cd-4b43-9460-a8053ca51885/using-new-asyncawait-in-console-app?forum=netfxbcl
class Program
{
static Task[] tasks = new Task[3];
static HttpClient _client = new HttpClient();
static void Main(string[] args)
{
Console.WriteLine("Main start");
Task t = Run();
t.ContinueWith((str) =>
{
Console.WriteLine(str.Status.ToString());
Console.WriteLine("Main end");
});
t.Wait();
}
public static async Task Run()
{
tasks[0] = GetMyData("http://www.w3.org/TR/PNG/iso_8859-1.txt");
tasks[1] = GetMyData("http://www.w3.org/TR/PNG/iso_8859-1.txt");
tasks[2] = GetMyData("http://www.w3.org/TR/PNG/iso_8859-1.txt");
await Task.WhenAll(tasks);
var result4 = (await (Task<Stream>)tasks[2]);
}
public static async Task<Stream> GetMyData(string urlToCall)
{
return await _client.GetStreamAsync(urlToCall);
}
}
I think the issue is more of understanding Task and async await; and I may be wrong so apologies up front.
Task is a managed thread that goes into a thread pool. Task has a Task.Result of Type T.
You can create a Task and then Start it and then Wait it. (Never a good idea to start and then immediately wait a task but for understanding...)
var task = new Task(() => DoWork());
task.Start();
task.Wait();
The task will perform the DoWork() method in a new thread.
The calling thread will BLOCK at task.Wait();
You can also give a Task a ContinueWith Action that will perform the remaining work on the calling thread.
var task = new Task(() => DoWorkOnNewThread());
task.ContinueWith(() => MainThreadWork());
task.Start(); //Notice no more task.Wait();
So, if you're following that little bit then you can sort of use async await correctly.
The async keyword tells the compiler to wrap all remaing code AFTER reaching the await keyword WHERE A GetAwaiter() is returned. This is important because until you actually create a task (preferably started also) and return it then you have no GetAwaiter();
private Task DoWorkAsync()
{
var task = new Task(() => DoWork());
task.Start();
return task;
}
private async void Method()
{
//Main thread code...
await DoWorkAsync(); //Returns to whoever called Method()
//More main thread code to be QUEUED to run AFTER DoWorkAsync is complete.
//This portion of code, when compiled, is essentially wrapped in the ContinueWith(...
}
So if you're still following along then here's the kicker. You're on the same thread UNTIL you return a GetAwaiter() which is only found in a Task. If the Task has never started then you'll await that Task forever technically. So here's some comments showing the thread transitions.
private Task DoWorkAsync()
{
Debug.WriteLine("Still on main thread")
var task = new Task(() =>
{
Debug.WriteLine("On background thread");
});
task.Start(); //On main thread.
return task; //On main thread.
}
private async void Method()
{
Debug.WriteLine("On main thread");
await DoWorkAsync(); //returns to caller after DoWorkAsync returns Task
Debug.WriteLine("Back on main thread"); //Works here after the task DoWorkAsync returned is complete
}
An easier way to return the task running is to return Task.Run(() => DoWork()); If you look at the return value of Run it is Task and that task has already been started.
Forgive me if this isn't what you wanted but I felt like there is more of a confusion about using async await correctly than there is confusion about your code. I may be wrong but I felt that if you could understand more about the Task itself and how async await works you would see your issue. If this isn't what you're looking for I'll delete the answer.
I need to implement a library to request vk.com API. The problem is that API supports only 3 requests per second. I would like to have API asynchronous.
Important: API should support safe accessing from multiple threads.
My idea is implement some class called throttler which allow no more than 3 request/second and delay other request.
The interface is next:
public interface IThrottler : IDisposable
{
Task<TResult> Throttle<TResult>(Func<Task<TResult>> task);
}
The usage is like
var audio = await throttler.Throttle(() => api.MyAudio());
var messages = await throttler.Throttle(() => api.ReadMessages());
var audioLyrics = await throttler.Throttle(() => api.AudioLyrics(audioId));
/// Here should be delay because 3 requests executed
var photo = await throttler.Throttle(() => api.MyPhoto());
How to implement throttler?
Currently I implemented it as queue which is processed by background thread.
public Task<TResult> Throttle<TResult>(Func<Task<TResult>> task)
{
/// TaskRequest has method Run() to run task
/// TaskRequest uses TaskCompletionSource to provide new task
/// which is resolved when queue processed til this element.
var request = new TaskRequest<TResult>(task);
requestQueue.Enqueue(request);
return request.ResultTask;
}
This is shorten code of background thread loop which process the queue:
private void ProcessQueue(object state)
{
while (true)
{
IRequest request;
while (requestQueue.TryDequeue(out request))
{
/// Delay method calculates actual delay value and calls Thread.Sleep()
Delay();
request.Run();
}
}
}
Is it possible to implement this without background thread?
So we'll start out with a solution to a simpler problem, that of creating a queue that process up to N tasks concurrently, rather than throttling to N tasks started per second, and build on that:
public class TaskQueue
{
private SemaphoreSlim semaphore;
public TaskQueue()
{
semaphore = new SemaphoreSlim(1);
}
public TaskQueue(int concurrentRequests)
{
semaphore = new SemaphoreSlim(concurrentRequests);
}
public async Task<T> Enqueue<T>(Func<Task<T>> taskGenerator)
{
await semaphore.WaitAsync();
try
{
return await taskGenerator();
}
finally
{
semaphore.Release();
}
}
public async Task Enqueue(Func<Task> taskGenerator)
{
await semaphore.WaitAsync();
try
{
await taskGenerator();
}
finally
{
semaphore.Release();
}
}
}
We'll also use the following helper methods to match the result of a TaskCompletionSource to a `Task:
public static void Match<T>(this TaskCompletionSource<T> tcs, Task<T> task)
{
task.ContinueWith(t =>
{
switch (t.Status)
{
case TaskStatus.Canceled:
tcs.SetCanceled();
break;
case TaskStatus.Faulted:
tcs.SetException(t.Exception.InnerExceptions);
break;
case TaskStatus.RanToCompletion:
tcs.SetResult(t.Result);
break;
}
});
}
public static void Match<T>(this TaskCompletionSource<T> tcs, Task task)
{
Match(tcs, task.ContinueWith(t => default(T)));
}
Now for our actual solution what we can do is each time we need to perform a throttled operation we create a TaskCompletionSource, and then go into our TaskQueue and add an item that starts the task, matches the TCS to its result, doesn't await it, and then delays the task queue for 1 second. The task queue will then not allow a task to start until there are no longer N tasks started in the past second, while the result of the operation itself is the same as the create Task:
public class Throttler
{
private TaskQueue queue;
public Throttler(int requestsPerSecond)
{
queue = new TaskQueue(requestsPerSecond);
}
public Task<T> Enqueue<T>(Func<Task<T>> taskGenerator)
{
TaskCompletionSource<T> tcs = new TaskCompletionSource<T>();
var unused = queue.Enqueue(() =>
{
tcs.Match(taskGenerator());
return Task.Delay(TimeSpan.FromSeconds(1));
});
return tcs.Task;
}
public Task Enqueue<T>(Func<Task> taskGenerator)
{
TaskCompletionSource<bool> tcs = new TaskCompletionSource<bool>();
var unused = queue.Enqueue(() =>
{
tcs.Match(taskGenerator());
return Task.Delay(TimeSpan.FromSeconds(1));
});
return tcs.Task;
}
}
I solved a similar problem using a wrapper around SemaphoreSlim. In my scenario, I had some other throttling mechanisms as well, and I needed to make sure that requests didn't hit the external API too often even if request number 1 took longer to reach the API than request number 3. My solution was to use a wrapper around SemaphoreSlim that had to be released by the caller, but the actual SemaphoreSlim would not be released until a set time had passed.
public class TimeGatedSemaphore
{
private readonly SemaphoreSlim semaphore;
public TimeGatedSemaphore(int maxRequest, TimeSpan minimumHoldTime)
{
semaphore = new SemaphoreSlim(maxRequest);
MinimumHoldTime = minimumHoldTime;
}
public TimeSpan MinimumHoldTime { get; }
public async Task<IDisposable> WaitAsync()
{
await semaphore.WaitAsync();
return new InternalReleaser(semaphore, Task.Delay(MinimumHoldTime));
}
private class InternalReleaser : IDisposable
{
private readonly SemaphoreSlim semaphoreToRelease;
private readonly Task notBeforeTask;
public InternalReleaser(SemaphoreSlim semaphoreSlim, Task dependantTask)
{
semaphoreToRelease = semaphoreSlim;
notBeforeTask = dependantTask;
}
public void Dispose()
{
notBeforeTask.ContinueWith(_ => semaphoreToRelease.Release());
}
}
}
Example usage:
private TimeGatedSemaphore requestThrottler = new TimeGatedSemaphore(3, TimeSpan.FromSeconds(1));
public async Task<T> MyRequestSenderHelper(string endpoint)
{
using (await requestThrottler.WaitAsync())
return await SendRequestToAPI(endpoint);
}
Here is one solution that uses a Stopwatch:
public class Throttler : IThrottler
{
private readonly Stopwatch m_Stopwatch;
private int m_NumberOfRequestsInLastSecond;
private readonly int m_MaxNumberOfRequestsPerSecond;
public Throttler(int max_number_of_requests_per_second)
{
m_MaxNumberOfRequestsPerSecond = max_number_of_requests_per_second;
m_Stopwatch = Stopwatch.StartNew();
}
public async Task<TResult> Throttle<TResult>(Func<Task<TResult>> task)
{
var elapsed = m_Stopwatch.Elapsed;
if (elapsed > TimeSpan.FromSeconds(1))
{
m_NumberOfRequestsInLastSecond = 1;
m_Stopwatch.Restart();
return await task();
}
if (m_NumberOfRequestsInLastSecond >= m_MaxNumberOfRequestsPerSecond)
{
TimeSpan time_to_wait = TimeSpan.FromSeconds(1) - elapsed;
await Task.Delay(time_to_wait);
m_NumberOfRequestsInLastSecond = 1;
m_Stopwatch.Restart();
return await task();
}
m_NumberOfRequestsInLastSecond++;
return await task();
}
}
Here is how this code can be tested:
class Program
{
static void Main(string[] args)
{
DoIt();
Console.ReadLine();
}
static async Task DoIt()
{
Func<Task<int>> func = async () =>
{
await Task.Delay(100);
return 1;
};
Throttler throttler = new Throttler(3);
for (int i = 0; i < 10; i++)
{
var result = await throttler.Throttle(func);
Console.WriteLine(DateTime.Now);
}
}
}
You can use this as Generic
public TaskThrottle(int maxTasksToRunInParallel)
{
_semaphore = new SemaphoreSlim(maxTasksToRunInParallel);
}
public void TaskThrottler<T>(IEnumerable<Task<T>> tasks, int timeoutInMilliseconds, CancellationToken cancellationToken = default(CancellationToken)) where T : class
{
// Get Tasks as List
var taskList = tasks as IList<Task<T>> ?? tasks.ToList();
var postTasks = new List<Task<int>>();
// When the first task completed, it will flag
taskList.ForEach(x =>
{
postTasks.Add(x.ContinueWith(y => _semaphore.Release(), cancellationToken));
});
taskList.ForEach(x =>
{
// Wait for open slot
_semaphore.Wait(timeoutInMilliseconds, cancellationToken);
cancellationToken.ThrowIfCancellationRequested();
x.Start();
});
Task.WaitAll(taskList.ToArray(), cancellationToken);
}
Edit: this solution works but use it only if it is ok to process all request in serial (in one thread). Otherwise use solution accepted as answer.
Well, thanks to Best way in .NET to manage queue of tasks on a separate (single) thread
My question is almost duplicate except adding delay before execution, which is actually simple.
The main helper here is SemaphoreSlim class which allows to restrict degree of parallelism.
So, first create a semaphore:
// Semaphore allows run 1 thread concurrently.
private readonly SemaphoreSlim semaphore = new SemaphoreSlim(1, 1);
And, final version of throttle looks like
public async Task<TResult> Throttle<TResult>(Func<Task<TResult>> task)
{
await semaphore.WaitAsync();
try
{
await delaySource.Delay();
return await task();
}
finally
{
semaphore.Release();
}
}
Delay source is also pretty simple:
private class TaskDelaySource
{
private readonly int maxTasks;
private readonly TimeSpan inInterval;
private readonly Queue<long> ticks = new Queue<long>();
public TaskDelaySource(int maxTasks, TimeSpan inInterval)
{
this.maxTasks = maxTasks;
this.inInterval = inInterval;
}
public async Task Delay()
{
// We will measure time of last maxTasks tasks.
while (ticks.Count > maxTasks)
ticks.Dequeue();
if (ticks.Any())
{
var now = DateTime.UtcNow.Ticks;
var lastTick = ticks.First();
// Calculate interval between last maxTasks task and current time
var intervalSinceLastTask = TimeSpan.FromTicks(now - lastTick);
if (intervalSinceLastTask < inInterval)
await Task.Delay((int)(inInterval - intervalSinceLastTask).TotalMilliseconds);
}
ticks.Enqueue(DateTime.UtcNow.Ticks);
}
}
After spending a very frustrating and unproductive day on this, I'm posting here in search of help.
I am using a third-party library that initiates a network connection in an unknown manner (I do know however it's a managed wrapper for an unmanaged lib). It lets you know about the status of the connection by invoking an event StatusChanged(status).
Since obviously invoking the network is costly and I may not need it for my Service, I inject an AsyncLazy<Connection> which is then invoked if necessary. The Service is accessed by ParallelForEachAsync which is an extension I made to process Tasks concurrently, based on this post.
If accessed sequentially, all is well. Any concurrency, even 2 parallel tasks will result in a deadlock 90% of the time. I know it's definitely related to how the third-party lib interacts with my code because a) I am not able to reproduce the effect using the same structure but without invoking it and b) the event StatusChanged(Connecting) is received fine, at which point I assume the network operation is started and I never get a callback for StatusChanged(Connected).
Here's a as-faithful-as-possible repro of the code structure which doesn't reproduce the deadlock unfortunately.
Any ideas on how to go about resolving this?
class Program
{
static void Main(string[] args)
{
AsyncContext.Run(() => MainAsync(args));
}
static async Task MainAsync(string[] args)
{
var lazy = new AsyncLazy<Connection>(() => ConnectionFactory.Create());
var service = new Service(lazy);
await Enumerable.Range(0, 100)
.ParallelForEachAsync(10, async i =>
{
await service.DoWork();
Console.WriteLine("did some work");
}, CancellationToken.None);
}
}
class ConnectionFactory
{
public static Task<Connection> Create()
{
var tcs = new TaskCompletionSource<Connection>();
var session = new Session();
session.Connected += (sender, args) =>
{
Console.WriteLine("connected");
tcs.SetResult(new Connection());
};
session.Connect();
return tcs.Task;
}
}
class Connection
{
public async Task DoSomethinElse()
{
await Task.Delay(1000);
}
}
class Session
{
public event EventHandler Connected;
public void Connect()
{
Console.WriteLine("Simulate network operation with unknown scheduling");
Task.Delay(100).Wait();
Connected(this, EventArgs.Empty);
}
}
class Service
{
private static Random r = new Random();
private readonly AsyncLazy<Connection> lazy;
public Service(AsyncLazy<Connection> lazy)
{
this.lazy = lazy;
}
public async Task DoWork()
{
Console.WriteLine("Trying to do some work, will connect");
await Task.Delay(r.Next(0, 100));
var connection = await lazy;
await connection.DoSomethinElse();
}
}
public static class AsyncExtensions
{
public static async Task<AsyncParallelLoopResult> ParallelForEachAsync<T>(
this IEnumerable<T> source,
int degreeOfParallelism,
Func<T, Task> body,
CancellationToken cancellationToken)
{
var partitions = Partitioner.Create(source).GetPartitions(degreeOfParallelism);
bool wasBroken = false;
var tasks =
from partition in partitions
select Task.Run(async () =>
{
using (partition)
{
while (partition.MoveNext())
{
if (cancellationToken.IsCancellationRequested)
{
Volatile.Write(ref wasBroken, true);
break;
}
await body(partition.Current);
}
}
});
await Task.WhenAll(tasks)
.ConfigureAwait(false);
return new AsyncParallelLoopResult(Volatile.Read(ref wasBroken));
}
}
public class AsyncParallelLoopResult
{
public bool IsCompleted { get; private set; }
internal AsyncParallelLoopResult(bool isCompleted)
{
IsCompleted = isCompleted;
}
}
EDIT
I think I understand why it's happening but not sure how to solve it. While the context is waiting for DoWork, DoWork is waiting for the lazy connection.
This ugly hack seems to solve it:
Connection WaitForConnection()
{
connectionLazy.Start();
var awaiter = connectionLazy.GetAwaiter();
while (!awaiter.IsCompleted)
Thread.Sleep(50);
return awaiter.GetResult();
}
Any more elegant solutions?
I suspect that the 3rd-party library is requiring some kind of STA pumping. This is fairly common with old-style asynchronous code.
I have a type AsyncContextThread that you can try, passing true to the constructor to enable manual STA pumping. AsyncContextThread is just like AsyncContext except it runs the context within a new thread (an STA thread in this case).
static void Main(string[] args)
{
using (var thread = new AsyncContextThread(true))
{
thread.Factory.Run(() => MainAsync(args)).Wait();
}
}
or
static void Main(string[] args)
{
AsyncContext.Run(() => async
{
using (var thread = new AsyncContextThread(true))
{
await thread.Factory.Run(() => MainAsync(args));
}
}
}
Note that AsyncContextThread will not work in all STA scenarios. I have run into issues when doing (some rather twisted) COM interop that required a true UI thread (WPF or WinForms thread); for some reason the STA pumping wasn't sufficient for those COM objects.
I have a "High-Precision" timer class that I need to be able to be start, stop & pause / resume. To do this, I'm tying together a couple of different examples I found on the internet, but I'm not sure if I'm using Tasks with asnyc / await correctly.
Here is my relevant code:
//based on http://haukcode.wordpress.com/2013/01/29/high-precision-timer-in-netc/
public class HighPrecisionTimer : IDisposable
{
Task _task;
CancellationTokenSource _cancelSource;
//based on http://blogs.msdn.com/b/pfxteam/archive/2013/01/13/cooperatively-pausing-async-methods.aspx
PauseTokenSource _pauseSource;
Stopwatch _watch;
Stopwatch Watch { get { return _watch ?? (_watch = Stopwatch.StartNew()); } }
public bool IsPaused
{
get { return _pauseSource != null && _pauseSource.IsPaused; }
private set
{
if (value)
{
_pauseSource = new PauseTokenSource();
}
else
{
_pauseSource.IsPaused = false;
}
}
}
public bool IsRunning { get { return !IsPaused && _task != null && _task.Status == TaskStatus.Running; } }
public void Start()
{
if (IsPaused)
{
IsPaused = false;
}
else if (!IsRunning)
{
_cancelSource = new CancellationTokenSource();
_task = new Task(ExecuteAsync, _cancelSource.Token, TaskCreationOptions.LongRunning);
_task.Start();
}
}
public void Stop()
{
if (_cancelSource != null)
{
_cancelSource.Cancel();
}
}
public void Pause()
{
if (!IsPaused)
{
if (_watch != null)
{
_watch.Stop();
}
}
IsPaused = !IsPaused;
}
async void ExecuteAsync()
{
while (!_cancelSource.IsCancellationRequested)
{
if (_pauseSource != null && _pauseSource.IsPaused)
{
await _pauseSource.Token.WaitWhilePausedAsync();
}
// DO CUSTOM TIMER STUFF...
}
if (_watch != null)
{
_watch.Stop();
_watch = null;
}
_cancelSource = null;
_pauseSource = null;
}
public void Dispose()
{
if (IsRunning)
{
_cancelSource.Cancel();
}
}
}
Can anyone please take a look and provide me some pointers on whether I'm doing this correctly?
UPDATE
I have tried modifying my code per Noseratio's comments below, but I still cannot figure out the syntax. Every attempt to pass the ExecuteAsync() method to either TaskFactory.StartNew or Task.Run, results in a compilation error like the following:
"The call is ambiguous between the following methods or properties: TaskFactory.StartNew(Action, CancellationToken...) and TaskFactory.StartNew<Task>(Func<Task>, CancellationToken...)".
Finally, is there a way to specify the LongRunning TaskCreationOption without having to provide a TaskScheduler?
async **Task** ExecuteAsync()
{
while (!_cancelSource.IsCancellationRequested)
{
if (_pauseSource != null && _pauseSource.IsPaused)
{
await _pauseSource.Token.WaitWhilePausedAsync();
}
//...
}
}
public void Start()
{
//_task = Task.Factory.StartNew(ExecuteAsync, _cancelSource.Token, TaskCreationOptions.LongRunning, null);
//_task = Task.Factory.StartNew(ExecuteAsync, _cancelSource.Token);
//_task = Task.Run(ExecuteAsync, _cancelSource.Token);
}
UPDATE 2
I think I've narrowed this down, but still not sure about the correct syntax. Would this be the right way to create the task so that the consumer / calling code continues on, with the task spinning-up and starting on a new asynchronous thread?
_task = Task.Run(async () => await ExecuteAsync, _cancelSource.Token);
//**OR**
_task = Task.Factory.StartNew(async () => await ExecuteAsync, _cancelSource.Token, TaskCreationOptions.LongRunning, TaskScheduler.Default);
Here are some points:
async void methods are only good for asynchronous event handlers (more info). Your async void ExecuteAsync() returns instantly (as soon as the code flow reaches await _pauseSource inside it). Essentially, your _task is in the completed state after that, while the rest of ExecuteAsync will be executed unobserved (because it's void). It may even not continue executing at all, depending on when your main thread (and thus, the process) terminates.
Given that, you should make it async Task ExecuteAsync(), and use Task.Run or Task.Factory.StartNew instead of new Task to start it. Because you want your task's action method be async, you'd be dealing with nested tasks here, i.e. Task<Task>, which Task.Run would automatically unwrap for you. More info can be found here and here.
PauseTokenSource takes the following approach (by design, AFAIU): the consumer side of the code (the one which calls Pause) actually only requests a pause, but doesn't synchronize on it. It will continue executing after Pause, even though the producer side may not have reached the awaiting state yet, i.e. await _pauseSource.Token.WaitWhilePausedAsync(). This may be ok for your app logic, but you should be aware of it. More info here.
[UPDATE] Below is the correct syntax for using Factory.StartNew. Note Task<Task> and task.Unwrap. Also note _task.Wait() in Stop, it's there to make sure the task has completed when Stop returns (in a way similar to Thread.Join). Also, TaskScheduler.Default is used to instruct Factory.StartNew to use the thread pool scheduler. This is important if your create your HighPrecisionTimer object from inside another task, which in turn was created on a thread with non-default synchronization context, e.g. a UI thread (more info here and here).
using System;
using System.Threading;
using System.Threading.Tasks;
namespace ConsoleApplication
{
public class HighPrecisionTimer
{
Task _task;
CancellationTokenSource _cancelSource;
public void Start()
{
_cancelSource = new CancellationTokenSource();
Task<Task> task = Task.Factory.StartNew(
function: ExecuteAsync,
cancellationToken: _cancelSource.Token,
creationOptions: TaskCreationOptions.LongRunning,
scheduler: TaskScheduler.Default);
_task = task.Unwrap();
}
public void Stop()
{
_cancelSource.Cancel(); // request the cancellation
_task.Wait(); // wait for the task to complete
}
async Task ExecuteAsync()
{
Console.WriteLine("Enter ExecuteAsync");
while (!_cancelSource.IsCancellationRequested)
{
await Task.Delay(42); // for testing
// DO CUSTOM TIMER STUFF...
}
Console.WriteLine("Exit ExecuteAsync");
}
}
class Program
{
public static void Main()
{
var highPrecisionTimer = new HighPrecisionTimer();
Console.WriteLine("Start timer");
highPrecisionTimer.Start();
Thread.Sleep(2000);
Console.WriteLine("Stop timer");
highPrecisionTimer.Stop();
Console.WriteLine("Press Enter to exit...");
Console.ReadLine();
}
}
}
I'm adding code for running long running task (infinite with cancelation) with internal sub tasks:
Task StartLoop(CancellationToken cancellationToken)
{
return Task.Factory.StartNew(async () => {
while (true)
{
if (cancellationToken.IsCancellationRequested)
break;
await _taskRunner.Handle(cancellationToken);
await Task.Delay(TimeSpan.FromMilliseconds(100), cancellationToken);
}
},
cancellationToken,
TaskCreationOptions.LongRunning,
TaskScheduler.Default);
}
I am new writer to SO, pls bear with me.
I have a WCF service with a duplex service contract. This service contract has an operation contact that suppose to do long data processing. I am constrained to limit the number of concurrent data processing to let's say max 3. My problem is that after the data processing I need to get back to the same service instance context so I call back my initiator endpoint passing the data processing result. I need to mention that due to various reasons I am constrained to TPL dataflows and WCF duplex.
Here is a demo to what I wrote so far
In a console library I simulate WCF calls
class Program
{
static void Main(string[] args)
{
// simulate service calls
Enumerable.Range(0, 5).ToList().ForEach(x =>
{
new System.Threading.Thread(new ThreadStart(async () =>
{
var service = new Service();
await service.Inc(x);
})).Start();
});
}
}
Here is what suppose to be the WCF service
// service contract
public class Service
{
static TransformBlock<Message<int>, Message<int>> transformBlock;
static Service()
{
transformBlock = new TransformBlock<Message<int>, Message<int>>(x => Inc(x), new ExecutionDataflowBlockOptions
{
MaxDegreeOfParallelism = 3
});
}
static Message<int> Inc(Message<int> input)
{
System.Threading.Thread.Sleep(100);
return new Message<int> { Token = input.Token, Data = input.Data + 1 };
}
// operation contract
public async Task Inc(int id)
{
var token = Guid.NewGuid().ToString();
transformBlock.Post(new Message<int> { Token = token, Data = id });
while (await transformBlock.OutputAvailableAsync())
{
Message<int> message;
if (transformBlock.TryReceive(m => m.Token == token, out message))
{
// do further processing using initiator service instance members
// something like Callback.IncResult(m.Data);
break;
}
}
}
}
public class Message<T>
{
public string Token { get; set; }
public T Data { get; set; }
}
The operation contract is not really necessary to be async, but I needed the OutputAvailableAsync notification.
Is this a good approach or is there a better solution for my scenario?
Thanks in advance.
First, I think you shouldn't use the token the way you do. Unique identifiers are useful when communicating between processes. But when you're inside a single process, just use reference equality.
To actually answer your question, I think the (kind of) busy loop is not a good idea.
A simpler solution for asynchronous throttling would be to use SemaphoreSlim. Something like:
static readonly SemaphoreSlim Semaphore = new SemaphoreSlim(3);
// operation contract
public async Task Inc(int id)
{
await Semaphore.WaitAsync();
try
{
Thread.Sleep(100);
var result = id + 1;
// do further processing using initiator service instance members
// something like Callback.IncResult(result);
}
finally
{
Semaphore.Release();
}
}
If you really want to (or have to?) use dataflow, you can use TaskCompletionSource for synchronization between the operation and the block. The operation method would wait on the Task of the TaskCompletionSource and the block would set it when it finished computation for that message:
private static readonly ActionBlock<Message<int>> Block =
new ActionBlock<Message<int>>(
x => Inc(x),
new ExecutionDataflowBlockOptions
{
MaxDegreeOfParallelism = 3
});
static void Inc(Message<int> input)
{
Thread.Sleep(100);
input.TCS.SetResult(input.Data + 1);
}
// operation contract
public async Task Inc(int id)
{
var tcs = new TaskCompletionSource<int>();
Block.Post(new Message<int> { TCS = tcs, Data = id });
int result = await tcs.Task;
// do further processing using initiator service instance members
// something like Callback.IncResult(result);
}