Reading from multiple WebSockets with async/await - c#

I'm writing a .NET Core Console App that needs to continuously read data from multiple WebSockets. My current approach is to create a new Task (via Task.Run) per WebSocket that runs an infinite while loop and blocks until it reads the data from the socket. However, since the data is pushed at a rather low frequency, the threads just block most of the time which seems quite inefficient.
From my understanding, the async/await pattern should be ideal for blocking I/O operations. However, I'm not sure how to apply it for my situation or even if async/await can improve this in any way - especially since it's a Console app.
I've put together a proof of concept (doing a HTTP GET instead of reading from WebSocket for simplicity). The only way I was able to achieve this was without actually awaiting. Code:
static void Main(string[] args)
{
Console.WriteLine($"ThreadId={ThreadId}: Main");
Task task = Task.Run(() => Process("https://duckduckgo.com", "https://stackoverflow.com/"));
// Do main work.
task.Wait();
}
private static void Process(params string[] urls)
{
Dictionary<string, Task<string>> tasks = urls.ToDictionary(x => x, x => (Task<string>)null);
HttpClient client = new HttpClient();
while (true)
{
foreach (string url in urls)
{
Task<string> task = tasks[url];
if (task == null || task.IsCompleted)
{
if (task != null)
{
string result = task.Result;
Console.WriteLine($"ThreadId={ThreadId}: Length={result.Length}");
}
tasks[url] = ReadString(client, url);
}
}
Thread.Yield();
}
}
private static async Task<string> ReadString(HttpClient client, string url)
{
var response = await client.GetAsync(url);
Console.WriteLine($"ThreadId={ThreadId}: Url={url}");
return await response.Content.ReadAsStringAsync();
}
private static int ThreadId => Thread.CurrentThread.ManagedThreadId;
This seems to be working and executing on various Worker Threads on the ThreadPool. However, this definitely doesn't seem as any typical async/await code which makes me think there has to be a better way.
Is there a more proper / more elegant way of doing this?

You've basically written a version of Task.WhenAny that uses a CPU loop to check for completed tasks rather than... whatever magic the framework method uses behind the scenes.
A more idiomatic version might look like this. (Although it might not - I feel like there should be an easier method of "re-run the completed task" than the reverse dictionary I've used here.)
static void Main(string[] args)
{
Console.WriteLine($"ThreadId={ThreadId}: Main");
// No need for Task.Run here.
var task = Process("https://duckduckgo.com", "https://stackoverflow.com/");
task.Wait();
}
private static async Task Process(params string[] urls)
{
// Set up initial dictionary mapping task (per URL) to the URL used.
HttpClient client = new HttpClient();
var tasks = urls.ToDictionary(u => client.GetAsync(u), u => u);
while (true)
{
// Wait for any task to complete, get its URL and remove it from the current tasks.
var firstCompletedTask = await Task.WhenAny(tasks.Keys);
var firstCompletedUrl = tasks[firstCompletedTask];
tasks.Remove(firstCompletedTask);
// Do work with completed task.
try
{
Console.WriteLine($"ThreadId={ThreadId}: URL={firstCompletedUrl}");
using (var response = await firstCompletedTask)
{
var content = await response.Content.ReadAsStringAsync();
Console.WriteLine($"ThreadId={ThreadId}: Length={content.Length}");
}
}
catch (Exception ex)
{
Console.WriteLine($"ThreadId={ThreadId}: Ex={ex}");
}
// Queue the task again.
tasks.Add(client.GetAsync(firstCompletedUrl), firstCompletedUrl);
}
}
private static int ThreadId => Thread.CurrentThread.ManagedThreadId;

I've accepted Rawling's answer - I believe it is correct for the exact scenario I described. However, with a bit of inverted logic, I ended up with something way simpler - leaving it in case anyone needs something like this:
static void Main(string[] args)
{
string[] urls = { "https://duckduckgo.com", "https://stackoverflow.com/" };
HttpClient client = new HttpClient();
var tasks = urls.Select(async url =>
{
while (true) await ReadString(client, url);
});
Task.WhenAll(tasks).Wait();
}
private static async Task<string> ReadString(HttpClient client, string url)
{
var response = await client.GetAsync(url);
string data = await response.Content.ReadAsStringAsync();
Console.WriteLine($"Fetched data from url={url}. Length={data.Length}");
return data;
}

Maybe better question is: do you really need thread per socket in this case? You should think of threads as system-wide resource and you should take this into consideration when spawning them, especially if you don't really know the number of threads that your application will be using. This is a good read: What's the maximum number of threads in Windows Server 2003?
Few years ago .NET team introduced Asynchronous sockets.
...The client is built with an asynchronous socket, so execution of
the client application is not suspended while the server returns a
response. The application sends a string to the server and then
displays the string returned by the server on the console.
Asynchronous Client Socket Example
There are a lot more examples out there showcasing this approach. While it is a bit more complicated and "low level" it let's you be in control.

Related

Can you avoid Task Continuations when using async/await if you want execution to continue immediately

I am working on a protocol and trying to use as much async/await as I can to make it scale well. The protocol will have to support hundreds to thousands of simultaneous connections. Below is a little bit of pseudo code to illustrate my problem.
private static async void DoSomeWork()
{
var protocol = new FooProtocol();
await protocol.Connect("127.0.0.1", 1234);
var i = 0;
while(i != int.MaxValue)
{
i++;
var request = new FooRequest();
request.Payload = "Request Nr " + i;
var task = protocol.Send(request);
_ = task.ContinueWith(async tmp =>
{
var resp = await task;
Console.WriteLine($"Request {resp.SequenceNr} Successful: {(resp.Status == 0)}");
});
}
}
And below is a little pseudo code for the protocol.
public class FooProtocol
{
private int sequenceNr = 0;
private SemaphoreSlim ss = new SemaphoreSlim(20, 20);
public Task<FooResponse> Send(FooRequest fooRequest)
{
var tcs = new TaskCompletionSource<FooResponse>();
ss.Wait();
var tmp = Interlocked.Increment(ref sequenceNr);
fooRequest.SequenceNr = tmp;
// Faking some arbitrary delay. This work is done over sockets.
Task.Run(async () =>
{
await Task.Delay(1000);
tcs.SetResult(new FooResponse() {SequenceNr = tmp});
ss.Release();
});
return tcs.Task;
}
}
I have a protocol with request and response pairs. I have used asynchronous socket programming. The FooProtocol will take care of matching up request with responses (sequence numbers) and will also take care of the maximum number of pending requests. (Done in the pseudo and my code with a semaphore slim, So I am not worried about run away requests). The DoSomeWork method calls the Protocol.Send method, but I don't want to await the response, I want to spin around and send the next one until I am blocked by the maximum number of pending requests. When the task does complete I want to check the response and maybe do some work.
I would like to fix two things
I would like to avoid using Task.ContinueWith() because it seems to not fit in cleanly with the async/await patterns
Because I have awaited on the connection, I have had to use the async modifier. Now I get warnings from the IDE "Because this call is not waited, execution of the current method continues before this call is complete. Consider applying the 'await' operator to the result of the call." I don't want to do that, because as soon as I do it ruins the protocol's ability to have many requests in flight. The only way I can get rid of the warning is to use a discard. Which isn't the worst thing but I can't help but feel like I am missing a trick and fighting this too hard.
Side note: I hope your actual code is using SemaphoreSlim.WaitAsync rather than SemaphoreSlim.Wait.
In most socket code, you do end up with a list of connections, and along with each connection is a "processor" of some kind. In the async world, this is naturally represented as a Task.
So you will need to keep a list of Tasks; at the very least, your consuming application will need to know when it is safe to shut down (i.e., all responses have been received).
Don't preemptively worry about using Task.Run; as long as you aren't blocking (e.g., SemaphoreSlim.Wait), you probably will not starve the thread pool. Remember that during the awaits, no thread pool thread is used.
I am not sure that it's a good idea to enforce the maximum concurrency at the protocol level. It seems to me that this responsibility belongs to the caller of the protocol. So I would remove the SemaphoreSlim, and let it do the one thing that it knows to do well:
public class FooProtocol
{
private int sequenceNr = 0;
public async Task<FooResponse> Send(FooRequest fooRequest)
{
var tmp = Interlocked.Increment(ref sequenceNr);
fooRequest.SequenceNr = tmp;
await Task.Delay(1000); // Faking some arbitrary delay
return new FooResponse() { SequenceNr = tmp };
}
}
Then I would use an ActionBlock from the TPL Dataflow library in order to coordinate the process of sending a massive number of requests through the protocol, by handling the concurrency, the backpreasure (BoundedCapacity), the cancellation (if needed), the error-handling, and the status of the whole operation (running, completed, failed etc). Example:
private static async Task DoSomeWorkAsync()
{
var protocol = new FooProtocol();
var actionBlock = new ActionBlock<FooRequest>(async request =>
{
var resp = await protocol.Send(request);
Console.WriteLine($"Request {resp.SequenceNr} Status: {resp.Status}");
}, new ExecutionDataflowBlockOptions()
{
MaxDegreeOfParallelism = 20,
BoundedCapacity = 100
});
await protocol.Connect("127.0.0.1", 1234);
foreach (var i in Enumerable.Range(0, Int32.MaxValue))
{
var request = new FooRequest();
request.Payload = "Request Nr " + i;
var accepted = await actionBlock.SendAsync(request);
if (!accepted) break; // The block has failed irrecoverably
}
actionBlock.Complete();
await actionBlock.Completion; // Propagate any exceptions
}
The BoundedCapacity = 100 configuration means that the ActionBlock will store in its internal buffer at most 100 requests. When this threshold is reached, anyone who wants to send more requests to it will have to wait. The awaiting will happen in the await actionBlock.SendAsync line.

Asynchronous parallel web requests in ASP.NET MVC(C#)

I have a mini-project that requires to download html documents of multiple websites using C# and make it perform as fast as possible. In my scenario I might need to switch IP using proxies based on certain conditions. I want to take advantage of C# Asynchronous Tasks to make it execute as many requests as possible in order for the whole process to be fast and efficient.
Here's the code I have so far.
public class HTMLDownloader
{
public static List<string> URL_LIST = new List<string>();
public static List<string> HTML_DOCUMENTS = new List<string>();
public static void Main()
{
for (var i = 0; i < URL_LIST.Count; i++)
{
var html = Task.Run(() => Run(URL_LIST[i]));
HTML_DOCUMENTS.Add(html.Result);
}
}
public static async Task<string> Run(string url)
{
var client = new WebClient();
//Handle Proxy Credentials client.Proxy= new WebProxy();
string html = "";
try
{
html = await client.DownloadStringTaskAsync(new Uri(url));
//if(condition ==true)
//{
// return html;
//}
//else
//{
// Switch IP and try again
//}
}
catch (Exception e)
{
}
return html;
}
The problem here is that I'm not really taking advantage of sending multiple web requests because each request has to finish in order for the next one to begin. Is there a better approach to this? For example, send 10 web requests at a time and then send a new request when one of those requests is finished.
Thanks
I want to take advantage of C# Asynchronous Tasks to make it execute as many requests as possible in order for the whole process to be fast and efficient.
You can use Task.WhenAll to get asynchronous concurrency.
For example, send 10 web requests at a time and then send a new request when one of those requests is finished.
To throttle asynchronous concurrency, use SemaphoreSlim:
public static async Task Main()
{
using var limit = new SemaphoreSlim(10); // 10 at a time
var tasks = URL_LIST.Select(Process).ToList();
var results = await Task.WhenAll(tasks);
HTML_DOCUMENTS.AddRange(results);
async Task<string> Process(string url)
{
await limit.WaitAsync();
try { return await Run(url); }
finally { limit.Release(); }
}
}
One way is to use Task.WhenAll.
Creates a task that will complete when all of the supplied tasks have
completed.
The premise is, Select all the tasks into a List, await the list of task with Task.WhenAll, Select the results
public static async Task Main()
{
var tasks = URL_LIST.Select(Run);
await Task.WhenAll(tasks);
var results = tasks.Select(x => x.Result);
}
Note : The result of WhenAll will be the collection of results as well
First change your Main to be async.
Then you can use LINQ Select to run the Tasks in parallel.
public static async Task Main()
{
var tasks = URL_LIST.Select(Run);
string[] documents = await Task.WhenAll(tasks);
HTML_DOCUMENTS.AddRange(documents);
}
Task.WhenAll will unwrap the Task results into an array, once all the tasks are complete.

Task.Run on method returning T VS method returning Task<T>

Scenario 1 - For each website in string list (_websites), the caller method wraps GetWebContent into a task, waits for all the tasks to finish and return results.
private async Task<string[]> AsyncGetUrlStringFromWebsites()
{
List<Task<string>> tasks = new List<Task<string>>();
foreach (var website in _websites)
{
tasks.Add(Task.Run(() => GetWebsiteContent(website)));
}
var results = await Task.WhenAll(tasks);
return results;
}
private string GetWebContent(string url)
{
var client = new HttpClient();
var content = client.GetStringAsync(url);
return content.Result;
}
Scenario 2 - For each website in string list (_websites), the caller method calls GetWebContent (returns Task< string >), waits for all the tasks to finish and return the results.
private async Task<string[]> AsyncGetUrlStringFromWebsites()
{
List<Task<string>> tasks = new List<Task<string>>();
foreach (var website in _websites)
{
tasks.Add(GetWebContent(website));
}
var results = await Task.WhenAll(tasks);
return results;
}
private async Task<string> GetWebContent(string url)
{
var client = new HttpClient();
var content = await client.GetStringAsync(url);
return content;
}
Questions - Which way is the correct approach and why? How does each approach impact achieving asynchronous processing?
With Task.Run() you occupy a thread from the thread pool and tell it to wait until the web content has been received.
Why would you want to do that? Do you pay someone to stand next to your mailbox to tell you when a letter arrives?
GetStringAsync already is asynchronous. The cpu has nothing to do (with this process) while the content comes in over the network.
So the second approach is correct, no need to use extra threads from the thread pool here.
Always interesting to read: Stephen Cleary's "There is no thread"
#René Vogt gave a great explanation.
There a minor 5 cents from my side.
In the second example there is not need to use async / await in GetWebContent method. You can simply return Task<string> (this would also reduce async depth).

Parallel.For and httpclient crash the application C#

I want to avoid application crashing problem due to parallel for loop and httpclient but I am unable to apply solutions that are provided elsewhere on the web due to my limited knowledge of programming. My code is pasted below.
class Program
{
public static List<string> words = new List<string>();
public static int count = 0;
public static string output = "";
private static HttpClient Client = new HttpClient();
public static void Main(string[] args)
{
//input path strings...
List<string> links = new List<string>();
links.AddRange(File.ReadAllLines(input));
List<string> longList = new List<string>(File.ReadAllLines(#"a.txt"));
words.AddRange(File.ReadAllLines(output1));
System.Net.ServicePointManager.DefaultConnectionLimit = 8;
count = longList.Count;
//for (int i = 0; i < longList.Count; i++)
Task.Run(() => Parallel.For(0, longList.Count, new ParallelOptions { MaxDegreeOfParallelism = 5 }, (i, loopState) =>
{
Console.WriteLine(i);
string link = #"some link" + longList[i] + "/";
try
{
if (!links.Contains(link))
{
Task.Run(async () => { await Download(link); }).Wait();
}
}
catch (System.Exception e)
{
}
}));
//}
}
public static async Task Download(string link)
{
HtmlAgilityPack.HtmlDocument document = new HtmlDocument();
document.LoadHtml(await getURL(link));
//...stuff with html agility pack
}
public static async Task<string> getURL(string link)
{
string result = "";
HttpResponseMessage response = await Client.GetAsync(link);
Console.WriteLine(response.StatusCode);
if(response.IsSuccessStatusCode)
{
HttpContent content = response.Content;
var bytes = await response.Content.ReadAsByteArrayAsync();
result = Encoding.UTF8.GetString(bytes);
}
return result;
}
}
There are solutions for example this one, but I don't know how to put await keyword in my main method, and currently the program simply exits due to its absence before Task.Run(). As you can see I have already applied a workaround regarding async Download() method to call it in main method.
I have also doubts regarding the use of same instance of httpclient in different parallel threads. Please advise me whether I should create new instance of httpclient each time.
You're right that you have to block tasks somewhere in a console application, otherwise the program will just exit before it's complete. But you're doing this more than you need to. Aim for just blocking the main thread and delegating the rest to an async method. A good practice is to create a method with a signature like private async Task MainAsyc(args), put the "guts" of your program logic there, call it from Main like this:
MainAsync(args).Wait();
In your example, move everything from Main to MainAsync. Then you're free to use await as much as you want. Task.Run and Parallel.For are explicitly consuming new threads for I/O bound work, which is unnecessary in the async world. Use Task.WhenAll instead. The last part of your MainAsync method should end up looking something like this:
await Task.WhenAll(longList.Select(async s => {
Console.WriteLine(i);
string link = #"some link" + s + "/";
try
{
if (!links.Contains(link))
{
await Download(link);
}
}
catch (System.Exception e)
{
}
}));
There is one little wrinkle here though. Your example is throttling the parallelism at 5. If you find you still need this, TPL Dataflow is a great library for throttled parallelism in the async world. Here's a simple example.
Regarding HttpClient, using a single instance across threads is completely safe and highly encouraged.

How to run multiple tasks, handle exceptions and still return results

I am updating my concurrency skillset. My problem seems to be fairly common: read from multiple Uris, parse and work with the result, etc. I have Concurrency in C# Cookbook. There are a few examples using GetStringAsync, such as
static async Task<string> DownloadAllAsync(IEnumerable<string> urls)
{
var httpClient = new HttpClient();
var downloads = urls.Select(url => httpClient.GetStringAsync(url));
Task<string>[] downloadTasks = downloads.ToArray();
string[] htmlPages = await Task.WhenAll(downloadTasks);
return string.Concat(htmlPages);
}
What I need is the asynchronous pattern for running multiple async tasks, capturing full or partial success.
Url 1 succeeds
Url 2 succeeds
Url 3 fails (timeout, bad Uri format, 401, etc)
Url 4 succeeds
... 20 more with mixed success
waiting on DownloadAllAsync task will throw a single aggregate exception if any fail, dropping the accumulated results. From my limited research, with WhenAll or WaitAll behave the same. I want to catch the exceptions, log the failures, but continue with the remaining tasks, even if they all fail.
I could process them one by one, but wouldn't that defeat the purpose of allowing TPL to manage the whole process? Is there a link to a pattern which would accomplish this in a pure TPL way? Perhaps I'm using the wrong tool?
I want to catch the exceptions, log the failures, but continue with the remaining tasks, even if they all fail.
In this case, the cleanest solution is to change what your code does for each element. I.e., this current code:
var downloads = urls.Select(url => httpClient.GetStringAsync(url));
says "for each url, download a string". What you want it to say is "for each url, download a string and then log and ignore any errors":
static async Task<string> DownloadAllAsync(IEnumerable<string> urls)
{
var httpClient = new HttpClient();
var downloads = urls.Select(url => TryDownloadAsync(httpClient, url));
Task<string>[] downloadTasks = downloads.ToArray();
string[] htmlPages = await Task.WhenAll(downloadTasks);
return string.Concat(htmlPages);
}
static async Task<string> TryDownloadAsync(HttpClient client, string url)
{
try
{
return await client.GetStringAsync(url);
}
catch (Exception ex)
{
Log(ex);
return string.Empty; // or whatever you prefer
}
}
You can attach continuation for all your tasks and wait for them instead of waiting directly on the tasks.
static async Task<string> DownloadAllAsync(IEnumerable<string> urls)
{
var httpClient = new HttpClient();
IEnumerable<Task<Task<string>>> downloads = urls.Select(url => httpClient.GetStringAsync(url).ContinueWith(p=> p, TaskContinuationOptions.ExecuteSynchronously));
Task<Task<string>>[] downloadTasks = downloads.ToArray();
Task<string>[] compleTasks = await Task.WhenAll(downloadTasks);
foreach (var task in compleTasks)
{
if (task.IsFaulted)//Or task.IsCanceled
{
//Handle it
}
}
var htmlPages = compleTasks.Where(x => x.Status == TaskStatus.RanToCompletion)
.Select(x => x.Result);
return string.Concat(htmlPages);
}
This will not stop as soon as one task fails, rather it will wait for all the tasks to complete. Then handle the success and failure separately.

Categories

Resources