Multithread answering for HttpListener - c#

I have single thread process which executes some long time. I need several users to have access to execute this process and I choose http protocol to manage invocation.
Naturally, when one process is working everybody else should wait till it's done. If process is available it executes. If not then BUSY answer is sent.
Here is implementation:
using System;
using System.Net;
using System.Reflection;
using System.Runtime.InteropServices;
using System.Threading;
using System.Threading.Tasks;
namespace simplehttp
{
class Program
{
private static System.AsyncCallback task;
private static System.Threading.ManualResetEvent mre = new System.Threading.ManualResetEvent(false);// Notifies one or more waiting threads that an event has occurred.
private static HttpListenerContext workingContext = null;
public static bool isBackgroundWorking()
{
return mre.WaitOne(0);
}
static void Main(string[] args)
{
new Thread(() =>
{
Thread.CurrentThread.IsBackground = true;
while (true)
{
Console.WriteLine(" waitOne " + isBackgroundWorking());
mre.WaitOne(); // Blocks the current thread until the current WaitHandle receives a signal.
Console.WriteLine(" do job" + " [" + Thread.CurrentThread.Name + ":" + Thread.CurrentThread.ManagedThreadId + " ]\n");
HttpListenerRequest request = workingContext.Request;
HttpListenerResponse response = workingContext.Response;
string responseString = "WORK " + DateTime.Now ;
byte[] buffer = System.Text.Encoding.UTF8.GetBytes(responseString);
response.ContentLength64 = buffer.Length;
System.IO.Stream output = response.OutputStream;
Thread.Sleep(10000);
output.Write(buffer, 0, buffer.Length);
output.Close();
Console.WriteLine(" " + responseString + "\t" + DateTime.Now);
workingContext = null;
mre.Reset(); // Sets the state of the event to nonsignaled, causing threads to block.
}
}).Start();
// Create a listener.
HttpListener listener = new HttpListener();
listener.Prefixes.Add("http://localhost:6789/index/");
listener.Start();
Console.WriteLine("Listening..." + " [" + Thread.CurrentThread.Name + ":" + Thread.CurrentThread.ManagedThreadId + " ]\n");
task = new AsyncCallback(ListenerCallback);
IAsyncResult resultM = listener.BeginGetContext(task,listener);
Console.WriteLine("Waiting for request to be processed asyncronously.");
Console.ReadKey();
Console.WriteLine("Request processed asyncronously.");
listener.Close();
}
private static void ListenerCallback(IAsyncResult result)
{
HttpListener listener = (HttpListener) result.AsyncState;
//If not listening return immediately as this method is called one last time after Close()
if (!listener.IsListening)
return;
HttpListenerContext context = listener.EndGetContext(result);
listener.BeginGetContext(task, listener);
if (workingContext == null && !isBackgroundWorking())
{
// Background work
workingContext = context;
mre.Set(); //Sets the state of the event to signaled, allowing one or more waiting threads to proceed.
}
else
{
HttpListenerRequest request = context.Request;
HttpListenerResponse response = context.Response;
string responseString = "BUSY "+ DateTime.Now + " [" + Thread.CurrentThread.Name + ":" + Thread.CurrentThread.ManagedThreadId;
byte[] buffer = System.Text.Encoding.UTF8.GetBytes(responseString);
response.ContentLength64 = buffer.Length;
System.IO.Stream output = response.OutputStream;
output.Write(buffer, 0, buffer.Length);
output.Close();
Console.WriteLine(responseString + "\t" + DateTime.Now);
}
}
}
}
To test I do 2 http calls. I expect have 2 different answers WORK and BUSY.
However I see that second request waits first to finish and then executes.
waitOne False
Listening... [:10 ]
Waiting for request to be processed asyncronously.
do job [:11 ]
WORK 1/24/2016 10:34:01 AM 1/24/2016 10:34:11 AM
waitOne False
do job [:11 ]
WORK 1/24/2016 10:34:11 AM 1/24/2016 10:34:21 AM
waitOne False
What is wrong in my understanding how it should work?
Update (too many comments are not ecouraged by SO):
My code looks awkward because it is replication of real process. In "my" application working process is main process which has has "courtesy" to run embedded C# code at some particular moments. So, I cannot run new task to process request and it must be asyncronious since working process does its own job and only invokes slave piece of code to notify clients when data is available. It is asyncroinious because code is invoked and should finish as soon as possible or it will block master application.
I'll try to add additional thread with synchronous call and see hot it affects situation.
Debugger is not used in this example to not interfere with real time process and time stamps printed to Console. Debugging is great and necessary but in this case I try to substitute with output to avoid extra actor in synchronization/waiting scenario.
The application itself is not heavy loaded conversation. 1-3 clients seldom ask main application for answer. http protocol is used for convenience not for heavy or often conversations. It appears that some browsers like IE work fine (Windows to Windows conversation?) and some like Chrome (more system agnostic) replicate my application behavior. Look at the time stamps, Chrome, IE,IE,Chrome and last Chrome still went to WORK process. BTW, code is changed per conversation suggestion and now new request is placed immediately after retrieving previous one.
HttpListenerContext context = listener.EndGetContext(result);
listener.BeginGetContext(task, listener);
Also, following suggestions, I had change asyncronious call to syncroniuos and result is still the same
private static void ListenerCallback(IAsyncResult result)
{
HttpListener listener = (HttpListener) result.AsyncState;
//If not listening return immediately as this method is called one last time after Close()
if (!listener.IsListening)
return;
HttpListenerContext context = listener.EndGetContext(result);
while (true)
{
if (workingContext == null && !isBackgroundWorking())
{
// Background work
workingContext = context;
mre.Set(); //Sets the state of the event to signaled, allowing one or more waiting threads to proceed.
}
else
{
HttpListenerRequest request = context.Request;
HttpListenerResponse response = context.Response;
string responseString = "BUSY " + DateTime.Now + " [" + Thread.CurrentThread.Name + ":" +
Thread.CurrentThread.ManagedThreadId;
byte[] buffer = System.Text.Encoding.UTF8.GetBytes(responseString);
response.ContentLength64 = buffer.Length;
System.IO.Stream output = response.OutputStream;
output.Write(buffer, 0, buffer.Length);
output.Close();
Console.WriteLine(responseString + "\t" + DateTime.Now);
}
context=listener.GetContext();
}
}

The code as posted works exactly like it should:
Can't reproduce. I guess that answers the question because apparently you're driving the test workload incorrectly. I drove it by repeatedly clicking the Fiddler composer send button.
Thanks for posting executable code, though. I should have tried that earlier!

I have not totally read and understood that code. It's structure is awkward. Here's a much cleaner structure that's easy to get right:
while (true) {
var ctx = GetContext();
Task.Run(() => ProcessRequest(ctx));
}
That simply dispatches all incoming work. Then:
object #lock = new object(); //is work being done?
void ProcessRequest(Context ctx) {
if (!Monitor.Enter(#lock))
SendBusy();
else {
ProcessInner(ctx); //actually do some work
Monitor.Exit(#lock);
}
}
That's really all that is necessary.
In particular it is pointless for you to use async IO. I assume you have copied that code or idea from somewhere. Common mistake. Async IO helps you nothing at all here and convoluted the code.

Related

Handle multiple http request HttpListener using multithread leads to strange behavior

i'm trying to create a HttpListener and accept Http request, for each HttpListenerContext i create a dedicated thread to do work on it like this
while(true)
{
var context = listener.GetContext();
Thread backgroundThread = new Thread(() => HandleContext(context));
backgroundThread.Start();
}
In the HandleContext i do a Thread.Sleep(5000) to simulate the work and Console.WriteLine(DateTime.Now) for debugging purpose.
But when i open 3 tabs in Chrome at the same time, only 2 tabs return but the result is duplicated and the last tab just keep hanging.
As far as i know, my threads has no shared data which could lead to deadlock, but this behavior seems like a deadlock, can you tell me what i'm doing wrong and how to fix it?
Here's my full code
class Program
{
private static HttpListener _listener;
static void Main(string[] args)
{
HttpListener listener = new HttpListener();
listener.Prefixes.Add("http://localhost:5001/");
listener.Start();
Listen(listener);
}
static void Listen(HttpListener listener)
{
while (true)
{
var context = listener.GetContext();
Thread backgroundThread = new Thread(() => HandleContext(context));
backgroundThread.Start();
}
}
static void HandleContext(HttpListenerContext context)
{
Thread.Sleep(5000);
Console.WriteLine($"Hello world, {DateTime.Now}");
var responseContent = "Hello world";
var buffer = Encoding.UTF8.GetBytes(responseContent);
context.Response.OutputStream.Write(buffer, 0, buffer.Length);
context.Response.OutputStream.Close();
context.Response.Close();
}
}
You didn't set the ContentLength64 property of your HttpListenerResponse stream !
Please try this
context.Response.ContentLength64 = buffer.LongLength;
using (Stream stream = context.Response.OutputStream)
{
stream.Write(buffer, 0, buffer.Length);
}
After a while, i figured out that the "deadlock" like behavior occurred because of the "orphan" GetContext which #Mat Hatter pointed out. And the Console.WriteLine being duplicated because of Google Chrome's behavior sending 2 requests, 1 for the website's content and 1 for the fav.icon. Hence, there's nothing wrong with the Console.WriteLine being duplicated.

Splitting Loop Between Multiple Threads

My code gets open ports from the given host.
This process is very time consuming
Code :
for(int port=0;port<11;port++)
{
string statusTCP = "Open";
using (TcpClient tcp = new TcpClient())
{
try{ tcp.Connect("127.0.0.1",port);
}catch { statusTCP="Close";}
}
Console.WriteLine("Port " + port + " : " + statusTCP);
}
This process takes 11s for me !
This is very long if i check 100 or 1000 ports...
Any Good & Fast way to do this ?
You don't need multiple threads to connect to multiple ports. You can use ConnectAsync to connect asynchronously. Windows uses asynchronous IO through completion ports from the NT days.
This means that no threads are blocked, the OS notifies the original thread when an IO request completes. In fact, blocking is simulated to make synchronous programming easier.
You can create a single method that connects to a single port and reports a message when finished. Writing to the console blocks though, so I'll use the IProgress< T> interface to report progress:
public async Task ConnectToPortAsync(string host,int port,IProgress<string> progress)
{
using(var client=new TcpClient())
{
try
{
await client.ConnectAsync(host,port).ConfigureAwait(false);
progress.Report($"Port: {port} Open");
//Do some more work
}
catch
{
progress.Report($"Port {port} Closed");
}
}
}
Assuming you have a list of ports:
var ports=new[]{80,8080,...};
or
var ports=Enumerable.Range(0,11);
You can call multiple ports simultaneously like this:
var host="127.0.0.1";
var progress=new Progress<string>(msg=>Console.WriteLine(msg););
var allTasks= ports.Select(port=>ConnectToPortAsync(host,port,progress));
await Task.WhenAll(allTasks);
This code will use the main thread up until the first await. This means that all TcpClient instances will be created using the main thread. After that though, the code will execute asynchronously, using a thread from the IO Completion thread pool whenever necessary.
Reporting is performed by the main thread by the Progress< T> object, so the background threads don't block waiting to write to the console.
Finally, once await returns, execution continues on the original synchronization context. For a desktop application (WPF, Winforms) that would be the main thread. That's OK if you want to update the UI after an asynchronous operation, but can cause blocking if you want to perform more work in the background. By using ConfigureAwait(false) we instruct the runtime to keep working on the background thread
Further improvements :
You can add a timeout in case a connection takes too long, by using Task.Delay() and Task.WhenAny :
var connectionTask=client.ConnectAsync(host,port);
var timeout=Task.Delay(1000);
var completed=await Task.WhenAny(connectionTask,timeout);
if(completed==connectionTask)
{
progress.Report($"Port {port} Open");
}
else
{
progress.Report($"Port {port} Timeout");
}
If what you're asking is how to run multiple lookups in parallel, take a look at the Parallel class.
Parallel.For(0, 11, new ParallelOptions { MaxDegreeOfParallelism = 5 }, (port) =>
{
string statusTCP = "Open";
using (TcpClient tcp = new TcpClient())
{
try
{
tcp.Connect("127.0.0.1", port);
}
catch { statusTCP = "Close"; }
}
Console.WriteLine("Port " + port + " : " + statusTCP);
});
Note in the first few method parameters where I specify to go from 0-10 (because 11 is exclusive) and execute 5 lookups in parallel with the ParallelOptions class.

C# async await for pooling

I need to do some WebRequest to a certain endpoint every 2 seconds. I tried to do it with a Timer, the problem is that every call to the callback function is creating a different Thread and I'm havind some concurrence problems. So I decided to change my implementation and I was thinking about using a background worker with a sleep of two seconds inside or using async await but I don't see the advantages of using async await. Any advice? thank you.
This is the code that I will reimplement.
private void InitTimer()
{
TimerCallback callback = TimerCallbackFunction;
m_timer = new Timer(callback, null, 0, m_interval);
}
private void TimerCallbackFunction(Object info)
{
Thread.CurrentThread.Name = "Requester thread ";
m_object = GetMyObject();
}
public MyObject GetMyObject()
{
MyObject myobject = new MyObject();
try
{
MemoryStream responseInMemory = CreateWebRequest(m_host, ENDPOINT);
XmlSerializer xmlSerializer = new XmlSerializer(typeof(MyObject));
myObject = (MyObject) xmlSerializer.Deserialize(responseInMemory);
}
catch (InvalidOperationException ex)
{
m_logger.WriteError("Error getting MyObject: ", ex);
throw new XmlException();
}
return myObject;
}
private MemoryStream CreateWebRequest(string host, string endpoint)
{
WebRequest request = WebRequest.Create(host + endpoint);
using (var response = request.GetResponse())
{
return (MemoryStream) response.GetResponseStream();
}
}
EDIT: I have read this SO thread Async/await vs BackgroundWorker
async await is also concurrence. If you have concurrence problems and you want your application to have only one thread, you should avoid using async await.
However the best way to do WebRequest is to use async await, which does not block the main UI thread.
Use the bellow method, it will not block anything and it is recommended by Microsoft. https://msdn.microsoft.com/en-us/library/86wf6409(v=vs.110).aspx
private async Task<MemoryStream> CreateWebRequest(string host, string endpoint)
{
WebRequest request = WebRequest.Create(host + endpoint);
using (var response = await request.GetResponseAsync())
{
return (MemoryStream)response.GetResponseStream();
}
}
You don't mention what the concurrency problems are. It may be that the request takes so long that the next one starts before the previous one finishes. It could also be that the callback replaces the value in my_Object while readers are accessing it.
You can easily make a request every X seconds, asynchronously and without blocking, by using Task.Delay, eg:
ConcurrentQueue<MyObject> m_Responses=new ConcurrentQueue<MyObject>();
public async Task MyPollMethod(int interval)
{
while(...)
{
var result=await SomeAsyncCall();
m_Responses.Enqueue(result);
await Task.Delay(interval);
}
}
This will result in a polling call X seconds after the last one finishes.
It also avoids concurrency issues by storing the result in a concurrent queue instead of replacing the old value, perhaps while someone else was reading int.
Consumers of MyObject would call Dequeue to retrieve MyObject instances in the order they were received.
You could use the ConcurrentQueue to fix the current code too:
private void TimerCallbackFunction(Object info)
{
Thread.CurrentThread.Name = "Requester thread ";
var result=GetMyObject();
m_Responses.Enqueue(result);
}
or
private async void TimerCallbackFunction(Object info)
{
Thread.CurrentThread.Name = "Requester thread ";
var result=await GetMyObjectAsync();
m_Responses.Enqueue(result);
}
if you want to change your GetObject method to work asynchronously.
Since your request seems to take a long time, it's a good idea to make it asynchronous and avoid blocking the timer's ThreadPool thread while waiting for a network response.

WebRequest doomed to failure when starting from parallel thread

Consider following console application:
public static void Request(string url)
{
ThreadPool.QueueUserWorkItem((state) =>
{
try
{
var request = WebRequest.Create(url);
request.Timeout = 5000;
request.GetResponse();
}
catch (Exception e)
{
Console.Out.WriteLine(e);
}
Console.Out.WriteLine(url);
});
}
static void Main(string[] args)
{
Request("http://google.com?q=a");
Request("http://google.com?q=b");
Request("http://google.com?q=c");
Request("http://google.com?q=d");
Thread.Sleep(20000);
Console.In.ReadLine();
}
Output will finish for 2 urls. But for the rest it will throw "The operation has timed out".
I know that there is a limit of parallel connections by default set to two. If I increase it to three then three will finish up.
I.E:
ServicePointManager.DefaultConnectionLimit = 3;
But my question is - why the rest of them doesn't finish up, but throws the operation has timed out instead ?
Because the timeout includes the time the process was waiting in line for a connection to become available.
The timeout to represents "I want to wait at most 5000ms after I call GetResponse() to get my response" not "I want to wait at most an additional 5000ms after GetResponse() has waited for a unlimited amount of time to get its turn in the queue."
Now, you wonder, "but the query is so quick, it should not take more than 5000ms to complete!". The problem comes from the fact you did not close the response you got from GetResponse, from the MSDN:
You must call the Close method to close the stream and release the
connection. Failure to do so may cause your application to run out of
connections.
Calling Dispose() implicitly calls Close(), so if you update your code to dispose of your response then the used connection will become freed and one of the waiting requests will then be able to start up.
public static void Request(string url)
{
ThreadPool.QueueUserWorkItem((state) =>
{
try
{
var request = WebRequest.Create(url);
request.Timeout = 5000;
using(var response = request.GetResponse())
{
Console.Out.WriteLine("Response - " + url);
}
}
catch (Exception e)
{
Console.Out.WriteLine(e);
}
Console.Out.WriteLine("Method End - " + url);
});
}
static void Main(string[] args)
{
Request("http://google.com?q=a");
Request("http://google.com?q=b");
Request("http://google.com?q=c");
Request("http://google.com?q=d");
Thread.Sleep(20000);
Console.In.ReadLine();
}

Stopping threads in C#

I need to make TcpClient event driven rather than polling for messages all the time, so I thought: I will create a thread that would wait for a message to come and fire an event once it does. Here is a general idea:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using System.Net.Sockets;
using System.Net;
namespace ThreadsTesting
{
class Program
{
static void Main(string[] args)
{
Program p = new Program();
//imitate a remote client connecting
TcpClient remoteClient = new TcpClient();
remoteClient.Connect(IPAddress.Parse("127.0.0.1"), 80);
//start listening to messages
p.startMessageListener();
//send some fake messages from the remote client to our server
for (int i = 0; i < 5; i++)
{
remoteClient.GetStream().Write(new byte[] { 0x80 }, 0, 1);
Thread.Sleep(200);
}
//sleep for a while to make sure the cpu is not used
Console.WriteLine("Sleeping for 2sec");
Thread.Sleep(2000);
//attempt to stop the server
p.stopMessageListener();
Console.ReadKey();
}
private CancellationTokenSource cSource;
private Task listener;
private TcpListener server;
private TcpClient client;
public Program()
{
server = new TcpListener(IPAddress.Parse("127.0.0.1"), 80);
server.Start();
}
private void startMessageListener()
{
client = server.AcceptTcpClient();
//start listening to the messages
cSource = new CancellationTokenSource();
listener = Task.Factory.StartNew(() => listenToMessages(cSource.Token), cSource.Token);
}
private void stopMessageListener()
{
Console.Out.WriteLine("Close requested");
//send cancelation signal and wait for the thread to finish
cSource.Cancel();
listener.Wait();
Console.WriteLine("Closed");
}
private void listenToMessages(CancellationToken token)
{
NetworkStream stream = client.GetStream();
//check if cancelation requested
while (!token.IsCancellationRequested)
{
//wait for the data to arrive
while (!stream.DataAvailable)
{ }
//read the data (always 1 byte - the message will always be 1 byte)
byte[] bytes = new byte[1];
stream.Read(bytes, 0, 1);
Console.WriteLine("Got Data");
//fire the event
}
}
}
}
This for obvious reasons doesn't work correctly:
while (!stream.DataAvailable) blocks the thread and uses always 25% CPU (on 4-core CPU), even if no data is there.
listener.Wait(); will wait for ever since the while loop doesn't pick up that cancel has been called.
My alternative solution would be using async calls within the listenToMessages method:
private async Task listenToMessages(CancellationToken token)
{
NetworkStream stream = client.GetStream();
//check if cancelation requested
while (!token.IsCancellationRequested)
{
//read the data
byte[] bytes = new byte[1];
await stream.ReadAsync(bytes, 0, 1, token);
Console.WriteLine("Got Data");
//fire the event
}
}
This works exactly as I expected:
The CPU is not blocked if there are no messages in the queue, but we are still waiting for them
Cancelation request is picked up correctly and thread finished as expected
I wanted to go further though. Since listenToMessages now returns a Task itself, I thought there is no need of starting a task that would execute that method. Here is what I did:
private void startMessageListener()
{
client = server.AcceptTcpClient();
//start listening to the messages
cSource = new CancellationTokenSource();
listener = listenToMessages(cSource.Token);
}
This doesn't work as I have expected in the sence that when Cancel() is called, the ReadAsync() method doesn't seem to pick up the cancelation message from the token, and the thread doesn't stop, instead it is stuck on the ReadAsync() line.
Any idea why is this happening? I would think the ReadAsync will still pick up the token, as it did before...
Thanks for all your time and help.
-- EDIT --
Ok so after more in depth evaluation my solution no.2 doesn't really work as expected:
the thread itself ends to the caller and so the caller can continue. However, the thread is not "dead", so if we send some data it will execute once more!
Here is an example:
//send some fake messages from the remote client to our server
for (int i = 0; i < 5; i++)
{
remoteClient.GetStream().Write(new byte[] { 0x80 }, 0, 1);
Thread.Sleep(200);
}
Console.WriteLine("Sleeping for 2sec");
Thread.Sleep(2000);
//attempt to stop the server
p.stopListeners();
//check what will happen if we try to write now
remoteClient.GetStream().Write(new byte[] { 0x80 }, 0, 1);
Thread.Sleep(200);
Console.ReadKey();
This will output the message "Got Data" even though in theory we stopped! I will investigate further and report on my findings.
With modern libraries, any time you type new Thread, you've already got legacy code.
The core solution for your situation is asynchronous socket methods. There are a few ways to approach your API design, though: Rx, TPL Dataflow, and plain TAP come to mind. If you truly want events then EAP is an option.
I have a library of EAP sockets here. It does require a synchronizing context, so you'd have to use something like ActionDispatcher (included in the same library) if you need to use it from a Console application (you don't need this if you're using it from WinForms/WPF).
ReadAsync doesn't seem to support cancellation on a NetworkStream - have a look at the answers in this thread:
NetworkStream.ReadAsync with a cancellation token never cancels

Categories

Resources