Asynchronous Queue Manager - c#

I've run into a problem while writing an async multi-server network app in c#. I have many jobs being taken care of by the thread pool and these include the writes to the network sockets. This ended up allowing for the case where more than one thread could write to the socket at the same time and discombobulate my outgoing messages. My idea for getting around this was to implement a queue system where whenever data got added to the queue, the socket would write it.
My problem is, I can't quite wrap my head around the architecture of something of this nature. I imagine having a queue object that fires an event on whenever data gets added to the queue. The event then writes the data being held in the queue, but that won't work because if two threads come by and add to the queue simultaneously, even if the queue is made to be thread safe, events will still be fired for both and I'll run into the same problem. So then maybe someway to hold off an event if another is in progress, but then how do I continue that event once the first finishes without simply blocking the thread on some mutex or something. This wouldn't be so hard if I wasn't trying to stay strict with my "block nothing" architecture but this particular application requires that I allow the thread pool threads to keep doing their thing.
Any ideas?

While similar to Porges answer it differs a bit in implementation.
First, I usually don't queue the bytes to send, but objects and seralize them in the sending thread but I guess that's a matter of taste.
But the bigger difference is in the use of ConcurrentQueues (in addition to the BlockingCollection).
So I'd end up with code similar to
BlockingCollection<Packet> sendQueue = new BlockingCollection<Packet>(new ConcurrentQueue<Packet>());
while (true)
{
var packet = sendQueue.Take(); //this blocks if there are no items in the queue.
SendPacket(packet); //Send your packet here.
}
The key-take away here is that you have one thread which loops this code, and all other threads can add to the queue in a thread-safe way (both, BlockingCollection and ConcurrentQueue are thread-safe)
have a look at Processing a queue of items asynchronously in C# where I answered a similar question.

Sounds like you need one thread writing to the socket synchronously and a bunch of threads writing to a queue for that thread to process.
You can use a blocking collection (BlockingCollection<T>) to do the hard work:
// somewhere there is a queue:
BlockingCollection<byte[]> queue = new BlockingCollection<byte[]>();
// in socket-writing thread, read from the queue and send the messages:
foreach (byte[] message in queue.GetConsumingEnumerable())
{
// just an example... obviously you'd need error handling and stuff here
socket.Send(message);
}
// in the other threads, just enqueue messages to be sent:
queue.Add(someMessage);
The BlockingCollection will handle all synchronization. You can also enforce a maximum queue length and other fun things.

I don't know C#, but what I would do is have the event trigger the socket manager to start pulling from the queue and write things out one at a time. If it is already going the trigger won't do anything, and once there is nothing in the queue, it stops.
This solves the problem of two threads writing to the queue simultaneously because the second event would be a no-op.

You could have a thread-safe queue that all your worker thread write their results to. Then have another thread that polls the queue and sends results when it sees them waiting.

Related

How to listen to a variable value change from different threads?

I have a static Queue collection. I have a thread that en-queue to it. I have multiple waiting threads which needs to listen to any en-queue event on the Queue collection and do a task. How to achieve this in C#?
The thread which is inserting should never be blocked. But all the threads that are listening to should instantly get notified and do some other work using the data and then listen again as soon as the job is over.
Actually BlockingCollection was exactly what I needed. I initially thought it would block the whole listening thread, however the listening thread fires events as I wanted while it is still waiting for consuming from the producer.

How to marshal work onto single thread c#

I have an app that needs to process some packets from a UDP broadcast, and write the result to a SQL database.
During intervals the rate of incoming packets gets quite high, and I find that the underlying buffer overflows, even at 8MB, because of the slow database writing.
To solve this I have come up with two options, cache some of the DB writing,
and/or marshal the db writes onto another sequentially operating thread. (order of db writes must be preserved).
My question is how to marshal the work most effectively, with built-in structures/features of c#?
Should I just have a synchronized queue, put the delegate work items onto it and have a single thread service it in an event loop?
The easiest way is to create a separate thread for the UDP reading.
The thread just receives the packages and puts them into a BlockingCollection.
(Don't know how you do the packet reading so this will be pseudocode)
BlockingCollection<Packet> _packets = new BlockingCollection<Packet>();
...
while (true) {
var packet = udp.GetNextPacket();
_packets.Add(packet);
}
This will ensure you don't miss any packages because the reading thread is busy.
Then you create a worker thread that does the same thing as your current code but it reads from the blockingcollection instead.
while (true) {
var nextPacket = _packets.Take();
...
}
If your packages are independent (which they should be since you use UDP) you can fire up several worker threads.
If you are using .Net 4, Tasks and TaskSchedulers can help you out here. On MSDN you'll find the LimitedConcurrencyLevelTaskScheduler. Create a single instance of that, with a concurrency level of 1, and each time you have work to do, create a Task and schedule it with that scheduler.

C# switch from single to multi thread

I have a C# HttpListener that runs on a single thread and parses data sent to it by another program. My main problem is not all the data sent to the server is received. I only assume this is due to the limitations of it being run on a single thread. I have searched high and low for a simple multi-threading solution so it may receive al the data sent to it, and came up empty handed. Any help in transforming this into a multi-threaded application would be much appreciated.
private void frmMain_Load(object sender, EventArgs e)
{
Thread t = new Thread(new ThreadStart(ThreadProc));
t.Start();
}
public static void ThreadProc()
{
while (true)
{
WebBot.SimpleListenerExample(new string[] { "http://localhost:13274/" });
//Thread t = new Thread(new ThreadStart(ThreadProc));
//t.Start();
Application.DoEvents();
}
}
First thing first: verify that your hypothesis is indeed correct. You need to check:
How much data is sent
How much data is received
How long does it take to send the data
How long does it take to operate on the data
HTTP works over TCP, which generally guarantees delivery, so even if it will take a long time, your server should be getting all the incoming information.
That said, if you still want to make the process multi-threaded, I would recommend the following design:
One thread like you have right now (LISTENER THREAD), that accepts incoming data.
Another set of threads that will process the incoming data (WORKER THREADS).
The listener thread will only receive the data and place it in a queue.
The worker threads will dequeue the queue and operate on the data.
Several notes and things to think about, though:
Take care of thread synchronization - specifically, you need to protect the queue.
Think if it matters which worker thread will get the data. If there are several chunks that need to be taken care of a specific worker thread, you'll need to address this problem.
In some cases, if there is a very high load on the listener thread, the queue may become a bottleneck, or more precisely - the locking on the queue may become a bottleneck. In this case I would recommend moving into a model of N queues for N worker threads, and have the listener just pick one in a round-robin fashion. This will minimize the locks and actually since you'll have one reader and one writer you can even get away without a lock (but this is out of scope for that answer).
Yet another option would be to use a thread pool. A thread pool is a pool of threads that are hibernating until they are needed. When the listener gets an incoming input it will allocate it to a free thread, or will enlarge the pool if needed; this way you don't have a queue, and your threads are optimally used.
Simplest Embedded Web Server Ever with HttpListener may help you get started.

Server multithreading overkill?

I'm creating a server-type application at the moment which will do the usual listening for connections from external clients and, when they connect, handle requests, etc.
At the moment, my implementation creates a pair of threads every time a client connects. One thread simply reads requests from the socket and adds them to a queue, and the second reads the requests from the queue and processes them.
I'm basically looking for opinions on whether or not you think having all of these threads is overkill, and importantly whether this approach is going to cause me problems.
It is important to note that most of the time these threads will be idle - I use wait handles (ManualResetEvent) in both threads. The Reader thread waits until a message is available and if so, reads it and dumps it in a queue for the Process thread. The Process thread waits until the reader signals that a message is in the queue (again, using a wait handle). Unless a particular client is really hammering the server, these threads will be sat waiting. Is this costly?
I'm done a bit of testing - had 1,000 clients connected continually nagging - the server (so, 2,000+ threads) and it seemed to cope quite well.
I think your implementation is flawed. This kind of design doesn't scale because creating threads is expensive and there is a limit on how many threads can be created.
That is the reason that most implementations of this type use a thread pool. That makes it easy to put a cap on the maximum amount of threads while easily managing new connections and reusing the threads when the work is finished.
If all you are doing with your thread is putting items in a queue, then use the
ThreadPool.QueueUserWorkItem method to use the default .NET thread pool.
You haven't given enough information in your question to specify for definite but perhaps you now only need one other thread, constantly running clearing down the queue, you can use a wait handle to signal when something has been added.
Just make sure to synchronise access to your queue or things will go horribly wrong.
I advice to use following patter. First you need thread pool - build in or custom. Have a thread that checks is there something available to read, if yes it picks Reader thread. Then reading thread puts into queue and then thread from pool of processing threads will pick it. it will minimize number of threads and minimize time spend in waiting state

Managing ThreadPool starvation within a multithreaded work queue processor?

I am investigating the design of a work queue processor where the QueueProcessor retrieves a Command Pattern object from the Queue and executes it in a new thread.
I am trying to get my head around a potential Queue lockup scenario where nested Commands may result in a deadlock.
E.G.
A FooCommand object is placed onto the queue which the QueueProcessor then executes in its own thread.
The executing FooCommand places a BarCommand onto the queue.
Assuming that the maximum allowed threads was only 1 thread, the QueueProcessor would be in a deadlocked state since the FooCommand is infinitely waiting for the BarCommand to complete.
How can this situation be managed? Is a queue object the right object for the job? Are there any checks and balances that can be put into place to resolve this issue?
Many thanks. ( application uses C# .NET 3.0 )
You could redesign things so that FooCommand doesn't use the queue to run BarCommand but runs it directly, or you could split FooCommand into two, and have the first half stop immediately after queueing BarCommand, and have BarCommand queue the second have of FooCommand after it's done its work.
Queuing implicitly assumes an asynchronous execution model. By waiting for the command to exit, you are working synchronously.
Maybe you can split up the commands in three parts: FooCommand1 that executes until the BarCommand has to be sent, BarCommand and finally FooCommand2 that continues after BarCommand has finished. These three commands can be queued separately. Of course, BarCommand should make sure that FooCommand2 is queued.
For simple cases like this an additional monitoring thread that can spin off more threads on demand is helpful.
Basically every N seconds check to see if any jobs have been finished, if not, add another thread.
This won't necessarily handle even more complex deadlock problems, but it will solve this one.
My recommendation for the heavier problem is to restrict waits to newly spawned process, in other words, you can only wait on something you started, that way you never get deadlocks, since cycles are impossible in that situation.
If you are building the Queue object yourself there are a few things you can try:
Dynamically add new service threads. Use a timer and add a thread if the available thread count has been zero for too long.
If a command is trying to queue another command and wait for the result then you should synchronously execute the second command in the same thread. If the first thread simply waits for the second you won't get a concurrency benefit anyway.
I assume you want to queue BarCommand so it is able to run in parallel with FooCommand, but BarCommand will need the result at some later point. If this is the case then I would recommend using Future from the Parallel Extensions library.
Bart DeSmet has a good blog entry on this. Basically you want to do:
public void FooCommand()
{
Future<int> BarFuture = new Future<int>( () => BarCommand() );
// Do Foo's Processing - Bar will (may) be running in parallel
int barResult = BarFuture.Value;
// More processing that needs barResult
}
With libararies such as the Parallel Extensions I'd avoid "rolling your own" scheduling.

Categories

Resources