My C# class must be able to process a high volume of events received via a tcp stream style socket connection. The volume of event messages received from the tcp server by the class's socket is completely variable. For instance, sometimes it will only receive one event message in a period of ten seconds and at other times it will receive a sixty event messages within a second.
I am using Socket.ReceiveAsync to receive messages. ReceiveAsync returns true if the receive operation is pending or false if there is already data on the wire and the receive operation completed synchronously. If the operation is pending, the Socket will call my callback on an IO completion thread, otherwise I call my own callback in the current (IOC) thread. Furthermore, mixed in with event messages I also receive responses to commands that were sent to this tcp server. Response messages are processed right away; individually, by firing off a threadpool worker.
However, I would like to queue event messages until I have "enough" (N) of them OR until there are no more on the wire...and then fire off a threadpool worker to process a batch of event messages. Also, I want all events to be processed sequentially so I only want one threadpool worker to be working on this at a time.
The processor of event messages need only copy the message buffer into an object, raise an event and then release the message buffer back into the ring-buffer pool. So my question is...what do you think is the best strategy to accomplish this?
Do you need more info? Let me know. Thanks!!
I would not call 60 events per second high volume. At that low level of activity any socket processing method at all with be fine. I've handled 5,000 events per second on a single thread using hardware that's much less capable than current machines, just using select.
I will say that if you are looking to scale, handing off messages individually between threads is going to be a disaster. You need to batch or your context switches will kill performance.
Related
My application is a communications server that receives TCP messages from a web server and then re-broadcasts the message to a number of iPads. It's a windows forms application in C#. Program.cs creates an instance of the primary form and then that form creates four threads that do the communications work. There is a thread that listens for messages from the web server, a thread that processes the incoming messages into the data that needs to be transmitted and a thread that handles sending the outbound messages. The fourth thread does database cleanup and spends 99% of it's time sleeping.
The problem I'm seeing is the GUI locks up with a load is placed on the system. On incoming message may represent 50 or 100 outgoing messages. While testing I'm restricting they system to only send 5 messages out at a time, so it requires a longer transmission time. The sending process is using async call backs, but even it it wasn't, I can't understand why load on the tread could be stalling the GUI thread.
I'm removed a much of the cross thread communications from the thread to the GUI for status update. The pattern for communications to the GUI is:
public void StatusOutput(string myString)
{
if (this.lbStatus.InvokeRequired)
{
this.lbStatus.BeginInvoke(new DebugOutputInvoker(StatusOutput), myString);
}
else
{
lbStatus.Items.Add(myString);
while (lbStatus.Items.Count >= 501)
{
lbStatus.Items.RemoveAt(0);
}
lbStatus.SelectedItem = lbStatus.Items.Count - 1;
} // StatusUpdate() ...
Can anyone give me any advice on how to pursue this? I though threads were completely isolated from the GUI and couldn't load it.
Thanks!
As an update, I removed all thread to GUI communications and the GUI stopped locking up. So this proved it wasn't a GUI thread issue. I work through the threads till I found out it's the TCPSender thread that has a lot of async callback functions, potentially hundreds of them. When ever this thread get's busy with it's async calls, the GUI locks up. I suspect it has to do with call backs happening while the thread has one of the GUI's methods in process.
I've solved the issue with making a new thread that just collects data from the operational threads and then updates the user interface. General status message that were transferred to the status display on the GUI thread are now queued and updated to the GUI through this new thread.
The GUI remains responsive and actually seems to be able to display more data now.
I have a static Queue collection. I have a thread that en-queue to it. I have multiple waiting threads which needs to listen to any en-queue event on the Queue collection and do a task. How to achieve this in C#?
The thread which is inserting should never be blocked. But all the threads that are listening to should instantly get notified and do some other work using the data and then listen again as soon as the job is over.
Actually BlockingCollection was exactly what I needed. I initially thought it would block the whole listening thread, however the listening thread fires events as I wanted while it is still waiting for consuming from the producer.
I am sending UDP messages through a .NET socket with the SendAsync method.
The question is:
What has happened until I get the completed event?
Has the message left the device on the wire? Can the elapsed time be used to judge the outgoing network speed?
Background:
I need to know the rate at which I can initiate UDP send operations. If the message would have left the device over the network at the time of the completed event, I would maybe add some buffer and initiate the next send operation. I don't care about lost messages and even about the actual transmission rate. It is just to have a rough idea at what rate I can send my messages out.
No, not at all. Not just because you use UDP, but more importantly, because you use SendAsync. In general, when you use methods that end with Async this means they don't block the current thread and they will run in the background, so you can't measure when they finish their work. To clarify this more you'd better to study about Multi-Threading here."Threading enables your C# program to perform concurrent processing so you can do more than one operation at a time" and generally Async methods operate this way i.e. Asynchronized. So you couldn't even find some measure about the speed this way if you had a sound understanding of this Async thing.
I've run into a problem while writing an async multi-server network app in c#. I have many jobs being taken care of by the thread pool and these include the writes to the network sockets. This ended up allowing for the case where more than one thread could write to the socket at the same time and discombobulate my outgoing messages. My idea for getting around this was to implement a queue system where whenever data got added to the queue, the socket would write it.
My problem is, I can't quite wrap my head around the architecture of something of this nature. I imagine having a queue object that fires an event on whenever data gets added to the queue. The event then writes the data being held in the queue, but that won't work because if two threads come by and add to the queue simultaneously, even if the queue is made to be thread safe, events will still be fired for both and I'll run into the same problem. So then maybe someway to hold off an event if another is in progress, but then how do I continue that event once the first finishes without simply blocking the thread on some mutex or something. This wouldn't be so hard if I wasn't trying to stay strict with my "block nothing" architecture but this particular application requires that I allow the thread pool threads to keep doing their thing.
Any ideas?
While similar to Porges answer it differs a bit in implementation.
First, I usually don't queue the bytes to send, but objects and seralize them in the sending thread but I guess that's a matter of taste.
But the bigger difference is in the use of ConcurrentQueues (in addition to the BlockingCollection).
So I'd end up with code similar to
BlockingCollection<Packet> sendQueue = new BlockingCollection<Packet>(new ConcurrentQueue<Packet>());
while (true)
{
var packet = sendQueue.Take(); //this blocks if there are no items in the queue.
SendPacket(packet); //Send your packet here.
}
The key-take away here is that you have one thread which loops this code, and all other threads can add to the queue in a thread-safe way (both, BlockingCollection and ConcurrentQueue are thread-safe)
have a look at Processing a queue of items asynchronously in C# where I answered a similar question.
Sounds like you need one thread writing to the socket synchronously and a bunch of threads writing to a queue for that thread to process.
You can use a blocking collection (BlockingCollection<T>) to do the hard work:
// somewhere there is a queue:
BlockingCollection<byte[]> queue = new BlockingCollection<byte[]>();
// in socket-writing thread, read from the queue and send the messages:
foreach (byte[] message in queue.GetConsumingEnumerable())
{
// just an example... obviously you'd need error handling and stuff here
socket.Send(message);
}
// in the other threads, just enqueue messages to be sent:
queue.Add(someMessage);
The BlockingCollection will handle all synchronization. You can also enforce a maximum queue length and other fun things.
I don't know C#, but what I would do is have the event trigger the socket manager to start pulling from the queue and write things out one at a time. If it is already going the trigger won't do anything, and once there is nothing in the queue, it stops.
This solves the problem of two threads writing to the queue simultaneously because the second event would be a no-op.
You could have a thread-safe queue that all your worker thread write their results to. Then have another thread that polls the queue and sends results when it sees them waiting.
I have a serial connection to a device that I want to periodically monitor at a frequency of 1 to 1/10 Hz (not decided yet).
I would like the communication to be non-blocking, and thus I have decided to put the communication in a thread of some kind and let the main application receive data via events fired from the thread.
I was thinking about making a thread at application start that sends and receives data continuously. The thread empties a queue containing commands to send to the device, and then listen for response firing an event when a response is complete.
The thread is put to sleep when no commands are in the queue, and woken when commands enter the queue.
Is this a good way of doing it?
I was thinking about maybe using some of established features of the framework, like BackgroundWorker or Task, since these might have advantages over what I'm doing.
Maybe there are other and better ways of accomplishing this?
That isn't frequent enough to justify burning up an expensive resource like a Thread. Use a System.Timers.Timer or System.Threading.Timer (better) instead. Write the device query command in the callback. Use the SerialPort.DataReceived event to receive the response and fire the event. Now everything runs on cheap threadpool threads.
I found http://social.msdn.microsoft.com/forums/en-US/netfxbcl/thread/e36193cd-a708-42b3-86b7-adff82b19e5e/ which explains a bit more of the architecture of SerialPort.