Queue implementation in C# - c#

I'm dealing with a hardware resource that can only handle 1 command at a time. I'm going to exposing some of it's API functions via a web interface, so obviously there's a good chance more than 1 command will get sent at a time. I have decided that queuing these commands when they're submitted is the best way to ensure serial processing.
I'm planning on implementing the queue in a static class. The web app code-behind will add a command by calling a method corresponding to the command they want. I want the calling method to wait until it gets the output of its command, so no async magic is required.
Am I doing this right? Is there a better way?
How do I start implementing the queue in C# (I usually work with Java)? I assume I'll need some sort Event to signal a job has been added, and a Handler to initiate processing of the queue...
I'm using .NET Framework 4.

You can use the ConcurrentQueue class for your implementation and have a dedicated thread to process items in the queue.
For the waiting part you can use an AutoResetEvent, producers pass the event instance to the singleton class along with the request, then calls WaitOne() which blocks until the processor has signaled processing is completed by calling Set().

Sounds like a good approach EXCEPT: Use the Generic Queue collections class. Do not write your own! You would be reinventing a well-built wheel.

Related

Asynchronous fire and forget in c#

Currently I am about to develop logging for a c# application into a SQL server table.
I have a designated class called Logger that has a static method writeToLog().
I just want to call that static function without block the calling thread.
How is this possible in C# clean and fast?
The functions don't return anything they are just fire and forget.
Thanks for you advice
There is the chance that the serveral available logging libraries out there have spent some thoughts about performance.
If you still need to develop a light weight solution yourself I think of two ways.
1.) Create a Task that runs the logging function, without awaiting them
2.) The Log-Method saves the Log-Information in a queue that is written to the SQL-database with a background thread.
I would recommend the second way, because the logging itself can be done in a synchronous function call without the Task creation overhead.
Another argument for this approach is that it is possible to gurantee the order of log messages. By using a Task for each single message the order of execution is not defined.
And a last argument: It might be more performant to write blocks of messages to the SQL table. By using the queue you will be able write messages in a bulk operation.

Multithreaded Service Engineering Questions

I am trying to leverage .NET 4.5 new threading capabilities to engineer a system to update a list of objects in memory.
I worked with multithreading years ago in Java and I fear that my skills have become stagnant, especially in the .NET region.
Basically, I have written a Feed abstract class and I inherit from that for each of my threads. The thread classes themselves are simple and run fine.
Each of these classes run endlessly, they block until an event occurs and it updates the List.
So the first question is, how might I keep the parent thread alive while these threads run? I've prevented this race condition by writing this currently in a dev console app with a Console.read().
Second, I would like to set up a repository of List objects that I can access from the parent thread. How would I update those Lists from the child thread and expose them to another system? Trying to avoid SQL. Shared memory?
I hope I've made this clear. I was hoping for something like this: Writing multithreaded methods using async/await in .Net 4.5
except, we need the adaptability of external classes and of course, we need to somehow expose those Lists.
You can run the "parent" thread in a while with some flag to stop it:
while(flag){
//run the thread
}
You can expose a public List as a property of some class to hold your data. Remember to lock access in multithreading code.
If the 'parent' thread is supposed to wait during the processing it could simply await the call(s) to the async method(s).
If it has to wait for specific events you could use a signaling object such as a Barrier.
If the thread has to 'do' things while waiting you could check the availability of the result or the progress: How to do progress reporting using Async/Await
If you're using tasks, you can use Tasks.WaitAll to wait for the tasks to complete. The default is that Tasks and async/await use your system's ThreadPool, so I'd avoid placing anything but relatively short running tasks here.
If you're using System.Threading.Thread (I prefer using these for long running threads), check out the accepted answer here: C# Waiting for multiple threads to finish
If you can fetch batches of data, you can expose services allowing access to the shared objects using self hosted Web API or something like NancyFX. WCF and remoting are also options if you prefer binary communication.
Shared memory, keep-alive TCP connections or UDP are options if you have many small transactions. Perhaps you could use ZeroMQ (it's not a traditional queue) with the C# binding they provide?
For concurrent access to the lists take a look at the classes in System.Collections.Concurrent before implementing your own locking.

How to marshal calls to specific threads in C# using TPL

I have a situation where I have a polling thread for a TCPClient (is that the best plan for a discrete TCP device?) which aggregates messages and occasionally responds to those messages by firing off events. The event producer really doesn't care much if the thread is blocked for a long time, but the consumer's design is such that I'd prefer to have it invoke the handlers on a single worker thread that I've got for handling a state machine.
The question then is this. How should I best manage the creation, configuration (thread name, is background, etc.) lifetime, and marshaling of calls for these threads using the Task library? I'm somewhat familiar with doing this explicitly using the Thread type, but when at all possible my company prefers to do what we can just through the use of Task.
Edit: I believe what I need here will be based around a SynchronizationContext on the consumer's type that ensures that tasks are schedules on a single thread tied to that context.
The question then is this. How should I best manage the creation, configuration (thread name, is background, etc.) lifetime, and marshaling of calls for these threads using the Task library?
This sounds like a perfect use case for BlockingCollection<T>. This class is designed specifically for producer/consumer scenarios, and allows you to have any threads add to the collection (which acts like a thread safe queue), and one (or more) thread or task call blockingCollection.GetConsumingEnumerable() to "consume" the items.
You could consider using TPL DataFlow where you setup an ActionBlock<T> that you push messages into from your TCP thread and then TPL DataFlow will take care of the rest by scaling out the processing of the actions as much your hardware can handle. You can also control exactly how much processing of the actions happen by configuring the ActionBlock<T> with a MaxDegreeOfParallelism.
Since processing sometimes can't keep up with the flow of incoming data, you might want to consider "linking" a BufferBlock<T> in front of the ActionBlock<T> to ensure that the TCP processing thread doesn't get too far ahead of what you can actually process. This would have the same effect as using BlockingCollection<T> with a bounded capacity.
Finally, note that I'm linking to .NET 4.5 documentation because it's easiest, but TPL DataFlow is available for .NET 4.0 via a separate download. Unfortunately they never made a NuGet package out of it.

What is the difference using async/await and Task for time critical processing?

Lets say I have a method that is constantly receiving input every 10ms, adds data to a Queue<T>, another thread dequeues and does some processing and writes to a file to keep everything flowing in nicely. The reason for this is because processing will take longer than 10ms.
In updating this method with .Net 4 I would start a new Task with the long running option to ensure a new thread is created. Have a BlockingCollection<T> to add data to and in the task method have the BlockingCollection call GetConsumingEnumerable to process data and write this to file.
Firstly, I'm not entirely sure if I need the Task to create a new thread or not but due to my ignorance this seems the most beneficial way of doing it.
Secondly with the introduction of async and await keywords I could possibly rewrite this again so that when data comes in, call a method marked with async which does processing and then calls await FileStream.WriteAsync. Will this ensure that the data coming in at 10ms apart is handled ok and there is no back log? I have found that these new keywords don't create new threads but just handles execution in a timely manner somehow.
In my scenario will this be any good to me or should I stick to creating new threads with the long running Task and BlockingCollection?
If you're receiving input every 10ms and processing takes longer than 10ms, then you will not be able to keep up unless you're doing parallel processing on the back end.
It sounds like a producer/consumer approach is best for your situation. You can use BlockingCollection<T> with separate threads/tasks, but you should also consider the Dataflow library (part of the new VS Async support). The Dataflow library allows you to set up a pipeline (or mesh) that defines how your data flows through the system; tasks are automatically created as needed.
The async and await keywords are meant to help you writing asynchronous calls to functions/APIs that take long to respond. The compiler wraps to call in 2 methods, the one that launches the task and the one that is the callback when the call is done.
In your case it seems there is no point in using async and await since you are launching a separate task to handle the work. There is no third thing that needs to happen after the work has been done.

Are calls synchronous in WCF?

I'm writing an App using WCF where clients subscribe to a server and then updates get pushed back to the clients.
The subscribers subscribe to the server using a DuplexPipeChannel calling a Subscribe() method on the server.
The server maintains a List<> of subscribers and when there is data to push out to the subscribers it calls a PushData() method.
My intention is to iterate through the list of subscribers calling the push method on each of them in turn.
What I want to know is: Is calling the push method on my Subscriber blocking? Will a failure of connectivity or delay in connecting to one of the subscribers cause the rest of the push calls to be delayed (or worse fail)?
I'm sorry if this is an obvious question, but I've been mostly a .Net 2.0 person up until now so I know very little about WCF.
My WCF code is loosly based on this tutorial.
Another Question
Assuming it is synchronous, am I better off spawning a new thread to deal with the client side requests or would I be better off spawning a new thread for each "push serverside?"
WCF calls are synchronous by default, although they can be configured to be asynchronous. See Jarrett's answer below. Take a look here. Every message you send will receive a result back, whether you actually are expecting data or not.
The call will block depending on what your server does. If PushData on the server actually iterates through the subscriber list and sends a message to each, it will. If PushData only inserts the data and another thread handles sending the data to the subscribers, it will only block while your server inserts the data and returns.
Hope this helps.
Edit: Regarding spawning threads client-side vs server-side. Server-side. If a client calls takes a while, that's while, but if it takes a long time because the server is actually sending out calls to other clients in the same call, then something is wrong. I would actually not really spawn a new thread each time. Just create a producer/consumer pattern on your server side so that whenever a data item is queued, the consumer picks it up. Hell, you can even have multiple consumers.
If you right-click on the Service Reference, you have the option to create Async calls. (There's a checkbox on the setup dialog.) I usually create Async methods and then listen for a result. While it is a bit more work, I can write a much more responsive application using async service operations.

Categories

Resources