long running process interlock - c#

I'm creating a webservice+servicebus project where user can do something like
public void ExecuteLongProcess(DateTime fromDate,string aggregateId){}
This method immediately returns but send over the bus a request for the operation.
My problems starts when multiple user ask for the long process over the same aggregateId when another one is already running.
The solution i'm thinking about is a Task that runs continuosly and look in a Queue<LongProcessTask> for a operation that must be executed so I run only one process a time or a future implementation will be multiple process if different aggregateId.
This way I don't overlap long running process over the same aggregate.
Other ideas?

I have created a TaskRunner that instantiate some continuous running Task (number depending on processor cores) that look in a concurrent queue and run each operation pending.
TaskRunner get from Windsor the handler for each operation type in order to leave the processing of each operation in a class aside.

In your answer, you say that multiple threads will take tasks from a concurrent queue. In this case there is a chance that two tasks with the same aggregateId may run at the sametime. I do not know whether this is a problem for you, if so then you must use a different queue for each aggregateId.
If the task order is not an issue then I recommend using a BlockingCollection. Because there is question : What are you planning to do with multiple consumer threads if there is no task in the concurrent queue.
while(some_condition_to_keep_thread_alive)
{
if(!queue.TryDequeue(...))
continue;
else
{
//do the job
}
}
This code will make your cores go crazy if queue is empty. You need a blocking mechanism.
BlockingCollection will do this for you.
Do you insist on using ConcurrentQueue? Ok SemaphoreSlim is your friend.

Related

How do I monitor and limit the number of QueueBackgroundWorkItem threads in ASP.net?

I have an ASP.Net application acting as an interface between systems which tend to send data in bursts via multiple requests and I would like to kick off a background task to perform some processing but would really like to only have one single background task do it. I can use HostingEnvironment.QueueBackgroundWorkItem but that will indiscriminately launch a thread for each incoming request which is a problem.
When I launch a background process I want it to queue up the work it has (in the connected database) and then check if there is another background process running. If there is then it should finish because the other process will process the queued work. If there is no other background process running then I want it to start processing the queued work until there is no more to do and then it will stop.
The process is not a heavy task or long running task but the main constraint is that everything is processed in a strict order making parallel threading risky. In a single process it's easy to ensure everything gets processed in order.
How do I achieve this without shifting to an external service?
Seems like a classic producer consumer scenario. Create a BlockingCollection that producers enqueue to. Create one permanent LongRunning Task that drains that collection.
You can drain in batches if you want.
This would not work with QueueBackgroundWorkItem because the you need to eventually exit the work that you put into QueueBackgroundWorkItem so that the worker process can shutdown gracefully.

C# Multithreading Model

I've a c# single threaded application and currently working on to make it multi-threaded with the use of thread pools. I am stuck in deciding which model would work for my problem.
Here's my current scenario
While(1)
{
do_sometask();
wait(time);
}
And this is repeated almost forever. The new scenario has multiple threads which does the above. I could easily implement it by spawning number of threads based on the tasks I have to perform, where all the threads perform some task and wait forever.
The issue here is I may not know the number of tasks, so I can't just blindly spawn 500 threads. I thought about using threadpool, but because almost every thread loops forever and won't ever be freed up for new tasks in the queue, am not sure which other model to use.
I am looking for an idea or solution where I could break the loop in the thread and free it up instead of waiting, but come back and resume the same task after the wait(when the time gets elapsed, using something like a timer/checking timestamp of when the last task is performed).
With this I could use a limited number of threads (like in a thread pool) and serve the tasks which are coming in during the time old threads waits(virtually).
Any help is really appreciated.
If all you have is a bunch of things that happen periodically, it sounds what you want is a bunch of timers. Create a timer for each task, to fire when appropriate. So if you have two different tasks:
using System.Threading;
// Task1 happens once per minute
Timer task1Timer = new Timer(
s => DoTask1(),
null,
TimeSpan.FromMinutes(1),
TimeSpan.FromMinutes(1));
// Task2 happens once every 47 seconds
Timer task2Timer = new Timer(
s => DoTask2(),
null,
TimeSpan.FromSeconds(47),
TimeSpan.FromSeconds(47);
The timer is a pretty lightweight object, so having a whole bunch of them isn't really a problem. The timer only takes CPU resources when it fires. The callback method will be executed on a pool thread.
There is one potential problem. If you have a whole lot of timers all with the same period, then the callbacks will all be called at the same time. The threadpool should handle that gracefully by limiting the number of concurrent tasks, but I can't say for sure. But if your wait times are staggered, this is going to work well.
If you have small wait times (less than a second), then you probably need a different technique. I'll detail that if required.
With this design, you only have one thread blocked at any time.
Have one thread (the master thread) waiting on a concurrent blocking collection, such as the BlockingCollection. This thread will be blocked by a call to TryTake until something is placed in the collection, or after a certain amount of time has passed via a timeout passed into the call (more on this later).
Once it is unblocked, it may have a unit of work to be processed. It checks to see if there is one (i.e., the TryTake call didn't time out), then if there is capacity to perform this work, and if so, queues up a thread (pool, Task or whatevs) to service the work. This master thread then goes back to the blocking collection and tries to take another unit of work. The cycle continues.
As a unit of work is begun, it will be noted so that the main thread can see how many threads are working. Once this unit is completed, the notation will be removed. The thread is then freed.
You want to use a timeout so that if it is judged that too many operations are running concurrently, you will be able to re-evaluate this a set period of time down the road. Otherwise, that unit of work sits in the blocking collection until a new unit is added, which is not optimal.
Outside users of this instance can queue up new units of work by simply dropping them in the collection.
You can use a cancellation token to immediately unblock the thread when it's time to shut down operations. Have the worker operations take cancellation tokens as well so they can halt on shutdown.
I could implement it with the help of a threadpool and few conditions to check the last activity of the task before adding it to the threadpool queue.

how to mix multithreading with sequential requirement?

i have a program which process price data coming from the broker. the pseudo code are as follow:
Process[] process = new Process[50];
void tickEvent(object sender, EventArgs e)
{
int contractNumber = e.contractNumber;
doPriceProcess(process[contractNumber], e);
}
now i would like to use mutlithreading to speed up my program, if the data are of different contract number, i would like to fire off different threads to speed up the process. However if the data are from the same contract, i would like the program to wait until the current process finishes before i continue with the next data. How do i do it?
can you provide some code please?
thanks in advance~
You have many high level architectural decissions to make here:
How many ticks do you expect to come from that broker?
After all, you should have some kind dispatcher here.
Here is some simple description of what basically is to do:
Encapsulate the incoming ticks in packages, best
single commands that have all the data needed
Have a queue where you can easily (and thread safe) can store those commands
Have a Dispatcher, that takes an item of the queue and assigns some worker
to do the command (or let the command execute itself)
Having a worker, you can have multiple threads, processes or whatsoever
to work multiple commands seemlessly
Maybe you want to do some dispatching already for the input queue, depending
on how many requests you want to be able to complete per time unit.
Here is some more information that can be helpful:
Command pattern in C#
Reactor pattern (with sample code)
Rather than holding onto an array of Processes, I would hold onto an array of BlockingCollections. Each blocking collection can correspond to a particular contract. Then you can have producer threads that add work to do onto the end of a corresponding contract's queue, and you can have producer queues that consume the results from those collections. You can ensure than each thread (I would use threads for this, not processes) is handling 1-n different queues, but that each queue is handled by no more than one thread. That way you can ensure that no bits of work from the same contract are worked on in parallel.
The threading aspect of this can be handled effectiving using C#'s Task class. For your consumers you can create a new task for each BlockingCollection. That task's body will pretty much just be:
foreach(SomeType item in blockingCollections[contractNumber].GetConsumingEnumerable())
processItem(item);
However, by using Tasks you will let the computer schedule them as it sees fit. If it notices most of them sitting around waiting on empty queues it will just have a few (or just one) actual thread rotating between the tasks that it's using. If they are trying to do enough, and your computer can clearly support the load of additional threads, it will add more (possibly adding/removing dynamically as it goes). By letting much smarter people than you or I handle that scheduling it's much more likely to be efficient without under or over parallelizing.

Threadpool, order of execution and long running operations

I have a need to create multiple processing threads in a new application. Each thread has the possibility of being "long running". Can someone comment on the viability of the built in .net threadpool or some existing custom threadpool for use in my application?
Requirements :
Works well within a windows service. (queued work can be removed from the queue, currently running threads can be told to halt)
Ability to spin up multiple threads.
Work needs to be started in sequential order, but multiple threads can be processing in parallel.
Hung threads can be detected and killed.
EDIT:
Comments seem to be leading towards manual threading. Unfortunately I am held to 3.5 version of the framework. Threadpool was appealing because it would allow me to queue work up and threads created for me when resources were available. Is there a good 3.5 compatable pattern (producer/consumer perhaps) that would give me this aspect of threadpool without actually using the threadpool?
Your requirements essentially rule out the use of the .NET ThreadPool;
It generally should not be used for long-running threads, due to the danger of exhausting the pool.
It does work well in Windows services, though, and you can spin up multiple threads - limited automatically by the pool's limits.
You can not guarantee thread starting times with the thread pool; it may queue threads for execution when it has enough free ones, and it does not even guarantee they will be started in the sequence you submit them.
There are no easy ways to detect and kill running threads in the ThreadPool
So essentially, you will want to look outside the ThreadPool; I might recommend that perhaps you might need 'full' System.Threading.Thread instances just due to all of your requirements. As long as you handle concurrency issues (as you must with any threading mechanism), I don't find the Thread class to be all that difficult to manage myself, really.
Simple answer, but the Task class (Fx4) meets most of your requirements.
Cancellation is cooperative, ie your Task code has to check for it.
But detecting hung threads is difficult, that is a very high requirement anyway.
But I can also read your requirements as for a JobQueue, where the 'work' consists of mostly similar jobs. You could roll your own system that Consumes that queue and monitors execution on a few Threads.
I've done essentially the same thing with .Net 3.5 by creating my own thread manager:
Instantiate worker classes that know how long they've been running.
Create threads that run a worker method and add them to a Queue<Thread>.
A supervisor thread reads threads from the Queue and adds them to a Dictionary<int, Worker> as it launches them until it hits its maximum running threads. Add the thread as a property of the Worker instance.
As each worker finishes it invokes a callback method from the supervisor that passes back its ManagedThreadId.
The supervisor removes the thread from the Dictionary and launches another waiting thread.
Poll the Dictionary of running workers to see if any have timed out, or put timers in the workers that invoke a callback if they take too long.
Signal a long-running worker to quit, or abort its thread.
The supervisor invokes callbacks to your main thread to inform of progress, etc.

Managing ThreadPool starvation within a multithreaded work queue processor?

I am investigating the design of a work queue processor where the QueueProcessor retrieves a Command Pattern object from the Queue and executes it in a new thread.
I am trying to get my head around a potential Queue lockup scenario where nested Commands may result in a deadlock.
E.G.
A FooCommand object is placed onto the queue which the QueueProcessor then executes in its own thread.
The executing FooCommand places a BarCommand onto the queue.
Assuming that the maximum allowed threads was only 1 thread, the QueueProcessor would be in a deadlocked state since the FooCommand is infinitely waiting for the BarCommand to complete.
How can this situation be managed? Is a queue object the right object for the job? Are there any checks and balances that can be put into place to resolve this issue?
Many thanks. ( application uses C# .NET 3.0 )
You could redesign things so that FooCommand doesn't use the queue to run BarCommand but runs it directly, or you could split FooCommand into two, and have the first half stop immediately after queueing BarCommand, and have BarCommand queue the second have of FooCommand after it's done its work.
Queuing implicitly assumes an asynchronous execution model. By waiting for the command to exit, you are working synchronously.
Maybe you can split up the commands in three parts: FooCommand1 that executes until the BarCommand has to be sent, BarCommand and finally FooCommand2 that continues after BarCommand has finished. These three commands can be queued separately. Of course, BarCommand should make sure that FooCommand2 is queued.
For simple cases like this an additional monitoring thread that can spin off more threads on demand is helpful.
Basically every N seconds check to see if any jobs have been finished, if not, add another thread.
This won't necessarily handle even more complex deadlock problems, but it will solve this one.
My recommendation for the heavier problem is to restrict waits to newly spawned process, in other words, you can only wait on something you started, that way you never get deadlocks, since cycles are impossible in that situation.
If you are building the Queue object yourself there are a few things you can try:
Dynamically add new service threads. Use a timer and add a thread if the available thread count has been zero for too long.
If a command is trying to queue another command and wait for the result then you should synchronously execute the second command in the same thread. If the first thread simply waits for the second you won't get a concurrency benefit anyway.
I assume you want to queue BarCommand so it is able to run in parallel with FooCommand, but BarCommand will need the result at some later point. If this is the case then I would recommend using Future from the Parallel Extensions library.
Bart DeSmet has a good blog entry on this. Basically you want to do:
public void FooCommand()
{
Future<int> BarFuture = new Future<int>( () => BarCommand() );
// Do Foo's Processing - Bar will (may) be running in parallel
int barResult = BarFuture.Value;
// More processing that needs barResult
}
With libararies such as the Parallel Extensions I'd avoid "rolling your own" scheduling.

Categories

Resources