I have multiple instances of a class that has a function that does some process that lasts more than an hour, and I need to allow only a max of 2 processes running at a time across all instances, and if the number of processes was 2 then it has to wait until the the value of running process goes under 2, so I came up with this
public class SomeClass
{
private static int _ProcessesRunningCount=0;
public int ProcessesRunningCount
{
get {return Interlocked.CompareExchange(ref _ProcessesRunningCount, 0, 0); }
}
public void StartProcessing()
{
if (ProcessesRunningCount < 2)
{
Interlocked.Increment(ref _ProcessesRunningCount);
Task.Factory.StartNew(() => Process());
}
else
{
//wait and start after _ProcessesRunningCount gets to less than 2
}
}
private void Process()
{
//Do the processing
System.Threading.Thread.Sleep(100000);
Interlocked.Decrement(ref _ProcessesRunningCount);
}
}
However I am not sure how to achieve the wait part, and not sure if that is a good way to do it, but I don't want to create a manager class that handles everything
example
var A = new SomeClass();
var B = new SomeClass();
var C = new SomeClass();
var D = new SomeClass();
A.StartProcessing(); //process will start
B.StartProcessing(); //process will start
C.StartProcessing(); //process will wait until _ProcessesRunningCount goes under 2
D.StartProcessing(); //process will wait until _ProcessesRunningCount goes under 2
You can use a semaphore to limit the number of processes you spin up. There's an example on MSDN that should fit right into your current design. A semaphore is similar to a mutex (lock), but it allows more than 1 thread to access the critical section. The Thread in the example will start a Process and should block until the process exits.
Related
My situation is simple but complex. I am trying to write a program, that needs to execute an external process 1100 times, 4 at a time. I am completely stumped on how to go about this. The application I am writing is a "Windows Form Application", and I am utilizing BackgroundWorker to run the tasks async.
Example, I have a list of 1100 different strings, and I want to run the process 1 time per string, but only 4 at a time, and then move on to the next 4.
Any help would be appreciated.
Consider the following code:
private async void CodeOnUiThread()
{
//do ui stuff before starting
await ExecuteProcesses();
//do ui stuff after completing.
}
private async Task ExecuteProcesses()
{
await Task.Factory.StartNew(() =>
{
List<string> myStrings = GetMyStrings(); //or whatever you need
Parallel.ForEach(myStrings,
new ParallelOptions()
{
MaxDegreeOfParallelism = 4
}, (s) =>
{
var process = new Process();
process.StartInfo = new ProcessStartInfo("myProcess.exe", s);
process.Start();
process.WaitForExit();
});
});
}
This allows a maximum of 4 threads to run simultaniously, thus not allowing more than 4 processes to execute at the same time.
Update:
You can also use Environment.ProcessorCount to get the number of cores. However the Parallel.ForEach call will handle this correctly by default.
Update 2
Parallel.ForEach will block the thread currently running. I have updated the code above.
How about this (complete example):
class Program
{
static void Main(string[] args)
{
List<StringContainer> strContainers = new List<StringContainer>();
for (int i = 0; i < 1100; i++)
{
strContainers.Add(new StringContainer() { str = "string" + i });
}
Parallel.ForEach(
strContainers,
new ParallelOptions() { MaxDegreeOfParallelism = 4 },
x => ProcessString(x));
foreach (var item in strContainers)
{
Console.WriteLine(item.str);
}
Console.ReadKey();
}
private static void ProcessString(StringContainer strContainer)
{
strContainer.str += "_processed";
}
}
public class StringContainer
{
public string str;
}
Requirement :- At any given point of time only 4 threads should be calling four different functions. As soon as these threads complete, next available thread should call the same functions.
Current code :- This seems to be the worst possible way to achieve something like this. While(True) will cause unnecessary CPU spikes and i could see CPU rising to 70% when running the following code.
Question :- How can i use AutoResetEventHandler to signal Main thread Process() function to start next 4 threads again once the first 4 worker threads are done processing without wasting CPU cycles. Please suggest
public class Demo
{
object protect = new object();
private int counter;
public void Process()
{
int maxthread = 4;
while (true)
{
if (counter <= maxthread)
{
counter++;
Thread t = new Thread(new ThreadStart(DoSomething));
t.Start();
}
}
}
private void DoSomething()
{
try
{
Thread.Sleep(50000); //simulate long running process
}
finally
{
lock (protect)
{
counter--;
}
}
}
You can use TPL to achieve what you want in a simpler way. If you run the code below you'll notice that an entry is written after each thread terminates and only after all four threads terminate the "Finished batch" entry is written.
This sample uses the Task.WaitAll to wait for the completion of all tasks. The code uses an infinite loop for illustration purposes only, you should calculate the hasPendingWork condition based on your requirements so that you only start a new batch of tasks if required.
For example:
private static void Main(string[] args)
{
bool hasPendingWork = true;
do
{
var tasks = InitiateTasks();
Task.WaitAll(tasks);
Console.WriteLine("Finished batch...");
} while (hasPendingWork);
}
private static Task[] InitiateTasks()
{
var tasks = new Task[4];
for (int i = 0; i < tasks.Length; i++)
{
int wait = 1000*i;
tasks[i] = Task.Factory.StartNew(() =>
{
Thread.Sleep(wait);
Console.WriteLine("Finished waiting: {0}", wait);
});
}
return tasks;
}
One other thing, from the textual requirement section on your question I'm lead to believe that a batch of four new threads should only start after all previously four threads completed. However the code you posted is not compatible with that requirement, since it starts a new thread immediately after a previous thread terminate. You should clarify what exactly is your requirement.
UPDATE:
If you want to start a thread immediately after one of the four threads terminate you can still use TPL instead of starting new threads explicitly but you can limit the number of running threads to four by using a SemaphoreSlim. For example:
private static SemaphoreSlim TaskController = new SemaphoreSlim(4);
private static void Main(string[] args)
{
var random = new Random(570);
while (true)
{
// Blocks thread without wasting CPU
// if the number of resources (4) is exhausted
TaskController.Wait();
Task.Factory.StartNew(() =>
{
Console.WriteLine("Started");
Thread.Sleep(random.Next(1000, 3000));
Console.WriteLine("Completed");
// Releases a resource meaning TaskController.Wait will unblock
TaskController.Release();
});
}
}
I am not familiar with multithreading. Image I have a method to do some intensive search on a string, and return 2 lists of integers as out parameters.
public static void CalcModel(string s, out List<int> startPos, out List<int> len)
{
// Do some intensive search
}
The search on long string is very time consuming. So I want to split the string into several fragments, search with multithreads, and recombine the result (adjust the startPos accordingly).
How to integrate multithreading in this kinda process? Thanks
I forgot to mention the following two things:
I want to set a string length cutoff, and let the code to decide how many fragments it needs.
I had a hard time to associate the startPos of each fragments (on the original string) with the thread. How can I do that?
Rather than get too bogged down in details, generally, you send each thread a "return object." Once you've started all the threads, you block on them and wait until they are all finished.
While each thread is running, the thread modifies its work object and terminates when it has produced the output.
So roughly this (I can't tell exactly how you want to split it up, so perhaps you can modify this):
public class WorkItem {
public string InputString;
public List<int> startPos;
public List<int> len;
}
public static void CalcLotsOfStrings(string s, out List<int> startPos, out List<int> len)
{
WorkItem wi1 = new WorkItem();
wi1.InputString = s;
Thread t1 = new Thread(InternalCalcThread1);
t1.Start(wi1);
WorkItem wi2 = new WorkItem();
wi2.InputString = s;
Thread t2 = new Thread(InternalCalcThread2);
t2.Start(wi2);
// You can now wait for the threads to complete or start new threads
// When you're done, wi1 and wi2 will be filled with the updated data
// but make sure not to use them until the threads are done!
}
public static void InternalCalcThread1(object item)
{
WorkItem w = item as WorkItem;
w.startPos = new List<int>();
w.len = new List<int>();
// Do work here - populate the work item data
}
public static void InternalCalcThread2(object item)
{
// Do work here
}
You can try this, but I am not sure about the performance on these methods
Parallel.Invoke(
() => CalcModel(s,startPos, len),
() => CalcModel(s,startPos, len)
);
To create and run multiple threads is a very easy task. All you need is method which acts as a starting point for a thread.
Suppose you have the CalcModel method as defined in your original post then you only have to do:
// instantiate the thread with a method as starting point
Thread t = new Thread(new ThreadStart(CalcModel));
// run the thread
t.Start();
However if you want the thread to return some values you might apply a little trick because you can't return values directly like you do it with a return statement or out parameters.
You can 'wrap' the thread in its own class and let him store its data in the class's fields:
public class ThreadClass {
public string FieldA;
public string FieldB;
//...
public static void Run () {
Thread t = new Thread(new ThreadStart(_run));
t.Start();
}
private void _run() {
//...
fieldA = "someData";
fieldB = "otherData"
//...
}
}
That's only a very rough example to illustrate the idea. I doesn't include any parts for thread synchronization or thread control.
I would say the more difficult task would be to think about splitting your CalcModel method in a way that it can be parallelized and then maybe more important how the partially results can be joined together to form one single end solution.
I'm using parallel.for to launch in many threads a external program. But despite the fact that these are separate threads I need implement sth like delay. E.g. 2 threads want to launch this external program at the same moment - then one of them should wait and start e.g. 10 sec after second thread.
Is it possible?
It's possible, but given the information you've provided it seems pointless... you're enforcing single-threaded execution of the external program, so you might as well have a single thread executing it. If Thread 2 has to wait for Thread 1 in order to start the "external program," then just let Thread 1 do all the work since it already knows when it started the "external program."
The only benefit you will get from a multi-threaded approach is if you have a bunch of processing that you need to do prior to executing the "external program" and that processing has to be a good candidate for concurrent execution.
Update
OK, there are a couple of ways to do this with only one extra thread in order to keep your Main/GUI thread responsive. The first approach is a simple lock around the external resource which you're interacting with:
public class ExternalResourceHandler
{
private readonly ExternalResource _resource;
private readonly object _sync = new object();
// constructors
// ...
// other methods
public void PerformExternalOperation()
{
lock(_sync)
{
Result result = _resource.Execute();
// do soemthing with the result
}
}
}
Here are 3 multi-threaded version for executing the code:
Using a Parallel.For method: recommended if the external program takes a short amount of time to execute- I'd suggest for things under 25 seconds (although this is not necessarily a "correct" number).
Using a ThreadPool: again, I'd recommend for things that take less than 25 seconds (with the same reservation as above).
Using a Thread: this would be recommended if the operation runs longer (i.e. more than 25 seconds, but it would do just as good if it's under 25 seconds).
Here are some examples (not necessarily functional, mostly meant to give you an idea of the different approaches):
public class Program
{
public static ExternalResourceHandler _erh = new ExternalResourceHandler();
static int Main()
{
Console.WriteLine("Type 'exit' to stop; 'parallel', 'pool' or 'thread' for the corresponding execution version.");
string input = Console.ReadLine();
while(input != "exit")
{
switch(input)
{
case "parallel":
// Run the Parallel.For version
ParallelForVersion();
break;
caase "pool":
// Run the threadpool version
ThreadPoolVersion();
break;
case "thread":
// Run the thread version
ThreadVersion();
break;
default:
break;
}
input = Console.ReadLine();
}
return 0;
}
public static void ParallelForVersion()
{
Parallel.For(0, 1, i =>
{
_erh.PerformExternalOperation();
});
}
public static void ThreadPoolVersion()
{
ThreadPool.QueueUserWorkItem(o=>
{
_erh.PerformExternalOperation();
});
}
public static void ThreadVersion()
{
Thread t = new Thread(()=>
{
_erh.PerformExternalOperation();
});
t.IsBackground = true;
t.Start();
}
}
The other option is to employ the Producer/Consumer design pattern where your ExternalResourceHandler is the consumer and it processes requests to the external resource from a thread-safe queue. Your main thread just places requests on the queue and immediately returns back to work. Here is an example:
public class ExternalResourceHandler
{
private volatile boolean _running;
private readonly ExternalResource _resource;
private readonly BlockingQueue<Request> _requestQueue;
public ExternalResourceHandler( BlockingQueue<Request> requestQueue)
{
_requestQueue = requestQueue;
_running = false;
}
public void QueueRequest(Request request)
{
_requestQueue.Enqueue(request);
}
public void Run()
{
_running = true;
while(_running)
{
Request request = null;
if(_requestQueue.TryDequeue(ref request) && request!=null)
{
_resource.Execute(request);
}
}
}
// methods to stop the handler (i.e. set the _running flag to false)
}
Your main would look like this:
public class Program
{
public static ExternalResourceHandler _erh = new ExternalResourceHandler();
static int Main()
{
Thread erhThread = new Thread(()=>{_erh.Run();});
erhThread.IsBackground = true;
erhThread.Start();
Console.WriteLine("Type 'exit' to stop or press enter to enqueue another request.");
string input = Console.ReadLine();
while(input != "exit")
{
_erh.EnqeueRequest(new Request());
input = Console.ReadLine();
}
// Stops the erh by setting the running flag to false
_erh.Stop();
// You may also need to interrupt the thread in order to
// get it out of a blocking state prior to calling Join()
erhThread.Join();
return 0;
}
}
As you see: in both cases all the work for the external handler is forced on a single thread yet your main thread still remains responsive.
Look at producer-consumer pattern. The first thread produces the information "external progam launched" the second thread consumes it, waits 10 second and then launches the external program.
I have a thread that connects to two network resources. Each time I attempt a connection, it can take 10 seconds for a reply.
void MyThread()
{
//this takes ten seconds
Resource r1 = MySystem.GetResource(ipAddress1);
//this takes ten seconds
Resource r2 = MySystem.GetResource(ipAddress2);
//do stuff with ep1 and ep2
}
Total time is twenty seconds, but I'd really like it to take only ten seconds -- launching threads each time I call GetResource, receiving a reply and then joining each thread to return control.
What's the best way to do this? Launch two threads that each return a value? Anonymous methods that take references to local variables? My head is spinning. Code is appreciated.
How about
Resource r1 = null; // need to initialize, else compiler complains
Resource r2 = null;
ThreadStart ts1 = delegate {
r1 = MySystem.GetResource(ipAddress1);
};
ThreadStart ts2 = delegate {
r2 = MySystem.GetResource(ipAddress2);
};
Thread t1 = new Thread(ts1);
Thread t2 = new Thread(ts2);
t1.Start();
t2.Start();
// do some useful work here, while the threads do their thing...
t1.Join();
t2.Join();
// r1, r2 now setup
Short and sweet.
The easiest way that occurs to me to do so is to parallelize one of the calls on a worker thread while having the main thread perform the second initialization and wait. The following snipet should help to illustrate:
ManualResetEvent r1Handle = new ManualResetEvent(false);
Resource r1 = null;
Resource r2 = null;
// Make the threadpool responsible for populating the
// first resource.
ThreadPool.QueueUserWorkItem( (state) =>
{
r1 = MySystem.GetResource(ipAddress1);
// Set the wait handle to signal the main thread that
// the work is complete.
r1Handle.Set();
});
// Populate the second resource.
r2 = MySystem.GetResource(ipAddress2);
// Wait for the threadpool worker to finish.
r1Handle.WaitOne();
// ... Do more stuff
For a more detailed discussion of thread synchronization techniques, you may wish to visit the MSDN article on the topic: http://msdn.microsoft.com/en-us/library/ms173179.aspx
These are always fun questions to ponder, and of course there's multiple ways to solve it.
One approach that's worked well for me is to provide a callback method that each thread uses to pass back results and status. In the following example, I use a List to keep track of running threads and put the results in a Dictionary.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading;
using System.Timers;
namespace ConsoleApplication1
{
class Program
{
static Dictionary threadResults = new Dictionary();
static int threadMax = 2;
static void Main(string[] args)
{
List<Thread> runningThreads = new List<Thread>();
for (int i = 0; i < threadMax; i++)
{
Worker worker = new Worker();
worker.Callback = new Worker.CallbackDelegate(ThreadDone);
Thread workerThread = new Thread(worker.DoSomething);
workerThread.IsBackground = true;
runningThreads.Add(workerThread);
workerThread.Start();
}
foreach (Thread thread in runningThreads) thread.Join();
}
public static void ThreadDone(int threadIdArg, object resultsArg)
{
threadResults[threadIdArg] = resultsArg;
}
}
class Worker
{
public delegate void CallbackDelegate(int threadIdArg, object resultArg);
public CallbackDelegate Callback { get; set; }
public void DoSomething()
{
// do your thing and put it into results
object results = new object();
int myThreadId = Thread.CurrentThread.ManagedThreadId;
Callback(myThreadId, results);
}
}
}
Try this on MSDN: "Asynchronous programming using delegates."
http://msdn.microsoft.com/en-us/library/22t547yb.aspx
-Oisin