Call Join for multiple threads simultaneously - c#

I am writing a script in C# for Unity to read messages from multiple sources. I have a function ReadMessage which takes in a port string and returns a string ID number. The issue is, I have a dozen different connections I need to read from and the messages have a 10ms timeout timer before it stops trying and lets the code continue. This causes a reduction in frame rate when I have a dozen threads waiting a few ms for the previous one to finish Joining. My thread code is as follows:
string threadOneString= null;
Thread ThreadOne;
//Repeat for 11 more threads
void Update () {
ThreadOne = new Thread(
() =>
{
threadOneString = ReadMessage(someClass.port);
});
ThreadOne.Start();
//Repeat for 11 more threads
}
void LateUpdate () {
ThreadOne.Join();
//Repeat for 11 more threads
UpdateClass(threadOneString);
//Repeat for 11 more threads
UpdateTextDisplay(); //Just updates a Unity Text object
}
And my ReadMessage code in case it matters.
private string ReadMessage(string port) //Change this name to ReadButtonPress
{
string fullMessage = "";
someStruct parsedMessage;
var timeout = new System.TimeSpan(0, 0, 0, 0, 10); // Less than 10 and it starts to miss most messages, ideally this would be a bit higher
string connectionString = PROTOCOL + CONTROLLER_IP + ":" + port;
AsyncIO.ForceDotNet.Force();
using (var subSocket = new SubscriberSocket())
{
subSocket.Connect(connectionString);
subSocket.Subscribe("");
subSocket.TryReceiveFrameString(timeout, out fullMessage);
UnityEngine.Debug.Log("Message: " + fullMessage);
subSocket.Close();
}
// Some message parsing and checking...
return parsedMessage.someString;
}
What I'd like to do, but I don't know if it's possible, is to call the Joins for each thread at the same time instead of calling one, waiting, then calling the next. The threads don't interact with each other, so I'm hoping this is possible. If not, I'd greatly appreciate another solution or suggestion.
EDIT: Clarification on what's happening. When I only run a few threads I get a FPS of 65-70. When I run all 12 threads my FPS drops to ~50 and I have a hard requirement of 60 FPS.

Thanks to some feedback here I am no longer creating a socket every update. Instead I create the sockets in Start() and just read what I need from a public function in the class. I based my code on https://stackoverflow.com/a/14797475/8635796
Don't do what I did and recreate the exact same threads and sockets every Update(). In hindsight, it was a really poor choice.

Use tasks instead, and put them into a list.
var myTasks = new List<Task>();
Task allUpdateTasks;
...
void Update () {
myTasks.Add(
Task.Factory.StartNew(() =>
{
threadOneString = ReadMessage(someClass.port);
}));
//Repeat for 11 more tasks
// Create a single tasks that will hold all tasks above.
allUpdateTasks = Task.WhenAll(myTasks);
}
void LateUpdate() {
allUpdateTasks.Wait();
...
}

define ManualResetEvent and signal it when you got results, you can wait up to 64 waithandles using WaitHandle.WaitAny

Related

CPU is maxed when running a while loop inside a Long Running task

I am running a constant poll to a messaging server. When a message arrives on the server I grab the message and process it. Unfortunately this is using 80-100% CPU for a simple task. UPDATE:I have reduced it to the while loop itself. The while loop inside 10 tasks are causing the CPU to max out at 100% easy.
int i = 0;
while(true){
//At the start of every minute
if (DateTime.Now.Seconds == 0)
{
i++;
}
}
Is there a way that I can throttle the loop or better yet a better way that I can write this code so that it does not use 100% CPU? I have tried to add a Task.Delay of 1 second but that doesn't help much at all.
Any help or advice you can provide I would greatly appreciate it.
// First I poll 10 different locations for messages and each task polls its own queue
foreach(Queue queue in queueList)
{
task.Add(Task.Run(() => PollIndividualQueue(queue));
}
t= Task.WhenAll(task.ToArray()).WithAggregatedExceptions();
t.Wait();
//Then in the PollIndividualQueue method I implement the while loop that constantly polls a message queue for the next hour
private async Task<string> PollIndividualQueue(Queue queue)
{
var cancellationToken = new CancellationTokenSource(Timespan.FromMinutes(60)).Token;
while(!cancellationToken.IsCancellationRequested)
{
//Poll the queue and if there is a message grab it and process it
if(!await GetMessage())
{
//I call a stored procedure that inserts this message into the database.
using(var conn = new SqlConnection(...)
{
using(var cmd = new SqlCommand(MyStoredProc, conn)
{
cmd.CommandType = Command.StoredProcedure;
cmd.Parameters.Add(new SQlParameter(...InputMessage));
await conn.OpenAsync();
await cmd.ExecuteNonQueryAsync();
}
}
}
else
{
await Task.Delay(1000);
}
}
}
private Task<bool> getMessage(Queue queue)
{
object myLock = new object();
try
{
Monitor.Enter(myLock);
queue.Get(message);
}
catch(MQException ex)
{
if(ex.ReasonCode == 2033)
{
return false;
}
}
finally
{
Monitor.Exit(myLock);
}
}
EDIT: Thank you all for the comments but it seems to be getting off track to the actual question. The question is there a way to use a percentage of the total CPU?
I can literally take all of this code out of the While loop and the CPU is still at 100%. The while loop is what is driving the CPU load it seems.
Ok. Clearly, you are a newbie, so I'll attempt to be kind with my comments.
Your while loop at the top of your questions is extremely bad programming. You wonder why it is maxing out your CPU? If you are running on an Intel i9-9900K then that loop will be executed many billions of times per second. I'm not kidding!!
If you want to do something once a second then you need to do a simple time calculation then sleep. To quiet the CPU down, your code should be:
int i = 0;
while (true) {
//At the start of every minute
Thread.Sleep((int)(60000 - ((DateTime.Now.Ticks / TimeSpan.TicksPerMillisecond) % 60000)));
i++;
}
I really don't understand you are using locks in the getMessage method. The Queue's Get method blocks when it attempts to retrieve a message.
If you are going to use threading then why don't you have each thread separately connect to the queue manager as there is no reason to share a connection between threads.

In C#, how do I call a method at random intervals on a separate thread but have it always process sequentially?

This seems like it has a very simple solution, but I've been looking on and off for months without finding a definitive answer.
I have an object being created by the UI Thread. I'm actually creating several of the same type of object. Sometimes I'll create 1 or 2 every minute, sometimes I'll create 30 in a second. Each time an object it created, I need to perform some calculations on the object. The combination of potentially 30 objects created at in a short time, and the expensive calculations I'm performing on said objects can really lag the UI.
I've tried performing the calculations via Tasks and backgroundWorkers and all sorts of threading but they all perform the calculations out of order, and it's imperative that the objects are calculated in the order they are created and that one doesn't start it's calculations until the object ahead of it finishes it's own calculations.
I can find all sorts of information about how to perform these tasks in parallel, but can anyone explain to me how I can force them to happen sequentially, just not on the UI Thread? Any help would be greatly appreciated. I've been trying to figure this out for months :(
IMO, the easiest way to run tasks in sequential order here is to use Task.ContinueWith. Note TaskContinuationOptions.LazyCancellation, it's used to make sure that cancellation doesn't break the order of the sequence:
Task _currentTask = Task.FromResult(Type.Missing);
readonly object _lock = new Object();
void QueueTask(Action action)
{
lock (_lock)
{
_currentTask = _currentTask.ContinueWith(
lastTask =>
{
// re-throw the error of the last completed task (if any)
lastTask.GetAwaiter().GetResult();
// run the new task
action();
},
CancellationToken.None,
TaskContinuationOptions.LazyCancellation,
TaskScheduler.Default);
}
}
private void button1_Click(object sender, EventArgs e)
{
for(var i = 0; i < 10; i++)
{
var sleep = 1000 - i*100;
QueueTask(() =>
{
Thread.Sleep(sleep);
Debug.WriteLine("Slept for {0} ms", sleep);
});
}
}
I think a proper solution would be to use a Queue, as it is FIFO in a task that is running in the background.
Your code could look something like this:
Edit
Edited to use Queue.Synchronize as #rwong mentioned (thanks!)
var queue = new Queue<MyCustomObject>();
//add object to the queue..
var mySyncedQueue = Queue.Synchronize(queue)
Task.Factory.StartNew(() =>
{
while (true)
{
var myObj = mySyncedQueue.Dequeue();
if (myObj != null)
{ do work...}
}
}, TaskCreationOptions.LongRunning);
This is just to get you started, im sure your could could be more efficient when you know exactly what is needed :)
Edit 2
As you are accessing the queue from multiple threads, it is better to use a ConcurrentQueue.
the method wont look much different:
var concurrentQueue = new ConcurrentQueue<MyCustomObject>();
//add object to the queue..
Task.Factory.StartNew(() =>
{
while (true)
{
MyCustomObject myObj;
if (concurrentQueue.TryDequeue(myObj))
{ do work...}
}
}, TaskCreationOptions.LongRunning);
Create exactly one background task. Then let it process all items in a thread-safe fifo buffer. Add objects to the fifo from your gui thread.

Adding a multithreading scenario for an application in c#

I have developed an application in c#. The class structure is as follows.
Form1 => The UI form. Has a backgroundworker, processbar, and a "ok" button.
SourceReader, TimedWebClient, HttpWorker, ReportWriter //clases do some work
Controller => Has the all over control. From "ok" button click an instance of this class called "cntrl" is created. This cntrlr is a global variable in Form1.cs.
(At the constructor of the Controler I create SourceReader, TimedWebClient,HttpWorker,ReportWriter instances. )
Then I call the RunWorkerAsync() of the background worker.
Within it code is as follows.
private void backgroundWorker1_DoWork(object sender, DoWorkEventArgs e)
{
int iterator = 1;
for (iterator = 1; iterator <= this.urlList.Count; iterator++)
{
cntrlr.Vmain(iterator-1);
backgroundWorker1.ReportProgress(iterator);
}
}
At themoment ReportProgress updates the progressbar.
The urlList mentioned above has 1000 of urls. cntlr.Vamin(int i) process the whole process at themoment. I want to give the task to several threads, each one having to process 100 of urls. Though access for other instances or methods of them is not prohibited, access to ReportWriter should be limited to only one thread at a time. I can't find a way to do this. If any one have an idea or an answer, please explain.
If you do want to restrict multiple threads using the same method concurrently then I would use the Semaphore class to facilitate the required thread limit; here's how...
A semaphore is like a mean night club bouncer, it has been provide a club capacity and is not allowed to exceed this limit. Once the club is full, no one else can enter... A queue builds up outside. Then as one person leaves another can enter (analogy thanks to J. Albahari).
A Semaphore with a value of one is equivalent to a Mutex or Lock except that the Semaphore has no owner so that it is thread ignorant. Any thread can call Release on a Semaphore whereas with a Mutex/Lock only the thread that obtained the Mutex/Lock can release it.
Now, for your case we are able to use Semaphores to limit concurrency and prevent too many threads from executing a particular piece of code at once. In the following example five threads try to enter a night club that only allows entry to three...
class BadAssClub
{
static SemaphoreSlim sem = new SemaphoreSlim(3);
static void Main()
{
for (int i = 1; i <= 5; i++)
new Thread(Enter).Start(i);
}
// Enfore only three threads running this method at once.
static void Enter(int i)
{
try
{
Console.WriteLine(i + " wants to enter.");
sem.Wait();
Console.WriteLine(i + " is in!");
Thread.Sleep(1000 * (int)i);
Console.WriteLine(i + " is leaving...");
}
finally
{
sem.Release();
}
}
}
Note, that SemaphoreSlim is a lighter weight version of the Semaphore class and incurs about a quarter of the overhead. it is sufficient for what you require.
I hope this helps.
I think I would have used the ThreadPool, instead of background worker, and given each thread 1, not 100 url's to process. The thread pool will limit the number of threads it starts at once, so you wont have to worry about getting 1000 requests at once. Have a look here for a good example
http://msdn.microsoft.com/en-us/library/3dasc8as.aspx
Feeling a little more adventurous? Consider using TPL DataFlow to download a bunch of urls:
var urls = new[]{
"http://www.google.com",
"http://www.microsoft.com",
"http://www.apple.com",
"http://www.stackoverflow.com"};
var tb = new TransformBlock<string, string>(async url => {
using(var wc = new WebClient())
{
var data = await wc.DownloadStringTaskAsync(url);
Console.WriteLine("Downloaded : {0}", url);
return data;
}
}, new ExecutionDataflowBlockOptions{MaxDegreeOfParallelism = 4});
var ab = new ActionBlock<string>(data => {
//process your data
Console.WriteLine("data length = {0}", data.Length);
}, new ExecutionDataflowBlockOptions{MaxDegreeOfParallelism = 1});
tb.LinkTo(ab); //join output of producer to consumer block
foreach(var u in urls)
{
tb.Post(u);
}
tb.Complete();
Note how you can control the parallelism of each block explicitly, so you can gather in parallel but process without going concurrent (for example).
Just grab it with nuget. Easy.

How to limit the execution time of a function in c sharp?

I've got a problem. I'm writing a benchmark and I have a function than is either done in 2 seconds or after ~5 minutes(depending on the input data). And I would like to stop that function if it's executed for more than 3 seconds...
How can I do it?
Thanks a lot!
Well..., I had the same question, and after reading all the answers here and the referred blogs, I settled for this,
It Lets me execute any block of code with a time limit, Declare the wrapper method
public static bool ExecuteWithTimeLimit(TimeSpan timeSpan, Action codeBlock)
{
try
{
Task task = Task.Factory.StartNew(() => codeBlock());
task.Wait(timeSpan);
return task.IsCompleted;
}
catch (AggregateException ae)
{
throw ae.InnerExceptions[0];
}
}
And use that to wrap any block of code like this
// code here
bool Completed = ExecuteWithTimeLimit(TimeSpan.FromMilliseconds(1000), () =>
{
//
// Write your time bounded code here
//
});
//More code
The best way would be that your function can check its execution time often enough to decide to stop it it takes too long.
If this is not the case, then run the function in a separate thread. In your main thread start a 3 seconds timer. When timer elapses, kill the separate thread using Thread.Abort() (of course unless the function is already over). See sample code and preacuations of usage in the function docs.
The best way in C# to stop function in middle is the return keyword in function, but how do I know when to use the return keyword to stop the function in middle, after it lasts at least 3 seconds? The Stopwatch class from System.Diagnostics is the answer. This complicated function that lasts between 2 seconds to 5 minutes (depending on the input data) logically uses many loops, and maybe even recursion, so my solution for you is that, at the first line code of that function, create an instance of Stopwatch using System.Diagnostics with the new keyword, start it by calling the Start() function of the Stopwatch class, and in for each loop and loop, at the beginning, add the following code:
if (stopwatch.ElapsedMilliseconds >= 3000) {
stopwatch.Stop();
// or
stopwatch.Reset();
return;
}
(tip: you can type it with hands once, copy it Ctrl+C, and then just paste it Ctrl+V). If that function uses recursion, in order to save memory, make the Stopwatch global instance rather than creating it as local instance at first, and start it if it does not running at the beginning of the code. You can know that with the IsRunning of the Stopwatch class. After that ask if elapsed time is more than 3 seconds, and if yes (true) stop or reset the Stopwatch, and use the return keyword to stop the recursion loop, very good start in function, if your function lasts long time due mainly recursion more than loops. That it is. As you can see, it is very simple, and I tested this solution, and the results showed that it works! Try it yourself!
private static int LongRunningMethod()
{
var r = new Random();
var randomNumber = r.Next(1, 10);
var delayInMilliseconds = randomNumber * 1000;
Task.Delay(delayInMilliseconds).Wait();
return randomNumber;
}
And
var task = Task.Run(() =>
{
return LongRunningMethod();
});
bool isCompletedSuccessfully = task.Wait(TimeSpan.FromMilliseconds(3000));
if (isCompletedSuccessfully)
{
return task.Result;
}
else
{
throw new TimeoutException("The function has taken longer than the maximum time allowed.");
}
it work for me!
Source: https://jeremylindsayni.wordpress.com/2016/05/28/how-to-set-a-maximum-time-to-allow-a-c-function-to-run-for/
You can use the fork/join pattern, in the Task Parallel Library this is implemented with Task.WaitAll()
using System.Threading.Tasks;
void CutoffAfterThreeSeconds() {
// start function on seperate thread
CancellationTokenSource cts = new CancellationTokenSource();
Task loop = Task.Factory.StartNew(() => Loop(cts.Token));
// wait for max 3 seconds
if(Task.WaitAll(new Task[]{loop}, 3000)){
// Loop finished withion 3 seconds
} else {
// it did not finish within 3 seconds
cts.Cancel();
}
}
// this one takes forever
void Loop() {
while (!ct.IsCancellationRequested) {
// your loop goes here
}
Console.WriteLine("Got Cancelled");
}
This will start the other task on a seperate thread, and then wait for 3000 milliseconds for it to finish. If it did finish within the timeout, it return true, else false so you can use that to decide what to do next.
You can use a CancellationToken to communicate to the other thread that it result is no longer needed so it can stop gracefully.
Regards Gert-Jan
Run this function in thread and kill it after 3 seconds or check elapsed time inside this function(I think it's loop there).
Use an OS callbacks with a hi performance counter, then kill your thread, if exists
It is possible to execute a function in a separate thread and limit its execution with Thread.Join(millisecondsTimeout):
using System.Threading;
Thread workThread = new Thread(DoFunc);
workThread.Start(param);
if (!workThread.Join(3000))
{
// DoFunc() took longer than 3 seconds. Thread was aborted
}
private void DoFunc(object param)
{
// do some long work
}
Since C# and .net framework are not real-time environments, you can't guarantee even the 3 seconds count. Even if you were to get close to that, you would still have to call the
if(timeSpan > TimeSpan.FromSeconds(3) then goto endindentifier; before every other call in the method.
All this is just wrong so no, there is just no reliable way to do it from what I know.
Although you can try this solution
https://web.archive.org/web/20140222210133/http://kossovsky.net/index.php/2009/07/csharp-how-to-limit-method-execution-time
but I just wouldn't do such things in .net application.

Threadpool/WaitHandle resource leak/crash

I think I may need to re-think my design. I'm having a hard time narrowing down a bug that is causing my computer to completely hang, sometimes throwing an HRESULT 0x8007000E from VS 2010.
I have a console application (that I will later convert to a service) that handles transferring files based on a database queue.
I am throttling the threads allowed to transfer. This is because some systems we are connecting to can only contain a certain number of connections from certain accounts.
For example, System A can only accept 3 simultaneous connections (which means 3 separate threads). Each one of these threads has their own unique connection object, so we shouldn't run in to any synchronization problems since they aren't sharing a connection.
We want to process the files from those systems in cycles. So, for example, we will allow 3 connections that can transfer up to 100 files per connection. This means, to move 1000 files from System A, we can only process 300 files per cycle, since 3 threads are allowed with 100 files each. Therefore, over the lifetime of this transfer, we will have 10 threads. We can only run 3 at a time. So, there will be 3 cycles, and the last cycle will only use 1 thread to transfer the last 100 files. (3 threads x 100 files = 300 files per cycle)
The current architecture by example is:
A System.Threading.Timer checks the queue every 5 seconds for something to do by calling GetScheduledTask()
If there's nothing to, GetScheduledTask() simply does nothing
If there is work, create a ThreadPool thread to process the work [Work Thread A]
Work Thread A sees that there are 1000 files to transfer
Work Thread A sees that it can only have 3 threads running to the system it is getting files from
Work Thread A starts three new work threads [B,C,D] and transfers
Work Thread A waits for B,C,D [WaitHandle.WaitAll(transfersArray)]
Work Thread A sees that there are still more files in the queue (should be 700 now)
Work Thread A creates a new array to wait on [transfersArray = new TransferArray[3] which is the max for System A, but could vary on system
Work Thread A starts three new work threads [B,C,D] and waits for them [WaitHandle.WaitAll(transfersArray)]
The process repeats until there are no more files to move.
Work Thread A signals that it is done
I am using ManualResetEvent to handle the signaling.
My questions are:
Is there any glaring circumstance which would cause a resource leak or problem that I am experiencing?
Should I loop thru the array after every WaitHandle.WaitAll(array) and call array[index].Dispose()?
The Handle count under the Task Manager for this process slowly creeps up
I am calling the initial creation of Worker Thread A from a System.Threading.Timer. Is there going to be any problems with this? The code for that timer is:
(Some class code for scheduling)
private ManualResetEvent _ResetEvent;
private void Start()
{
_IsAlive = true;
ManualResetEvent transferResetEvent = new ManualResetEvent(false);
//Set the scheduler timer to 5 second intervals
_ScheduledTasks = new Timer(new TimerCallback(ScheduledTasks_Tick), transferResetEvent, 200, 5000);
}
private void ScheduledTasks_Tick(object state)
{
ManualResetEvent resetEvent = null;
try
{
resetEvent = (ManualResetEvent)state;
//Block timer until GetScheduledTasks() finishes
_ScheduledTasks.Change(Timeout.Infinite, Timeout.Infinite);
GetScheduledTasks();
}
finally
{
_ScheduledTasks.Change(5000, 5000);
Console.WriteLine("{0} [Main] GetScheduledTasks() finished", DateTime.Now.ToString("MMddyy HH:mm:ss:fff"));
resetEvent.Set();
}
}
private void GetScheduledTask()
{
try
{
//Check to see if the database connection is still up
if (!_IsAlive)
{
//Handle
_ConnectionLostNotification = true;
return;
}
//Get scheduled records from the database
ISchedulerTask task = null;
using (DataTable dt = FastSql.ExecuteDataTable(
_ConnectionString, "hidden for security", System.Data.CommandType.StoredProcedure,
new List<FastSqlParam>() { new FastSqlParam(ParameterDirection.Input, SqlDbType.VarChar, "#ProcessMachineName", Environment.MachineName) })) //call to static class
{
if (dt != null)
{
if (dt.Rows.Count == 1)
{ //Only 1 row is allowed
DataRow dr = dt.Rows[0];
//Get task information
TransferParam.TaskType taskType = (TransferParam.TaskType)Enum.Parse(typeof(TransferParam.TaskType), dr["TaskTypeId"].ToString());
task = ScheduledTaskFactory.CreateScheduledTask(taskType);
task.Description = dr["Description"].ToString();
task.IsEnabled = (bool)dr["IsEnabled"];
task.IsProcessing = (bool)dr["IsProcessing"];
task.IsManualLaunch = (bool)dr["IsManualLaunch"];
task.ProcessMachineName = dr["ProcessMachineName"].ToString();
task.NextRun = (DateTime)dr["NextRun"];
task.PostProcessNotification = (bool)dr["NotifyPostProcess"];
task.PreProcessNotification = (bool)dr["NotifyPreProcess"];
task.Priority = (TransferParam.Priority)Enum.Parse(typeof(TransferParam.SystemType), dr["PriorityId"].ToString());
task.SleepMinutes = (int)dr["SleepMinutes"];
task.ScheduleId = (int)dr["ScheduleId"];
task.CurrentRuns = (int)dr["CurrentRuns"];
task.TotalRuns = (int)dr["TotalRuns"];
SchedulerTask scheduledTask = new SchedulerTask(new ManualResetEvent(false), task);
//Queue up task to worker thread and start
ThreadPool.QueueUserWorkItem(new WaitCallback(this.ThreadProc), scheduledTask);
}
}
}
}
catch (Exception ex)
{
//Handle
}
}
private void ThreadProc(object taskObject)
{
SchedulerTask task = (SchedulerTask)taskObject;
ScheduledTaskEngine engine = null;
try
{
engine = SchedulerTaskEngineFactory.CreateTaskEngine(task.Task, _ConnectionString);
engine.StartTask(task.Task);
}
catch (Exception ex)
{
//Handle
}
finally
{
task.TaskResetEvent.Set();
task.TaskResetEvent.Dispose();
}
}
0x8007000E is an out-of-memory error. That and the handle count seem to point to a resource leak. Ensure you're disposing of every object that implements IDisposable. This includes the arrays of ManualResetEvents you're using.
If you have time, you may also want to convert to using the .NET 4.0 Task class; it was designed to handle complex scenarios like this much more cleanly. By defining child Task objects, you can reduce your overall thread count (threads are quite expensive not only because of scheduling but also because of their stack space).
I'm looking for answers to a similar problem (Handles Count increasing over time).
I took a look at your application architecture and like to suggest you something that could help you out:
Have you heard about IOCP (Input Output Completion Ports).
I'm not sure of the dificulty to implement this using C# but in C/C++ it is a piece of cake.
By using this you create a unique thread pool (The number of threads in that pool is in general defined as 2 x the number of processors or processors cores in the PC or server)
You associate this pool to a IOCP Handle and the pool does the work.
See the help for these functions:
CreateIoCompletionPort();
PostQueuedCompletionStatus();
GetQueuedCompletionStatus();
In General creating and exiting threads on the fly could be time consuming and leads to performance penalties and memory fragmentation.
There are thousands of literature about IOCP in MSDN and in google.
I think you should reconsider your architecture altogether. The fact that you can only have 3 simultaneously connections is almost begging you to use 1 thread to generate the list of files and 3 threads to process them. Your producer thread would insert all files into a queue and the 3 consumer threads will dequeue and continue processing as items arrive in the queue. A blocking queue can significantly simplify the code. If you are using .NET 4.0 then you can take advantage of the BlockingCollection class.
public class Example
{
private BlockingCollection<string> m_Queue = new BlockingCollection<string>();
public void Start()
{
var threads = new Thread[]
{
new Thread(Producer),
new Thread(Consumer),
new Thread(Consumer),
new Thread(Consumer)
};
foreach (Thread thread in threads)
{
thread.Start();
}
}
private void Producer()
{
while (true)
{
Thread.Sleep(TimeSpan.FromSeconds(5));
ScheduledTask task = GetScheduledTask();
if (task != null)
{
foreach (string file in task.Files)
{
m_Queue.Add(task);
}
}
}
}
private void Consumer()
{
// Make a connection to the resource that is assigned to this thread only.
while (true)
{
string file = m_Queue.Take();
// Process the file.
}
}
}
I have definitely oversimplified things in the example above, but I hope you get the general idea. Notice how this is much simpler as there is not much in the way of thread synchronization (most will be embedded in the blocking queue) and of course there is no use of WaitHandle objects. Obviously you would have to add in the correct mechanisms to shut down the threads gracefully, but that should be fairly easy.
It turns out the source of this strange problem was not related to architecture but rather because of converting the solution from 3.5 to 4.0. I re-created the solution, performing no code changes, and the problem never occurred again.

Categories

Resources