I'm here because I'm having a weird behaviour using this code :
But before that, I KNOW THAT IS A REALLY BAD PRACTICE TO DO THAT, so it's not even used in reality I just want to understand what is happenning behind the scene but my knowledges are really poor about that.
Here is the code in question :
int worker = 0;
int io = 0;
Console.WriteLine($"Worker thread {worker} Io thread {io}");
ThreadPool.GetAvailableThreads(out worker, out io);
ThreadPool.GetMaxThreads(out var workerThreadsMax, out var completionPortThreadsMax);
Console.WriteLine($"Worker thread {workerThreadsMax - worker} Io thread {completionPortThreadsMax - io}");
for (int i = 0; i < 100; i++)
{
Task.Run(() =>
{
Console.WriteLine(Thread.CurrentThread.ManagedThreadId + " - Running thread");
ThreadPool.GetAvailableThreads(out var worker2, out var io2);
ThreadPool.GetMaxThreads(out var workerThreadsMax2, out var completionPortThreadsMax2);
Console.WriteLine($"{Thread.CurrentThread.ManagedThreadId} - Worker thread {workerThreadsMax2 - worker2} Io thread {completionPortThreadsMax2 - io2}");
var t1 = Task.Delay(5000);
var t2 = Task.Delay(5000);
Task.WaitAll(t1, t2);
Console.WriteLine(Thread.CurrentThread.ManagedThreadId + " - End of thread");
ThreadPool.GetAvailableThreads(out worker2, out io2);
ThreadPool.GetMaxThreads(out workerThreadsMax2, out completionPortThreadsMax2);
Console.WriteLine($"{Thread.CurrentThread.ManagedThreadId} - Worker thread {workerThreadsMax2 - worker2} Io thread {completionPortThreadsMax2 - io2}");
});
}
Console.ReadLine();
So what I'm trying to do in this code is to run or at least queue 500 tasks (that a lot I know, but was curious), while still displaying the number of active Threads from the ThreadPool. So in the first line I have 0 worker thread in the ThreadPool which makes sense, as long as I didn't start any Task yet. But when the first Task runs there is 8 active Threads. And here is where a weird thing is happening :
A new thread is spawned every second or less (but it's not instantly), which is not a real issue but the thing I don't understand is why the Task are blocked ? Even when the 250ms delay are finished the Task don't end itself, it still blocked on the Task.WaitAll line even after more than one minute :
Worker thread 0 Io thread 0
Worker thread 0 Io thread 0
8 - Running thread
8 - Worker thread 8 Io thread 0
6 - Running thread
6 - Worker thread 8 Io thread 0
10 - Running thread
10 - Worker thread 8 Io thread 0
7 - Running thread
7 - Worker thread 8 Io thread 0
11 - Running thread
11 - Worker thread 8 Io thread 0
5 - Running thread
9 - Running thread
9 - Worker thread 8 Io thread 0
12 - Running thread
12 - Worker thread 8 Io thread 0
5 - Worker thread 8 Io thread 0
13 - Running thread
13 - Worker thread 9 Io thread 0
14 - Running thread
14 - Worker thread 10 Io thread 0
15 - Running thread
15 - Worker thread 11 Io thread 0
16 - Running thread
16 - Worker thread 12 Io thread 0
17 - Running thread
17 - Worker thread 13 Io thread 0
18 - Running thread
18 - Worker thread 14 Io thread 0
Is there any deadlock happening here ? If can someone can explain me this, it would be really great. Thanks.
EDIT : For those who proposed to use async and await Task.WhenAll(..) you are totally right ! But as I said, it was for testing purpose and i won't do that in the reality and use instead the async/await statements for that. But we were testing something with a friend about synchronous and asynchronous Task and when testing the synchronous way we came across this problem withotu knowing what was happening. Thank you for those who clarified this. It has been very instructive.
Lets look at this piece of code in greater detail
var t1 = Task.Delay(5000);
var t2 = Task.Delay(5000);
Task.WaitAll(t1, t2);
Task.Delay is essentially a wrapper around a timer, more specifically a System.Threading.Timer. This timer will delegate the actual time keeping to the OS. When the timer has elapsed it will raise the even on a thread pool thread that in turn marks the task as completed. This will trigger a check for the Task.WhenAll task to see if that can complete, and if so unblock.
However, your test is essentially designed to exhaust the threadpool, Causing a classic deadlock. All the Task.WhenAll tasks are waiting for one or more Task.Delay to complete, but this requires an available threadpool thread, but all threads are blocked while waiting for the the Task.WhenAll tasks. So everything is waiting for something else, and nothing can run.
Except the designers of the threadpool anticipated this problem and added a mechanism that increases the number of threadpool threads, allowing the Task.Delay to complete, and resolve the deadlocks. But this mechanism is slow. So it should not be a surprise that there is a huge delay for tasks to complete.
Since this is a toy example a solution might not be required, but it might be worth repeating. Do not oversubscribe the threadpool. Use asynchronous, non-blocking code, or code that is careful about how many threads it uses.
Your Task should be:
Task.Run(async () =>
{
var t1 = Task.Delay(5000);
var t2 = Task.Delay(5000);
await Task.WhenAll(t1, t2);
});
Without the async, the method is blocking. The async/await makes the task non-blocking. As mentioned by Gurustron you need to use the non-blocking Task.WhenAll not, Task.WaitAll. See WaitAll vs WhenAll
Related
Can anyone explain (or have a resource that explains) exactly when ThreadPool threads are released back to the ThreadPool? Here is a small example program (Dotnet fiddle: https://dotnetfiddle.net/XRso3q)
public static async Task Main()
{
Console.WriteLine("Start");
var t1 = ShortWork("SW1");
var t2 = ShortWork("SW2");
await Task.Delay(50);
var t3 = LongWork("LW1");
Console.WriteLine($"After starting LongWork Thread={Thread.CurrentThread.ManagedThreadId}");
await Task.WhenAll(t1, t2);
await t3;
Console.WriteLine("Done");
}
public static async Task ShortWork(string name)
{
Console.WriteLine($"SHORT Start {name} Thread={Thread.CurrentThread.ManagedThreadId}");
await Task.Delay(500);
Console.WriteLine($"SHORT End {name} Thread={Thread.CurrentThread.ManagedThreadId}");
}
public static async Task LongWork(string name)
{
Console.WriteLine($"LONG Start {name} Thread={Thread.CurrentThread.ManagedThreadId}");
await Task.Delay(2500);
Console.WriteLine($"LONG End {name} Thread={Thread.CurrentThread.ManagedThreadId}");
}
Outputs:
Start
SHORT Start SW1 Thread=1
SHORT Start SW2 Thread=1
LONG Start LW1 Thread=5
After starting LongWork Thread=5
SHORT End SW1 Thread=7
SHORT End SW2 Thread=5
LONG End LW1 Thread=5
Done
Long work starts on thread 5, but at some point thread 5 is released back to the threadpool as thread 5 is able to pick up Short SW1 ending. When exactly does 5 get released back to the threadpool after await Task.Delay(2500) in LongWork? Does the await call release it back to the threadpool? I dont think this is the case as if I log the thread id right after the call to LongWork, that is still running on thread 5. await Task.WhenAll is called on thread 5 - which then releases control back up to whatever called 'main' - is this where it gets released as there is no 'caller' to go back to?
My understanding of what happens:
Starts on thread 1, thread 1 executes ShortWork SW1 and SW2.
Task.Delay(50) is awaited and thread 1 gets released (as there is no more work to do?)
Thread 5 is chosen to pick up the continuation after the 50ms delay
Thread 5 kicks off LongWork, and it it gets to the awaited 2500ms delay. Control gets released back up to main, still on thread 5. t1 and t2 are awaited - control gets released back up to whatever called main (and so thread 5's work is done - it gets released to the threadpool)
At this point no threads are 'doing' anything
When the ShortWork delay is done, thread 5 and 7 are selected from the pool for the continuation of each call. Once done with the continuation, these are released to the pool (?)
Another thread picks up the continuation between Task.WhenAll and await t3, which then immediately gets released because it is just awaiting t3
A ThreadPool thread is selected to do the continuation of the LongWork call
Finally, a ThreadPool thread picks up the last work to write done after t3 is done.
Also as a bonus, why does 5 pick up end of SW1, and 7 pick up end of LW1? These are the threads that were just used. Are they somehow kept as 'hot threads and prioritised for continuations that come up?
The way await works is that it first checks its awaitable argument; if it has already completed, then the async method continues synchronously. If it is not complete, then the method returns to its caller.
A second key to understanding is that all async methods begin executing synchronously, on a normal call stack just like any other method.
The third useful piece of information here is that a Console app needs a foreground thread to keep running or it will exit. So when you have an async Main, behind the scenes the runtime blocks the main thread on the returned task. So, in your example, when the first await is hit in Main, it returns a task, and the main thread 1 spends the rest of the time blocked on that task.
In this code, all continuations are run by thread pool threads. It's not specified or guaranteed which thread(s) will run which continuations.
The current implementation uses synchronous continuations, so in your example the thread id for LONG Start LW1 and After starting LongWork will always be the same. You can even place a breakpoint on After starting LongWork and see how a LongWork continuation is actually in the call stack of your Main continuation.
What actually happens is that when a pool thread starts, it takes a task from the pool's work queue, waiting if necessary. It then executes the task, and then takes a new one, again waiting if necessary. This is repeated until the thread determines that it has to die and exits.
While the thread is taking a new task, or waiting for one, you could say that it has been "released to the thread pool", but that's not really useful or helpful. Nothing from the pool actually does things to the threads, after starting them up. The threads control themselves.
When you write an async function, the compiler transforms it in a way that divides it into many tasks, each of which will be executed by a function call. Where you write Task.Delay(), what actually happens is that your function schedules the task that represents the remainder of its execution, and then returns all the way out to the thread's main procedure, allowing it to get a new task from the thread pool's work queue.
I have some very simple code that's attempting to multi-thread an existing script.
On inspecting the treads window in visual Studio and calling Thread.CurrentThread.ManagedThreadId it always reports back as the same thread as starting the process. When ending it reports back a different thread id.
The threads do seem to be performing the task asynchronously, but the logging and output from visual studio are making me think otherwise.
Please could someone clarify what is going on and if I've made a mistake in my approach?
namespace ResolveGoogleURLs
{
class Program
{
public static void Main(string[] args)
{
HomeController oHC = new HomeController();
}
}
}
using System;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
namespace ResolveGoogleURLs
{
class HomeController
{
public static int MaxJobs = 5;
public static int RecordsPerJob = 1000;
public static List<Task> TaskList = new List<Task>(MaxJobs);
public HomeController()
{
CreateJobs();
MonitorTasks();
}
public void MonitorTasks()
{
while (1 == 1)
{
Task.WaitAny(TaskList.ToArray());
TaskList.RemoveAll(x => x.IsCompleted);
Console.WriteLine("Task complete! Launching new...");
CreateJobs();
}
}
public async Task CreateJob()
{
Console.WriteLine("Thread {0} - Start", Thread.CurrentThread.ManagedThreadId);
// read in results from sql
await Task.Delay(10000);
Console.WriteLine("Thread {0} - End", Thread.CurrentThread.ManagedThreadId);
}
public void CreateJobs()
{
while (TaskList.Count < MaxJobs)
{
TaskList.Add( CreateJob() );
}
}
}
}
Output:
> Thread 1 - Start
Thread 1 - Start
Thread 1 - Start
Thread 1 - Start
Thread 1 - Start
Thread 4 - End
Thread 5 - End
Thread 4 - End
Thread 6 - End
Thread 8 - End
Task complete! Launching new...
Thread 1 - Start
Thread 1 - Start
Thread 1 - Start
Thread 1 - Start
Thread 1 - Start
Thread 7 - End
Thread 6 - End
Thread 5 - End
Thread 4 - End
Thread 8 - End
Task complete! Launching new...
Thread 1 - Start
Thread 1 - Start
Thread 1 - Start
Task complete! Launching new...
Thread 1 - Start
Thread 1 - Start
Thread 10 - End
Thread 9 - End
Task complete! Launching new...
Thread 1 - Start
Thread 7 - End
Thread 4 - End
Thread 6 - End
Task complete! Launching new...
Thread 1 - Start
Thread 1 - Start
Task complete! Launching new...
Thread 1 - Start
Thread 1 - Start
Tasks (class Task) are not the same as threads (class Thread).
Thread can be imagined as a virtual CPU that can run its code in same time than other threads. Every thread has its own stack (where local variables and function parameters are stored). In .NET a thread is mapped into a native thread - supported by the platform (operation system), that has things like thread kernel object, kernel mode stack, thread execution block, etc.. This makes thread a pretty heavy-weight object. Creation of new thread is time consuming and even when threads sleeps (doesn't execute its code), it still consumes a lot of memory (thread stack).
The operation system periodically goes through all running threads and based on their priority it assigns them a time slot, when a thread can use the real CPU (execute its code).
Because threads consume memory, they are slow to create and running too many threads hurts overall system performance, tasks (thread pools) were invented.
Task internally uses threads, because threads are the only way how to run code in parallel way in .NET. (Actually it's the only way how to run any code.) But it does it in efficient way.
When a .NET process is started, internally it creates a thread pool - a pool of threads that are used to execute tasks.
Task library is implemented in that way that when a process is started, a thread pool contains only a single thread. If you start to create new tasks, they are stored into a queue from which that single thread takes and execute one task after another. But .NET monitors if that thread is not overloaded with too many tasks - situation when tasks are waiting 'too long' in a thread pool queue. If - based on its internal criteria - it detects that initially created thread is overloaded, it creates a new one. So thread pool now has 2 threads and tasks can be ran on 2 of them in parallel way. This process can be repeated, so if a heavy load is present, thread pool can have 3, 4 or more threads.
On the other hand, if the frequency of new tasks drops down, and thread pool threads don't have tasks to execute, after 'some time' (what is 'some time' is defined by internal thread pool criteria) thread pool can decide to release some of threads - to conserve system resources. Number of thread pool threads can drop down to initially created single thread.
So task library internally uses threads, but it tries to use them in efficient way. Number of threads is scaled up and down based on the number of tasks program wants to execute.
In almost all cases tasks should be preferred solution to using 'raw' threads.
I want the below process run continuously.But confused to use Thread or Task. I am also new in implementing Thread or Task.Is the process is right I am implementing? Between Thread and Task which is better for getting fast performance for long running process?
private void BtnStart_Click(object sender, EventArgs e)
{
IClockThread();
}
This method is creating thread for every machine.
public void IClockThread()
{
txtStatus.BeginInvoke((Action)(() => txtStatus.Text = "Thread for IClock Starts......" + Environment.NewLine));
Thread[] ts = new Thread[IclockDetails.Count];
for (int i = 0; i < 5; i++)
{
string IP = IclockDetails[i].IpAddress;
ts[i] = new Thread(() =>
{
ConnectMachineIClock(IP);
});
ts[i].Start();
}
txtStatus.BeginInvoke((Action)(() => txtStatus.Text += "Thread for IClock Ends............" + Environment.NewLine));
}
This is another method which is called by every thread.
public void ConnectMachineIClock(string IP)
{
int idwErrorCode = 0;
var Iclock = IclockDetails.Where(a => a.IpAddress == IP).FirstOrDefault();
if (AttnMachineRepository.IsMachineOnline(IP) == true)
{
Stopwatch sw = Stopwatch.StartNew();
blnCon = CZKEM1.Connect_Net(IP.Trim(), 4370);
sw.Stop();
if (blnCon == true)
{
UpdateText(1, txtStatus, Iclock.IpAddress);
iMachineNumber = Iclock.Id;
LocId = Iclock.LocationId;
MType = Iclock.MachineTypeId;
LocName = AttnMachineRepository.GetPunchLocation(LocId);
CZKEM1.RegEvent(iMachineNumber, 65535);
UpdateText(2, txtStatus, Iclock.IpAddress);
//txtStatus.BeginInvoke((Action)(() => txtStatus.Text += ("Connected with " + Iclock.IpAddress + " " + sw.Elapsed.TotalSeconds + " Seconds taken to connect") + Environment.NewLine));
MachineIP = Iclock.IpAddress;
Get_IClock_LogData(iMachineNumber);
}
else
{
CZKEM1.GetLastError(ref idwErrorCode);
UpdateText(-1, txtErrorLog, Iclock.IpAddress);
//txtErrorLog.BeginInvoke((Action)(() => txtErrorLog.Text += "Unable to connect the device with IP: " + MachineIP + ", ErrorCode = " + idwErrorCode.ToString() + "" + Environment.NewLine));
//Application.DoEvents();
}
}
else
{
UpdateText(-2, txtErrorLog, Iclock.IpAddress);
//txtErrorLog.BeginInvoke((Action)(() => txtErrorLog.Text += "IP " + MachineIP + " not found" + Environment.NewLine));
//Application.DoEvents();
}
}
public void UpdateText(int status, TextBox text, string IP)
{
switch (status)
{
case 1:
text.BeginInvoke((Action)(() => text.Text += ("Data Processing for" + IP + " starts") + Environment.NewLine));
Application.DoEvents();
break;
case 2:
text.BeginInvoke((Action)(() => text.Text += ("Connected with " + IP) + Environment.NewLine));
Application.DoEvents();
break;
case -1:
text.BeginInvoke((Action)(() => text.Text += "Unable to connect the device with IP: " + IP + ", ErrorCode = " + -1 + "" + Environment.NewLine));
Application.DoEvents();
break;
case -2:
text.BeginInvoke((Action)(() => text.Text += "IP " + IP + " not found" + Environment.NewLine));
Application.DoEvents();
break;
}
}
Analysis
So your current code basically spins up a new thread for each clock detail, in order to basically have those calls all run asynchronously. It doesn't await for any threads to finish.
The first problem with that is creating new threads is expensive, and running 100 tasks on 100 threads is very rarely, if ever, as efficient as say 4 threads running in parallel taking care of tasks as they become available.
So the first issue is that creating a bunch of new threads is not the best approach here.
Secondly, hitting up the network stack with 100 threads (or however many are in IClockDetails) is also less efficient than fewer concurrent threads.
Difference Between Task and Thread
To directly answer your first question, it is better to use Task's in this case. Also the only difference you will get from doing
new Thread(() => {}).Start();
And doing
Task.Run(() => {});
Is that the first way will always create a new thread (expensive) to do the work on, whereas the second will possibly use an existing, idle thread.
Secondly, a Task always creates a background thread, not foreground, so the Task's thread won't keep your application alive if your main and all foreground threads finish, whereas a foreground thread will.
That is it. Your code with the exception of possibly being run on an existing idle thread, and it being run as a background thread, there is no difference at all.
Standard Solution
As mentioned, creating a new thread is expensive. So the first and simple answer to your question is, it is much better to change your new Thread(() => {}).Start(); calls to Task.Run(() => {}); calls.
public void IClockThread()
{
txtStatus.BeginInvoke((Action)(() => txtStatus.Text = "Thread for IClock Starts......" + Environment.NewLine));
for (int i = 0; i < IclockDetails.Length; i++)
Task.Run(() => ConnectMachineIClock(IclockDetails[i].IpAddress));
txtStatus.BeginInvoke((Action)(() => txtStatus.Text += "Thread for IClock Ends............" + Environment.NewLine));
}
I have cleaned up the code, fixed your for loop to use the actual IclockDetails length, removed unnecessary bits, and replaced the thread with the Task.
This will now re-use threads as they become available, and run them as background threads.
Parallel to the rescue
As mentioned though, if your IclockDetails has 100 items or anything more than say 20, its inefficient to just run threads for each item. Instead you could parallel them up.
Consider this
Parallel.For(0, 100, (i) =>
{
Console.WriteLine($"I am {i} on thread {Thread.CurrentThread.ManagedThreadId}");
Thread.Sleep(10 * i);
});
This will create a bunch of methods which will get called in parallel as the Partitioner sees fit. So this will run based on the most optimal thread count for the system its running on
It will just output to the console the i of which work item it is between the range I specified of 0 to 100. It delays more the larger the number. So if you run this code now you will see a bunch of output like this
I am 0 on thread 1
I am 1 on thread 1
I am 2 on thread 1
I am 5 on thread 3
I am 10 on thread 4
I am 3 on thread 1
I am 15 on thread 5
I am 6 on thread 3
I am 4 on thread 1
I am 20 on thread 6
I am 25 on thread 7
I am 11 on thread 4
I am 35 on thread 11
I am 8 on thread 1
I am 30 on thread 8
I am 45 on thread 12
I am 7 on thread 3
I am 40 on thread 10
I am 50 on thread 9
I am 55 on thread 14
I am 60 on thread 15
I am 16 on thread 13
I am 65 on thread 5
I am 70 on thread 16
I am 75 on thread 18
I am 9 on thread 1
...
As you can see the run order is out of sync as they are run in parallel and the threads are getting re-used.
You can limit the max number of parallel tasks running at once if you like with
Parallel.For(0, 100, new ParallelOptions { MaxDegreeOfParallelism = 4 }...
To limit it to 4 in this example. However, I would personally leave it up to the Paritioner to make that decision.
For clear understanding, if you limit the MaxDegreeOfParallelism = 1 your output would be
I am 0 on thread 1
I am 1 on thread 1
I am 2 on thread 1
I am 3 on thread 1
I am 4 on thread 1
I am 5 on thread 1
I am 6 on thread 1
I am 7 on thread 1
I am 8 on thread 1
I am 9 on thread 1
I am 10 on thread 1
...
NOTE: The Parallel.For call does not finish and carry on to the next line of code until all the work is done. If you want to change that just do Task.Run(() => Parallel.For(...));
Best Solution
So with that in mind, my proposal for your situation and the best solution is to use Parallel.For to split up your work and delegate it onto the right threads and make re-use of those threads as the system see's fit.
Also note I only use Task.Run here to maintain your Thread for clock start/end outputs so they act exactly the same and your IClockThread() method returns before the work is done, so as to not change your current code flow.
public void IClockThread()
{
txtStatus.BeginInvoke((Action)(() => txtStatus.Text = "Thread for IClock Starts......" + Environment.NewLine));
Task.Run(() =>
{
Parallel.For(0, IclockDetails.Count, (i) =>
{
ConnectMachineIClock(IclockDetails[i].IpAddress);
});
});
txtStatus.BeginInvoke((Action)(() => txtStatus.Text += "Thread for IClock Ends............" + Environment.NewLine));
}
The main difference between a Task and a Thread is how the concurrency is done.
A Task is concurrency that is delegated to a thread in the application's thread pool. The idea is that a Task is a reasonably short lived, concurrent procedure or function, that is handed off to a thread in the thread pool where it is executed and finished and then the thread being used is returned back to the thread pool for some other task.
See the MSDN documentation Task Class provides an overview with links to various method descriptions, etc.
The Task class represents a single operation that does not return a
value and that usually executes asynchronously. Task objects are one
of the central components of the task-based asynchronous pattern first
introduced in the .NET Framework 4. Because the work performed by a
Task object typically executes asynchronously on a thread pool thread
rather than synchronously on the main application thread, you can use
the Status property, as well as the IsCanceled, IsCompleted, and
IsFaulted properties, to determine the state of a task. Most commonly,
a lambda expression is used to specify the work that the task is to
perform.
For operations that return values, you use the Task < TResult > class.
Also see Microsoft Docs Task-based Asynchronous Programming as well as Task-based Asynchronous Pattern (TAP) in Microsoft Docs which is the current recommended approach (see Asynchronous Programming Patterns for a discussion in Microsoft Docs on several patterns).
This MSDN article Asynchronous programming describes using the async and await keywords with links to additional articles including the Microsoft Docs article Async in depth which gets into the gory details.
A Thread is concurrency that is created by the application. The idea is that a Thread is a reasonably long lived, concurrent procedure or function.
A major difference between the two, Task and Thread, is that a Task is handed off to a ready to run thread while a Thread has to be created and spun up. So the startup time is lower and startup efficiency is higher for a Task as the thread it is delegated to already exists.
In my opinion the primary reason to use a Task is to be able to make a short lived action or procedure concurrent. So things such as accessing a web site or doing some kind of a data transfer or performing a calculation of some kind that requires several seconds or updating a data store are the ideal types of activities for a Task. There are a large number of Async type functions with .NET and C# and C++/CLI that are designed to be used with Task to trigger activities while allowing the UI to remain responsive.
For a Thread, activities such as providing a server that accepts a high volume of requests and acts on them or a function that is monitoring several devices or sensors for a long period would be ideal types of activities for a Thread. Other activities would be additional UI threads in order to handle a complex User Interface or compute intensive tasks such as graphics rendering whose throughput would be enhanced by being able to use one or more dedicated CPU cores.
One point to remember is that Task versus Thread is not a binary, either one or the other decision. One or more Threads may be created to handle particular, encapsulated and self sufficient functionality and within one or more of the Threads created the Task construct may be used to handle the work of a Thread. A common example is the UI thread which is designed to handle the various messages of a User Interface however particular actions that happen within the UI thread are handled by a Task. Another common example would be a server thread which handle multiple concurrent connections using Tasks.
The Microsoft Docs article, The Managed Thread Pool, has this to say:
There are several scenarios in which it is appropriate to create and
manage your own threads instead of using thread pool threads:
You require a foreground thread.
You require a thread to have a particular priority.
You have tasks that cause the thread to block for long periods of time. The thread pool has a maximum number of threads, so a large
number of blocked thread pool threads might prevent tasks from
starting.
You need to place threads into a single-threaded apartment. All ThreadPool threads are in the multithreaded apartment.
You need to have a stable identity associated with the thread, or to dedicate a thread to a task.
In addition see the following stackoverflow postings and discussions about the differences between Task and Thread.
Task vs Thread differences
What is the difference between task and thread?
I need to control one thread for my own purposes: calculating, waiting, reporting, etc...
In all other cases I'm using the ThreadPool or TaskEx.
In debugger, when I'm doing Thread.Sleep(), I notice that some parts of the UI are becoming less responsible. Though, without debugger seems to work fine.
The question is: If I'm creating new Thread and Sleep()'ing it, can it affect ThreadPool/Tasks?
EDIT: here are code samples:
One random place in my app:
ThreadPool.QueueUserWorkItem((state) =>
{
LoadImageSource(imageUri, imageSourceRef);
});
Another random place in my app:
var parsedResult = await TaskEx.Run(() => JsonConvert.DeserializeObject<PocoProductItem>(resultString, Constants.JsonSerializerSettings));
My ConcurrentQueue (modified, original is taken from here):
Creation of thread for Queue needs:
public void Process(T request, bool Async = true, bool isRecurssive = false)
{
if (processThread == null || !processThread.IsAlive)
{
processThread = new Thread(ProcessQueue);
processThread.Name = "Process thread # " + Environment.TickCount;
processThread.Start();
}
If one of the Tasks reports some networking problems, i want this thread to wait a bit
if (ProcessRequest(requestToProcess, true))
{
RequestQueue.Dequeue();
}
else
{
DoWhenTaskReturnedFalse();
Thread.Sleep(3000);
}
So, the question one more time: can Thread.Sleep(3000);, called from new Thread(ProcessQueue);, affect ThreadPool or TaskEx.Run() ?
Assuming that the thread you put on sleep was obtained from thread pool then surely it does affect the thread pool. If you explicitly say that the thread should sleep then it cannot be reused by the thread pool during this time. This may cause the thread pool to spawn new threads if there are some jobs awaiting to be scheduled. Creating a new thread is always expensive - threads are system resources.
You can however look at Task.Delay method (along with async and await) that suspends executing code in a more intelligent way - allowing the thread to be reused during waiting.
Refer to this Thread.Sleep vs. Task.Delay article.
Thread.Sleep() affects the thread it's called from, if you're calling Thread.Sleep() in a ThreadPool thread and trying to queue up more it may be hitting the max count of ThreadPool threads and waiting for a thread to finish before executing another.
http://msdn.microsoft.com/en-us/library/system.threading.threadpool.setmaxthreads.aspx
No, the Thread.Sleep() is only on the current thread. Thread.Sleep(int32) documentation:
The number of milliseconds for which the thread is suspended.
The following code should (at least in my opinion) create 100 Tasks, which are all waiting in parallel (that's the point about concurrency, right :D ?) and finish almost at the same time. I guess for every Task.Delay a Timerobject is created internally.
public static async Task MainAsync() {
var tasks = new List<Task>();
for (var i = 0; i < 100; i++) {
Func<Task> func = async () => {
await Task.Delay(1000);
Console.WriteLine("Instant");
};
tasks.Add(func());
}
await Task.WhenAll(tasks);
}
public static void Main(string[] args) {
MainAsync().Wait();
}
But! When I run this on Mono I get very strange behavior:
The Tasks do not finish at the same time, there are huge delays (probably about 500-600ms)
In the console mono shows a lot of created threads:
Loaded assembly: /Users/xxxxx/Programming/xxxxx/xxxxxxxxxx/bin/Release/xxxxx.exe
Thread started: #2
Thread started: #3
Thread started: #4
Thread started: #5
Thread started: #6
Thread started: #7
Thread finished: #3 <-- Obviously the delay of 1000ms finished ?
Thread finished: #2 <-- Obviously the delay of 1000ms finished ?
Thread started: #8
Thread started: #9
Thread started: #10
Thread started: #11
Thread started: #12
Thread started: #13
... you get it.
Is this actually a bug ? Or do I use the library wrong ?
[EDIT]
I tested a custom sleep method using Timer:
public static async Task MainAsync() {
Console.WriteLine("Started");
var tasks = new List<Task>();
for (var i = 0; i < 100; i++) {
Func<Task> func = async () => {
await SleepFast(1000);
Console.WriteLine("Instant");
};
tasks.Add(func());
}
await Task.WhenAll(tasks);
Console.WriteLine("Ready");
}
public static Task SleepFast(int amount) {
var source = new TaskCompletionSource<object>();
new Timer(state => {
var oldSrc = (TaskCompletionSource<object>)state;
oldSrc.SetResult(null);
}, source, amount, 0);
return source.Task;
}
This time, all tasks completed instantaneously. So, I think it's a really bad implementation or a bug.
[Edit2]
Just FYI: I've tested the original code (using Task.Delay) on .NET using Windows 8.1 now and it ran as expected (1000 Tasks, waiting for 1 second in parallel and finishing).
So the answer is: Mono's impl. of (some) methods is not perfect. In general Task.Delay does not start a thread and even a lot of them should not create multiple threads.
The Task library is designed more for managing blocking tasks without blocking an entire workflow (task asynchronism, confusingly called "task parallel" by Microsoft), and not for doing large blocks of concurrent computation (parallel execution).
The task library uses a scheduler and queues jobs ready for execution. When jobs are run, they will do so on a thread-pool thread, and these are very limited in number. There is logic to expand the thread count, but unless you have hundreds of CPU cores, it's going to stay a low number.
So to answer the question, some of your tasks are queued up waiting for a thread from the pool, while the other delayed tasks have been issued by the scheduler.
The scheduler and thread-pool logic can be changed at runtime, but if you are trying to get lots of computation done quickly Task isn't right for the job. If you want to deal with lots of slow resources (like disk, database, or internet resources) Task may help keep an app responsive.
If you just want to learn about Task try these:
The Task library
The Scheduler
On .NET Framework Desktop.
In short, there this special VM thread which periodically checks queue of timers and runs timers' delegates on thread pool queue. Task.Delay does not create new Thread, but still may be heavy, and no guaranties on order of execution or being precise about deadlines. And as I understand, passing cancellation Task.Delay may end up in just removing item from collection, with no thread pool work queued.
Task.Delay scheduled as DelayPromise by creating new System.Threading.Timer. All timers are stored in AppDomain singleton of TimerQueue. Native VM timer used to callback .NET to check if need to fire any timers from queue. Timer delegates scheduled for execution via ThreadPool.UnsafeQueueUserWorkItem.
From performance point of view, it seems better to cancel delay if delay ends earlier:
open System.Threading
open System.Threading.Tasks
// takes 0.8% CPU
while true do
Thread.Sleep(10)
Task.Delay(50)
// takes 0.4% CPU
let mutable a = new CancellationTokenSource()
while true do
Thread.Sleep(10)
a.Cancel()
a.Dispose()
a <- new CancellationTokenSource()
let token = a.Token
Task.Delay(50,token)