Thread execution order in c# - c#

I have a list of thread I want to ensure the execution order between them this is the code
for (int k = 0; k < RadioList.Count; k++)
{
for (int i = 0; i < filePaths.Count(); i++)
{
Thread t = new Thread(delegate()
{
Thread_Encde_function(TempRadio.PublishPoint, filePaths[i], encodingtype);
});
t.Start();
Thread.Sleep(1000);
}
}
I want to know if thread.join() can do the job.

If your tasks can execute asynchronously (out of order), then threads are appropriate. However, if you have a number of tasks you want to execute in order (strictly one after the other), then why use threads? You should just execute them in your for loop, as they can't be parallelized. Using Thread.Join to wait for the thread right after you start it will do the trick, but that way you will have to wait until a task is finished before starting the next one, effectively executing them in order.
However, if you have some parts of the tasks that can be executed simultaneously, and other parts that have to be sequential, you can take a look at the C# Task Parallel Library, it makes doing things like that easy.

Related

ThreadPool or Task.Factory

I have windows service that can get requests, I want to handle in each request in separated thread.
I want to limit also the number of the threads, i.e maximum 5 threads.
And i want to wait for all threads before i'm close the application,
What is the best way to do that?
What I'm tried:
for (int i = 0; i < 10; i++)
{
var i1 = i;
Task.Factory.StartNew(() => RequestHandle(i1.ToString())).ContinueWith(t => Console.WriteLine("Done"));
}
Task.WaitAll();//Not waiting actually for all threads, why?
In this way i can limit the number of the theads?
Or
var events = new ManualResetEvent[10];
ThreadPool.SetMaxThreads(5, 5);
for (int i = 0; i < 10; i++)
{
var i1 = i;
ThreadPool.QueueUserWorkItem(x =>
{
Test(i1.ToString());
events[i1].Set();
});
}
WaitHandle.WaitAll(events);
There is another way to implement that?
Both approaches should work, neither is a guarantee that each operation will run in a different thread (and that is a good thing). Controlling the maximun number of threads is another thing...
You can set the maximun number of threads of the ThreadPool with SetMaxThreads. Remember that this change is global, you only have one ThreadPool.
The default TaskScheduler will use the thread pool (except in some particular situations where it can run the taks inline on the same thread that is calling them), so changing the parameters of the ThreadPool will also affect the Tasks.
Now, notice I said default TaskScheduler. You can roll your own, which will give you more control over how the tasks will run. It is not advised to create a custom TaskScheduler (unless you really need it and you know what you are doing).
For the embedded question:
Task.WaitAll();//Not waiting actually for all threads, why?
To correctly call Task.WaitAll you need to pass the tasks you want to wait for as parameters. There is no way to wait for all currently existing tasks implicitly, you need to tell it what tasks you want to wait for.
Task.WaitAll();//Not waiting actually for all threads, why?
you have to the following
List<Task> tasks = new List<Task>();
for (int i = 0; i < 10; i++)
{
var i1 = i;
tasks.Add(Task.Factory.StartNew(() => RequestHandle(i1.ToString())).ContinueWith(t => Console.WriteLine("Done")));
}
Task.WaitAll(tasks);
I think that you should make the differences between Task and thread
Task are not threads Task is just a promise of result in the future and your code can execute on only one thread even if you have many tasks scheduled
Thread is a low-level concept if you start a thread you know that it will be a separate thread
I think you first implentation is well enough to be sure that all your code is executed but you have to use Task.Run instead of Task.Factory.StartNew

Controlling number of tasks being executed at an instance at runtime [duplicate]

I have the following code:
var factory = new TaskFactory();
for (int i = 0; i < 100; i++)
{
var i1 = i;
factory.StartNew(() => foo(i1));
}
static void foo(int i)
{
Thread.Sleep(1000);
Console.WriteLine($"foo{i} - on thread {Thread.CurrentThread.ManagedThreadId}");
}
I can see it only does 4 threads at a time (based on observation). My questions:
What determines the number of threads used at a time?
How can I retrieve this number?
How can I change this number?
P.S. My box has 4 cores.
P.P.S. I needed to have a specific number of tasks (and no more) that are concurrently processed by the TPL and ended up with the following code:
private static int count = 0; // keep track of how many concurrent tasks are running
private static void SemaphoreImplementation()
{
var s = new Semaphore(20, 20); // allow 20 tasks at a time
for (int i = 0; i < 1000; i++)
{
var i1 = i;
Task.Factory.StartNew(() =>
{
try
{
s.WaitOne();
Interlocked.Increment(ref count);
foo(i1);
}
finally
{
s.Release();
Interlocked.Decrement(ref count);
}
}, TaskCreationOptions.LongRunning);
}
}
static void foo(int i)
{
Thread.Sleep(100);
Console.WriteLine($"foo{i:00} - on thread " +
$"{Thread.CurrentThread.ManagedThreadId:00}. Executing concurently: {count}");
}
When you are using a Task in .NET, you are telling the TPL to schedule a piece of work (via TaskScheduler) to be executed on the ThreadPool. Note that the work will be scheduled at its earliest opportunity and however the scheduler sees fit. This means that the TaskScheduler will decide how many threads will be used to run n number of tasks and which task is executed on which thread.
The TPL is very well tuned and continues to adjust its algorithm as it executes your tasks. So, in most cases, it tries to minimize contention. What this means is if you are running 100 tasks and only have 4 cores (which you can get using Environment.ProcessorCount), it would not make sense to execute more than 4 threads at any given time, as otherwise it would need to do more context switching. Now there are times where you want to explicitly override this behaviour. Let's say in the case where you need to wait for some sort of IO to finish, which is a whole different story.
In summary, trust the TPL. But if you are adamant to spawn a thread per task (not always a good idea!), you can use:
Task.Factory.StartNew(
() => /* your piece of work */,
TaskCreationOptions.LongRunning);
This tells the DefaultTaskscheduler to explicitly spawn a new thread for that piece of work.
You can also use your own Scheduler and pass it in to the TaskFactory. You can find a whole bunch of Schedulers HERE.
Note another alternative would be to use PLINQ which again by default analyses your query and decides whether parallelizing it would yield any benefit or not, again in the case of a blocking IO where you are certain starting multiple threads will result in a better execution you can force the parallelism by using WithExecutionMode(ParallelExecutionMode.ForceParallelism) you then can use WithDegreeOfParallelism, to give hints on how many threads to use but remember there is no guarantee you would get that many threads, as MSDN says:
Sets the degree of parallelism to use in a query. Degree of
parallelism is the maximum number of concurrently executing tasks that
will be used to process the query.
Finally, I highly recommend having a read of THIS great series of articles on Threading and TPL.
If you increase the number of tasks to for example 1000000 you will see a lot more threads spawned over time. The TPL tends to inject one every 500ms.
The TPL threadpool does not understand IO-bound workloads (sleep is IO). It's not a good idea to rely on the TPL for picking the right degree of parallelism in these cases. The TPL is completely clueless and injects more threads based on vague guesses about throughput. Also to avoid deadlocks.
Here, the TPL policy clearly is not useful because the more threads you add the more throughput you get. Each thread can process one item per second in this contrived case. The TPL has no idea about that. It makes no sense to limit the thread count to the number of cores.
What determines the number of threads used at a time?
Barely documented TPL heuristics. They frequently go wrong. In particular they will spawn an unlimited number of threads over time in this case. Use task manager to see for yourself. Let this run for an hour and you'll have 1000s of threads.
How can I retrieve this number? How can I change this number?
You can retrieve some of these numbers but that's not the right way to go. If you need a guaranteed DOP you can use AsParallel().WithDegreeOfParallelism(...) or a custom task scheduler. You also can manually start LongRunning tasks. Do not mess with process global settings.
I would suggest using SemaphoreSlim because it doesn't use Windows kernel (so it can be used in Linux C# microservices) and also has a property SemaphoreSlim.CurrentCount that tells how many remaining threads are left so you don't need the Interlocked.Increment or Interlocked.Decrement. I also removed i1 because i is value type and it won't be changed by the call of foo method passing the i argument so it's no need to copy it into i1 to ensure it never changes (if that was the reasoning for adding i1):
private static void SemaphoreImplementation()
{
var maxThreadsCount = 20; // allow 20 tasks at a time
var semaphoreSlim = new SemaphoreSlim(maxTasksCount, maxTasksCount);
var taskFactory = new TaskFactory();
for (int i = 0; i < 1000; i++)
{
taskFactory.StartNew(async () =>
{
try
{
await semaphoreSlim.WaitAsync();
var count = maxTasksCount-semaphoreSlim.CurrentCount; //SemaphoreSlim.CurrentCount tells how many threads are remaining
await foo(i, count);
}
finally
{
semaphoreSlim.Release();
}
}, TaskCreationOptions.LongRunning);
}
}
static async void foo(int i, int count)
{
await Task.Wait(100);
Console.WriteLine($"foo{i:00} - on thread " +
$"{Thread.CurrentThread.ManagedThreadId:00}. Executing concurently: {count}");
}

Threading Question - Best way to spawn multiple threads

I've created a method, let's call it MemoryStressTest(). I would like to call this from another method, let's call it InitiateMemoryStressTest(). I would like the InitiateMemoryStressTest() method to call multiple instances of MemoryStressTest() via different threads. The threads don't need to be aware of each other and will not be dependent on each other. I'm using .NET 4. What would be the best way to do this?
Be as simple, as possible:
int threadCount = 100;
for (int i = 0; i < threadCount; i++)
{
(new Thread(() => MemoryStressTest())).Start();
}
If you just want new threads - and don't want thread pool threads, tasks etc, then it's very straightforward:
for (int i = 0; i < numberOfThreads; i++)
{
Thread t = new Thread(MemoryStressTest);
t.Start();
// Remember t if you need to wait for them all to finish etc
}
(One benefit of this approach over using the thread pool is that you don't get the "smart" behaviour of .NET in terms of ramping up threads in the thread pool slowly etc. All very well for normal situations, but this is slightly different :)
How about using .NET 4.0 Parallel Framework's tasks instead of threads
- let the system decide how many actual threads to use.
This can be done with a parallel for, or other techniques.

Internally how does a thread work?

Thread t = new Thread (WriteY);
t.Start();
for (int i = 0; i < 1000; i++) Console.Write ("x");
static void WriteY()
{
for (int i = 0; i < 1000; i++) Console.Write ("y");
}
internally How does thread work ? means why output of above code is not fix every time i run, sequence of 'x' and 'y' is different?
All multitasking systems have a scheduler. A scheduler decides what unit of work will execute next. A basic scheduler can be something that runs of a high resolution timer (like, say, every 100ms, a task switch happens). Clearly, modern implementations are much more complicated than that.
That said, most modern threading implementations rely on a scheduler within the kernel. Many of these schedulers are NOT deterministic. That is, there is no guarantee that a context switch (i.e. a switch between runnable instances managed by the scheduler) will happen at any specific time.
What you are seeing are the discrepancies in that scheduler for your system.

Simple Multithreading Question

Ok I should already know the answer but...
I want to execute a number of different tasks in parallel on a separate thread and wait for the execution of all threads to finish before continuing. I am aware that I could have used the ThreadPool.QueueUserWorkItem() or the BackgroundWorker but did not want to use either (for no particular reason).
So is the code below the correct way to execute tasks in parallel on a background thread and wait for them to finish processing?
Thread[] threads = new Thread[3];
for (int i = 0; i < threads.Length; i++)
{
threads[i] = new Thread(SomeActionDelegate);
threads[i].Start();
}
for (int i = 0; i < threads.Length; i++)
{
threads[i].Join();
}
I know this question must have been asked 100 times before so thanks for your answer and patience.
Yes, this is a correct way to do this. But if SomeActionDelegate is relatively small then the overhead of creating 3 threads will be significant. That's why you probaly should use the ThreadPool anyway. Even though it has no clean equivalent to Join.
A BackgroundWorker is mainly useful when interacting with the GUI.
It's always hard to say what the "correct" way is, but your approach does work correctly.
However, by using Join, the UI thread (if the main thread is a UI thread) will be blocked, therefore freezing the UI. If that is not the case, your approach is basically OK, even if it has different problems (scalability/number of concurrent threads, etc.).
if your are (or will be using .Net 4.0) try using Task parallel Library

Categories

Resources