I'm Trying to get my head around the async-await functionality within C#. I've written the below code to run several tasks asynchronously - currently all they do is raise an event after a certain amount of time.
public class Program
{
public static Stopwatch Watch = new Stopwatch();
public static void Main(string[] args)
{
AsyncClass asyncClass = new AsyncClass();
asyncClass.WaitSecondsAsyncCompleted += asyncClass_WaitSecondsAsyncCompleted;
List<Task> tasks = new List<Task>();
Watch.Start();
for (int i = 1; i < 6; i++)
{
tasks.Add(asyncClass.WaitSecondsAsync(i, Watch));
}
Task.WaitAll(tasks.ToArray());
Console.ReadLine();
}
private static void asyncClass_WaitSecondsAsyncCompleted(int i)
{
Console.WriteLine("{1} : Async Method Called: waited for {0} seconds", i, Watch.ElapsedMilliseconds);
}
}
public class AsyncClass
{
public event Action<int> WaitSecondsAsyncCompleted;
public async Task WaitSecondsAsync(int x, Stopwatch watch)
{
await Task.Run(() =>
{
Thread.Sleep(x * 500);
});
if (WaitSecondsAsyncCompleted != null)
{
WaitSecondsAsyncCompleted(x);
}
}
}
I'd expect a task to be completed roughly once every half a second - however this is not quite what I see. Instead the first four tasks complete on time but the final task has an extra half second delay:
This seems very strange - and the only thing I can think of is that there is a limit on the number of threads that are available to a task and that this is limit is very small and so the fifth task is having to wait for the first task to complete before it can start.
I added some extra output and increased the number of tasks to try and gain more information but I can make little sense of it - the output seems to be deterministic, some threads are reused, but also new ones are used. The delay on tasks being completed also seems to continue to grow (for instance for Task 10 I'd expect it to complete after 5 seconds, instead it stops after 8 seconds). I've attached the output below.
What I'd like to know:
Does anyone know what's going on in this particular example?
Is the limit on threads available small enough to have an effect here?
I presume asynchronous tasks are not guaranteed to start immediately, but there appears to be some other deterministic process going on here, which I hadn't expected. Does anyone know what that is?
Edit
Note that this question does not ask about the maximum number of tasks that can be run (Max tasks in TPL?) but rather how an effect can be seen when running as few as 5 tasks. I was under the impression that default threadPool contained many more threads than this.
So, it turns out that the issue I was seeing had to do with the threadpool size. This is apparently initially set to the number of cores of the machine (https://msdn.microsoft.com/en-us/library/system.threading.threadpool.getminthreads%28v=vs.110%29.aspx).
It can be increased, and doing so means that more of the tasks are initially run simultaneously (https://msdn.microsoft.com/en-us/library/system.threading.threadpool.setminthreads%28v=vs.110%29.aspx)
Related
This question already has answers here:
thread get 100% CPU very fast
(5 answers)
Closed 4 years ago.
I have a task to get a piece of c# code to make my CPU go 100% consistently. I have tried infinite loops, big integer calculations, concatenation of numerous very long strings.... everything either goes to 100% for a few moments and then down to like 60% or doesn't go to 100% at all.
Is there a simple piece of code that can achieve this?
Also using Windows 10
You would want to implement threading for this as was previously stated. Your thread would call a method that contains a while(true) loop. For example:
Random rand = new Random()
List<Thread> threads = new List<Thread>();
public void KillCore()
{
long num = 0;
while(true)
{
num += rand.Next(100, 1000);
if (num > 1000000) { num = 0; }
}
}
public void Main()
{
while (true)
{
threads.Add( new Thread( new ThreadStart(KillCore) ) );
}
}
You don't have to add the threads to a list but you may want to if you somewhere down the road want to call Thread.Abort() to kill them off. To do this you would need a collection of some sort to iterate through and call the Abort() method for every Thread instance in the collection. The method to do that would look as follows.
public void StopThreads()
{
foreach (Thread t in threads) { t.Abort(); }
}
use parallel for to maximize the CPU Cores usage and get the best of threading , inside each thread create an infinite loop using while(true) and congratulations
you have **** your CPU :D
This question already has answers here:
Is .NET's StringBuilder thread-safe
(3 answers)
Closed 5 years ago.
using System;
using System.Threading;
using System.Text;
class ThreadTest
{
static StringBuilder sb = new StringBuilder();
static void Main()
{
Thread t = new Thread(WriteY);
t.Start();
for(int i = 0; i < 1000; i++)
{
sb.Append("x");
}
Console.WriteLine(sb.ToString());
}
private static void WriteY()
{
for (int i = 0; i < 1000; i++)
{
sb.Append("y");
}
}
}
output:
{xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyy}
Question:
Why does 'x' appear before 'y'?
StringBuilder is Accept only one Thread?
Why does not appear like this "xyxyxyxyx xyxyxyxy"?
Questions 1 and 3 are both related to the time slicing of the Windows scheduler. According to the Windows 2000 Performance Guide, a time slice on x86 processors is about 30 ms. That may have changed since Windows 2000, but should still be in this order of magnitude. Hence, t.Start() only adds the new thread to the scheduler but does not immediately trigger a context switch to it. The main thread has still the remaining part of its time slice, which obviously is enough time to print the 'x' 1,000 times.
Furthermore, when the new thread is actually scheduled, it has a whole time slice to print out 'y'. As this is plenty of time, you don't get the pattern "xyxyxy", but rather 'x's until the time slice of the main thread runs out and then 'y's until the end of the time slice of the new thread and then 'x's again. (At least if there are enough 'x's and 'y's to be printed, which, according to Simon's comment is the case with 10,000 'x's and 'y's.)
Question 2 is answered by the MSDN page on the StringBuilder. Under the topic "Thread Safety" it is written that "Any public static ( Shared in Visual Basic) members of this type are thread safe. Any instance members are not guaranteed to be thread safe." As the Append method is an instance method, you cannot call this reliably from different threads in parallel without further synchronization.
Is this what you are looking for?
class Program
{
static StringBuilder sb = new StringBuilder();
static void Main()
{
Thread t = new Thread(WriteY);
t.Start();
for (int i = 0; i < 1000; i++)
{
//Console.Write("x");
sb.Append("x");
Thread.Sleep(10);
}
//t.Join();
Console.WriteLine(sb.ToString());
}
private static void WriteY()
{
for (int i = 0; i < 1000; i++)
{
//Console.Write("y");
sb.Append("y");
Thread.Sleep(10);
}
}
}
Why does 'x' appear before 'y'?
Because the main thread is not blocked at any point and continuing its execution before the resources are granted to other thread that is printing y
StringBuilder is Accept only one Thread?
No that is not the case. Run example below.
Why does not appear like this "xyxyxyxyx xyxyxyxy"?
there is not much work so, to get that random result you need to increase the duration which is demonstrated by using sleep.
Update: in your above example you can see the randomness if you increase your loop to 100000 or greater. and you also need to add t.Join() otherwise you thread may not yield the work.
Is anyone out there who can explain me the flow of this code?
I wonder how main thread generates worker threads, what I know is:
As soon as main thread calls .start method it creates a new thread.
But I have a confusion how the behavior changes when it comes to looping multiple threads in main.
static void Main()
{
Thread[] tr = new Thread[10];
for (int i = 0; i < 10; i++)
{
tr[i] = new Thread(new ThreadStart(count));
tr[i].Start();
}
static private void count()
{
for (int i = 0; i < 10; ++i)
{
lock (theLock)
{
Console.WriteLine("Count {0} Thread{1}",
counter++, Thread.CurrentThread.GetHashCode());
}
}
Is there a good way to debug and track your multithreaded program. after google it out I found tracking thread window in debug mood, but I couldn't find it useful even after given custom names to threads.
I just can't understand the flow, how threads being launched, how they work all together etc as breakpoints seem no effect in multi-threaded application. (At least in my case.)
I want this output 1 printed by Thread : 4551 [ThreadID] 2 printed by
Thread : 4552 3 printed by Thread : 4553 4 printed by Thread : 4554 5
printed by Thread : 4555 6 printed by Thread : 4556 7 printed by
Thread : 4557 8 printed by Thread : 4558 9 printed by Thread : 4559 10
printed by Thread : 4560 11 printed by Thread : 4551 [ Same Thread Id
Appears again as in 1] 12 printed by Thread : 4552
I'll try to describe what your code is doing as it interacts with the threading subsystem. The details I'm giving are from what I remember from my OS design university classes, so the actual implementation in the host operating system and/or the CLR internals may vary a bit from what I describe.
static void Main()
{
Thread[] tr = new Thread[10];
for (int i = 0; i < 10; i++)
{
tr[i] = new Thread(new ThreadStart(count));
// The following line puts the thread in a "runnable" thread list that is
// managed by the OS scheduler. The scheduler will allow threads to run by
// considering many factors, such as how many processes are running on
// the system, how much time a runnable thread has been waiting, the process
// priority, the thread's priority, etc. This means you have little control
// on the order of execution, The only certain fact is that your thread will
// run, at some point in the near future.
tr[i].Start();
// At this point you are exiting your main function, so the program should
// end, however, since you didn't flag your threads as BackgroundThreads,
// the program will keep running until every thread finishes.
}
static private void count()
{
// The following loop is very short, and it is probable that the thread
// might finish before the scheduler allows another thread to run
// Like user2864740 suggested, increasing the amount of iterations will
// increase the chance that you experience interleaved execution between
// multiple threads
for (int i = 0; i < 10; ++i)
{
// Acquire a mutually-exclusive lock on theLock. Assuming that
// theLock has been declared static, then only a single thread will be
// allowed to execute the code guarded by the lock.
// Any running thread that tries to acquire the lock that is
// being held by a different thread will BLOCK. In this case, the
// blocking operation will do the following:
// 1. Register the thread that is about to be blocked in the
// lock's wait list (this is managed by a specialized class
// known as the Monitor)
// 2. Remove the thread that is about to be blocked from the scheduler's
// runnable list. This way the scheduler won't try to yield
// the CPU to a thread that is waiting for a lock to be
// released. This saves CPU cycles.
// 3. Yield execution (allow other threads to run)
lock (theLock)
{
// Only a single thread can run the following code
Console.WriteLine("Count {0} Thread{1}",
counter++, Thread.CurrentThread.GetHashCode());
}
// At this point the lock is released. The Monitor class will inspect
// the released lock's wait list. If any threads were waiting for the
// lock, one of them will be selected and returned to the scheduler's
// runnable list, where eventually it will be given the chance to run
// and contend for the lock. Again, many factors may be evaluated
// when selecting which blocked thread to return to the runnable
// list, so we can't make any guarantees on the order the threads
// are unblocked
}
}
Hopefully things are clearer. The important thing here is to acknowledge that you have little control of how individual threads are scheduled for execution, making it impossible (without a fair amount of synchronization code) to replicate the output you are expecting. At most, you can change a thread's priority to hint the scheduler that a certain thread must be favored over other threads. However, this needs to be done very carefully, as it may lead to a nasty problem known as priority inversion. Unless you know exactly what you are doing, it is usually better not to change a thread's priority.
After a continuous try, I got to complete the requirements of my task. Here is the code:
using System;
using System.Threading;
public class EntryPoint
{
static private int counter = 0;
static private object theLock = new Object();
static object obj = new object();
static private void count()
{
{
for (int i = 0; i < 10; i++)
{
lock (theLock)
{
Console.WriteLine("Count {0} Thread{1}",
counter++, Thread.CurrentThread.GetHashCode());
if (counter>=10)
Monitor.Pulse(theLock);
Monitor.Wait(theLock); } }}
}
static void Main()
{
Thread[] tr = new Thread[10];
for (int i = 0; i < 10; i++)
{
tr[i] = new Thread(new ThreadStart(count));
tr[i].Start();
}
}
}
Monitor maintains a ready queue in a sequential order hence I achieved what I wanted:
Cheers!
I want to execute some code on each second. The code I am using now is:
Task.Run((Action)ExecuteSomething);
And ExecuteSomething() is defined as below:
private void ExecuteSomething()
{
Task.Delay(1000).ContinueWith(
t =>
{
//Do something.
ExecuteSomething();
});
}
Does this method block a thread? Or should I use Timer class in C#? And it seems Timer also dedicates a separate thread for execution (?)
Task.Delay uses Timer internally
With Task.Delay you can make your code a little-bid clearer than with Timer. And using async-await will not block the current thread (UI usually).
public async Task ExecuteEverySecond(Action execute)
{
while(true)
{
execute();
await Task.Delay(1000);
}
}
From source code: Task.Delay
// on line 5893
// ... and create our timer and make sure that it stays rooted.
if (millisecondsDelay != Timeout.Infinite) // no need to create the timer if it's an infinite timeout
{
promise.Timer = new Timer(state => ((DelayPromise)state).Complete(), promise, millisecondsDelay, Timeout.Infinite);
promise.Timer.KeepRootedWhileScheduled();
}
// ...
Microsoft's Reactive Framework is ideal for this. Just NuGet "System.Reactive" to get the bits. Then you can do this:
IDisposable subscription =
Observable
.Interval(TimeSpan.FromSeconds(1.0))
.Subscribe(x => execute());
When you want to stop the subscription just call subscription.Dispose(). On top of this the Reactive Framework can offer far more power than Tasks or basic Timers.
Can you use Task.Delay as a timer? It depends on whether you need to be waiting an exact interval, or whether you just need to wait a specified interval AFTER other work is done.
In other words, I have used a Timer many times in the past as part of a service that fetches or pushes data for the purposes of synchronization. The problem is that if you fetch data every 60 seconds, but there is a slow in the database and your retrieval takes over 60 seconds. Now you have a problem as you are going to be fetching the same data twice. So then you update the process so that you stop the timer until it finishes, and then start the timer again. This works fine, but using Task.Delay does this work for you.
So if you are just needing to Delay for a bit of time between executions, then Task.Delay works perfectly.
static class Helper
{
public async static Task ExecuteInterval(Action execute, int millisecond, IWorker worker)
{
while (worker.Worked)
{
execute();
await Task.Delay(millisecond);
}
}
}
interface IWorker
{
bool Worked { get; }
}
I've use the below code to implement and test a blocking queue. I test the queue by starting up 5 concurrent threads (the removers) to pull items off the queue, blocking if the queue is empty and 1 concurrent thread (the adder) to add items to the queue intermitently. However, if I leave it running for long enough I get an exception because one of the remover threads comes out of a waiting state even when the queue is empty.
Does anyone know why I get the exception? Note, I'm interested in knowing why this doesn't work as opposed to a working solution (as I can just Google that).
I'd greatly appreciate your help.
using System;
using System.Threading;
using System.Collections.Generic;
namespace Code
{
class Queue<T>
{
private List<T> q = new List<T>();
public void Add(T item)
{
lock (q)
{
q.Add(item);
if (q.Count == 1)
{
Monitor.Pulse(q);
}
}
}
public T Remove()
{
lock (q)
{
if (q.Count == 0)
{
Monitor.Wait(q);
}
T item = q[q.Count - 1];
q.RemoveAt(q.Count - 1);
return item;
}
}
}
class Program
{
static Random r = new Random();
static Queue<int> q = new Queue<int>();
static int count = 1;
static void Adder()
{
while (true)
{
Thread.Sleep(1000 * ((r.Next() % 5) + 1));
Console.WriteLine("Will try to add");
q.Add(count++);
}
}
static void Remover()
{
while (true)
{
Thread.Sleep(1000 * ((r.Next() % 5) + 1));
Console.WriteLine("Will try to remove");
int item = q.Remove();
Console.WriteLine("Removed " + item);
}
}
static void Main(string[] args)
{
Console.WriteLine("Test");
for (int i = 0; i < 5; i++)
{
Thread remover = new Thread(Remover);
remover.Start();
}
Thread adder = new Thread(Adder);
adder.Start();
}
}
}
if I leave it running for long enough I get an exception because one of the remover threads comes out of a waiting state even when the queue is empty. Does anyone know why I get the exception?
The question is odd, because obviously you know the answer: your first sentence answers the question asked by the second sentence. You get the exception because a remover thread comes out of the wait state when the queue is empty.
To solve the problem you'll want to use a loop instead of an "if". The correct code is:
while(q.Count == 0) Monitor.Wait(q);
not
if(q.Count == 0) Monitor.Wait(q);
UPDATE:
A commenter points out that perhaps your question was intended to be "under what circumstances can a consumer thread obtain the monitor when the queue is empty?"
Well, you are in a better position to answer that than we are, since you're the one running the program and looking at the output. But just off the top of my head, here's a way that could happen:
Consumer Thread 1: waiting
Consumer Thread 2: ready
Producer Thread 3: owns the monitor
There is one element in the queue.
Thread 3 pulses.
Thread 1 goes to ready state.
Thread 3 abandons the monitor.
Thread 2 enters the monitor.
Thread 2 consumes the item in the queue
Thread 2 abandons the monitor.
Thread 1 enters the monitor.
And now thread 1 is in the monitor with an empty queue.
Generally speaking when reasoning about these sorts of problems you should think of "Pulse" as being like a pigeon with a note attached to it. Once released it has no connection to the sender, and if it cannot find its home, it dies in the wilderness with its message undelivered. All you know when you Pulse is that if there is any thread waiting then one thread will move to the ready state at some time in the future; you don't know anything else about the relative timing of operations on threads.
Your code would work if there was 1 consumer but when there are more, this mechanism fails and it should be while(q.Count == 0) Monitor.Wait(q)
The following scenario shows when if(q.Count == 0) Monitor.Wait(q) would fail (it's different than Eric's):
consumer 1 is waiting
producer has put in an item and is pulsing
consumer 1 is ready
producer is releasing lock
consumer 2 just entered Remove, is lucky and acquires lock
consumer 2 sees 1 item, does not wait and takes item out
consumer 2 releases lock
consumer 1 re-acquires lock but queue is empty
This happens exactly as documentation says it can happen:
When the thread that invoked Pulse releases the lock, the next thread in the ready queue (which is not necessarily the thread that was pulsed) acquires the lock.
Eric is of course right; the fact is that while the code appears to cover all the bases; the fact that an exception occurs shows that you haven't.
The race condition is that between the Monitor.Wait on a remover and a Monitor.Pulse on the adder (which releases the lock; but doesn't necessarily immediately trigger a thread waiting to wake up and reacquire it); a subsequent remove thread can acquire the lock and immediately jump the
if (q.Count == 0)
{
Monitor.Wait(q);
}
Statement and go straight to removing the item. Then, the Pulsed thread wakes up and assumes there's an item still there; but there isn't.
The way to fix it, whatever the way the race condition is actually manifesting, is as Eric has said.
Equally if you read the example on Monitor.Pulse you'll see a similar setup to what you have done here but a subtlely different way of doing it.