How to manage execution order of Monitor.Enter - c#

I use ThreadPool to perform simultaneous operations. Each operation is performed successfully. I also lock that operations via Monitor.Enter method because if I don't do that I have a thread collision problem. The problem is that after running my application I see that the operations are performed in wrong order. Here is my code:
using System.Threading;
private static readonly Object obj = new Object();
public void Test()
{
List<int> list1 = new List<int>();
for (int i = 0; i < 10; i++) list1.Add(i);
int toProcess = list1.Count;
using (ManualResetEvent resetEvent = new ManualResetEvent(false))
{
for (int i = 0; i < list1.Count; i++)
{
var idx = i;
ThreadPool.QueueUserWorkItem(
new WaitCallback(delegate(object state)
{
WriteToConsole(list1[idx]);
if (Interlocked.Decrement(ref toProcess) == 0)
resetEvent.Set();
}), null);
}
resetEvent.WaitOne();
}
}
private void WriteToConsole(int p)
{
bool lockWasTaken = false;
var temp = obj;
Monitor.Enter(temp, ref lockWasTaken);
Console.Write(p.ToString());
Monitor.Exit(temp);
}
Output:
0
1
2
3
7
5
9
6
4
8
What should I do to fix that wrong order?
Thanks

What should I do to fix that wrong order?
Nothing. It's supposed to be like that. When you process items in parallel the order of execution is generally not guaranteed. So your options are:
Do work sequentially
Sort results after parallel execution
Don't bother with order at all

If you do it ordered, it inherently means that you're processing it sequentially. When you want it to be done sequentially, you don't need threads at all.
That said, You don't have to deal with threadpool directly anymore. Use Task.Run or Task.Factory.StartNew. Then you can use Task.WaitAll or Task.WhenAll to wait for its completion.
Also your WriteToConsole method is superfluous. Console.Write access doesn't need to be locked as it is already thread-safe. It could be simply written as
private void WriteToConsole(int p)
{
Console.Write(p);
}

Related

The most efficient way to go through a list and check all of them in a post request

I'm trying to come up with the best solution for going through a list of strings and performing a POST request with each one of them.
My previous attempt was to make a Queue<String> of the strings 200 or more threads and each thread had a task to Dequeue a string from the list and perform the task, which performed worse than I expected.
What I'm doing wrong here?
My code:
class Checker
{
public Queue<string> pins;
public Checker()
{
pins = GetPins();
StartThreads(1000);
}
public void StartThreads(int threadsCount)
{
Console.WriteLine("Starting Threads");
for (int n = 0; n < 200; n++)
{
var thread = new Thread(Printer);
thread.Name = String.Format("Thread Number ({0})", n);
thread.Start();
}
}
public Queue<string> GetPins()
{
Queue<string> numbers = new Queue<string>();
for (int n = 0; n < 100000; n++)
{
numbers.Enqueue(n.ToString().PadLeft(5, '0'));
}
Console.WriteLine("Got Pins");
return numbers;
}
void Printer()
{
while (pins.Count > 0)
{
var num = pins.Dequeue();
Console.WriteLine();
Console.WriteLine(String.Format("{0} - {1}", num, Thread.CurrentThread.Name));
}
}
}
As you can see I generate 100.000 5 digits long pins and perform a task (output them through console) and assuming i have 1000 threads, it has to be incredibly fast, which is not.
Please tell me what I'm doing wrong and anything I can improve. Thank you!
Queue is not a thread-safe collection. See ConcurrentQueue
Your StartTheads() method does not use the threadsCount arg and therefore only starting 200 threads.
You also need to be careful with threads. Consider using Tasks or ThreadPool instead. IIRC, This will let your app decide how many threads it needs depending on the task count.

I am trying to call a method in a loop .It should be called only 20 times in 10 seconds . I am using semaphore like the below code

By using the below code firstly some of the calls are not getting made lets say out of 250 , 238 calls are made and rest doesn't.Secondly I am not sure if the calls are made at the rate of 20 calls per 10 seconds.
public List<ShowData> GetAllShowAndTheirCast()
{
ShowResponse allShows = GetAllShows();
ShowCasts showCast = new ShowCasts();
showCast.showCastList = new List<ShowData>();
using (Semaphore pool = new Semaphore(20, 20))
{
for (int i = 0; i < allShows.Shows.Length; i++)
{
pool.WaitOne();
Thread t = new Thread(new ParameterizedThreadStart((taskId) =>
{
showCast.showCastList.Add(MapResponse(allShows.Shows[i]));
}));
pool.Release();
t.Start(i);
}
}
//for (int i = 0; i < allShows.Shows.Length; i++)
//{
// showCast.showCastList.Add(MapResponse(allShows.Shows[i]));
//}
return showCast.showCastList;
}
public ShowData MapResponse(Show s)
{
CastResponse castres = new CastResponse();
castres.CastlistResponse = (GetShowCast(s.id)).CastlistResponse;
ShowData sd = new ShowData();
sd.id = s.id;
sd.name = s.name;
if (castres.CastlistResponse != null && castres.CastlistResponse.Any())
{
sd.cast = new List<CastData>();
foreach (var item in castres.CastlistResponse)
{
CastData cd = new CastData();
cd.birthday = item.person.birthday;
cd.id = item.person.id;
cd.name = item.person.name;
sd.cast.Add(cd);
}
}
return sd;
}
public ShowResponse GetAllShows()
{
ShowResponse response = new ShowResponse();
string showUrl = ClientAPIUtils.apiUrl + "shows";
response.Shows = JsonConvert.DeserializeObject<Show[]>(ClientAPIUtils.GetDataFromUrl(showUrl));
return response;
}
public CastResponse GetShowCast(int showid)
{
CastResponse res = new CastResponse();
string castUrl = ClientAPIUtils.apiUrl + "shows/" + showid + "/cast";
res.CastlistResponse = JsonConvert.DeserializeObject<List<Cast>>(ClientAPIUtils.GetDataFromUrl(castUrl));
return res;
}
All the Calls should be made , but I am not sure where they are getting aborted and even please let me know how to check the rate of calls being made.
I'm assuming that your goal is to process all data about shows but no more than 20 at once.
For that kind of task you should probably use ThreadPool and limit maximum number of concurrent threads using SetMaxThreads.
https://learn.microsoft.com/en-us/dotnet/api/system.threading.threadpool?view=netframework-4.7.2
You have to make sure that collection that you are using to store your results is thread-safe.
showCast.showCastList = new List<ShowData>();
I don't think that standard List is thread-safe. Thread-safe collection is ConcurrentBag (there are others as well). You can make standard list thread-safe but it requires more code. After you are done processing and need to have results in list or array you can convert collection to desired type.
https://learn.microsoft.com/en-us/dotnet/api/system.collections.concurrent.concurrentbag-1?view=netframework-4.7.2
Now to usage of semaphore. What your semaphore is doing is ensuring that maximum 20 threads can be created at once. Assuming that this loop runs in your app main thread your semaphore has no purpose. To make it work you need to release semaphore once thread is completed; but you are calling thread Start() after calling Release(). That results in thread being executed outside "critical area".
using (Semaphore pool = new Semaphore(20, 20)) {
for (int i = 0; i < allShows.Shows.Length; i++) {
pool.WaitOne();
Thread t = new Thread(new ParameterizedThreadStart((taskId) =>
{
showCast.showCastList.Add(MapResponse(allShows.Shows[i]));
pool.Release();
}));
t.Start(i);
}
}
I did not test this solution; additional problems might arise.
Another issue with this program is that it does not wait for all threads to complete. Once all threads are started; program will end. It is possible (and in your case I'm sure) that not all threads completed its operation; this is why ~240 data packets are done when program finishes.
thread.Join();
But if called right after Start() it will stop main thread until it is completed so to keep program concurrent you need to create a list of threads and Join() them at the end of program. It is not the best solution. How to wait on all threads that program can add to ThreadPool
Wait until all threads finished their work in ThreadPool
As final note you cannot access loop counter like that. Final value of loop counter is evaluated later and with test I ran; code has tendency to process odd records twice and skip even. This is happening because loop increases counter before previous thread is executed and causes to access elements outside bounds of array.
Possible solution to that is to create method that will create thread. Having it in separate method will evaluate allShows.Shows[i] to show before next loop pass.
public void CreateAndStartThread(Show show, Semaphore pool, ShowCasts showCast)
{
pool.WaitOne();
Thread t = new Thread(new ParameterizedThreadStart((s) => {
showCast.showCastList.Add(MapResponse((Show)s));
pool.Release();
}));
t.Start(show);
}
Concurrent programming is tricky and I would highly recommend to do some exercises with examples on common pitfalls. Books on C# programming are sure to have a chapter or two on the topic. There are plenty of online courses and tutorials on this topic to learn from.
Edit:
Working solution. Still might have some issues.
public ShowCasts GetAllShowAndTheirCast()
{
ShowResponse allShows = GetAllShows();
ConcurrentBag<ShowData> result = new ConcurrentBag<ShowData>();
using (var countdownEvent = new CountdownEvent(allShows.Shows.Length))
{
using (Semaphore pool = new Semaphore(20, 20))
{
for (int i = 0; i < allShows.Shows.Length; i++)
{
CreateAndStartThread(allShows.Shows[i], pool, result, countdownEvent);
}
countdownEvent.Wait();
}
}
return new ShowCasts() { showCastList = result.ToList() };
}
public void CreateAndStartThread(Show show, Semaphore pool, ConcurrentBag<ShowData> result, CountdownEvent countdownEvent)
{
pool.WaitOne();
Thread t = new Thread(new ParameterizedThreadStart((s) =>
{
result.Add(MapResponse((Show)s));
pool.Release();
countdownEvent.Signal();
}));
t.Start(show);
}

how to do two threads can not acces the same folder

I am writing a multithreaded application it is windows service. I have 20 folders. I create 15 threads onstart method. I want to achieve that; 15 threads go to folders 1,2,3,...,15 sequentially. When one thread finished, it creates another thread. This created thread must go 16.th folder. It must not go to working folders. How can I do this? That is, how can I be sure that two threads do not go the same folder?
Could you not just have a static variable that would be a counter for the folder name?
Something like:
private static int _folderNameCounter = 0;
private static readonly object _padlock = new object();
public static int GetFolderCounter()
{
lock(_padlock)
{
_folderNameCounter++;
return _folderNameCounter;
}
}
public static void Main()
{
for(int i = 0; i < 20; i++)
{
Task.Factory.StartNew(() =>
{
var path = #"c:\temp\" + GetFolderCounter();
Directory.CreateDirectory(path);
// add your own code for the thread here
});
}
}
Note: I've used the TPL instead of using Threads directly since I think that the TPL is a better solution. You can of course have specific requirements which can mean that Threads is the better solution for
your case.
Use a BlockingCollection<T> and fill the collection with the folder numbers. Each task handles an item of the collection, and the collection itself handles the multi-threading aspect so that each item is only handled by one consumer.
// Define the blocking collection with a maximum size of 15.
const int maxSize = 15;
var data = new BlockingCollection<int>(maxSize);
// Add the data to the collection.
// Do this in a separate task since BlockingCollection<T>.Add()
// blocks when the specified capacity is reached.
var addingTask = new Task(() => {
for (int i = 1; i <= 20; i++) {
data.Add(i);
}
).Start();
// Define a signal-to-stop bool
var stop = false;
// Create 15 handle tasks.
// You can change this to threads if necessary, but the general idea is that
// each consumer continues to consume until the stop-boolean is set.
// The Take method returns only when an item is/becomes available.
for (int t = 0; t < maxSize; t++) {
new Task(() => {
while (!stop) {
int item = data.Take();
// Note: the Take method will block until an item comes available.
HandleThisItem(item);
}
}).Start();
};
// Wait until you need to stop. When you do, set stop true
stop = true;

lock vs boolean

Here the scenario :
A method is called each minute by a timer. This method could be call through UI (a button). I want that if my method is "in process", and is called, it does not execute the method twice.
In my method I use a simple boolean :
private bool _isProcessing;
public void JustDoIt(Action a, int interval, int times)
{
if (!_isProcessing)
{
_isProcessing = true;
for (int i = 0; i < times; i++)
{
a();
Thread.Sleep(interval);
}
}
_isProcessing = false;
}
It works fine. I test this functionality with this test :
[Test]
public void Should_Output_A_String_Only_3_Times()
{
var consoleMock = new Mock<IConsole>();
IConsole console = consoleMock.Object;
var doer = new Doer { Console = console };
Action a = new Action(() => console.Writeline("TASK DONE !"));
// Simulate a call by Timer
var taskA = Task.Factory.StartNew(() => doer.JustDoIt(a, 1000, 3));
// Simulate a call by UI
var taskB = Task.Factory.StartNew(() => doer.JustDoIt(a));
taskA.Wait();
consoleMock.Verify(c => c.Writeline("TASK DONE !"), Times.Exactly(3));
}
A developer reviews my code and says : "I replaced your boolean by a lock keyword. It's more Thread Safe. Frankly I'm not masterize multithreading so I answered him "OK Guy !"
few days later (today to be more precise), I want to test what if the difference between using lock or a simple boolean. So I was surprised to constate when I replace a boolean by the lock keyword like this :
private object _locker = new Object();
public void JustDoIt(Action a, int interval, int times)
{
lock (_locker)
{
//_isProcessing = true;
for (int i = 0; i < times; i++)
{
a();
Thread.Sleep(interval);
}
}
//_isProcessing = false;
}
The precedent test don't pass :
Message : Moq.MockException : Expected invocation on the mock exactly 3 times, but was 4 times: c=>c.Writeline("TASK DONE !")
So, do I use the lock keyword badly ? Should it be 'Static' ?
Thank you
Make _isProcessing volatile. And then do this:
public void JustDoIt(Action a, int interval, int times)
{
if (_isProcessing) return
_isProcessing = true;
for (int i = 0; i < times; i++)
{
a();
Thread.Sleep(interval);
}
_isProcessing = false;
}
This has a minor race condition, but since your code isn't synchronized to anything anyway, I don't believe it can possibly matter.
You just lock it, that means others thread which would like to enter the critical section wait for the lock and they will enter the lock if the current thread/task releases it.
Eg.: TaskA aquires the lock, it is now in the Critical Section and executes the method a() 3 times. When TaskA has finished the executions, it releases the lock and maybe there is context switch, so TaskB runs the method a() (the 4th time).
After TaskB returns the main-threads says.. "hey, TaskA has finished, so i verify my results"
Addiontional to that, i don't know if TaskA has to run before TaskB. So, i don't know if the Task-Scheduler is FIFO.

Unordered threads problem

I had asked question about lock in here and people responded there is no problem in my lock implementation. But i catched problem. Here is same lock implementation and i am getting weird result. I expect to see numbers starts from 1 but it starts from 5.Example is at below.
class Program
{
static object locker = new object();
static void Main(string[] args)
{
for (int j = 0; j < 100; j++)
{
(new Thread(new ParameterizedThreadStart(dostuff))).Start(j);
}
Console.ReadKey();
}
static void dostuff(dynamic input)
{
lock (locker)
{
Console.WriteLine(input);
}
}
}
The code is fine. But you cannot guarantee the order the threads are executed in. When I run the code I get:
0
1
3
5
2
4
6
10
9
11
7
12
8
etc
If you need to run the threads in a specified order, you could look into using ThreadPool.QueueUserWorkItem instead.
class Program
{
static object locker = new object();
static EventWaitHandle clearCount
=new EventWaitHandle(false, EventResetMode.ManualReset);
static void Main(string[] args)
{
for (int j = 0; j < 100; j++)
{
ThreadPool.QueueUserWorkItem(dostuff, j);
}
clearCount.WaitOne();
}
static void dostuff(dynamic input)
{
lock (locker)
{
Console.WriteLine(input);
if (input == 99) clearCount.Set();
}
}
}
It doesn't make sense to put a lock where you're putting it, as you're not locking code which changes a value shared by multiple threads. The section of code you're locking doesn't change any variables at all.
The reason the numbers are out of order is because the threads aren't guaranteed to start in any particular order, unless you do something like #Mikael Svenson suggests.
For an example of a shared variable, if you use this code:
class Program
{
static object locker = new object();
static int count=0;
static void Main(string[] args)
{
for (int j = 0; j < 100; j++)
{
(new Thread(new ParameterizedThreadStart(dostuff))).Start(j);
}
Console.ReadKey();
}
static void dostuff(object Id)
{
lock (locker)
{
count++;
Console.WriteLine("Thread {0}: Count is {1}", Id, count);
}
}
}
You'll probably see that the Thread numbers aren't in order, but the count is. If you remove the lock statement, the count won't be in order either.
You have a couple of problems and wrong assumptions here.
Creating 100 threads in this fashion is not recommended.
The threads are not going to execute in the order they are started.
Placing the lock where you have it will effectively serialize the execution of the threads immediately removing any advantage you were hoping to gain by using threading.
The best approach to use is to partition your problem into separate independent chunks which can be computed simultaneously using only the least amount of thread synchronization as possible. These partitions should be executed on small and fairly static number of threads. You can use the ThreadPool, Parallel, or Task classes for doing this.
I have included a sample pattern using the Parallel.For method. To make the sample easy to understand lets say you have a list of objects that you want to clone and land into a separate list. Lets assume the clone operation is expensive and that you want to parallelize the cloning of many objects. Here is how you would do it. Notice the placement and limited use of the lock keyword.
public static void Main()
{
List<ICloneable> original = GetCloneableObjects();
List<ICloneable> copies = new List<ICloneable>();
Parallel.For(0, 100,
i =>
{
ICloneable cloneable = original[i];
ICloneable copy = cloneable.Clone();
lock (copies)
{
copies.Add(copy);
}
});
}

Categories

Resources