Strange behavior doing multithreaded activity in C# - c#

I've got a fairly simple program to enter and serialize an object using a lambda expression to pass things off to another thread.
using System;
using System.Threading;
using Newtonsoft.Json;
namespace MultithreadingApplication
{
class ThreadCreationProgram
{
static void Main(string[] args)
{
myObject theObject = new myObject();
Console.WriteLine("Enter the following:");
Console.WriteLine("color:");
theObject.Color = Console.ReadLine();
Console.WriteLine("number");
theObject.Color = Console.ReadLine();
Console.WriteLine("shape:");
theObject.Shape = Console.ReadLine();
Thread myNewThread = new Thread(() => Serialize(theObject));
myNewThread.Start();
myNewThread.Abort();
Console.ReadKey();
}
public static void Serialize(myObject theObject)
{
string json = JsonConvert.SerializeObject(theObject, Formatting.Indented);
Console.WriteLine(json);
Thread.Sleep(1000);
}
}
public class myObject
{
private Int32 number;
private String color, shape;
public Int32 Number
{
get { return number; }
set { number = value; }
}
public String Color
{
get { return color; }
set { color = value; }
}
public String Shape
{
get { return shape; }
set { shape = value; }
}
public myObject()
{
}
}
}
When I run this thing, I notice that sometimes, it won't actually call the Serialize method. I've examined it with breakpoints and it will instantiate the thread with the lambda expression statement and immediately terminate it without ever going down to the Serialize method. I'm new to multithreading, so what's the deal here?

myNewThread.Start();
myNewThread.Abort();
The thread sometimes fails to make any progress because you abort it before it has a chance to execute. If you want the thread to execute don't abort it.
The whole point of threads is that they execute independent of each other. When you call Start the thread is instructed to begin executing. In the meantime the calling thread is free to continue. In your case it immediately aborts the thread. And this can happen before the thread has got going.

The problem here is that you call Thread.Start() which tells the computer: "Hey, begin doing this work in the lamda expression in a different context.
Right when that message is sent, you are immediately calling Thread.Abort() [http://msdn.microsoft.com/en-us/library/System.Threading.Thread_methods%28v=vs.110%29.aspx ]. It immediately kills the thread so sometimes no work will get done at all.
This SO answer should point you in the right direction: How to wait for thread to finish with .NET?
Checkout Thread.Join() here

Related

How do I guarantee execution of code only if and when optional main thread task and worker threads are finished?

Background:
I have an application I am developing that deals with a large number of addons for another application. One if its primary uses is to safely modify file records in files with fewer records so that they may be treated as one file (almost as if it is combing the files together into one set of records. To do this safely it keeps track of vital information about those files and changes made to them so that those changes can be undone if they don't work as expected.
When my application starts, it analyzes those files and keeps essential properties in a cache (to reduce load times). If a file is missing from the cache, the most important stuff is retrieved and then a background worker must process the file for more information. If a file that was previously modified has been updated with a new version of the file, the UI must confirm this with the user and its modification data removed. All of this information, including information on its modification is stored in the cache.
My Problem:
My problem is that neither of these processes are guaranteed to run (the confirmation window or the background file processor). If either of them run, then the cache must be updated by the main thread. I don't know enough about worker threads, and which thread runs the BackgroundWorker.RunWorkerCompleted event handler in order to effectively decide how to approach guaranteeing that the cache updater is run after either (or both) processes are completed.
To sum up: if either process is run, they both must finish and (potentially) wait for the other to be completed before running the cache update code. How can I do this?
ADJUNCT INFO (My current intervention that doesn't seem to work very well):
I have a line in the RunWorkerCompleted handler that waits until the form reference is null before continuing and exiting but maybe this was a mistake as it sometimes locks my program up.
SpinWait.SpinUntil(() => overwriteForm == null);
I haven't included any more code because I anticipate that this is more of a conceptual question than a code one. However, if necessary, I can supply code if it helps.
I think CountDownTask is what you need
using System;
using System.Threading;
public class Program
{
public class AtomicInteger
{
protected int value = 0;
public AtomicInteger(int value)
{
this.value = value;
}
public int DecrementAndGet()
{
int answer = Interlocked.Decrement(ref value);
return answer;
}
}
public interface Runnable
{
void Run();
}
public class CountDownTask
{
private AtomicInteger count;
private Runnable task;
private Object lk = new Object();
private volatile bool runnable;
private bool cancelled;
public CountDownTask(Int32 count, Runnable task)
{
this.count = new AtomicInteger(count);
this.task = task;
this.runnable = false;
this.cancelled = false;
}
public void CountDown()
{
if (count.DecrementAndGet() == 0)
{
lock (lk)
{
runnable = true;
Monitor.Pulse(lk);
}
}
}
public void Await()
{
lock (lk)
{
while (!runnable)
{
Monitor.Wait(lk);
}
if (cancelled)
{
Console.WriteLine("Sorry! I was cancelled");
}
else {
task.Run();
}
}
}
public void Cancel()
{
lock (lk)
{
runnable = true;
cancelled = true;
Monitor.Pulse(lk);
}
}
}
public class HelloWorldTask : Runnable
{
public void Run()
{
Console.WriteLine("Hello World, I'm last one");
}
}
public static void Main()
{
Thread.CurrentThread.Name = "Main";
Console.WriteLine("Current Thread: " + Thread.CurrentThread.Name);
CountDownTask countDownTask = new CountDownTask(3, new HelloWorldTask());
Thread worker1 = new Thread(() => {
Console.WriteLine("Worker 1 run");
countDownTask.CountDown();
});
Thread worker2 = new Thread(() => {
Console.WriteLine("Worker 2 run");
countDownTask.CountDown();
});
Thread lastThread = new Thread(() => countDownTask.Await());
lastThread.Start();
worker1.Start();
worker2.Start();
//countDownTask.Cancel();
Console.WriteLine("Main Thread Run");
countDownTask.CountDown();
Thread.Sleep(1000);
}
}
let me explain (but you can refer Java CountDownLatch)
1. To ensure a task must run after another tasks, we need create a Wait function to wait for they done, so I used
while(!runnable) {
Monitor.Wait(lk);
}
2. When there is a task done, we need count down, and if count down to zero (it means all of the tasks was done) we will need notify to blocked thread to wake up and process task
if(count.decrementAndGet() == 0) {
lock(lk) {
runnable = true;
Monitor.Pulse(lk);
}
}
Let read more about volatile, thanks
While dung ta van's "CountDownTask" answer isn't quite what I needed, it heavily inspired the solution below (see it for more info). Basically all I did was add some extra functionality and most importantly: made it so that each task "vote" on the outcome (true or false). Thanks dung ta van!
To be fair, dung ta van's solution DOES work to guarantee execution which as it turns out isn't quite what I needed. My solution adds the ability to make that execution conditional.
This was my solution which worked:
public enum PendingBool
{
Unknown = -1,
False,
True
}
public interface IRunnableTask
{
void Run();
}
public class AtomicInteger
{
int integer;
public int Value { get { return integer; } }
public AtomicInteger(int value) { integer = value; }
public int Decrement() { return Interlocked.Decrement(ref integer); }
public static implicit operator int(AtomicInteger ai) { return ai.integer; }
}
public class TaskElectionEventArgs
{
public bool VoteResult { get; private set; }
public TaskElectionEventArgs(bool vote) { VoteResult = vote; }
}
public delegate void VoteEventHandler(object sender, TaskElectionEventArgs e);
public class SingleVoteTask
{
private AtomicInteger votesLeft;
private IRunnableTask task;
private volatile bool runTask = false;
private object _lock = new object();
public event VoteEventHandler VoteCast;
public event VoteEventHandler TaskCompleted;
public bool IsWaiting { get { return votesLeft.Value > 0; } }
public PendingBool Result
{
get
{
if (votesLeft > 0)
return PendingBool.Unknown;
else if (runTask)
return PendingBool.True;
else
return PendingBool.False;
}
}
public SingleVoteTask(int numberOfVotes, IRunnableTask taskToRun)
{
votesLeft = new AtomicInteger(numberOfVotes);
task = taskToRun;
}
public void CastVote(bool vote)
{
votesLeft.Decrement();
runTask |= vote;
VoteCast?.Invoke(this, new TaskElectionEventArgs(vote));
if (votesLeft == 0)
lock (_lock)
{
Monitor.Pulse(_lock);
}
}
public void Await()
{
lock(_lock)
{
while (votesLeft > 0)
Monitor.Wait(_lock);
if (runTask)
task.Run();
TaskCompleted?.Invoke(this, new TaskElectionEventArgs(runTask));
}
}
}
Implementing the above solution was as simple as creating the SingleVoteTask in the UI thread and then having each thread affecting the outcome cast a vote.

Creating a deadlock

I have some code that looks like the below. Does this create a deadlock?
private readonly object objectLock = new object();
public void MethodA()
{
lock(objectLock)
{
MethodB();
}
}
public void MethodB()
{
lock(objectLock)
{
//do something
}
}
UPDATE: There will 2 threads running
No - but this would be:
private readonly object objectLockA = new object();
private readonly object objectLockB = new object();
public void MethodA()
{
lock(objectLockA)
{
lock(objectLockB)
{
//...
}
}
}
public void MethodB()
{
lock(objectLockB)
{
lock(objectLockA)
{
//do something
}
}
}
If you call both Methods in parallel (from 2 different threads) then you would get a deadlock...
No its not a deadlock. Its the same thread locking on the same synchronization object. A thread can take nested locks. It just needs to release it equal no. of times.
No, you'd need two lock objects to enable a deadlock.
If this is the only mutex involved, it isn't. The same thread can lock the same mutex multiple times, as long it unlocks it equal number of times.
Calling MethodA produces the following sequence of operations on the same thread:
Lock objectLock.
Call MethodB.
Lock objectLock.
Unlock objectLock.
Exit MethodB.
Unlock objectLock.
So, objectLock is locked twice and unlocaked twice, but there is no deadlock:
If another thread tries to call MethodA it will simply block on the first lock but will not deadlock.
It if calls MethodB, the same would happen.
And if first thread call MethodB and then other thread calls MethodA, again "normal" blocking would take place but not a deadlock.
If you copy paste the following lines, compile and run see that "never called" is not printed in the console.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
namespace deadlocktest
{
class Program
{
static object object1 = new object();
static object object2 = new object();
public static void FunctionOne()
{
lock (object1)
{
Console.WriteLine("FunctionOne called 1");
Thread.Sleep(1000);
lock (object2)
{
Console.WriteLine("FunctionOne called 2, never called");
}
}
}
public static void FunctionTwo()
{
lock (object2)
{
Console.WriteLine("FunctionTwo called 1");
Thread.Sleep(1000);
lock (object1)
{
Console.WriteLine("FunctionTwo called 2, never called");
}
}
}
static void Main(string[] args)
{
Thread thread1 = new Thread(FunctionOne);
Thread thread2 = new Thread(FunctionTwo);
thread1.Start();
thread2.Start();
int i = 0;
while (i < 9)
{
Console.WriteLine("How bad thread!");
i++;
}
thread1.Join();
thread2.Join();
Console.ReadLine();
}
}
}

Is it safe to use a boolean flag to stop a thread from running in C#

My main concern is with the boolean flag... is it safe to use it without any synchronization? I've read in several places that it's atomic (including the documentation).
class MyTask
{
private ManualResetEvent startSignal;
private CountDownLatch latch;
private bool running;
MyTask(CountDownLatch latch)
{
running = false;
this.latch = latch;
startSignal = new ManualResetEvent(false);
}
// A method which runs in a thread
public void Run()
{
startSignal.WaitOne();
while(running)
{
startSignal.WaitOne();
//... some code
}
latch.Signal();
}
public void Stop()
{
running = false;
startSignal.Set();
}
public void Start()
{
running = true;
startSignal.Set();
}
public void Pause()
{
startSignal.Reset();
}
public void Resume()
{
startSignal.Set();
}
}
Is this a safe way to design a task in this way? Any suggestions, improvements, comments?
Note: I wrote my custom CountDownLatch class in case you're wondering where I'm getting it from.
Update:
Here is my CountDownLatch too:
public class CountDownLatch
{
private volatile int m_remain;
private EventWaitHandle m_event;
public CountDownLatch (int count)
{
if (count < 0)
throw new ArgumentOutOfRangeException();
m_remain = count;
m_event = new ManualResetEvent(false);
if (m_remain == 0)
{
m_event.Set();
}
}
public void Signal()
{
// The last thread to signal also sets the event.
if (Interlocked.Decrement(ref m_remain) == 0)
m_event.Set();
}
public void Wait()
{
m_event.WaitOne();
}
}
You better mark it volatile though:
The volatile keyword indicates that a
field might be modified by multiple
concurrently executing threads. Fields
that are declared volatile are not
subject to compiler optimizations that
assume access by a single thread. This
ensures that the most up-to-date value
is present in the field at all times.
But I would change your loop:
startSignal.WaitOne();
while(running)
{
//... some code
startSignal.WaitOne();
}
As it is in your post the 'some code' might execute when the thread is stopped (ie. when Stop is called) which is unexpected and may be even incorrect.
Booleans are atomic in C#, however, if you want to modify it in one thread and read it in another, you will need to mark it volatile at the very least,. Otherwise the reading thread may only actually read it once into a register.
Booleans are atomic in C#: http://msdn.microsoft.com/en-us/library/aa691278(VS.71).aspx
BTW, I just noticed this part of the code:
// A method which runs in a thread
public void Run()
{
startSignal.WaitOne();
while(running)
{
startSignal.WaitOne();
//... some code
}
latch.Signal();
}
You will need to unblock the worker thread twice using "startSignal.Set()" for the code within the while block to execute.
Is this deliberate?

Sample code to illustrate a deadlock by using lock(this)

I've read several articles and posts that say that lock(this), lock(typeof(MyType)), lock("a string") are all bad practice because another thread could lock on the same key and cause a deadlock. In order to understand this problem, I was trying to create some sample code to illustrate the deadlock but have been unable to wrap my head around this.
Can someone write a concise bit of code that illustrates this classic problem? Please keep it short, I can digest code in smaller chunks only.
Edit:
I think lassevk sums it up well; that the real problem is that you have lost control over your locks. Once that happens, you cannot control the order the locks are called, and you are allowing a potential deadlock situation.
lock(this), lock(typeof(MyType)), etc all are situations where you have chosen a lock that is impossible to control.
A deadlock will only occur if you have more than one lock. You need a situation where both threads hold a resource that the other needs (which means there has to be a least two resources, and the two threads have to attempt to acquire them in a different order)
So a simple example:
// thread 1
lock(typeof(int)) {
Thread.Sleep(1000);
lock(typeof(float)) {
Console.WriteLine("Thread 1 got both locks");
}
}
// thread 2
lock(typeof(float)) {
Thread.Sleep(1000);
lock(typeof(int)) {
Console.WriteLine("Thread 2 got both locks");
}
}
Assuming both threads are started within a second of each others, they will both have time to grab the first lock before anyone gets to the inner lock. Without the Sleep() call, one of the threads would most likely have time to get and release both locks before the other thread even got started.
The idea is that you should never lock on something you cannot control who has access to.
Type objects are singletons visible to every .net piece of code and you cannot control who locks on your "this" object from the outside.
Same thing is for strings: since strings are immutable, the framework keeps just one instance of "hard coded" strings and puts them in a pool (the string is said to be interned), if you write two times in your code the string "hello", you will always get the same abject.
Consider the following example: you wrote just Thread1 in your super private call, while Thread2 is called by some library you are using in a background thread...
void Thread1()
{
lock (typeof(int))
{
Thread.Sleep(1000);
lock (typeof(long))
// do something
}
}
void Thread2()
{
lock (typeof(long))
{
Thread.Sleep(1000);
lock (typeof(int))
// do something
}
}
Sure, here you go.
Note that the common example for a deadlock is when you acquire multiple locks, and two or more threads end up waiting for each other.
For instance, two threads that locks like this:
Thread 1 Thread 2
Lock "A" Lock "B"
Lock "B" Lock "A" <-- both threads will stop dead here
waiting for the lock to be come
available.
However, in this example I didn't bother with that, I just let one thread lock indefinitely. You really don't want to loose control over your locks, so while this is a contrived example, the fact that the background thread can completely block the main thread like this, is bad.
using System;
using System.Threading;
namespace ConsoleApplication7
{
public class Program
{
public static void Main(string[] args)
{
LockableClass lockable = new LockableClass();
new Thread(new ParameterizedThreadStart(BackgroundMethod)).Start(lockable);
Thread.Sleep(500);
Console.Out.WriteLine("calling Reset");
lockable.Reset();
}
private static void BackgroundMethod(Object lockable)
{
lock (lockable)
{
Console.Out.WriteLine("background thread got lock now");
Thread.Sleep(Timeout.Infinite);
}
}
}
public class LockableClass
{
public Int32 Value1 { get; set; }
public Int32 Value2 { get; set; }
public void Reset()
{
Console.Out.WriteLine("attempting to lock on object");
lock (this)
{
Console.Out.WriteLine("main thread got lock now");
Value1 = 0;
Value2 = 0;
}
}
}
}
This is pretty standard bad-ness. Grabing the locks out of order and then sleeping with the lock. Two bad things to do. :)
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading;
namespace DeadLock
{
public class Program
{
static void Main(string[] args)
{
var ddt = new DontDoThat();
ddt.Go();
}
}
public class DontDoThat
{
private int _badSharedState = 0;
private readonly object _lock1 = new object();
private readonly object _lock2 = new object();
public void Go()
{
new Thread(BadGuy1).Start();
new Thread(BadGuy2).Start();
Console.WriteLine("Leaving Go!");
}
public void BadGuy1()
{
lock (_lock1)
{
Thread.Sleep(100); // yeild with the lock is bad
lock (_lock2)
{
_badSharedState++;
Console.Write("From Bad Guy #1: {0})", _badSharedState );
}
}
}
public void BadGuy2()
{
lock (_lock2)
{
lock (_lock1)
{
_badSharedState++;
Console.Write("From Bad Guy #2: {0})", _badSharedState);
}
}
}
}
}
The problem is that lock("a string") is locking on a singleton. This means that other objects that use the same lock could be an infinite wait.
for example:
using System;
using System.Threading;
namespace ThreadLock
{
class Program
{
static void Main(string[] args)
{
lock ("my lock")
{
ManualResetEvent evt = new ManualResetEvent(false);
WorkerObject worker = new WorkerObject(evt);
Thread t = new Thread(new ThreadStart(worker.Work));
t.Start();
evt.WaitOne();
}
}
}
class WorkerObject
{
private ManualResetEvent _evt;
public WorkerObject(ManualResetEvent evt)
{
_evt = evt;
}
public void Work()
{
lock ("my lock")
{
Console.WriteLine("worked.");
_evt.Set();
}
}
}
}
In this case, the calling code creates a lock on a string then makes a worker object. The worker object in Work() locks on the same string, which is a singleton in C#. It ends up in deadlock because the caller owns the lock and is waiting for a signal which will never come.
class Character
{
public Character Other;
public string Name;
private object locker = new object();
public Character(string name)
{
Name = name;
}
public void Go()
{
lock (locker)
{
Thread.Sleep(1000);
Console.WriteLine("go in {0}", Name);
Other.Go();
}
}
}
class Program
{
static void Main(string[] args)
{
Character a = new Character("A");
Character b = new Character("B");
a.Other = b;
b.Other = a;
new Thread(a.Go).Start();
b.Go();
Console.ReadLine();
}
}

Getting list of currently active managed threads in .NET?

For a "log information for support" type of function I'd like to enumerate and dump active thread information.
I'm well aware of the fact that race conditions can make this information semi-inaccurate, but I'd like to try to get the best possible result, even if it isn't 100% accurate.
I looked at Process.Threads, but it returns ProcessThread objects, I'd like to have a collection of Thread objects, so that I can log their name, and whether they're background threads or not.
Is there such a collection available, even if it is just a snapshot of the active threads when I call it?
ie.
Thread[] activeThreads = ??
Note, to be clear, I am not asking about Process.Threads, this collection gives me a lot, but not all of what I want. I want to know how much time specific named threads in our application is currently using (which means I will have to look at connecting the two types of objects later, but the names is more important than the CPU time to begin with.)
If you're willing to replace your application's Thread creations with another wrapper class, said wrapper class can track the active and inactive Threads for you. Here's a minimal workable shell of such a wrapper:
namespace ThreadTracker
{
using System.Collections.Generic;
using System.Collections.ObjectModel;
using System.Threading;
public class TrackedThread
{
private static readonly IList<Thread> threadList = new List<Thread>();
private readonly Thread thread;
private readonly ParameterizedThreadStart start1;
private readonly ThreadStart start2;
public TrackedThread(ParameterizedThreadStart start)
{
this.start1 = start;
this.thread = new Thread(this.StartThreadParameterized);
lock (threadList)
{
threadList.Add(this.thread);
}
}
public TrackedThread(ThreadStart start)
{
this.start2 = start;
this.thread = new Thread(this.StartThread);
lock (threadList)
{
threadList.Add(this.thread);
}
}
public TrackedThread(ParameterizedThreadStart start, int maxStackSize)
{
this.start1 = start;
this.thread = new Thread(this.StartThreadParameterized, maxStackSize);
lock (threadList)
{
threadList.Add(this.thread);
}
}
public TrackedThread(ThreadStart start, int maxStackSize)
{
this.start2 = start;
this.thread = new Thread(this.StartThread, maxStackSize);
lock (threadList)
{
threadList.Add(this.thread);
}
}
public static int Count
{
get
{
lock (threadList)
{
return threadList.Count;
}
}
}
public static IEnumerable<Thread> ThreadList
{
get
{
lock (threadList)
{
return new ReadOnlyCollection<Thread>(threadList);
}
}
}
// either: (a) expose the thread object itself via a property or,
// (b) expose the other Thread public methods you need to replicate.
// This example uses (a).
public Thread Thread
{
get
{
return this.thread;
}
}
private void StartThreadParameterized(object obj)
{
try
{
this.start1(obj);
}
finally
{
lock (threadList)
{
threadList.Remove(this.thread);
}
}
}
private void StartThread()
{
try
{
this.start2();
}
finally
{
lock (threadList)
{
threadList.Remove(this.thread);
}
}
}
}
}
and a quick test driver of it (note I do not iterate over the list of threads, merely get the count in the list):
namespace ThreadTracker
{
using System;
using System.Threading;
internal static class Program
{
private static void Main()
{
var thread1 = new TrackedThread(DoNothingForFiveSeconds);
var thread2 = new TrackedThread(DoNothingForTenSeconds);
var thread3 = new TrackedThread(DoNothingForSomeTime);
thread1.Thread.Start();
thread2.Thread.Start();
thread3.Thread.Start(15);
while (TrackedThread.Count > 0)
{
Console.WriteLine(TrackedThread.Count);
}
Console.ReadLine();
}
private static void DoNothingForFiveSeconds()
{
Thread.Sleep(5000);
}
private static void DoNothingForTenSeconds()
{
Thread.Sleep(10000);
}
private static void DoNothingForSomeTime(object seconds)
{
Thread.Sleep(1000 * (int)seconds);
}
}
}
Not sure if you can go such a route, but it will accomplish the goal if you're able to incorporate at an early stage of development.
Is it feasible for you to store thread information in a lookup as you create each thread in your application?
As each thread starts, you can get its ID using AppDomain.GetCurrentThreadId(). Later, you can use this to cross reference with the data returned from Process.Threads.

Categories

Resources