Thread Synchronization - c#

I have a list of objects,
for each object i want to run a totally separate thread (thread safty),like....i will pick one a object from my list in while loop and run a thread and then for next object run the next threads...all thread should be synchronized such that resources (values/connection (close/open) )shared by them should not change.....

Starting a thread per object is not necessarily wise; you should probably have a small number of worker threads picking items off the list (or better, a Queue<T>), synchronizing access to that list/queue. An example of a thread-safe queue can be found in this thread.
Once you have a work item, there is no magic bullet for making the rest of the code you write (to process it) thread-safe. A sensible approach that keeps things simple is immutability - either true immutability (the items can't change), or simply don't change the object. You can of course implement locking around the work item, but this only helps if all your code uses the same locking strategy, which is hard to enforce.

i will pick one a object from my list
in while loop and run a thread and
then for next object run the next
threads
If I really wanted a thread per object, which I probably wouldn't, I would create a class like this:
class ObjectProcessingThread
{
Thread processingThread = new Thread();
public TargetObject { get; set;}
public Start()
{
//start the processing thread with threadEntryPoint as the work the thread will do
}
private threadEntryPoint
{
//do stuff with targetObject
}
}
Then in the while loop new up an ObjectProcessingThread for each object, setting it's TargetObject property, then calling Start.
all thread should be synchronized such
that resources (values/connection
(close/open) )shared by them should
not change.....
If you don't want values to change, don't change them.

Related

Parallel execution calling a method thread safety

Having the following:
List<Person> persons = // list of persons
Parallel.ForEach(persons, (i) => {
AddAge(i);
});
// Does this method needs to be thread safe?
// Why?
public void AddAge(Person person)
{
// Multiple threads execute here at once. However they're
// working with their own "person" object, therefore
// each thread won't corrupt others "person" object - is this assumption correct?
person.Age =+ 10;
}
Since each person gets updated "separately" on their own separate threads and one has nothing to do with another, does AddAge() method have to be thread safe?
Does CLR executes it's own copy of "AddAge()" per thread - making it separate between threads?
Thread safety relates to modifying the same data from multiple threads. If you are operating on separate data (such as your Parallel.ForEach) and are properly gating your work so that batches are completed before dependant work you do not need thread safe code within, as you are doing your thread safety outside the method (by ensuring each thread gets its own set of data to work with).

Force thread not to give back CPU until a part is finished

Consider two threads run simultaneously. A is reading and B is writing. When A is reading, in the middle of code ,CPU time for A finishes then B thread continues.
Is there any way to don't give back CPU until A finishes, but B can start or continue?
You need to understand that you have almost no control over when CPU is given back and to whom it is given. The operating system does that. To have control on that, you'd need to be the operating system. The only things you can usually do are:
start a thread
set thread priority, so some threads are may more likely get time than others
put a thread to sleep, immediatelly and ask the operating system to wake it up upon some condition, maybe with some timeout (waiting time limit)
as a special case, or a typical use case, the second point is often also provided with a shorthand:
put a thread to sleep, immediatelly for a specified amount of time
By "sleep" I mean that this thread is paused and will not get any CPU time, even if all CPUs are idle, unless the thread is woken up by the OS due to some condition.
Furthermore, in a typical case, there is no "thread A and thread B that switch CPU time between them", but there is "lots of threads from various processes and the operating system itself, and you two threads". This means that when your thread A loses the CPU, most probably it will not be the thread B that gets the time now. Some other thread from somewhere else will get it, and at some future point of time, maybe your thread A or maybe thread B will get it back.
This means that there is very little you can be sure. You can be sure that your threads are
either dead
or sleeping
or proceeding 'forward' in a hard to determine order
If you need to ensure that some threads are synchronized, you must .. not start them simultaneously, or put them sleep in precise moments and wake them up in precise order.
You've just said in comments:
You know, if in the middle of A CPU time finishes, data that has been retrieved is not complete
This means that you need to ensure that thread B does not try to touch the data before thread A finishes writing it. But also, if you think about it, you need to ensure that thread A doesn't start writing next data if the thread B is now reading previous data.
This means synchronization. This means that threads A and B must wait if the other thread is touching the data. This means that they need to be put to sleep and woken up when the other thread finishes.
In C#, the easiest way to do that is to use lock(x) keyword. When a thread enters a lock() section, it proceeds only if it is able to get the lock. If not, it is put to sleep. It can't get the lock if any other thread was faster and got it before. However, a thread releases the lock when it ends its job. Upon that time, one of the sleeping threads is woken up and given the lock.
lock(fooo) { // <- this line means 'acquire the lock or sleep'
iam.doing(myjob);
very.important(things);
thatshouldnt.be.interrupted();
byother(threads);
} // <- this line means 'release the lock'
So, when a thread gets through the lock(fooo){ line, you can't be sure it won't be interrupted. Oh, surely it will be. OS will switch the threads back and forth to other processes, and so on. But you can be sure that no other threads of your app will be inside the code block. If they tried to get inside while your thread got that lock, they'd imediatelly fall asleep in the first lock line. One of them be will be later woken up when your thread gets out of that code.
There's one more thing. lock() keyword requires a parameter. I wrote foo there. You need to pass there something that will act as the lock. It can be any object, even plain object:
private object thelock = new object();
private void dosomething()
{
lock(thelock)
{
foobarize(thebaz);
}
}
however you must ensure that all threads try to use the same lock instance. Writing a code like
private void dosomething()
{
object thelock = new object();
lock(thelock)
{
foobarize(thebaz);
}
}
is a nonsense since every potential thread executing that lines will try lockin upon their own new object instance and will see it as "free" (it's new, just created, noone took it earlier) and will immediatelly get into the protected code block.
Now you wrote about using ConcurrentQueue. This class provides safely mechanisms against concurrency. You can be sure that adding or reading or removing items from that queue is already safe. This collection makes it safe. You don't need to add synchronization to add or remove items safely. It's safe. If you observe any ill effects, then most probably you have tried putting an item into that collection and then you were modifying that item. Concurrent collection will not guard you against that. It can only make sure that add/remove/etc are safe. But it has no knowledge or control on what you do to the items:
In short, if some thread B tries to read items from the collection, then in thread A this is NOT safe:
concurrentcoll.Add(item);
item.x = 5;
item.foobarize();
but this is safe:
item.x = 5;
item.foobarize();
concurrentcoll.Add(item);
// and do not touch the Item anymore here.

Cross-Thread access of a field in C#

If a class has an array, it doesn't really matter what of. Now one thread is adding data to said array, while another thread needs to process the data that is already in it. With my limited knowledge of multithreading, how could this work? The first problem I can think of is if an item is added while the other thread is processing what's still there. At first I thought that wouldn't be a problem, the processor thread would get it next time it processed, but then I realized that while the processor thread removes items it's already processed, the adding thread would not receive this change, possibly (?) wreaking havoc. Is there any good way to implement this behavior?
What you've described is basically the Reader Writers Problem. If you want to take care of multithreading, you're either going to need a concurrent collection, or use of a lock. The simplest implementation of a lock would just be locking an object
private Object myLock = new Object();
public MyClass ReadFromSharedArray()
{
lock(myLock)
{
//do whatever here
}
}
public void WriteToSharedArray(MyClass data)
{
lock(myLock)
{
//Do whatever here
}
}
There are better locks such as ReadWriterSlim locks but this sort of basic implementation should be a good starting point.
Also you mentioned adding/removing from arrays, I'm assuming you meant Lists (or better yet a Queue) - there's a ConcurrentQueuewhich could be a good replacement.

.net Lock - Two Questions

Two questions about the Lock() construct in .net
First, I am aware that if an object is locked within one class and another class attempts to lock the same object this produces a deadlock. But why? I have read about it on MSDN but MSDN is rarely all that clear.
----Edit Question One----
Still confused. I have a main thread (UI thread) that spawns many Threadpool threads. Each child thread locks the data before it works with it. This works fine every time.
If I then attempt to lock the same data from the UI thread to check if I should even bother creating a new thread for an edge case I create deadlock nearly every time.
----Edit Question Two----
Secondly, If I have a compound object that I lock are all child objects within it locked as well? Short code Demo:
internal sealed class Update
{
//Three objects instantiated via other external assemblies
public DataObject One { get; set; }
public DataObject Two { get; set; }
public ReplayStatus Status { get; set; }
}
If I call lock(UpdateObject) are each of the three internal objects and all of there child objects locked as well?
So I should do somthing like this to prevent threads from playing with my data objects:
Lock(UpdateObject.One)
{
Lock(UpdateObject.Two)
{
Lock(UpdateObject.Status)
{
//Do Stuff
}
}
}
First, I am aware that if an object is locked within one class and another class attempts to lock the same object this produces a deadlock.
No. If one thread locks an object and a second thread attempts to lock that object, that second thread must wait for the first thread to exit the lock.
Deadlock is something else:
1. thread1 locks instanceA
2. thread2 locks instanceB
3. thread1 attempts to lock instanceB and now must wait on thread2
4. thread2 attempts to lock instanceA and now must wait on thread1
These two threads can no longer execute, and so never release their locks. What a mess.
If I call lock(UpdateObject) are each of the three internal objects and all of there child objects locked as well?
No, the "lock" is only on the locked instance. Note: the lock doesn't prevent anything other than a second thread from acquiring a lock on that instance at the same time.
First, the whole point of a lock is that two sections of code can't get ahold of the same lock at once. This is to coordinate multiple threads working with the same stuff without interfering with each other. If you have a lock on an object, then anyone else that tries to get the lock will block (wait) until the original lock is released (only one thread can have the lock at any given time). You only have a deadlock if the first thread never gives up the lock, or if both threads are waiting for something from each other and neither can proceed until each gets what it's waiting for.
Second, if you lock an object in C#, you're not really "locking" the object in any semantic sense. You're acquiring a "lock" on the object (which you later release or relenquish). The object is purely a convenient token that is used to uniquely identify which lock you wish to obtain. So no, a lock on an object does not create a lock on any sub-parts of that object.

creating and terminating a thread contained in a collection

I have a custom collection (a thread-safe ObservableQueue). I implemented the business logic inside the collection class (i.e. dequeue the items one by one and expose them to the outside). This is working fine. To prevent the collection from blocking the thread it is initialised in, the OnservableQueue implements a thread to perform that work. Now I am not perfectly sure of any pitfalls that could occur.
Is it a bad idea to initialise (not start! only initialise) the thread in the constructor? And what would be a good, if not best, practice of terminating the thread? Note, I dont need to know how to terminate a thread, that is working fine, I am rather interested in weather there is something wrong doing it using the disposable pattern or creating a method which would need to get called to terminate the thread. If implementing IDisposable are there any things I have to take in account regarding the collection/queue?
Edit: The thread is actually only pre-initialised to prevent NullReferenceException from being thrown in the Enqueue Method, where it is properly initilised again (the Enqueue Method is supposed to check weather a dequeuing thread is running already and if not to start a new one). Note that whenever all items are dequeued and the thread has done its work it will not be alive any longer either, so any time the queue is empty and a new item is added a new thread will get started to process the queue:
if (!_dequeuingThread.IsAlive)
{
// start the dequeuing thread
_dequeuingThread = new Thread(new ThreadStart(StartDequeuing));
_dequeuingThread.Name = "DeQueueThread";
_dequeuingThread.Start();
}
The if-statement does need an initialised thread. There are other possible ways of achieving this, but pre-initialising the thread seemed the least bothersome. You see that after checking weather the thread is alive, which it should not when being pre-initialised, it gets initialised again properly.
I don't see anything wrong with initialising in the constructor, but obviously bare in mind they will be initialised in a different thread than your worker thread.
As for stopping, I generally have a volatile boolean flag that the worker checks to keep running. If your worker thread sleeps at all, then have it wait on an event rather than sleeping, so you can wake it up immediately when stopping it.
There seems to be a problem with the fact that the consumer will initialize this collection object by calling its constructor and it would think that the object is initialized (that what the constructor is supposed to do), which is not correct as the initialization is happening on a separate thread created by the constructor. So, basically you need to implement some sort of "Asynchronous API on this object" to initialize this collection such that the consumer calls the initialize method (after creating the object using constructor) and then either by either passing a callback to the initialize method or by registering to an event on the collection object the consumer gets to know that the initialization has been completed.

Categories

Resources