Control.BeginInvoke Execution Order - c#

When calling BeginInvoke(), will the delegates comes back in the same order that the method is being called? or there is no guarantee which delegates will come back first?
public Form1()
{
InitializeComponent();
for (int i = 0; i < 100; i++)
{
Thread t = new Thread(DisplayCount);
t.Start(i);
}
}
public void DisplayCount(object count)
{
if (InvokeRequired)
{
BeginInvoke(new Action<object>(DisplayCount), count);
return;
}
listBox1.Items.Add(count);
}
And list of integers will come back out of order.

Control.BeginInvoke() will execute the action asynchronously, but on the UI thread.
If you call BeginInvoke() multiple times with different actions, they will come back in order of whichever ones complete the fastest.
As a side-note, you should probably use some sort of snychronization mechanism around your listBox1.Items.Add(count) calls, perhaps locking on its SynchRoot property.
From MSDN - ListBox.ObjectCollection Class
Any public static (Shared in Visual Basic) members of this type are
thread safe. Any instance members are not guaranteed to be thread
safe.
(Emphasis added)

If you call the same function multiple times, then they should come back in the same order, maybe! If you have a function analysing a 1 TB Dataset and another function just doing some Logging then I don't think they will came back in the same order.
It also depends on the DispatcherPriority you have set for BeginInvoke. A low priority like SystemIdl will be executet later then a higher priority like Send.

If you start a thread using Thread.Start() then the execution of the Thread-Function happens asynchronously at a random time after that call.
That's why you get random numbers in my opinion.

Related

Invoke not switching back to separate thread

I have the method below that is running on a separate thread from the main UI thread, and I am trying to update the ListBox control on the main thread. The code does work and the field does get updated, but when the Invoke method runs it switches to the main thread. The problem is that the code after the Invoke also runs on the main thread, but I need it to run on the separate thread.
public static void Status_Message(string str, int destination, int prompt)
{
//Clear_System_Message_Area();
sysmsg++;
ListBox tl = Application.OpenForms["GMLEC"].Controls["groupBox2"].Controls["TestList"] as ListBox;
if (!tl.InvokeRequired)
{
tl.Items.Add(str);
tl.Refresh();
}
else
{
tl.Invoke(new Action<string, int, int>(Status_Message), str, destination, prompt);
}
if (destination == 1)
{
Printer.Output(str);
}
if (prompt == 1)
{
Pause(false);
}
if (sysmsg > 23)
{
Pause(true);
}
}
Is there a way to make it go back to the separate thread?
If you don't want code run on the UI thread, don't invoke the method that contains it.
For what it's worth, I disagree with any code that uses InvokeRequired. First of all, you ought to know from the context whether invoke is required or not. If you don't know which thread the code that's executing is on, then there is too much coupling between the UI and background task parts of the code.
But secondly, the Control.Invoke() method has to check which thread is current anyway, because it has to work whether you are on the UI thread or not. You can always call it safely from the UI thread, and when you do, it can't go queueing up your delegate for invocation and then waiting for it, because that would deadlock. It has to just invoke the delegate directly, but only in that case, which means it's doing the InvokeRequired check anyway.
So, taking all of that into account, just write your code to always invoke the part that needs invoking, and be done with it.
For example:
public static void Status_Message(string str, int destination, int prompt)
{
//Clear_System_Message_Area();
sysmsg++;
ListBox tl = Application.OpenForms["GMLEC"].Controls["groupBox2"].Controls["TestList"] as ListBox;
tl.Invoke((MethodInvoker)(() =>
{
tl.Items.Add(str);
tl.Refresh();
}));
if (destination == 1)
{
Printer.Output(str);
}
if (prompt == 1)
{
Pause(false);
}
if (sysmsg > 23)
{
Pause(true);
}
}
Now, some other notes about this:
It's doubtful that you should be calling Refresh(). Let Winforms deal with updating on its own. If you've somehow interfered with it refreshing the window normally, fix that. Don't hack around it by calling Refresh() yourself.
It's almost certain that there's a better way to encapsulate the ListBox object than by always looking it up from the top of the UI control graph. For example, maybe the actual object should have been referenced directly (e.g. from a TestList field) and passed to the code that will eventually need it.
Finally, and most important, the fact that you're using Invoke() at all is a big warning flag in modern code. There's a decent chance that your overall code could be refactored to use async/await in a way that allows it to read more naturally and still work correctly, but at the very least it would be better to use Progress<T> to mediate cross-thread updates like this.
To address any of these issues would be beyond the scope of the current question, but I do encourage you to take the suggestions under consideration.
This might help...
normally I use Invoke() to activate a part of the script after a certain time. Invoke() Does NOT repeat, if you want it to repeat you can use InvokeRepeating().
another option is to use "Multi-Threading". Here's how to use Multi-Threading:
using System.Threading
public static Thread newThread = new Thread(MultiThread)
private void Start()
{
newThread.Start()
//also newThread.Abort() to quit the thread
}
private static void MultiThread()
{
// this is the seporate thread
// i normally use this for a "while (True)" loop cause it will stop responding
//otherwise
}
srry for any typos
Hopefully this helps

calling static method on a new thread

Suppose I have a static method like so:
public static string ProcessMessage()
{
string testString = " this is test ";
....
return testString ;
}
and another method SendMessage:
public void SendMessage()
{
Thread th = new Thread(this.ProcessMessage);
th.Start();
th.Join();
}
What happens when SendMessage is called several times one right after another.
Suppose first thread calls ProcessMessage and is at line 1 and another thread calls ProcessMessage, what happens to the first thread? will it ever finish, since ProcessMessage is a static method? what will ProcessMessage properties look like? will thread 2 overwrite thread 1?
To answer your general question, multiple threads can call a static method, and each thread will process that method call separately. If static variables are used within the static method, then you may run into cross-threading issues, due to those variables being shared among multiple threads, but the method code itself is simply a set of instructions that will be followed by whatever threads enter it.
In regards to your specific example, the Thread Constructor can only be passed either a ParameterizedThreadStart or a ThreadStart object (and possibly an Int32). Both types of ThreadStart parameters are delegates with void signatures (they cannot return anything).
Also, since you are calling Thread.Start, followed by Thread.Join. Thread.Join blocks the calling thread until the waited on thread completes. Since that is the case, multiple calls to SendMessage() from the same thread will never spawn multiple simultaneous threads. Instead, each call will create a new thread and then wait for it to finish before moving on to the next call to SendMesage(). This is no better than simply calling ProcessMessage in the original thread.

Under what conditions can a thread enter a lock (Monitor) region more than once concurrently?

(question revised): So far, the answers all include a single thread re-entering the lock region linearly, through things like recursion, where you can trace the steps of a single thread entering the lock twice. But is it possible somehow, for a single thread (perhaps from the ThreadPool, perhaps as a result of timer events or async events or a thread going to sleep and being awaken/reused in some other chunk of code separately) to somehow be spawned in two different places independently of each other, and hence, run into the lock re-entrance problem when the developer didn't expect it by simply reading their own code?
In the ThreadPool Class Remarks (click here) the Remarks seem to suggest that sleeping threads should be reused when they're not in use, or otherwise wasted by sleeping.
But on the Monitor.Enter reference page (click here) they say "It is legal for the same thread to invoke Enter more than once without it blocking." So I figure there must be something I'm supposed to be careful to avoid. What is it? How is it even possible for a single thread to enter the same lock region twice?
Suppose you have some lock region that takes an unfortunately long time. This might be realistic, for example, if you access some memory that has been paged out (or whatever.) The thread in the locked region might go to sleep or something. Does the same thread become eligible to run more code, which might accidentally step into the same lock region? The following does NOT, in my testing, get multiple instances of the same thread to run into the same lock region.
So how does one produce the problem? What exactly do you need to be careful to avoid?
class myClass
{
private object myLockObject;
public myClass()
{
this.myLockObject = new object();
int[] myIntArray = new int[100]; // Just create a bunch of things so I may easily launch a bunch of Parallel things
Array.Clear(myIntArray, 0, myIntArray.Length); // Just create a bunch of things so I may easily launch a bunch of Parallel things
Parallel.ForEach<int>(myIntArray, i => MyParallelMethod());
}
private void MyParallelMethod()
{
lock (this.myLockObject)
{
Console.Error.WriteLine("ThreadId " + Thread.CurrentThread.ManagedThreadId.ToString() + " starting...");
Thread.Sleep(100);
Console.Error.WriteLine("ThreadId " + Thread.CurrentThread.ManagedThreadId.ToString() + " finished.");
}
}
}
Suppose you have a queue that contains actions:
public static Queue<Action> q = whatever;
Suppose Queue<T> has a method Dequeue that returns a bool indicating whether the queue could be successfully dequeued.
And suppose you have a loop:
static void Main()
{
q.Add(M);
q.Add(M);
Action action;
while(q.Dequeue(out action))
action();
}
static object lockObject = new object();
static void M()
{
Action action;
lock(lockObject)
{
if (q.Dequeue(out action))
action();
}
}
Clearly the main thread enters the lock in M twice; this code is re-entrant. That is, it enters itself, through an indirect recursion.
Does this code look implausible to you? It should not. This is how Windows works. Every window has a message queue, and when a message queue is "pumped", methods are called corresponding to those messages. When you click a button, a message goes in the message queue; when the queue is pumped, the click handler corresponding to that message gets invoked.
It is therefore extremely common, and extremely dangerous, to write Windows programs where a lock contains a call to a method which pumps a message loop. If you got into that lock as a result of handling a message in the first place, and if the message is in the queue twice, then the code will enter itself indirectly, and that can cause all manner of craziness.
The way to eliminate this is (1) never do anything even slightly complicated inside a lock, and (2) when you are handling a message, disable the handler until the message is handled.
Re-Entrance is possible if you have a structure like so:
Object lockObject = new Object();
void Foo(bool recurse)
{
lock(lockObject)
{
Console.WriteLine("In Lock");
if (recurse) { foo(false); }
}
}
While this is a pretty simplistic example, it's possible in many scenarios where you have interdependent or recursive behaviour.
For example:
ComponentA.Add(): locks a common 'ComponentA' object, adds new item to ComponentB.
ComponentB.OnNewItem(): new item triggers data-validation on each item in list.
ComponentA.ValidateItem(): locks a common 'ComponentA' object to validate the item.
Same-thread re-entry on the same lock is needed to ensure you don't get deadlocks occurring with your own code.
One of the more subtle ways you can recurse into a lock block is in GUI frameworks. For example, you can asynchronously invoke code on a single UI thread (a Form class)
private object locker = new Object();
public void Method(int a)
{
lock (locker)
{
this.BeginInvoke((MethodInvoker) (() => Method(a)));
}
}
Of course, this also puts in an infinite loop; you'd likely have a condition by which you'd want to recurse at which point you wouldn't have an infinite loop.
Using lock is not a good way to sleep/awaken threads. I would simply use existing frameworks like Task Parallel Library (TPL) to simply create abstract tasks (see Task) to creates and the underlying framework handles creating new threads and sleeping them when needed.
IMHO, Re-entering a lock is not something you need to take care to avoid (given many people's mental model of locking this is, at best, dangerous, see Edit below). The point of the documentation is to explain that a thread cannot block itself using Monitor.Enter. This is not always the case with all synchronization mechanisms, frameworks, and languages. Some have non-reentrant synchronization in which case you have to be careful that a thread doesn't block itself. What you do need to be careful about is always calling Monitor.Exit for every Monitor.Enter call. The lock keyword does this for you automatically.
A trivial example with re-entrance:
private object locker = new object();
public void Method()
{
lock(locker)
{
lock(locker) { Console.WriteLine("Re-entered the lock."); }
}
}
The thread has entered the lock on the same object twice so it must be released twice. Usually it is not so obvious and there are various methods calling each other that synchronize on the same object. The point is that you don't have to worry about a thread blocking itself.
That said you should generally try to minimize the amount the time you need to hold a lock. Acquiring a lock is not computationally expensive, contrary to what you may hear (it is on the order of a few nanoseconds). Lock contention is what is expensive.
Edit
Please read Eric's comments below for additional details, but the summary is that when you see a lock your interpretation of it should be that "all activations of this code block are associated with a single thread", and not, as it is commonly interpreted, "all activations of this code block execute as a single atomic unit".
For example:
public static void Main()
{
Method();
}
private static int i = 0;
private static object locker = new object();
public static void Method()
{
lock(locker)
{
int j = ++i;
if (i < 2)
{
Method();
}
if (i != j)
{
throw new Exception("Boom!");
}
}
}
Obviously, this program blows up. Without the lock, it is the same result. The danger is that the lock leads you into a false sense of security that nothing could modify state on you between initializing j and evaluating the if. The problem is that you (perhaps unintentionally) have Method recursing into itself and the lock won't stop that. As Eric points out in his answer, you might not realize the problem until one day someone queues up too many actions simultaneously.
ThreadPool threads cannot be reused elsewhere just because they went to sleep; they need to finish before they're reused. A thread that is taking a long time in a lock region does not become eligible to run more code at some other independent point of control. The only way to experience lock re-entry is by recursion or executing methods or delegates inside a lock that re-enter the lock.
Let's think about something other than recursion.
In some of business logics, they would like to control the behaviors of synchronization.
One of these patterns, they invoke Monitor.Enter somewhere and would like to invoke Monitor.Exit elsewhere later. Here is the code to get the idea about that:
public partial class Infinity: IEnumerable<int> {
IEnumerator IEnumerable.GetEnumerator() {
return this.GetEnumerator();
}
public IEnumerator<int> GetEnumerator() {
for(; ; )
yield return ~0;
}
public static readonly Infinity Enumerable=new Infinity();
}
public partial class YourClass {
void ReleaseLock() {
for(; lockCount-->0; Monitor.Exit(yourLockObject))
;
}
void GetLocked() {
Monitor.Enter(yourLockObject);
++lockCount;
}
void YourParallelMethod(int x) {
GetLocked();
Debug.Print("lockCount={0}", lockCount);
}
public static void PeformTest() {
new Thread(
() => {
var threadCurrent=Thread.CurrentThread;
Debug.Print("ThreadId {0} starting...", threadCurrent.ManagedThreadId);
var intanceOfYourClass=new YourClass();
// Parallel.ForEach(Infinity.Enumerable, intanceOfYourClass.YourParallelMethod);
foreach(var i in Enumerable.Range(0, 123))
intanceOfYourClass.YourParallelMethod(i);
intanceOfYourClass.ReleaseLock();
Monitor.Exit(intanceOfYourClass.yourLockObject); // here SynchronizationLockException thrown
Debug.Print("ThreadId {0} finished. ", threadCurrent.ManagedThreadId);
}
).Start();
}
object yourLockObject=new object();
int lockCount;
}
If you invoke YourClass.PeformTest(), and get a lockCount greater than 1, you've reentered; not necessarily be concurrent.
If it was not safe for reentrancy, you will get stuck in the foreach loop.
In the code block where Monitor.Exit(intanceOfYourClass.yourLockObject) will throw you a SynchronizationLockException, it is because we are trying to invoke Exit more than the times it have entered. If you are about to use the lock keyword, you possibly would not encounter this situation except directly or indirectly of recursive calls. I guess that's why the lock keyword was provided: it prevents the Monitor.Exit to be omitted in a careless manner.
I remarked the calling of Parallel.ForEach, if you are interested then you can test it for fun.
To test the code, .Net Framework 4.0 is the least requirement, and following additional name spaces are required, too:
using System.Threading.Tasks;
using System.Diagnostics;
using System.Threading;
using System.Collections;
Have fun.

Many short tasks on the gui

When I must do many short tasks, that are cooperating with GUI, should I use another BackgroundWorker for each task, or some other solution?
EDIT
I mean update every cell in datagridview (200 rows x 50 columns) every 5 second. Every cell stores image.
BackgroundWorker would be better suited to long running tasks, you want something like ThreadPool. Here's a very crude example:
QueueUserWorkItem allows you to specify a worker method and pass in an object for that method to work on.
ThreadPool.QueueUserWorkItem(new WaitCallback(DoWork), object);
Then you would have your DoWork method where all the magic happens:
public void DoWork(object sender)
{
object s = (object)sender;
this.Invoke(new ThreadDone(ReportProgress), result);
}
Notice the call to this.Invoke(new TaskDone(ReportProgress)); This safely calls code running on the main thread to update your UI with processed data from DoWork. This is done via a private delegate:
private delegate void ThreadDone(object yourObject);
Which would call:
private void ReportProgress(object yourObject)
{
//update UI
}
You could also use this for checking when the task is complete by keeping track of an Interlocked counter object e.g.
foreach (string s in strings)
{
ThreadPool.QueueUserWorkItem(new WaitCallback(DoWork), s);
Interlocked.Increment(ref workItems);
}
And then decrementing the workItems when a single thread completes and make a simple check for when workItems == 0
Interlocked.Decrement(ref workItems);
if (workItems == 0)
{
this.Invoke(new TaskDone(WorkComplete));
}
Hope this helps.
When I'm dealing with many small tasks, I tend to find that the ThreadPool is a nice way of dealing with them. You can dispatch a delegate on a background thread like so:
ThreadPool.QueueUserWorkItem(new WaitCallback(MyThreadProc));
You can also pass an object to the thread as it's state, consider a delegate that takes an object as a parameter. You could call this:
ThreadPool.QueueUserWorkItem(new WaitCallback(MyThreadProc), state);
Where state is some object that your thread knows how to deal with.
Edit: A ThreadPool approach should work just fine for your scenario. Just make sure that any code that changes your GUI will have to be invoked on the GUI thread, or you'll get some cross-thread exceptions.

recursively calling method (for object reuse purpose)

I have a rather large class which contains plenty of fields (10+), a huge array (100kb) and some unmanaged resources. Let me explain by example
class ResourceIntensiveClass
{
private object unmaganedResource; //let it be the expensive resource
private byte[] buffer = new byte[1024 * 100]; //let it be the huge managed memory
private Action<ResourceIntensiveClass> OnComplete;
private void DoWork(object state)
{
//do long running task
OnComplete(this); //notify callee that task completed so it can reuse same object for another task
}
public void Start(object dataRequiredForCurrentTask)
{
ThreadPool.QueueUserWorkItem(DoWork); //initiate long running work
}
}
The problem is that the start method never returns after the 10000th iteration causing a stack overflow. I could execute the OnComplete delegate in another thread giving a chance for the Start method to return, but it requires using extra cpu time and resources as you know. So what is the best option for me?
Is there a good reason for doing your calculations recursively? This seems like a simple loop would do the trick, thus obviating the need for incredibly deep stacks. This design seems especially problematic as you are relying on main() to setup your recursion.
recursive methods can get out of hand quite fast. Have you looked into using Parallel Linq?
you could do something like
(your Array).AsParallel().ForAll(item => item.CallMethod());
you could also look into the Task Parallel Library (TPL)
with tasks, you can define an action and a continue with task.
The Reactive Framework (RX) on the other hand could handle these on complete events in an async manner.
Where are you changing the value of taskData so that its length can ever equal currentTaskIndex? Since the tasks you are assigning to the data are never changing, they are being carried out forever...
I would guess that the problem arises from using the pre-increment operator here:
if(c.CurrentCount < 10000)
c.Start(++c.CurrentCount);
I am not sure of the semantics of pre-increment in C#, perhaps the value passed to a method call is not what you expect.
But since your Start(int) method assigns the value of the input to this.CurrentCount as it's first step anyway, you should be safe replacing this with:
if(c.CurrentCount < 10000)
c.Start(c.CurrentCount + 1);
There is no point in assigning to c.CurrentCount twice.
If using the threadpool, I assume you are protecting the counters (c.CurrentCount), otherwise concurrent increments will cause more activity, not just 10000 executions.
There's a neat tool called a ManualResetEvent that could simplify life for you.
Place a ManualResetEvent in your class and add a public OnComplete event.
When you declare your class, you can wire up the OnComplete event to some spot in your code or not wire it up and ignore it.
This would help your custom class to have more correct form.
When your long process is complete (I'm guessing this is in a thread), simply call the Set method of the ManualResetEvent.
As for running your long method, it should be in a thread that uses the ManualResetEvent in a way similar to below:
private void DoWork(object state)
{
ManualResetEvent mre = new ManualResetEvent(false);
Thread thread1 = new Thread(
() => {
//do long running task
mre.Set();
);
thread1.IsBackground = true;
thread1.Name = "Screen Capture";
thread1.Start();
mre.WaitOne();
OnComplete(this); //notify callee that task completed so it can reuse same object for another task
}

Categories

Resources