I wonder if the following code buys any performance gains:
if (Deployment.Current.Dispatcher.CheckAccess())
{
DoUIWork();
}
else
{
Deployment.Current.Dispatcher.BeginInvoke(() =>
DoUIWork());
}
Is the Dispatcher smart enough to short circuit a dispatch to the UI thread if its unnecessary?
I couldn't say whether the dispatcher does anything expensive when dispatching from the UI thread to itself, compared with the check. But BeginInvoke from the UI thread may behave differently from executing the operation directly, as it's at least put on the queue rather than invoked immediately. You could tell the difference between this and removing the conditional statement if you had code directly afterwards.
Certainly worth being aware of the control flow, enough to know if the difference doesn't matter.
If it is anything like standard Windows SynchronizationContext (and it probably is) then the two options are not the same. BeginInvoke will basicaly queue up the method to be executed by the dispatcher message pump after the current execution of any existing message has been processed.
In your example the two options be the same if you were to use Invoke instead of BeginInvoke.
Related
is there any solution for this scenario:
I do some time-consuming things on a new Thread (using Tasks). In this thread I want to update UI elements (text) so that the user knows what happens. This should happen at real-time, not only when a Task finished.
I always must explicitly call Dispatcher.Invoke() (it's the UI-Thread dispatcher) if I want to change something. Is there a way to run something asynchronously without having to invoke explicitly? I found a solution using the TaskFactory and specifying a TaskScheduler. But this locks the UI when UI-calls are made.
Is there a way to do this, without locking the UI?
EDIT: Because of some slight misunderstandings this edit.
I want that once I call something like
uiControl.Text = "Test 123";
on the worker-thread to be automatically invoked on the UI-Thread if needed. So in this case it's needed, but I want to get rid of all these Invoke-calls. So if there is a nifty solution it would be great. Otherwise I have to use explicit invokes, not nice but ok.
PostSharp has this implemented as a method level aspect. What it means in terms of implementation is that you decorate a method with [DispatchedMethod] attribute and it will be dispatched to the UI thread every time it is called (by any thread). Take a look here for more details:
http://www.postsharp.net/aspects/examples/multithreading
http://www.postsharp.net/threading/thread-dispatching
http://doc.postsharp.net/##T_PostSharp_Patterns_Threading_DispatchedAttribute
You can do the operation on main thread and just call Application.DoEvents() from time to time in order to process UI events ...
Note that this only works for winforms so probably not a correct answer.
HighCore suggested this as a better answer https://stackoverflow.com/a/21884638/643085
I came across a piece of C# code like this today:
lock(obj)
{
// perform various operations
...
// send a message via a queue but in the same process, Post(yourData, callback)
messagingBus.Post(data, () =>
{
// perform operation
...
if(condition == true)
{
// perform a long running, out of process operation
operation.Perform();
}
}
}
My question is this: can the callback function ever be invoked in such a way as to cause the lock(obj) to not be released before operation.Perform() is called? i.e., is there a way that the callback function can be invoked on the same thread that is holding the lock, and before that thread has released the lock?
EDIT: messagingBus.Post(...) can be assumed to be an insert on to a queue, that then returns immediately. The callback is invoked on some other thread, probably from the thread pool.
For the operation.Perform() you can read it as Thread.Sleep(10000) - just something that runs for a long time and doesn't share or mutate any state.
I'm going to guess.
Post in .net generally implies that the work will be done by another thread or at another time.
So yes, it's not only possible that the lock on obj will be released before Perform is called, it's fairly likely it will happen. However, it's not guaranteed. Perform may complete before the lock is released.
That doesn't mean it's a problem. The "perform various actions" part may need the lock. messagingBus may need the lock to queue the action. The work inside may not need the lock at all, in which case the code is thread safe.
This is all a guess because there's no notion of what work is being done, why it must be inside a lock, and what Post or perform does. So the code may be perfectly safe, or it may be horribly flawed.
Without know what messagingBus.Post is doing, you can't tell. If Post invokes the delegate it is given (the lambda expression in your example) then the lock will be in place while that lambda executes. If Post schedules that delegate for execution at a later time, then the lock will not be in place while the lambda executes. It's not clear what the the lock(obj) is for, to lock calls to messagingBus.Post, or what... Detailing the type (including full namespace) of the messagingBus variable would go a long way to providing better details.
If the callback executes asynchronously, then yes, the lock may still be held when Perform() unless Post() does something specific to avoid that case (which would be unusual).
If the callback was scheduled on the same thread as the call to Post() (e. g. in the extreme example where the thread pool has only 1 thread), a typical thread pool implementation would not execute the callback until the thread finishes it's current task, which in this case would require it releasing the lock before executing Perform().
It's impossible to answer your question without knowing how messagingBus.Post is implemented. Async APIs typically provide no guarantee that the callback will be executed truly concurrently. For example, .Net APM methods such as FileStream.BeginRead may decide to perform the operation synchronously, in wich case the callback will be executed on the same thread that called BeginRead. Returned IAsyncResult.CompletedSynchronously will be set to true in this case.
What is the most efficient way to create a “cancel” event in a C# program that is crunching a large set of data in a loop on a separate thread?
For now, I am simply using a cancel event that is triggered from my UI thread, which subsequently calls an “onCancel” function on the number crunching thread. That cancel function sets a variable to “true”, which the crunch loop checks periodically, e.g.
Class Cruncher {
private bool cancel = false;
public cruncher()
{
crunch();
}
private void crunch()
{
while(conditions AND !cancel) { crunch; }
dispose_resources;
}
private void onCancel()
{
cancel = true;
}
}
While I am not checking the cancel variable as often as my example above (and not actually performing a NOT cancel), I would still like to optimize this crunch method as much as possible. Any examples where this is done more efficiently would be very nice to see.
The cancel event/flag should be a volatile... I asked a very similar question to yours: Is it safe to use a boolean flag to stop a thread from running in C#
I would also recommend that when you cancel your threads you wait for all of them to cancel by using something similar to the C# version of CountDownLatch. It's useful when you want to guarantee that the thread is canceled.
It will ultimately always result in something like this - although it's important that you make your cancel variable volatile, as otherwise the worker threads may not see the change from the cancelling thread.
You've got to check something periodically unless you want to go the more drastic route of interrupting the thread (which I don't recommend). Checking a single Boolean flag isn't likely to be exactly costly... if you can do a reasonable chunk of work in each iteration of the loop (enough to dwarf the cost of the check) then that's fine.
If you ever need to perform any waiting, however (in the worker thread), then you may be able to improve matters, by using a form of waiting (e.g. Monitor.Wait) which allows the cancelling thread to wake any waiting threads up early. That won't make normal operation more efficient, but it will allow the threads to terminate more quickly in the event of cancellation.
Especially since it's UI-triggered, I would recommend just leveraging the BackgroundWorker that's already in the framework, especially since it'll nicely have the progress and done events happen on the UI thread for you (so you don't have to invoke it over yourself).
Then you can just use the CancelAsync() call. Admittedly, it's not much different than what you're already doing, just done in the framework already (and including the thread synchronization logic)
As Jon mentioned, you're still going to want to do cooperative cancellation (checking CancellationPending in your DoWork for use of BackgroundWorker) since the 'interrupt/abort the thread' option is something you want to avoid if possible.
If in .NET 4 you can use TPL and the new Cancellation support, but again it's focused on cooperative cancellation.
I recommend using the unified cancellation model that was introduced in .NET 4.0 (if .NET 4.0 is an option).
It is very efficient, and allows integrated cancellation with Task objects and Parallel LINQ.
i would do it the same way. i would also add Thread.Sleep in to the loop to yield control to the main thread.
http://msdn.microsoft.com/en-us/library/7a2f3ay4%28VS.80%29.aspx
My problem is this:
I have two threads, my UI thread, and a worker thread. My worker thread is running in a seperate class that gets instantiated by the form, which passes itself as an ISynchronizeInvoke to the worker class, which then uses Invoke on that interface to call it's events, which provide status updates to the UI for display. This works wonderfully.
I noticed that my background thread seemed to be running slowly though, so I changed the call to Invoke to BeginInvoke, thinking that "I'm just providing progress updates, it doesn't need to be exactly synchronous, no harm done" except that now I'm getting oddities with the progress update. My progress bar updates, but the label's text doesn't, and if I change to another window and try to change back, it acts like the UI thread is locked up, so I'm wondering if perhaps my progress calls (which happen very often) are overloading the UI thread so much that it never processes messages. Is this possible at all, or is there something else at work here?
You're definitively overloading the UI thread.
In your first sample, you were (behind the scenes) sending a message to the UI thread, waiting for it to be processed (that's the purpose of invoke, which ultimately relies on SendMessage), and then sending another one. In the meantime, other messages were probably enqueued (WM_PAINT messages, for example) and processed.
In your second sample, by using BeginInvoke (which ultimately relies on PostMessage), you massively enqueued a lot of messages in the message queue, that the message pump must sequentially handle. And of course, while it's handling those thousands of messages, it cannot handle the OS messages (WM_PAINT, etc..) which makes your UI look "frozen"
You're probably providing too much status updates ; try to lower the feedback level.
If you want to understand better how messages work in windows, this is the place to start.
A few thoughts;
try batching your updates; for example, there is no point updating for every iteration in a loop; depending on the speed, perhaps every 50 / 500. In the case of lists, you would buffer in a local list variable, take the list over via Invoke / BeginInvoke, and process the buffer on the UI thread
variable capture; if you are using BeginInvoke and anonymous methods, you could have problems... I'll add an example below
making the UI update efficient - especially if you are processing a list; some controls (especially list-based controls) have a pair of methods like BeginEdit / EndEdit, that stop the UI redrawing when you are making lots of updates; instead, it waits until the End* is called
capture problem... imagine (worker):
List<string> stuff = new List<string>();
for(int i = 0 ; i < 50000 ; i++) {
stuff.Add(i.ToString());
if((i % 100) == 0) {
// update UI
BeginInvoke((MethodInvoker) delegate {
foreach(string s in stuff) {
listBox.Items.Add(s);
}
});
}
}
Did you notice that at some point both threads are talking to stuff? The UI thread can be iterating it while the worker thread (which has kept running past BeginInvoke) keeps adding. This can cause issues. Not usually performance issues (unless you are catching the exceptions and taking a long time to log them), but definitely issues. Options here would include:
using Invoke to run the update synchronously
create a new buffer per update, so that the two threads never have the same list instance (you'd need to look very carefully at the variable scoped to make sure, though)
If I have Thread A which is the main Application Thread and a secondary Thread. How can I check if a function is being called within Thread B?
Basically I am trying to implement the following code snippit:
public void ensureRunningOnCorrectThread()
{
if( function is being called within ThreadB )
{
performIO()
}
else
{
// call performIO so that it is called (invoked?) on ThreadB
}
}
Is there a way to perform this functionality within C# or is there a better way of looking at the problem?
EDIT 1
I have noticed the following within the MSDN documentation, although Im a dit dubious as to whether or not its a good thing to be doing! :
// if function is being called within ThreadB
if( System.Threading.Thread.CurrentThread.Equals(ThreadB) )
{
}
EDIT 2
I realise that Im looking at this problem in the wrong way (thanks to the answers below who helped me see this) all I care about is that the IO does not happen on ThreadA. This means that it could happen on ThreadB or indeed anyother Thread e.g. a BackgroundWorker. I have decided that creating a new BackgroundWorker within the else portion of the above f statement ensures that the IO is performed in a non-blocking fashion. Im not entirely sure that this is the best solution to my problem, however it appears to work!
Here's one way to do it:
if (System.Threading.Thread.CurrentThread.ManagedThreadId == ThreadB.ManagedThreadId)
...
I don't know enough about .NET's Thread class implementation to know if the comparison above is equivalent to Equals() or not, but in absence of this knowledge, comparing the IDs is a safe bet.
There may be a better (where better = easier, faster, etc.) way to accomplish what you're trying to do, depending on a few things like:
what kind of app (ASP.NET, WinForms, console, etc.) are you building?
why do you want to enforce I/O on only one thread?
what kind of I/O is this? (e.g. writes to one file? network I/O constrained to one socket? etc.)
what are your performance constraints relative to cost of locking, number of concurrent worker threads, etc?
whether the "else" clause in your code needs to be blocking, fire-and-forget, or something more sophisticated
how you want to deal with timeouts, deadlocks, etc.
Adding this info to your question would be helpful, although if yours is a WinForms app and you're talking about user-facing GUI I/O, you can skip the other questions since the scenario is obvious.
Keep in mind that // call performIO so that it is called (invoked?) on ThreadB implementation will vary depending on whether this is WinForms, ASP.NET, console, etc.
If WinForms, check out this CodeProject post for a cool way to handle it. Also see MSDN for how this is usually handled using InvokeRequired.
If Console or generalized server app (no GUI), you'll need to figure out how to let the main thread know that it has work waiting-- and you may want to consider an alternate implementation which has a I/O worker thread or thread pool which just sits around executing queued I/O requests that you queue to it. Or you might want to consider synchronizing your I/O requests (easier) instead of marshalling calls over to one thread (harder).
If ASP.NET, you're probably implementing this in the wrong way. It's usually more effective to use ASP.NET async pages and/or to (per above) synchronize snchronizing to your I/O using lock{} or another synchronization method.
What you are trying to do is the opposite of what the InvokeRequired property of a windows form control does, so if it's a window form application, you could just use the property of your main form:
if (InvokeRequired) {
// running in a separate thread
} else {
// running in the main thread, so needs to send the task to the worker thread
}
The else part of your snippet, Invoking PerformIO on ThreadB is only going to work when ThreadB is the Main thread running a Messageloop.
So maybe you should rethink what you are doing here, it is not a normal construction.
Does your secondary thread do anything else besides the performIO() function? If not, then an easy way to do this is to use a System.Threading.ManualResetEvent. Have the secondary thread sit in a while loop waiting for the event to be set. When the event is signaled, the secondary thread can perform the I/O processing. To signal the event, have the main thread call the Set() method of the event object.
using System.Threading;
static void Main(string[] args)
{
ManualResetEvent processEvent = new ManualResetEvent(false);
Thread thread = new Thread(delegate() {
while (processEvent.WaitOne()) {
performIO();
processEvent.Reset(); // reset for next pass...
}
});
thread.Name = "I/O Processing Thread"; // name the thread
thread.Start();
// Do GUI stuff...
// When time to perform the IO processing, signal the event.
processEvent.Set();
}
Also, as an aside, get into the habit of naming any System.Threading.Thread objects as they are created. When you create the secondary thread, set the thread name via the Name property. This will help you when looking at the Threads window in Debug sessions, and it also allows you to print the thread name to the console or the Output window if the thread identity is ever in doubt.