Why using Task.ContinueWith hurts my program's responsiveness? - c#

We have a video player written in WPF with a scroll bar. When the scroll bar is dragged right-left the CurrentFrameTime is updated and triggers UpdateFrames which, in turn, grabs the frame and shows it. That works fine.
But, sometimes grabbing the frame can take time (because of the disk for example) and though CurrentFrameTime value can be already changed, the UpdateFrames can be "stuck" and still be waiting for previous frame in ...GetAsync().Result.
What is decided to do is to move Dispatcher.BeginInvoke into ContinueWith block. Now, each time the CurrentFrameTime is changed, the previous operation will be canceled(we don't need to show the frame if frame time was already changed) and an up-to-date frame should be shown. But, for some reason because of this change the application became slower. When I drag the scroll it can take a few seconds before the image is updated.
What could happened that moving the code into ContinueWith has slowed down the video player?
MainApplication without ContinueWith
_threadUpdateUI = new Thread(new ThreadStart(UpdateFrames));
public long CurrentFrameTime
{
get{...}
set
{
...
_fetchFrame.Set();
}
}
void UpdateFrames()
{
while(run)
{
_fetchFrame.WaitOne();
var frame = Cache.Default.GetAsync(CurrentFrameTime)
.Result;
Dispatcher.BeginInvoke(new Action(() => ShowFrame(frame.Time, frame.Image)));
}
}
Cache
public Task<VideoFrame> GetAsync(long frameTime)
{
//this i used when cache is disabled
if (GrabSynchronously)
{
var tcs = new TaskCompletionSource<VideoFrame>();
//reading from file
var frame2 = FrameProvider.Instance.GetFrame(frameTime);
tcs.SetResult(frame2);
return tcs.Task;
}
...
}
MainApplication WITH ContinueWith
void ShowFrames()
{
while(run)
{
_fetchFrame.WaitOne();
_previousFrameCancellationToken.Cancel();
_previousFrameCancellationToken = new CancellationTokenSource();
Cache.Default.GetAsync(CurrentFrameTime).ContinueWith((task) =>
{
var frameTime = task.Result.Time;
var frameImage = task.Result.Image
Dispatcher.BeginInvoke(new Action(() => ShowFrame(frameTime, frameImage)));
}, _previousFrameCancellationToken.Token);
}
}
df

In your old way your UpdateFrames loop would block every .Result call. This made your loop self metering, only allowing one request "in flight" at a time even if _fetchFrame got .Set() called many times while it was waiting for .Result to finish.
In your new way every call to _fetchFrame.Set() triggers another task to start up and be "in flight" (assuming GrabSynchronously is false) even if it never gets used. This is flooding your system with requests and is causing your slowdown.
One possible solution is to put another semaphore of some type to limit the number of concurrent requests for frames you can handle.
Semaphore _frameIsProcessing = new Semaphore(5, 5); //Allows for up to 5 frames to be requested at once before it starts blocking requests.
private void ShowFrames()
{
while (run)
{
_fetchFrame.WaitOne();
_previousFrameCancellationToken.Cancel();
_previousFrameCancellationToken = new CancellationTokenSource();
_frameIsProcessing.WaitOne();
Cache.Default.GetAsync(CurrentFrameTime).ContinueWith((task) =>
{
_frameIsProcessing.Release();
if(_previousFrameCancellationToken.IsCancellationRequested)
return;
var frameTime = task.Result.Time;
var frameImage = task.Result.Image;
Dispatcher.BeginInvoke(new Action(() => ShowFrame(frameTime, frameImage)));
});
}
}

Related

How to keep cancelling the task until a condition is met (TaskCanceledException)

I want to call a method after some delay when an event is raised, but any subsequent events should "restart" this delay. Quick example to illustrate, the view should be updated when scrollbar position changes, but only 1 second after the user has finished scrolling.
Now I can see many ways of implementing that, but the most intuitive would be to use Task.Delay + ContinueWith + cancellation token. However, I am experiencing some issues, more precisely subsequent calls to my function cause the TaskCanceledException exception and I started to wonder how I could get rid of that. Here is my code:
private CancellationTokenSource? _cts;
private async void Update()
{
_cts?.Cancel();
_cts = new();
await Task.Delay(TimeSpan.FromSeconds(1), _cts.Token)
.ContinueWith(o => Debug.WriteLine("Update now!"),
TaskContinuationOptions.OnlyOnRanToCompletion);
}
I have found a workaround that works pretty nicely, but I would like to make the first idea work.
private CancellationTokenSource? _cts;
private CancellationTokenRegistration? _cancellationTokenRegistration;
private void Update()
{
_cancellationTokenRegistration?.Unregister();
_cts = new();
_cancellationTokenRegistration = _cts.Token.Register(() => Debug.WriteLine("Update now!"));
_cts.CancelAfter(1000);
}
You should consider using Microsoft's Reactive Framework (aka Rx) - NuGet System.Reactive and add using System.Reactive.Linq;.
You didn't say hat UI you're using, so for Windows Forms also add System.Reactive.Windows.Forms and for WPF System.Reactive.Windows.Threading.
Then you can do this:
Panel panel = new Panel(); // assuming this is a scrollable control
IObservable<EventPattern<ScrollEventArgs>> query =
Observable
.FromEventPattern<ScrollEventHandler, ScrollEventArgs>(
h => panel.Scroll += h,
h => panel.Scroll -= h)
.Select(sea => Observable.Timer(TimeSpan.FromSeconds(1.0)).Select(_ => sea))
.Switch();
IDisposable subscription = query.Subscribe(sea => Console.WriteLine("Hello"));
The query is firing for every Scroll event and starts a one second timer. The Switch operator watches for every Timer produces and only connects to the latest one produced, thus ignoring the previous Scroll events.
And that's it.
After scrolling has a 1 second pause the word "Hello" is written to the console. If you begin scrolling again then after every further 1 second pause it fires again.
In my own experience I've dealt with lots of scenarios just like the one you describe, e.g. update something one second after the mouse stops moving etc.
For a long time I would do timer restarts just the way you describe, by cancelling an old task and starting a new one. But I never really liked how messy that was, so I came up with an alternative that I use in production code. Long-term it has proven quite reliable. It takes advantage of the captured context associated with a task. Multiple instances of TaskCanceledException no longer occur.
class WatchDogTimer
{
int _wdtCount = 0;
public TimeSpan Interval { get; set; } = TimeSpan.FromSeconds(1);
public void Restart(Action onRanToCompletion)
{
_wdtCount++;
var capturedCount = _wdtCount;
Task
.Delay(Interval)
.GetAwaiter()
.OnCompleted(() =>
{
// If the 'captured' localCount has not changed after awaiting the Interval,
// it indicates that no new 'bones' have been thrown during that interval.
if (capturedCount.Equals(_wdtCount))
{
onRanToCompletion();
}
});
}
}
Another nice perk is that it doesn't rely on platform timers and works just as well in iOS/Android as it does in WinForms/WPF.
For purposes of demonstration, this can be exercised in a quick console demo where the MockUpdateView() action is sent to the WDT 10 times at 500 ms intervals. It will only execute one time, 500 ms after the last restart is received.
static void Main(string[] args)
{
Console.Title = "Test WDT";
var wdt = new WatchDogTimer { Interval = TimeSpan.FromMilliseconds(500) };
Console.WriteLine(DateTime.Now.ToLongTimeString());
// "Update view 500 ms after the last restart."
for (int i = 0; i < 10; i++)
{
wdt.Restart(onRanToCompletion: ()=>MockUpdateView());
Thread.Sleep(TimeSpan.FromMilliseconds(500));
}
Console.ReadKey();
}
static void MockUpdateView()
{
Console.WriteLine($"Update now! WDT expired {DateTime.Now.ToLongTimeString()}");
}
}
So, with 500 ms times 10 restarts this verifies one event at 5 seconds from the start.
You can combine a state variable and a delay to avoid messing with timers or task cancelation. This is far simpler IMO.
Add this state variable to your class/form:
private DateTime _nextRefresh = DateTime.MaxValue;
And here's how you refresh:
private async void Update()
{
await RefreshInOneSecond();
}
private async Task RefreshInOneSecond()
{
_nextRefresh = DateTime.Now.AddSeconds(1);
await Task.Delay(1000);
if (_nextRefresh <= DateTime.Now)
{
_nextRefresh = DateTime.MaxValue;
Refresh();
}
}
If you call RefreshInOneSecond repeatedly, it pushes out the _nextRefresh timestamp until later, so any refreshes already in flight will do nothing.
Demo on DotNetFiddle
One approach is to create a timer and reset this whenever the user does something. For example using System.Timers.Timer
timer = new Timer(1000);
timer.SynchronizingObject = myControl; // Needs a winforms object for synchronization
timer.Elapsed += OnElapsed;
timer.Start(); // Don't forget to stop the timer whenever you are done
...
private void OnUserUpdate(){
timer.Interval = 1000; // Setting the interval will reset the timer
}
There are multiple timers to chose from, I believe the same pattern is possible with the other timers. DispatchTimer might be most suitable if you use WPF.
Note that both System.Timers.Timer and Task.Delay uses System.Threading.Timer in the background. It is possible to use this directly, just call the .Change method to reset it. But be aware that this raises the event on a taskpool thread, so you need to provide your own synchronization.
I implemented the same scenario in a JavaScript application using Timer. I believe it's the same in the .NET world. Anyway handling this use-case when the user calls a method repeatedly with Task.Delay() will put more pressure on GC & thread pool
var timer = new Timer()
{
Enabled = true,
Interval = TimeSpan.FromSeconds(5).TotalMilliseconds,
};
timer.Elapsed += (sender, eventArgs) =>
{
timer.Stop();
// do stuff
}
void OnKeyUp()
{
timer.Stop();
timer.Start();
}

How to improve performance on wpf GUI main thread that is done in two windows

I have a wpf application, It has mainWindow that creates _otherWindow that displays on a secondary monitor. Both windows have elements that need to change along with time. (mainWindow updates an Image and also plots a graph, finally _otherWindow updates a shape position depending on some computations).
What is my problem? well, I am reading a video frame by frame within a Thread (however I would like to allow this with stream taken with a camera). And as I update GUI every frame in a certain time, Application is getting a heavy load and is getting slow...
I realized that commenting either mainWindow updating Image, or commenting _otherWindow updating shape position codes make the application run nice, but the issue is when they run together.
Here is a detailed description
First I compute some things inside _otherWindow and compute position of a shape.
Then I compute some stuff related to image and the update frame adding some stuff to bitmap
Then I update position of shape inside _otherWindow
Finally I plot results (the plot needs data gotten from mainWindow and _otherWindow)
For this, I use tasks and wait for them.
I have this:
private Thread _camera;
private void CaptureVideo()
{
_camera = new Thread(CaptureVideoCallback)
{
Priority = ThreadPriority.Highest
};
_camera.Start();
}
private VideoCapture _capture;
private void CaptureVideoCallback()
{
//some computing here read from a video file...
_capture = new VideoCapture("someVideo.mp4");
for (var i = 0; i < _capture.FrameCount; i++)
{
_capture.Read(_frame);
if (_frame.Empty()) return;
//*************task that does heavy computation in other class
var heavyTaskOutput1 = Task.Factory.StartNew(() =>
{
_otherWindow.Dispatcher.Invoke(() =>
{
ResultFromHeavyComputationMethod1 = _otherWindow.HeavyComputationMethod1();
});
}
);
////*************task that does heavy computation in current class
var heavyTaskOutput2 = Task.Factory.StartNew(() =>
{
ResultFromHeavyComputationMethod2 = HeavyComputationMethod2(ref _frame);
var bitmap = getBitmapFromHeavyComputationMethod2();
bitmap.Freeze();
//update GUI in main thread
Dispatcher.CurrentDispatcher.Invoke(() => ImageSource = bitmap);
});
////*************wait both task to complete
Task.WaitAll(heavyTaskOutput1, heavyTaskOutput2 );
//update _otherWindow GUI
var outputGui = Task.Factory.StartNew(() =>
{
_otherWindow.Dispatcher.Invoke(() =>
{
_otherWindow.UpdateGui();
});
}
);
outputGui.Wait();
////*************plot in a char using gotten results, UPDATE GUI
Task.Run(() =>
{
PlotHorizontal();
});
}
}
What would be a good way to speed this up?
I mean I know that GUI stuff need to be done on main thread, but this is slowing down things.
Edit
Have changed code as Clemens suggested:
//*************task that does heavy computation in other class
var heavyTaskOutput1 = Task.Run(() =>
{
ResultFromHeavyComputationMethod1 = _otherWindow.HeavyComputationMethod1();
}
);
////*************task that does heavy computation in current class
var heavyTaskOutput2 = Task.Run(() =>
{
ResultFromHeavyComputationMethod2 = HeavyComputationMethod2(ref _frame);
var bitmap = getBitmapFromHeavyComputationMethod2();
bitmap.Freeze();
//update GUI in main thread
Dispatcher.CurrentDispatcher.Invoke(() => ImageSource = bitmap);
});
////*************wait both task to complete
Task.WaitAll(heavyTaskOutput1, heavyTaskOutput2);
//update _otherWindow GUI
var outputGui = Task.Run(() =>
{
_otherWindow.Dispatcher.Invoke(() =>
{
_otherWindow.UpdateGui();
});
}
);
outputGui.Wait();
It's a bit hard to guess. Do you have Visual Studio? I think even the Community edition has some profiling capabilities (menu: Analyze/Performance Profiler...). That may point out some non-obvious bottlenecks.
My thoughts:
getBitmapFromHeavyComputationMethod2 appears to return a new bitmap every time through. I can't infer the actual type it's returning, but it likely involves a semi-large un-managed memory allocation and implements IDisposable. You might check on whether you're disposing that appropriately.
Rather than create a new bitmap for every frame, can you use a WriteableBitmap? Be sure to lock and unlock it if you do. Perhaps ping-pong (alternate) between two bitmaps if you need to.
It appears you may be serializing your "heavy computation" with your I/O read (first one, then the other). Perhaps launch the read as an async as well, and wait on it in your WaitAll so that the computation and I/O can happen concurrently. Something in this shape:
var readResult = _capture.Read(_frame);
for (...) {
// check read result
// ...
// launch heavy computation
readResult = Task.Run(() => _capture.Read(nextFrame);
Task.WaitAll(pupilOutput, outputTest, readResult);
_frame = nextFrame;
}
Note this would read N+1 times for N frames--maybe your Read method is okay with that.

C# Thread Tasks Not running Sequentially

I have an application which plays a recording stream, it has a COM Interface to provide functionality to start replay stream along with other control variables. I want to accomplish the following task (test case) using an button click event (Visual Studio Desktop Form Application):
Play recording from 0 second (0 min) for 10 seconds
Then play recording from 60 second (1 min) for 15 seconds
Finally play recording from 120 second (2 min) for 20 seconds. (stream length in approx 3 min)
I am creating thread for each of the above task from 1-3 and then I am using join method to wait for each task to be completed and then start the next thread.
My code:
private void btnReplayForward_Click(object sender, EventArgs e) {
Thread replayAtThread = new Thread(() => mpd.startReplayAt3(0, 10));
replayAtThread.Start();
replayAtThread.Join();
Thread replayAtThread1 = new Thread(() => mpd.startReplayAt3(60, 15));
replayAtThread1.Start();
replayAtThread1.Join();
Thread replayAtThread2 = new Thread(() => mpd.startReplayAt3(120, 20));
replayAtThread2.Start();
replayAtThread2.Join();
replayAtThread.Abort();
replayAtThread1.Abort();
replayAtThread2.Abort();
}
mpd is just an instance of an class that uses COM Interface, in a nutshell this is what startReplayAt3 method looks like
public bool startReplayAt3(double start, double duration)
{
//Set start and duration
//Start replay at start and duration values
bool isReplay = //get replay value, true if replay running, false if stopped
while (isReplay)
isReplay = //get replay value, true if replay running, false if stopped
return isReplay;
}
However, on the actual interface I am not seeing the desired results, it performs task 1, then does something with task 2 but skip the replay part entirely, then performs task 3. Next time when I click the button, it skips task 1, goes straight to task 2 and ends without performing task 3. So as you can see it is very unpredictable and strange behavior.
Why could be the cause, I am making sure that my thread are in sync but it doesn't seem to reflect in the UI of the application I am running.
I'm not entirely sure about the issue you're seeing with the threads :\. The implementation seems a little off, the .Abort() calls seem redundant since the thread will be completed by the time you're calling the .Abort().
Why not use asynchronous functionality?
await Task.Run(() => mpd.startReplayAt3(0, 10));
await Task.Run(() => mpd.startReplayAt3(60, 15));
await Task.Run(() => mpd.startReplayAt3(120, 20));
or:
var replayTasks = new List<Task>();
replayTasks.Add(Task.Run(() => mpd.startReplayAt3(0, 10)));
replayTasks.Add(Task.Run(() => mpd.startReplayAt3(60, 15)));
replayTasks.Add(Task.Run(() => mpd.startReplayAt3(120, 20)));
Task.WaitAll(replayTasks.ToArray());
or:
var replayTasks = new Task[] {
Task.Run(() => mpd.startReplayAt3(0, 10)),
Task.Run(() => mpd.startReplayAt3(60, 15)),
Task.Run(() => mpd.startReplayAt3(120, 20))
};
Task.WaitAll(replayTasks);
Like yourself I used to create many threads for executing tasks simultaneously, but I have fallen in love with async/await.
Make sure to change your _Click method to an asynchronous method using the async keyword.
From your description of the program's behavior, it seems to me that startReplayAt3 in fact only starts the replay.
What you want to do is start the replay, then wait until it finishes, and then start the next replay.
If the COM object gives you an event that is raised when the replay completes, then you're good; you should wrap that event into a task, and use await to consume it, something like this:
public static Task ReplayAsync(this MPD mpd, int start, int duration)
{
var tcs = new TaskCompletionSource();
ReplayFinishedHandler handler = null;
handler = (s, e) => { tcs.TrySetCompleted(); mpd.ReplayFinished -= handler; };
mpd.ReplayFinished += handler;
mpd.startReplayAt3(start, duration);
return tcs.Task;
}
used as such:
private async void btnReplayForward_Click(object sender, EventArgs e)
{
await mpd.ReplayAsync(0, 10);
await mpd.ReplayAsync(60, 15);
await mpd.ReplayAsync(120, 20);
}
But some COM objects don't let you know when they're done (notably, a lot of media players are really bad at this). In that case, you'll either have to poll some kind of IsPlaying flag on the COM object, or just use a timeout like this:
public static Task ReplayAsync(this MPD mpd, int start, int duration)
{
mpd.startReplayAt3(start, duration);
return Task.Delay(TimeSpan.FromSeconds(duration));
}

Forcing a Task to Wait Before Updating the UI in Continuation

All, I want to update a ToolStripMenu to show an SqlConnection failure. I want the error message to display for some time timeToWaitMs (in milli-second) and then refresh the UI back to an okay state after some time and some operations. Currently I am doing (with some un-necessary details removed)
public void ShowErrorWithReturnTimer(string errorMessage, int timeToWaitMs = 5000)
{
// Update the UI (and images/colors etc.).
this.toolStripLabelState.Text = errorMessage;
// Wait for timeToWait and return to the default UI.
Task task = null;
task = Task.Factory.StartNew(() =>
{
task.Wait(timeToWaitMs);
});
// Update the UI returning to the valid connection.
task.ContinueWith(ant =>
{
try
{
// Connection good to go (retore valid connection update UI etc.)!
this.toolStripLabelState.Text = "Connected";
}
finally
{
RefreshDatabaseStructure();
task.Dispose();
}
}, CancellationToken.None,
TaskContinuationOptions.None,
mainUiScheduler);
}
The problem I have is that task.Wait(timeToWaitMs); is causing a Cursors.WaitCursor to be displayed - I don't want this. How can I force the error message to be displayed for a period, after which I return to a non-error state?
Thanks for your time.
I wouldn't use a task at all here - at least not without the async features in C# 5. In C# 5 you could just write:
await Task.Delay(millisToWait);
But until you've got that, I'd just use a timer appropriate for your UI, e.g. System.Windows.Forms.Timer or System.Windows.Threading.DispatcherTimer. Just use what you've currently got as a continuation as the "tick" handler for the timer, and schedule it appropriately.
You can use a Timer, instead of task.Wait().You can make it wait for an amount of time. Once the timer ticks, a callback can start the update.
var timer = new Timer(timeToWaitMs);
timer.Elapsed += (s, e) =>
{
timer.Stop();
UpdateValidConnection();
};
private void UpdateValidConnection()
{
Task.Factory.StartNew(() =>
{
try
{
this.toolStripLabelState.Text = "Connected";
}
finally
{
RefreshDatabaseStructure();
}
}, CancellationToken.None, TaskCreationOptions.None, mainUiScheduler);
}

Show progress only if a background operation is long

I'm developing a C# operation and I would like to show a modal progress dialog, but only when an operation will be long (for example, more than 3 seconds). I execute my operations in a background thread.
The problem is that I don't know in advance whether the operation will be long or short.
Some software as IntelliJ has a timer aproach. If the operation takes more than x time, then show a dialog then.
What do you think that is a good pattern to implement this?
Wait the UI thread with a timer, and show dialog there?
Must I DoEvents() when I show the dialog?
Here's what I'd do:
1) Use a BackgroundWorker.
2) In before you call the method RunWorkerAsync, store the current time in a variable.
3) In the DoWork event, you'll need to call ReportProgress. In the ProgressChanged event, check to see if the time has elapsed greater than three seconds. If so, show dialog.
Here is a MSDN example for the BackgroundWorker: http://msdn.microsoft.com/en-us/library/cc221403(v=vs.95).aspx
Note: In general, I agree with Ramhound's comment. Just always display the progress. But if you're not using BackgroundWorker, I would start using it. It'll make your life easier.
I will go with the first choice here with some modifications:
First run the possible long running operation in different thread.
Then run a different thread to check the first one status by a wait handle with timeout to wait it for finish. if the time out triggers there show the progress bar.
Something like:
private ManualResetEvent _finishLoadingNotifier = new ManualResetEvent(false);
private const int ShowProgressTimeOut = 1000 * 3;//3 seconds
private void YourLongOperation()
{
....
_finishLoadingNotifier.Set();//after finish your work
}
private void StartProgressIfNeededThread()
{
int result = WaitHandle.WaitAny(new WaitHandle[] { _finishLoadingNotifier }, ShowProgressTimeOut);
if (result > 1)
{
//show the progress bar.
}
}
Assuming you have a DoPossiblyLongOperation(), ShowProgressDialog() and HideProgressDialog() methods, you could use the TPL to do the heavy lifting for you:
var longOperation = new Task(DoPossiblyLongOperation).ContinueWith(() => myProgressDialog.Invoke(new Action(HideProgressDialog)));
if (Task.WaitAny(longOperation, new Task(() => Thread.Sleep(3000))) == 1)
ShowProgressDialog();
I would keep the progress dialog separate from the background activity, to separate my UI logic from the rest of the application. So the sequence would be (This is essentially the same as what IntelliJ does):
UI starts the background operation (in a BackgroundWorker) and set up a timer for X seconds
When the timer expires UI shows the progress dialog (if the background task is still running)
When the background task completes the timer is cancelled and the dialog (if any) is closed
Using a timer instead of a separate thread is more resource-efficient.
Recommended non-blocking solution and no new Threads:
try
{
var t = DoLongProcessAsync();
if (await Task.WhenAny(t, Task.Delay(1000)) != t) ShowProgress();
await t;
}
finally
{
HideProgress();
}
I got the idea from Jalal Said answer. I required the need to timeout or cancel the progress display. Instead of passing an additional parameter (cancellation token handle) to the WaitAny I changed the design to depend on Task.Delay()
private const int ShowProgressTimeOut = 750;//750 ms seconds
public static void Report(CancellationTokenSource cts)
{
Task.Run(async () =>
{
await Task.Delay(ShowProgressTimeOut);
if (!cts.IsCancellationRequested)
{
// Report progress
}
});
}
Use it like so;
private async Task YourLongOperation()
{
CancellationTokenSource cts = new CancellationTokenSource();
try
{
// Long running task on background thread
await Task.Run(() => {
Report(cts);
// Do work
cts.Cancel();
});
}
catch (Exception ex) { }
finally {cts.Cancel();}
}

Categories

Resources