How to make long running Task not disturbed - c#

Having long running Task, something like this
Task.Factory.StartNew(() =>
{
var step = 0;
while (true)
{
Task.Delay(100).Wait();
Console.WriteLine(step);
step++;
if (step > MaxStep)
{
return;
}
}
}, CancellationToken.None, TaskCreationOptions.LongRunning, TaskScheduler.Default);
In the same application there are lots of Tasks with API calls executing and when, sometimes that API calls are slowing down, this main long running task having disturbance, taking more than 150 milliseconds to the next step.
How can I make this task flawless? Any ideas?
Tried to use Thread on place of Task, same result.
Wanted to change API calls to async but too much refactoring in old code, scared to touch it.
Artifically wrapped methods with API calls to async methods but same result.

Have you tried running a
backgroundWorker
instead of Tasks (They are not the same though). Another quick way to achieve this is using
Timers
or
DispatcherTimer
). I have used timers to run 24-hours (weeks) window that shows Ads continuously with no error.

Related

Parallel queued background tasks with hosted services in ASP.NET Core

I'm doing some tests with the new Background tasks with hosted services in ASP.NET Core feature present in version 2.1, more specifically with Queued background tasks, and a question about parallelism came to my mind.
I'm currently following strictly the tutorial provided by Microsoft and when trying to simulate a workload with several requests being made from a same user to enqueue tasks I noticed that all workItems are executed in order, so no parallelism.
My question is, is this behavior expected? And if so, in order to make the request execution parallel is it ok to fire and forget, instead of waiting the workItem to complete?
I've searched for a couple of days about this specific scenario without luck, so if anyone has any guide or examples to provide, I would be really glad.
Edit: The code from the tutorial is quite long, so the link for it is https://learn.microsoft.com/en-us/aspnet/core/fundamentals/host/hosted-services?view=aspnetcore-2.1#queued-background-tasks
The method which executes the work item is this:
public class QueuedHostedService : IHostedService
{
...
public Task StartAsync(CancellationToken cancellationToken)
{
_logger.LogInformation("Queued Hosted Service is starting.");
_backgroundTask = Task.Run(BackgroundProceessing);
return Task.CompletedTask;
}
private async Task BackgroundProceessing()
{
while (!_shutdown.IsCancellationRequested)
{
var workItem =
await TaskQueue.DequeueAsync(_shutdown.Token);
try
{
await workItem(_shutdown.Token);
}
catch (Exception ex)
{
_logger.LogError(ex,
$"Error occurred executing {nameof(workItem)}.");
}
}
}
...
}
The main point of the question is to know if anyone out there could share the knowledge of how to use this specific technology to execute several work items at the same time, since a server can handle this workload.
I tried the fire and forget method when executing the work item and it worked the way I intended it to, several tasks executing in parallel at the same time, I 'm jut no sure if this is an ok practice, or if there is a better or proper way of handling this situation.
The code you posted executes the queued items in order, one at a time but also in parallel to the web server. An IHostedService is running per definition in parallel to the web server. This article provides a good overview.
Consider the following example:
_logger.LogInformation ("Before()");
for (var i = 0; i < 10; i++)
{
var j = i;
_backgroundTaskQueue.QueueBackgroundWorkItem (async token =>
{
var random = new Random();
await Task.Delay (random.Next (50, 1000), token);
_logger.LogInformation ($"Event {j}");
});
}
_logger.LogInformation ("After()");
We add ten tasks which will wait a random amount of time. If you put the code in a controller method the events will still be logged even after controller method returns. But each item will be executed in order so that the output looks like this:
Event 1
Event 2
...
Event 9
Event 10
In order to introduce parallelism we have to change the implementation of the BackgroundProceessing method in the QueuedHostedService.
Here is an example implementation that allows two Tasks to be executed in parallel:
private async Task BackgroundProceessing()
{
var semaphore = new SemaphoreSlim (2);
void HandleTask(Task task)
{
semaphore.Release();
}
while (!_shutdown.IsCancellationRequested)
{
await semaphore.WaitAsync();
var item = await TaskQueue.DequeueAsync(_shutdown.Token);
var task = item (_shutdown.Token);
task.ContinueWith (HandleTask);
}
}
Using this implementation the order of the events logged in no longer in order as each task waits a random amount of time. So the output could be:
Event 0
Event 1
Event 2
Event 3
Event 4
Event 5
Event 7
Event 6
Event 9
Event 8
edit: Is it ok in a production environment to execute code this way, without awaiting it?
I think the reason why most devs have a problem with fire-and-forget is that it is often misused.
When you execute a Task using fire-and-forget you are basically telling me that you do not care about the result of this function. You do not care if it exits successfully, if it is canceled or if it threw an exception. But for most Tasks you do care about the result.
You do want to make sure a database write went through
You do want to make sure a Log entry is written to the hard drive
You do want to make sure a network packet is sent to the receiver
And if you care about the result of the Task then fire-and-forget is the wrong method.
That's it in my opinion. The hard part is finding a Task where you really do not care about the result of the Task.
You can add the QueuedHostedService once or twice for every CPU in the machine.
So something like this:
for (var i=0;i<Environment.ProcessorCount;++i)
{
services.AddHostedService<QueuedHostedService>();
}
You can hide this in an extension method and make the concurrency level configurable to keep things clean.

WPF GUI Performance with a large number of parallel tasks

I developed a small client (WPF) to make some stress test on our systems. Essentially it has to call various methods on an Asp.Net WebApi endpoint in parallel.
Each time you press "Start" it generates 4000 tasks (Async - Await) in parallel with request to stress, waits until they all finish, then it does it again - until the user clicks the stop button. The GUI is decorated with a progress bar and some counters: requests in error, completed request, in progress requests. I obtain these informations because the object that makes the batch of stress requests exposes some events:
var stressTestTask = new stressTestTask(LogService, configuration);
stressTestTask.ErrorRequestCountChanged += stressTestTask_ErrorRequestCountChanged;
stressTestTask.GoodRequestCountChanged += stressTestTask_GoodRequestCountChanged;
stressTestTask.TryRequestCountChanged += stressTestTask_TryRequestCountChanged;
_executionCancellationToken = new CancellationTokenSource();
await Task.Run(
() => stressTestTask.ApiStressTestTask(_executionCancellationToken.Token),
_executionCancellationToken.Token);
The whole execution is started from an ICommand (MVVM):
private RelayCommand _startCommand;
public RelayCommand StartCommand
{
get
{
return _startCommand ?? (_startCommand = new RelayCommand(
async () =>
{
await StartStressTest();
}));
}
}
RelayCommand is an implementation of ICommand from the library Mvvm-Light.
What I don't understand is this behaviour: if I configure my batch of tasks with a "low" number of tasks, for example 2000, the GUI doesn't freeze while executing. If instead I choose 5000 tasks, after a while it freezes. If then I open another instance of the .exe of my client and I choose 2000 on each, the GUI is responsive in both.
My first question is: why opening one instance with x tasks is worse in terms of responsivness than opening n instances with x/n tasks? Is it something related to Windows Scheduler and the fact that in the first case I have only one process?
My second questions is: how can I address the problem to make everything work on a single GUI? I thought about making a console application with the single batch of stress tests and calling a command from the GUI for each instance I want, in order to generate a process for every batch.
Are you handling those API events by invoking to the UI context? If you have many invocations occurring you will flood the dispatcher with operations and cause the UI to hang and lag behind user input.
Try batching the UI updates.
My first question is: why opening one instance with x tasks is worse in terms of responsivness than opening n instances with x/n tasks?
Possibly because you are getting more events to handle on the UI thread. I guess your ErrorBetCountChanged, GoodRequestCountChanged and TryRequestCountChanged event handlers are invoked on the UI thread and a lot of events being raised may flood the UI thread.
As Gusdor suggets you should probably find a way of batching the updates. Take a look at the reactive extensions (Rx): http://www.introtorx.com/content/v1.0.10621.0/01_WhyRx.html.
It has a Buffer method that may come in handy: http://www.introtorx.com/content/v1.0.10621.0/13_TimeShiftedSequences.html.
It also has en Obervable.FromEvent method that you can use to convert an event into an IObservable: https://msdn.microsoft.com/en-us/library/hh229241(v=vs.103).aspx.
My second questions is: how can I address the problem to make everything work on a single GUI?
You need to find a way - one or anoher - of updating the UI less frequently. Batching the updates and events should be a good starting point. Raising less notifications is another option. Maybe you need to both.
how can I address the problem to make everything work on a single GUI?
Send API requests in "proper" async-await manner with only one thread.
private async Task SendStressRequests()
{
var tasks = new List<Task>();
for (int i = 0; i < 4000; i++)
{
var task = SendApiRequestAsync();
tasks.Add(task);
}
await Task.WhenAll(tasks);
// Update UI with results
}

multithreading in regards toTask, async, and await

I have the following code and just want to make sure I have the concept of multithreading down on a high level.
public async Task<List<Category>> GetProjectsByCategoryIDAsync(Int16 categoryid)
{
try
{
using (YeagerTechEntities DbContext = new YeagerTechEntities())
{
DbContext.Configuration.ProxyCreationEnabled = false;
DbContext.Database.Connection.Open();
var category = await DbContext.Categories.Include("Projects").Where(p => p.CategoryID == categoryid).ToListAsync();
return category;
}
}
catch (Exception)
{
throw;
}
}
It is my understanding of the following:
async - declares a method to run asynchounously instead of
synchrounously.
Task - declares a method to run as a task on a single thread
await - the task waits for the operation to complete.
Where I am a little fuzzy about is the await keyword. Obviously, the benefit of asynchrounous programming is that the method supposedly doesn't have to wait for the task to complete before another request comes in right behind it. But with the await keyword, the task waits until the operation is finished.
With synchrounous programming, everything is processed in a sequential pattern.
How does this methodology allow for requests to come in simultaneously and be executed in a much faster fashion than synchronous programming??
I just need a high level explanation to get the concept down.
Thanks so much in advance.
Consider the following code:
public async Task DoSomething()
{
Console.WriteLine("Begin");
int i = await DoSomethingElse();
Console.WriteLine("End " + i);
}
public Task<int> DoSomethingElse()
{
return new Task<int>(() =>
{
// do heavy work
Thread.Sleep(1000);
return 1;
});
}
With synchrounous programming, everything is processed in a sequential
pattern.
The code above is asynchronous, but is still sequential. The difference between that code and its synchronous version (e.g., public int DoSomethingElse) is that when you await DoSomethingElse, the main thread will be freed to do other work, instead of blocking waiting for DoSomethingElse to complete.
What actually happens is: your async DoSomething method will run on thread A and be broken in two.
the first part will print "Begin" and make an async call, and then return.
the second part will print "End"
After the first part of the method executes, Thread A will be free to do other work.
Meanwhile, Thread B will be executing the lambda expression that does some heavy work.
Whenever Thread B completes, the second part of your method will be scheduled to run on Thread A, and "End" will be printed.
Notice that, while Thread B was executing the heavy work, Thread A was free to do other stuff.
How does this methodology allow for requests to come in simultaneously
and be executed in a much faster fashion than synchronous
programming??
In frameworks such as ASP.NET MVC, your application has a finite number of threads available to handle incoming requests (lets call these "request threads"). By delegating heavy work to other threads and awaiting, your request threads will be free to handle more incoming requests while heavy work is being done.
This diagram, although complex, illustrates the execution/suspension flow of threads executing asynchronous work:
Notice how at step 6 the thread was yielded, and then step 7 resumed the execution of the method.
As you can see, the await keyword effectively breaks the method in two.

Task stays in WaitingToRun state for abnormally long time

I've got a program that handles a variety of tasks running in parallel. A single task acts as a manager of sorts, making sure certain conditions are met before the next task is ran. However, I've found that sometimes a task will sit in the WaitingToRun state for a very long time. Here's the following code:
mIsDisposed = false;
mTasks = new BlockingCollection<TaskWrapper>(new ConcurrentQueue<TaskWrapper>());
Task.Factory.StartNew(() => {
while (!mIsDisposed) {
var tTask = mTasks.Take();
tTask.task.Start();
while (tTask.task.Status == TaskStatus.WaitingToRun) {
Console.WriteLine("Waiting to run... {0}", tTask.task.Id);
Thread.Sleep(200);
}
tTask.ready.Wait();
}
mTasks.Dispose();
});
DoWork();
DoWork();
DoWork();
DoWork();
DoWorkAsync();
DoWorkAsync();
DoWorkAsync();
DoWorkAsync();
DoWorkAsync();
DoWork();
TaskWrapper is very simply defined as:
private class TaskWrapper
{
public Task task { get; set; }
public Task ready { get; set; }
}
And tasks are only currently added in 2 places:
public void DoWork()
{
DoWorkAsync().Wait();
}
public Task DoWorkAsync()
{
ManualResetEvent next = new ManualResetEvent(false);
Task task = new Task(() => ActualWork(next));
Task ready = Task.Factory.StartNew(() => next.Wait());
mTasks.Add(new TaskWrapper() {
task = task,
ready = ready
});
return task;
}
Where ActualWork(next) calls next.Set().
This queues work and waits until next has been set before allowing the next work item to proceed. You can either wait for the entire task to finish before continuing by calling DoWork() or queue multiple tasks at once (which are supposed to run after next has been set).
However, when adding a task via DoWorkAsync(), after calling tTask.task.Start(), tTask.task sits in the WaitingToRun state for a loooong time (like 30 seconds to a minute), then magically starts running. I've monitored this using the while loop, and Waiting To Run... # will display for quite some time.
Calling DoWork() always runs immediately. I'm sure this has something to do with calling Wait on the task that is set to run.
I'm at a loss, here.
UPDATE:
I've managed to make the code work, but I'd still like to know why there's an issue in the first place.
After some experimental changes, I've managed to fix my own problem, but it's more of a "Oh, so I just can't do that" rather than a good fix. It turns out my problem was enqueuing tasks to run too quickly. By modifying DoWorkAsync() to no longer use Task.Factory.StartNew and changing tTask.ready.Wait() to tTask.ready.RunSynchronously I've managed to solve my issue.
Is there a reason the TaskScheduler is delaying the scheduling of my tasks? Am I saturating some underlying resources? What's going on here?
The threads will be run in the system's thread pool. The thread pool has a minimum number of threads available at all times (see ThreadPool.SetMinThreads()). If you try to create more than that many threads, a delay of approximately 500ms will be introduced between each new thread starting.
There is also a maximum number of threads in the thread pools (see ThreadPool.GetMaxThreads()), and if you reach that limit no new threads will be created; it will wait until an old thread dies before scheduling a new one (or rather, rescheduling the old one to run your new thread, of course).
You are unlikely to be hitting that limit though - it's probably over 1000.
Ok, I've just been faced with a similar issue. A bit of code that created and started a task ran, but the task never started (it just changed status to WaitingToRun)
Having tried the other options in this thread to no avail I thought about it a bit more, and realised that the code that was calling this method was itself called in a continuation task, that had been specified to run on the UI task scheduler (As it needed to update the UI)...
So something like
void Main()
{
var t1 = new Task(() => Console.WriteLine("hello, I'm task t1"));
t1.ContinueWith(t => CreateAndRunASubTask(), TaskScheduler.FromCurrentSynchronizationContext());
t1.Start();
Console.WriteLine("All tasks done with");
}
// Define other methods and classes here
public void CreateAndRunASubTask()
{
var tsk = new Task(() => Console.WriteLine("hello, I'm the sub-task"));
tsk.Start();
Console.WriteLine("sub-task has been told to start");
tsk.Wait();
// the code blocks on tsk.Wait() indefinately, the tsk status being "WaitingToRun"
Console.WriteLine("sub-task has finished");
}
The fix turned out to be pretty simple - when specifying the continuation task you need to specify the TaskContinuationOption: TaskContinuationOptions.HideScheduler
This has the effect of... (taken from the XML comment)
Specifies that tasks created by the continuation by calling methods
such as System.Threading.Tasks.Task.Run(System.Action) or
System.Threading.Tasks.Task.ContinueWith(System.Action{System.Threading.Tasks.Task})
see the default scheduler (System.Threading.Tasks.TaskScheduler.Default) rather
than the scheduler on which this continuation is running as the current scheduler.
ie (in my example)
t1.ContinueWith(t =>
CreateAndRunASubTask(),
System.Threading.CancellationToken.None,
TaskContinuationOptions.HideScheduler,
TaskScheduler.FromCurrentSynchronizationContext());
Hope this helps someone, as it stumped me for a good while!
Just faced similar issue.
I have a bunch of similar tasks running inifite loops, one of that tasks from time to time stays in WaitingToRun state permamently.
Creating tasks in that way did the trick for me:
_task = new Task(() => DoSmth(_cancellationTokenSource.Token), TaskCreationOptions.LongRunning);
_task.Start();

multithreading in winforms application

I’m writing a win forms that uses the report viewer for the creation of multiple PDF files. These PDF files are divided in 4 main parts, each part is responsible for the creation of a specific report. These processes are creating a minimum of 1 file up to the number of users (currently 50).
The program already exists using there 4 methods sequentially. For extra performance where the number of users is growing, I want to separate these methods from the mail process in 4 separate threads.
While I'm new to multithreading using C# I read a number of articles how to achieve this. The only thing I'm not sure of is which way I should start. As I read multiple blog posts I'm not sure if to use 4 separate threads, a thread pool or multiple background workers. (or should parallel programming be the best way?). Blog posts tell me if more than 3 threads use a thread pool, but on the other hand the tell me if using winforms, use the backgroundworker. Which option is best (and why)?
At the end my main thread has to wait for all processes to end before continuing.
Can someone tell me what's the best solution to my problem.
* Extra information after edit *
Which i forgot to tell (after i read al your comments and possible solutions). The methods share one "IEnumerable" only for reading. After firing the methods (that don't have to run sequentially), the methods trigger events for for sending status updates to the UI. I think triggering events is difficult if not impossible using separate threads so there should be some kind of callback function to report status updates while running.
some example in psuedo code.
main()
{
private List<customclass> lcc = importCustomClass()
export.CreatePDFKind1.create(lcc.First(), exportfolderpath, arg1)
export.CreatePDFKind2.create(lcc, exportfolderpath)
export.CreatePDFKind3.create(lcc.First(), exportfolderpath)
export.CreatePDFKind4.create(customclass2, exportfolderpath)
}
namespace export
{
class CreatePDFKind1
{
create(customclass cc, string folderpath)
{
do something;
reportstatus(listviewItem, status, message)
}
}
class CreatePDFKind2
{
create(IEnumerable<customclass> lcc, string folderpath)
{
foreach (var x in lcc)
{
do something;
reportstatus(listviewItem, status, message)
}
}
}
etc.......
}
From the very basic picture you have described, I would use the Task Paralell Library (TPL). Shipped with .NET Framework 4.0+.
You talk about the 'best' option of using thread pools when spawning a large-to-medium number of threads. Dispite this being correct [the most efficent way of mangaing the resources], the TPL does all of this for you - without you having to worry about a thing. The TPL also makes the use of multiple threads and waiting on their completion a doddle too...
To do what you require I would use the TPL and Continuations. A continuation not only allows you to create a flow of tasks but also handles your exceptions. This is a great introduction to the TPL. But to give you some idea...
You can start a TPL task using
Task task = Task.Factory.StartNew(() =>
{
// Do some work here...
});
Now to start a second task when an antecedent task finishes (in error or successfully) you can use the ContinueWith method
Task task1 = Task.Factory.StartNew(() => Console.WriteLine("Antecedant Task"));
Task task2 = task1.ContinueWith(antTask => Console.WriteLine("Continuation..."));
So as soon as task1 completes, fails or is cancelled task2 'fires-up' and starts running. Note that if task1 had completed before reaching the second line of code task2 would be scheduled to execute immediately. The antTask argument passed to the second lambda is a reference to the antecedent task. See this link for more detailed examples...
You can also pass continuations results from the antecedent task
Task.Factory.StartNew<int>(() => 1)
.ContinueWith(antTask => antTask.Result * 4)
.ContinueWith(antTask => antTask.Result * 4)
.ContinueWith(antTask =>Console.WriteLine(antTask.Result * 4)); // Prints 64.
Note. Be sure to read up on exception handling in the first link provided as this can lead a newcomer to TPL astray.
One last thing to look at in particular for what you want is child tasks. Child tasks are those which are created as AttachedToParent. In this case the continuation will not run until all child tasks have completed
TaskCreationOptions atp = TaskCreationOptions.AttachedToParent;
Task.Factory.StartNew(() =>
{
Task.Factory.StartNew(() => { SomeMethod() }, atp);
Task.Factory.StartNew(() => { SomeOtherMethod() }, atp);
}).ContinueWith( cont => { Console.WriteLine("Finished!") });
So in your case you would start your four tasks, then wait on their completion on the main thread.
I hope this helps.
Using a BackgroundWorker is helpful if you need to interact with the UI with respect to your background process. If you don't, then I wouldn't bother with it. You can just start 4 Task objects directly:
tasks.Add(Task.Factory.StartNew(()=>DoStuff()));
tasks.Add(Task.Factory.StartNew(()=>DoStuff2()));
tasks.Add(Task.Factory.StartNew(()=>DoStuff3()));
If you do need to interact with the UI; possibly by updating it to reflect when the tasks are finished, then I would suggest staring one BackgroundWorker and then using tasks again to process each individual unit of work. Since there is some additional overhead in using a BackgroundWorker I would avoid starting lots of them if you can avoid it.
BackgroundWorker bgw = new BackgroundWorker();
bgw.DoWork += (_, args) =>
{
List<Task> tasks = new List<Task>();
tasks.Add(Task.Factory.StartNew(() => DoStuff()));
tasks.Add(Task.Factory.StartNew(() => DoStuff2()));
tasks.Add(Task.Factory.StartNew(() => DoStuff3()));
Task.WaitAll(tasks.ToArray());
};
bgw.RunWorkerCompleted += (_, args) => updateUI();
bgw.RunWorkerAsync();
You could of course use just Task methods to do all of this, but I still find BackgroundWorkers a bit simpler to work with for the simpler cases. Using .NEt 4.5 you could use Task.WhenAll to run a continuation in the UI thread when all 4 tasks finished, but doing that in 4.0 wouldn't be quite as simple.
Without further information it's impossible to tell. The fact that they're in four separate methods doesn't make much of a difference if they're accessing the same resources. The PDF file for example. If you're having trouble understanding what I mean you should post some of the code for each method and I'll go into a little more detail.
Since the number of "parts" you have is fixed it won't make a big difference whether you use separate threads, background workers or use a thread pool. I'm not sure why people are recommending background workers. Most likely because it's a simpler approach to multithreading and more difficult to screw up.

Categories

Resources