Parallel task execution at different time - c#

I have console app which performs a couple of tasks in parallel. In my case the important part is that i want the tasks to complete at the same time. I know how long each task is going to take. So the idea is to delay somehow every task in the Parallel.ForEach with custom time delay per task so they all finish at the same time and in the end all the tasks will complete approximately at the same time as the one which takes most time.
Example:
interface IService
{
void DoWork();
}
class Program
{
static void Main(string[] args)
{
var services = new List<IService>();
services.Add(new ServiceA());//Takes 500ms to DoWork()
services.Add(new ServiceB());//Takes 1000ms to DoWork()
services.Add(new ServiceC());//Takes 5000ms to DoWork()
Parallel.ForEach(services, z =>
{
Stopwatch sw = new Stopwatch();
sw.Start();
z.DoWork();
Console.WriteLine($"Ready for {sw.Elapsed}");
});
}
}
Output:
Ready for 00:00:00.5006837
Ready for 00:00:01.0002284
Ready for 00:00:05.0010202
Press any key to continue . . .
How to modify the code so the output will look like:
Ready for 00:00:05.5006837
Ready for 00:00:05.0002284
Ready for 00:00:05.0010202
Press any key to continue . . .
I guess the most obvious solution would be to distinguish which Service is z in the loop and add custom Thread.Sleep before calling z.DoWork(), but I am looking for smarter solution.

Thanks for the help! So this is what i wanted to achieve. All the services finish their job at the same time. Is this the right way of doing it?
class Program
{
static void Main(string[] args)
{
ServiceA a = new ServiceA();
ServiceB b = new ServiceB();
ServiceC c = new ServiceC();
int maxDelay = 3000;
var sw = new Stopwatch();
sw.Start();
Task taskA = Task.Delay(maxDelay - a.ResponseTime).ContinueWith(x =>
{
a.DoWork();
Console.WriteLine($"taskA ready for {sw.Elapsed}");
});
Task taskB = Task.Delay(maxDelay - b.ResponseTime).ContinueWith(x =>
{
b.DoWork();
Console.WriteLine($"taskB ready for {sw.Elapsed}");
});
Task taskC = Task.Delay(maxDelay - c.ResponseTime).ContinueWith(x =>
{
c.DoWork();
Console.WriteLine($"taskC ready for {sw.Elapsed}");
});
taskA.Wait();
taskB.Wait();
taskC.Wait();
Console.WriteLine(sw.Elapsed);
}
}
class ServiceA
{
public int ResponseTime { get => 500; }
public void DoWork() => Thread.Sleep(500);
}
class ServiceB
{
public int ResponseTime { get => 1000; }
public void DoWork() => Thread.Sleep(1000);
}
class ServiceC
{
public int ResponseTime { get => 3000; }
public void DoWork() => Thread.Sleep(3000);
}
Output:
taskA ready for 00:00:03.0084344
taskC ready for 00:00:03.0102184
taskB ready for 00:00:03.0102588
*Thread.Sleep(x) represents some time consuming operations.

Related

Await Task.Delay() spend too much time

In C# I have an example:
public async static Task TaskTest(int i)
{
await Task.Delay(1);
Console.WriteLine($"{i}. {DateTime.Now.ToString("HH:mm:ss fff")} " +
$"ThreadId:{Thread.CurrentThread.ManagedThreadId} Start");
int count = 1;
while (true)
{
DoSomeThing(count);
var stopWatch = new Stopwatch();
stopWatch.Start();
await Task.Delay(100);
stopWatch.Stop();
if (stopWatch.Elapsed.TotalMilliseconds > 200)
Console.ForegroundColor = ConsoleColor.Red;
Console.WriteLine($"Id:{count} Time:{DateTime.Now.ToString("HH:mm:ss fff")} " +
$"ThreadID:{Thread.CurrentThread.ManagedThreadId} Time Delay:{stopWatch.Elapsed.TotalMilliseconds }");
Console.ForegroundColor = ConsoleColor.White;
count++;
}
}
public async static Task DoSomeThing(int index)
{
await Task.Delay(1);
Task.Delay(1000).Wait();
}
private static void Main(string[] args)
{
int i = 1;
while (i < 2)
{
TaskTest(i);
Task.Delay(1).Wait();
i++;
}
Console.ReadKey();
}
Here is my result
Result
Id:8 Time:23:03:59 972 ThreadID:12 Time Delay:582.6348
Id:22 Time:23:04:01 974 ThreadID:14 Time Delay:552.7234000000001
Id:42 Time:23:04:04 967 ThreadID:8 Time Delay:907.3214
I don't know why Task sometimes delay more than 200 milliseconds.
Update:
Thank for all answer.
I update my code to use Thread and Thread.Sleep() and Task.Run(). I increase number of Threads run forever to 500. I tested in 30 minutes and 500 threads never sleep more than 200ms.
Do you think that is bad code?
Please leave a comment!
Thank you so much!
public static void TaskTest(object i)
{
Console.WriteLine($"{i} Start");
int count = 1;
while (true)
{
// Open Task to do work
Task.Run(() => { DoSomeThing(count); });
var stopWatch = new Stopwatch();
stopWatch.Start();
Thread.Sleep(100);
stopWatch.Stop();
if (stopWatch.Elapsed.TotalMilliseconds > 200)
{
Console.WriteLine($"Id:{count} Time:{DateTime.Now.ToString("HH:mm:ss fff")} " +
$"ThreadID:{Thread.CurrentThread.ManagedThreadId} Time Delay:{stopWatch.Elapsed.TotalMilliseconds }");
}
count++;
}
}
public static void DoSomeThing(int index)
{
Thread.Sleep(1000); // Time spent complete work
}
private static void Main(string[] args)
{
int i = 0;
while (i < 500)
{
// Open Thread for TaskTest
Thread tesThread = new Thread(TaskTest);
tesThread.IsBackground = true;
tesThread.Start(i);
i++;
}
Console.WriteLine("Finish init");
Console.ReadKey();
}
Task.Delay, like any other multi-threaded sleep function, yields the thread it's running on back to the system (or in the case of the thread pool, back to the thread pool scheduler), asking to be re-scheduled some time after the amount of time specified.
That is the only guarantee you have, that it will wait at least the amount specified. How long it will actually wait heavily depends on your thread pool load (you're delaying an awful lot of tasks there), general system load (there's thousands of threads at any given point in time to be scheduled on an average computer OS) and on your CPU&OS's capability to schedule threads quickly (in Windows, look at timeBeginPeriod).
Long story short, if precise timing matters to you, don't relinquish your thread.

How to handle threads that hang when using SemaphoreSlim

I have some code that runs thousands of URLs through a third party library. Occasionally the method in the library hangs which takes up a thread. After a while all threads are taken up by processes doing nothing and it grinds to a halt.
I am using a SemaphoreSlim to control adding new threads so I can have an optimal number of tasks running. I need a way to identify tasks that have been running too long and then to kill them but also release a thread from the SemaphoreSlim so a new task can be created.
I am struggling with the approach here so I made some test code that immitates what I am doing. It create tasks that have a 10% chance of hanging so very quickly all threads have hung.
How should I be checking for these and killing them off?
Here is the code:
class Program
{
public static SemaphoreSlim semaphore;
public static List<Task> taskList;
static void Main(string[] args)
{
List<string> urlList = new List<string>();
Console.WriteLine("Generating list");
for (int i = 0; i < 1000; i++)
{
//adding random strings to simulate a large list of URLs to process
urlList.Add(Path.GetRandomFileName());
}
Console.WriteLine("Queueing tasks");
semaphore = new SemaphoreSlim(10, 10);
Task.Run(() => QueueTasks(urlList));
Console.ReadLine();
}
static void QueueTasks(List<string> urlList)
{
taskList = new List<Task>();
foreach (var url in urlList)
{
Console.WriteLine("{0} tasks can enter the semaphore.",
semaphore.CurrentCount);
semaphore.Wait();
taskList.Add(DoTheThing(url));
}
}
static async Task DoTheThing(string url)
{
Random rand = new Random();
// simulate the IO process
await Task.Delay(rand.Next(2000, 10000));
// add a 10% chance that the thread will hang simulating what happens occasionally with http request
int chance = rand.Next(1, 100);
if (chance <= 10)
{
while (true)
{
await Task.Delay(1000000);
}
}
semaphore.Release();
Console.WriteLine(url);
}
}
As people have already pointed out, Aborting threads in general is bad and there is no guaranteed way of doing it in C#. Using a separate process to do the work and then kill it is a slightly better idea than attempting Thread.Abort; but still not the best way to go. Ideally, you want co-operative threads/processes, which use IPC to decide when to bail out themselves. This way the cleanup is done properly.
With all that said, you can use code like below to do what you intend to do. I have written it assuming your task will be done in a thread. With slight changes, you can use the same logic to do your task in a process
The code is by no means bullet-proof and is meant to be illustrative. The concurrent code is not really tested well. Locks are held for longer than needed and some places I am not locking (like the Log function)
class TaskInfo {
public Thread Task;
public DateTime StartTime;
public TaskInfo(ParameterizedThreadStart startInfo, object startArg) {
Task = new Thread(startInfo);
Task.Start(startArg);
StartTime = DateTime.Now;
}
}
class Program {
const int MAX_THREADS = 1;
const int TASK_TIMEOUT = 6; // in seconds
const int CLEANUP_INTERVAL = TASK_TIMEOUT; // in seconds
public static SemaphoreSlim semaphore;
public static List<TaskInfo> TaskList;
public static object TaskListLock = new object();
public static Timer CleanupTimer;
static void Main(string[] args) {
List<string> urlList = new List<string>();
Log("Generating list");
for (int i = 0; i < 2; i++) {
//adding random strings to simulate a large list of URLs to process
urlList.Add(Path.GetRandomFileName());
}
Log("Queueing tasks");
semaphore = new SemaphoreSlim(MAX_THREADS, MAX_THREADS);
Task.Run(() => QueueTasks(urlList));
CleanupTimer = new Timer(CleanupTasks, null, CLEANUP_INTERVAL * 1000, CLEANUP_INTERVAL * 1000);
Console.ReadLine();
}
// TODO: Guard against re-entrancy
static void CleanupTasks(object state) {
Log("CleanupTasks started");
lock (TaskListLock) {
var now = DateTime.Now;
int n = TaskList.Count;
for (int i = n - 1; i >= 0; --i) {
var task = TaskList[i];
Log($"Checking task with ID {task.Task.ManagedThreadId}");
// kill processes running for longer than anticipated
if (task.Task.IsAlive && now.Subtract(task.StartTime).TotalSeconds >= TASK_TIMEOUT) {
Log("Cleaning up hung task");
task.Task.Abort();
}
// remove task if it is not alive
if (!task.Task.IsAlive) {
Log("Removing dead task from list");
TaskList.RemoveAt(i);
continue;
}
}
if (TaskList.Count == 0) {
Log("Disposing cleanup thread");
CleanupTimer.Dispose();
}
}
Log("CleanupTasks done");
}
static void QueueTasks(List<string> urlList) {
TaskList = new List<TaskInfo>();
foreach (var url in urlList) {
Log($"Trying to schedule url = {url}");
semaphore.Wait();
Log("Semaphore acquired");
ParameterizedThreadStart taskRoutine = obj => {
try {
DoTheThing((string)obj);
} finally {
Log("Releasing semaphore");
semaphore.Release();
}
};
var task = new TaskInfo(taskRoutine, url);
lock (TaskListLock)
TaskList.Add(task);
}
Log("All tasks queued");
}
// simulate all processes get hung
static void DoTheThing(string url) {
while (true)
Thread.Sleep(5000);
}
static void Log(string msg) {
Console.WriteLine("{0:HH:mm:ss.fff} Thread {1,2} {2}", DateTime.Now, Thread.CurrentThread.ManagedThreadId.ToString(), msg);
}
}

How to implement robust thread monitoring in C#?

I have 2 tasks running parallelly and here is the task information.
Task 1 - Launch and run application
Task 2 - Monitor the application run duration. If it exceeds 30 mins, issue a stop command of task 1 application and restart both task.
Task 1 application bit heavy and memory leak prone during longer runs.
I am requesting, how can we implement robust threading solution for this situation.
using QuickTest;
using System;
using System.Threading;
using System.Threading.Tasks;
namespace TaskParallelExample
{
internal class Program
{
private static void Main(string[] args)
{
Parallel.Invoke(RunApplication, MonitorApplication);
}
private static void RunApplication()
{
Application uftInstance = new Application();
uftInstance.Launch();
QuickTest.Test uftTestInstance = uftInstance.Test;
uftInstance.Open(#"C:\Tasks\Test1");
uftInstance.Test.Run(); // It will may run more then 30 mins or less then also. It it exceeds 30 mins which is calculated from Monitor Application.
}
private static void MonitorApplication()
{
Application uftInstance = new Application();
try
{
DateTime uftTestRunMonitor = DateTime.Now;
int runningTime = (DateTime.Now - uftTestRunMonitor).Minutes;
while (runningTime <= 30)
{
Thread.Sleep(5000);
runningTime = (DateTime.Now - uftTestRunMonitor).Minutes;
if (!uftInstance.Test.IsRunning)
{
break;
}
}
}
catch (Exception exception)
{
//To-do
}
finally
{
if (uftInstance.Test.IsRunning)
{
//Assume it is still running and it is more then 30 mins
uftInstance.Test.Stop();
uftInstance.Test.Close();
uftInstance.Quit();
}
}
}
}
}
Thanks,
Ram
Could you use a CancellationTokenSource with timeout set to 30 mins?
var stopAfter30Mins = new CancellationTokenSource(TimeSpan.FromMinutes(30));
Then you would pass that to your worker method:
var task = Task.Run(() => worker(stopAfter30Mins.Token), stopAfter30Mins.Token);
...
static void worker(CancellationToken cancellation)
{
while (true) // Or until work completed.
{
cancellation.ThrowIfCancellationRequested();
Thread.Sleep(1000); // Simulate work.
}
}
Note that if the worker task cannot periodically check the cancellation status, there is NO robust way to handle task timeout.
System.Threading.Tasks.Task do the job
CancellationTokenSource cts = new CancellationTokenSource();
CancellationToken token = cts.Token;
Task myTask = Task.Factory.StartNew(() =>
{
for (int i = 0; i < 2000; i++)
{
token.ThrowIfCancellationRequested();
// Body of for loop.
}
}, token);
//Do sometohing else
//if cancel needed
cts.Cancel();

Repeat a task (TPL) in windows service, using ContinueWith

I have a windows service (written in C#) that use the task parallel library dll to perform some parallel tasks (5 tasks a time)
After the tasks are executed once I would like to repeat the same tasks on an on going basis (hourly). Call the QueuePeek method
Do I use a timer or a counter like I have setup in the code snippet below?
I am using a counter to set up the tasks, once I reach five I exit the loop, but I also use a .ContinueWith to decrement the counter, so my thought is that the counter value would be below 5 hence the loop would continue. But my ContinueWith seems to be executing on the main thread and the loop then exits.
The call to DecrementCounter using the ContinueWith does not seem to work
FYI : The Importer class is to load some libraries using MEF and do the work
This is my code sample:
private void QueuePeek()
{
var list = SetUpJobs();
while (taskCounter < 5)
{
int j = taskCounter;
Task task = null;
task = new Task(() =>
{
DoLoad(j);
});
taskCounter += 1;
tasks[j] = task;
task.ContinueWith((t) => DecrementTaskCounter());
task.Start();
ds.SetJobStatus(1);
}
if (taskCounter == 0)
Console.WriteLine("Completed all tasks.");
}
private void DoLoad(int i)
{
ILoader loader;
DataService.DataService ds = new DataService.DataService();
Dictionary<int, dynamic> results = ds.AssignRequest(i);
var data = results.Where(x => x.Key == 2).First();
int loaderId = (int)data.Value;
Importer imp = new Importer();
loader = imp.Run(GetLoaderType(loaderId));
LoaderProcessor lp = new LoaderProcessor(loader);
lp.ExecuteLoader();
}
private void DecrementTaskCounter()
{
Console.WriteLine(string.Format("Decrementing task counter with threadId: {0}",Thread.CurrentThread.ManagedThreadId) );
taskCounter--;
}
I see a few issues with your code that can potentially lead to some hard to track-down bugs. First, if using a counter that all of the tasks can potentially be reading and writing to at the same time, try using Interlocked. For example:
Interlocked.Increment(ref _taskCounter); // or Interlocked.Decrement(ref _taskCounter);
If I understand what you're trying to accomplish, I think what you want to do is to use a timer that you re-schedule after each group of tasks is finished.
public class Worker
{
private System.Threading.Timer _timer;
private int _timeUntilNextCall = 3600000;
public void Start()
{
_timer = new Timer(new TimerCallback(QueuePeek), null, 0, Timeout.Infinite);
}
private void QueuePeek(object state)
{
int numberOfTasks = 5;
Task[] tasks = new Task[numberOfTasks];
for(int i = 0; i < numberOfTasks; i++)
{
tasks[i] = new Task(() =>
{
DoLoad();
});
tasks[i].Start();
}
// When all tasks are complete, set to run this method again in x milliseconds
Task.Factory.ContinueWhenAll(tasks, (t) => { _timer.Change(_timeUntilNextCall, Timeout.Infinite); });
}
private void DoLoad() { }
}

Async Task with sync method call

I'm starting with the new .net 4.5 async programming and I found a situation like the code below: I have a sync method but I would like to make several calls and make it run in parallel. However, in this code, all the calls to the sync method runs with the id = 10 and I'm not sure why (probably I misunderstand something with this new approach :).
class Program
{
static void Main(string[] args)
{
var tasks = new List<Task>();
for (int i = 0; i < 10; i++)
{
var foo = new Foo();
var fooTask = Task.Run(() => foo.FooNonAsyncMethod(i));
tasks.Add(fooTask);
}
tasks.ForEach(t => t.Wait());
Console.WriteLine("All Done!");
Console.ReadLine();
}
}
public class Foo
{
public void FooNonAsyncMethod(int id)
{
Console.WriteLine("Starting {0}", id);
Thread.Sleep(4000);
Console.WriteLine("Ending {0}", id);
}
}
// Output:
// Starting 10
// Starting 10
// Starting 10
// Starting 10
// Ending 10
// Starting 10
// Starting 10
// Ending 10
// Ending 10
// ...
That's because there is only 1 variable i and the lambda expressions bind on a variable and not a value.
You can fix this by using:
for (int i = 0; i < 10; i++)
{
int newI = i;
var foo = new Foo();
var fooTask = Task.Run(() => foo.FooNonAsyncMethod(newI));
tasks.Add(fooTask);
}
As #Yuriy mentioned, this answer has a lot more info on this particularity : Is there a reason for C#'s reuse of the variable in a foreach?

Categories

Resources