I am starting TCPListener like this and when the job done I close the Socket.
I would like to know if the Thread which I start like this
ThreadPool.QueueUserWorkItem(ConnectClientsThredProc, args);
is going to destroy itself so I don't need any external control over it.
Would anyone so pleased to explain have I worry about it or don't.
Thank you!
class TCPListenerManager
{
TcpListener tcpListener;
HostListenerItem hostListener;
private bool _isServerWorking = false;
public TCPListenerManager(HostListenerItem hostListenerItem)
{
hostListener = hostListenerItem;
tcpListener = new TcpListener(IPAddress.Parse(hostListenerItem.IP4), hostListenerItem.Port);
var t = Task.Factory.StartNew(async () =>
{
await StartAsync(hostListenerItem.ClientsMax);
});
}
public async Task StartAsync(int clientsMax)
{
tcpListener.Start();
_isServerWorking = true;
for (int i = 0; i < clientsMax; i++)
{
if (_isServerWorking)
{
ServerConnectedEventArgs args = new ServerConnectedEventArgs();
args.TcpClient = await tcpListener.AcceptTcpClientAsync();
args.HostListener = hostListener;
OnServerConnected(args);
ThreadPool.QueueUserWorkItem(ConnectClientsThredProc, args);
}
}
}
private void ConnectClientsThredProc(object obj)
{
var args = (ServerConnectedEventArgs)obj;
if (args.TcpClient.Connected)
{
// Do some job and disconnect
args.TcpClient.Client.Close();
args.TcpClient.Client = null;
}
}
}
When ConnectClientsThredProc exits, thread is not "gone" but returned back to the pool (that's why thread pool exists in the first place). Anyway, you should not care about that indeed, unless you have long-running task performed in ConnectClientsThredProc. If it is long running - better not use thread pool thread, but start a new one (via Task.Factory.StartNew + TaskCreationOptions.LongRunning for example).
Also, you use Task.Factory, async\await, ThreadPool.QueueUserWorkItem, all that in a short piece of code, mixed together. Maybe you need to understand a bit better what are those tools and what are similarities and differences between them (especially async\await). For example, what is the reason for this:
var t = Task.Factory.StartNew(async () =>
{
await StartAsync(hostListenerItem.ClientsMax);
});
You start a task\thread, inside which you start yet another task and then wait for it to exit - makes little sense.
Instead of ThreadPool.QueueUserWorkItem you might use Task.Run with the same effect.
Related
I want to check the open port in a group of ip . I do not want to check an ip behind a title sequentially. But I want it in parallel so that the user determines the number of threads. The number of ip is divided into threads :
private async void button3_Click(object sender, EventArgs e)
{
await Task.Run(() =>
{
var lines = File.ReadAllLines("ip.txt");
int cc = (lines.Length) / (int.Parse(thread.Text));
if (lines.Length % int.Parse(thread.Text) == 0)
{
for (int s = 0; s < lines.Length; s = s + cc)
{
Parallel.For(0, 1, a =>
{
checkopen(s, (s + cc));
});
}
}
else
{
MessageBox.Show("enter true thread num");
}
});
}
this to check open port :
void checkopen(int first,int last)
{
int port = Convert.ToInt32(portnum.Text);
var lines = File.ReadAllLines("ip.txt");
for (int i = first; i < last; ++i)
{
var line = lines[i];
using (TcpClient tcpClient = new TcpClient())
{
try
{
tcpClient.Connect(line, port);
this.Invoke(new Action(() =>
{
listBox1.Items.Add(line); // open port
}));
}
catch (Exception)
{
this.Invoke(new Action(() =>
{
listBox2.Items.Add(line); //close port
}));
}
}
}
}
I see this problem every day.
There is no point putting IO bound operations in Parallel.For, it's not designed for IO work, or the async pattern (and trust me, that's what you want).
Why do we want the async await pattern?
Because it designed to give the threads back when it's awaiting an IO completion port or awaitable workload.
When you run IO work in Parallel.For/Parallel.ForEach like this, the Task Scheduler just won't give you threads to block, it uses all sort of heuristics to work out how many threads you should have, and it takes a dim view of it.
So what should we use?
The async and await pattern.
Why?
Because we can let IO be IO, the system creates and IO completion port, .Net gives the thread back to the threadpool until the Completion port calls back and the method continues.
So, there are many options for this. But first and foremost await the awaitable async methods of the libraries you are using.
From here you can either create a lists of tasks and use something like an awaitable SemaphoreSlim to limit concurrency and a WhenAll.
Or you could use something like an ActionBlock out of TPL Dataflow, which is designed to work with both CPU and IO bound workloads.
The real world benefits can't be understated. Your Parallel.For approach will just run a handful of thread and block them. An async version you'll be able to run 100s simultaneously.
Dataflow example
You can get the Nuget here
public async Task DoWorkLoads(List<WorkLoad> workloads)
{
var options = new ExecutionDataflowBlockOptions
{
// add pepper and salt to taste
MaxDegreeOfParallelism = 100,
EnsureOrdered = false
};
// create an action block
var block = new ActionBlock<WorkLoad>(MyMethodAsync, options);
// Queue them up
foreach (var workLoad in workloads)
block.Post(workLoad );
// wait for them to finish
block.Complete();
await block.Completion;
}
...
// Notice we are using the async / await pattern
public async Task MyMethodAsync(WorkLoad workLoad)
{
try
{
Console.WriteLine("Doing some IO work async);
await DoIoWorkAsync;
}
catch (Exception)
{
// probably best to add some error checking some how
}
}
I am using Async await with Task.Factory method.
public async Task<JobDto> ProcessJob(JobDto jobTask)
{
try
{
var T = Task.Factory.StartNew(() =>
{
JobWorker jobWorker = new JobWorker();
jobWorker.Execute(jobTask);
});
await T;
}
This method I am calling inside a loop like this
for(int i=0; i < jobList.Count(); i++)
{
tasks[i] = ProcessJob(jobList[i]);
}
What I notice is that new tasks opens up inside Process explorer and they also start working (based on log file). however out of 10 sometimes 8 or sometimes 7 finishes. Rest of them just never come back.
why would that be happening ?
Are they timing out ? Where can I set timeout for my tasks ?
UPDATE
Basically above, I would like each Task to start running as soon as they are called and wait for the response on AWAIT T keyword. I am assuming here that once they finish each of them will come back at Await T and do the next action. I am alraedy seeing this result for 7 out of 10 tasks but 3 of them are not coming back.
Thanks
It is hard to say what the issues is without the rest if the code, but you code can be simplified by making ProcessJob synchronous and then calling Task.Run with it.
public JobDto ProcessJob(JobDto jobTask)
{
JobWorker jobWorker = new JobWorker();
return jobWorker.Execute(jobTask);
}
Start tasks and wait for all tasks to finish. Prefer using Task.Run rather than Task.Factory.StartNew as it provides more favourable defaults for pushing work to the background. See here.
for(int i=0; i < jobList.Count(); i++)
{
tasks[i] = Task.Run(() => ProcessJob(jobList[i]));
}
try
{
await Task.WhenAll(tasks);
}
catch(Exception ex)
{
// handle exception
}
First, let's make a reproducible version of your code. This is NOT the best way to achieve what you are doing, but to show you what is happening in your code!
I'll keep the code almost same as your code, except I'll use simple int rather than your JobDto and on completion of the job Execute() I'll write in a file that we can verify later. Here's the code
public class SomeMainClass
{
public void StartProcessing()
{
var jobList = Enumerable.Range(1, 10).ToArray();
var tasks = new Task[10];
//[1] start 10 jobs, one-by-one
for (int i = 0; i < jobList.Count(); i++)
{
tasks[i] = ProcessJob(jobList[i]);
}
//[4] here we have 10 awaitable Task in tasks
//[5] do all other unrelated operations
Thread.Sleep(1500); //assume it works for 1.5 sec
// Task.WaitAll(tasks); //[6] wait for tasks to complete
// The PROCESS IS COMPLETE here
}
public async Task ProcessJob(int jobTask)
{
try
{
//[2] start job in a ThreadPool, Background thread
var T = Task.Factory.StartNew(() =>
{
JobWorker jobWorker = new JobWorker();
jobWorker.Execute(jobTask);
});
//[3] await here will keep context of calling thread
await T; //... and release the calling thread
}
catch (Exception) { /*handle*/ }
}
}
public class JobWorker
{
static object locker = new object();
const string _file = #"C:\YourDirectory\out.txt";
public void Execute(int jobTask) //on complete, writes in file
{
Thread.Sleep(500); //let's assume does something for 0.5 sec
lock(locker)
{
File.AppendAllText(_file,
Environment.NewLine + "Writing the value-" + jobTask);
}
}
}
After running just the StartProcessing(), this is what I get in the file
Writing the value-4
Writing the value-2
Writing the value-3
Writing the value-1
Writing the value-6
Writing the value-7
Writing the value-8
Writing the value-5
So, 8/10 jobs has completed. Obviously, every time you run this, the number and order might change. But, the point is, all the jobs did not complete!
Now, if I un-comment the step [6] Task.WaitAll(tasks);, this is what I get in my file
Writing the value-2
Writing the value-3
Writing the value-4
Writing the value-1
Writing the value-5
Writing the value-7
Writing the value-8
Writing the value-6
Writing the value-9
Writing the value-10
So, all my jobs completed here!
Why the code is behaving like this, is already explained in the code-comments. The main thing to note is, your tasks run in ThreadPool based Background threads. So, if you do not wait for them, they will be killed when the MAIN process ends and the main thread exits!!
If you still don't want to await the tasks there, you can return the list of tasks from this first method and await the tasks at the very end of the process, something like this
public Task[] StartProcessing()
{
...
for (int i = 0; i < jobList.Count(); i++)
{
tasks[i] = ProcessJob(jobList[i]);
}
...
return tasks;
}
//in the MAIN METHOD of your application/process
var tasks = new SomeMainClass().StartProcessing();
// do all other stuffs here, and just at the end of process
Task.WaitAll(tasks);
Hope this clears all confusion.
It's possible your code is swallowing exceptions. I would add a ContineWith call to the end of the part of the code that starts the new task. Something like this untested code:
var T = Task.Factory.StartNew(() =>
{
JobWorker jobWorker = new JobWorker();
jobWorker.Execute(jobTask);
}).ContinueWith(tsk =>
{
var flattenedException = tsk.Exception.Flatten();
Console.Log("Exception! " + flattenedException);
return true;
});
},TaskContinuationOptions.OnlyOnFaulted); //Only call if task is faulted
Another possibility is that something in one of the tasks is timing out (like you mentioned) or deadlocking. To track down whether a timeout (or maybe deadlock) is the root cause, you could add some timeout logic (as described in this SO answer):
int timeout = 1000; //set to something much greater than the time it should take your task to complete (at least for testing)
var task = TheMethodWhichWrapsYourAsyncLogic(cancellationToken);
if (await Task.WhenAny(task, Task.Delay(timeout, cancellationToken)) == task)
{
// Task completed within timeout.
// Consider that the task may have faulted or been canceled.
// We re-await the task so that any exceptions/cancellation is rethrown.
await task;
}
else
{
// timeout/cancellation logic
}
Check out the documentation on exception handling in the TPL on MSDN.
I'm having an odd problem with HttpClient and Timers. I have a large number of objects (up to 10,000) that post to a web service. These objects are on timers and post to the service at some n time after creation. The problem is that the Post stalls, until all of the timers have started. See the code below for an example.
Q: Why does the Post hang until all Timers have started? How can it be fixed so that the post functions correctly while the other Timers start?
public class MyObject
{
public void Run()
{
var result = Post(someData).Result;
DoOtherStuff();
}
}
static async Task<string> Post(string data)
{
using (var client = new HttpClient())
{
//Hangs here until all timers are started
var response = await client.PostAsync(new Uri(url), data).ConfigureAwait(continueOnCapturedContext: false);
var text = await response.Content.ReadAsStringAsync().ConfigureAwait(continueOnCapturedContext: false);
return text;
}
}
static void Main(string[] args)
{
for (int i = 0; i < 1000; i++)
{
TimeSpan delay = TimeSpan.FromSeconds(1);
if (i % 2 == 0) delay = TimeSpan.FromDays(1);
System.Timers.Timer timer = new System.Timers.Timer();
timer.AutoReset = false;
timer.Interval = delay.TotalMilliseconds;
timer.Elapsed += (x, y) =>
{
MyObject o = new MyObject();
o.Run();
};
timer.Start();
}
Console.ReadKey();
}
Because you're using up all the ThreadPool threads.
There's a lot wrong with your sample code. You're killing any chance of having reasonable performance, not to mention that the whole thing is inherently unstable.
You're creating a thousand timers in a loop. You're not keeping a reference to any of them, so they will be collected the next time the GC runs - so I'd expect that in practice, very few of them will actually run, unless there's very little memory allocated until they actually get to run.
The timer's Elapsed event will be invoked on a ThreadPool thread. In that thread, you synchronously wait for a bunch of asynchronous calls to complete. That means you're now wasting a thread pool thread, and completely wasting the underlying asynchronicity of the asynchronous method.
The continuation to the asynchronous I/O will be posted to ThreadPool as well - however, the ThreadPool is full of timer callbacks. It will slowly start creating more and more threads to accomodate the amount of work scheduled, until it finally is able to execute the first callback from the asynchronous I/O and it slowly untangles itself. At this point, you likely have more than 1000 threads, and are showing a complete misunderstanding of how to do asynchronous programming.
One way (still rather bad) to fix both problems is to simply make use of asynchronicity all the time:
public class MyObject
{
public async Task Run()
{
var result = await Post(someData);
DoOtherStuff();
}
}
static async Task<string> Post(string data)
{
using (var client = new HttpClient())
{
//Hangs here until all timers are started
var response = await client.PostAsync(new Uri(url), new StringContent(data)).ConfigureAwait(continueOnCapturedContext: false);
var text = await response.Content.ReadAsStringAsync().ConfigureAwait(continueOnCapturedContext: false);
return text;
}
}
static void Main(string[] args)
{
var tasks = new List<Task>();
for (int i = 0; i < 1000; i++)
{
TimeSpan delay = TimeSpan.FromSeconds(1);
if (i % 2 == 0) delay = TimeSpan.FromDays(1);
tasks.Add(Task.Delay(delay).ContinueWith((_) => new MyObject().Run()));
}
Task.WaitAll(tasks.ToArray());
Console.WriteLine("Work done");
Console.ReadKey();
}
A much better way would be to implement some scheduler that handles dispatching the asynchronous I/O with the throttling you need. You probably want to limit the number of concurrent requests or something like that, rather than running the requests in pre-defined intervals (and ignoring the fact that some requests might take very long, timeout or some such).
As mentioned in another reply, that Result property is the problem. when you are using it, asyc will be come sync. If you want to run async operations in console or windows service applications, try Nito AsyncEx library. It creates an AsyncContext. Now you can change void Run to Task Run which is await-able and doesn't need the blocking Result property and in this case await Post will work in the Run method.
static void Main(string[] args)
{
AsyncContext.Run(async () =>
{
var data = await ...;
});
}
As you're running on a console application, which uses the default ThreadPoolSynchronizationContext, you shouldn't really be experiencing the "hanging" feeling as if you are in a UI application. I assume its because Post is taking longer to return than to allocate 1000 timers.
In order for your method to run async, it has to go "async all the way. Using the Task.Result property, as mentioned before will simply block the asynchronous operation until it completes.
Lets see what we need to do for this to be "async all the way":
First, lets change Run from void to async Task so we can await on the Post method:
public async Task Run()
{
var result = await Post(someData);
DoOtherStuff();
}
Now, since Run became awaitable, as it returns a Task, we can turn Timer.Elapsed to an async event handler and await on Run.
static void Main(string[] args)
{
for (int i = 0; i < 1000; i++)
{
TimeSpan delay = TimeSpan.FromSeconds(1);
if (i % 2 == 0) delay = TimeSpan.FromDays(1);
System.Timers.Timer timer = new System.Timers.Timer();
timer.AutoReset = false;
timer.Interval = delay.TotalMilliseconds;
timer.Elapsed += async (x, y) =>
{
MyObject o = new MyObject();
await o.Run();
};
timer.Start();
}
Console.ReadKey();
}
That's it, now we flow async all the way down to the HTTP request and back.
It's because the timers are set up very quickly so they have all finished setting up by the time PostAsync completes. Try putting a Thread.Sleep(1000) after timer.Start, this will slow down the set up of your timers and you should see some PostAsync executions complete.
By the way, Task.Result is a blocking operation, which can cause a deadlock when run from a GUI application. There is more information in this article.
The main idea here is to fetch some data from somewhere, when it's fetched start writing it, and then prepare the next batch of data to be written, while waiting for the previous write to be complete.
I know that a Task cannot be restarted or reused (nor should it be), although I am trying to find a way to do something like this :
//The "WriteTargetData" method should take the "data" variable
//created in the loop below as a parameter
//WriteData basically do a shedload of mongodb upserts in a separate thread,
//it takes approx. 20-30 secs to run
var task = new Task(() => WriteData(somedata));
//GetData also takes some time.
foreach (var data in queries.Select(GetData))
{
if (task.Status != TaskStatus.Running)
{
//start task with "data" as a parameter
//continue the loop to prepare the next batch of data to be written
}
else
{
//wait for task to be completed
//"restart" task
//continue the loop to prepare the next batch of data to be written
}
}
Any suggestion appreciated ! Thanks. I don't necessarily want to use Task, I just think it might be the way to go.
This may be over simplifying your requirements, but would simply "waiting" for the previous task to complete work for you? You can use Task.WaitAny and Task.WaitAll to wait for previous operations to complete.
pseudo code:
// Method that makes calls to fetch and write data.
public async Task DoStuff()
{
Task currTask = null;
object somedata = await FetchData();
while (somedata != null)
{
// Wait for previous task.
if (currTask != null)
Task.WaitAny(currTask);
currTask = WriteData(somedata);
somedata = await FetchData();
}
}
// Whatever method fetches data.
public Task<object> FetchData()
{
var data = new object();
return Task.FromResult(data);
}
// Whatever method writes data.
public Task WriteData(object somedata)
{
return Task.Factory.StartNew(() => { /* write data */});
}
The Task class is not designed to be restarted. so you Need to create a new task and run the body with the same Parameters. Next i do not see where you start the task with the WriteData function in its body. That will property Eliminate the call of if (task.Status != TaskStatus.Running) There are AFAIK only the class Task and Thread where task is only the abstraction of an action that will be scheduled with the TaskScheduler and executed in different threads ( when we talking about the Common task Scheduler, the one you get when you call TaskFactory.Scheduler ) and the Number of the Threads are equal to the number of Processor Cores.
To you Business App. Why do you wait for the execution of WriteData? Would it be not a lot more easy to gater all data and than submit them into one big Write?
something like ?
public void Do()
{
var task = StartTask(500);
var array = new[] {1000, 2000, 3000};
foreach (var data in array)
{
if (task.IsCompleted)
{
task = StartTask(data);
}
else
{
task.Wait();
task = StartTask(data);
}
}
}
private Task StartTask(int data)
{
var task = new Task(DoSmth, data);
task.Start();
return task;
}
private void DoSmth(object time)
{
Thread.Sleep((int) time);
}
You can use a thread and an AutoResetEvent. I have code like this for several different threads in my program:
These are variable declarations that belong to the main program.
public AutoResetEvent StartTask = new AutoResetEvent(false);
public bool IsStopping = false;
public Thread RepeatingTaskThread;
Somewhere in your initialization code:
RepeatingTaskThread = new Thread( new ThreadStart( RepeatingTaskProcessor ) ) { IsBackground = true; };
RepeatingTaskThread.Start();
Then the method that runs the repeating task would look something like this:
private void RepeatingTaskProcessor() {
// Keep looping until the program is going down.
while (!IsStopping) {
// Wait to receive notification that there's something to process.
StartTask.WaitOne();
// Exit if the program is stopping now.
if (IsStopping) return;
// Execute your task
PerformTask();
}
}
If there are several different tasks you want to run, you can add a variable that would indicate which one to process and modify the logic in PerformTask to pick which one to run.
I know that it doesn't use the Task class, but there's more than one way to skin a cat & this will work.
I am adding some data to a list and it might take a while to do this. Therefore i perform this action asynchronously. I do it like this:
ScanDelegate worker = StartScan;
AsyncCallback completedCallback = ScanCompletedCallback;
lock (_sync)
{
var async = AsyncOperationManager.CreateOperation(null);
worker.BeginInvoke(completedCallback, async);
}
The information added in StartScan() is correctly added to the list. When the scan is complete, i want to perform a different scan, depending on the data in the list. So i start the different scan in the ScanCompletedCallback() method. But at this point, my list is empty. Im guessing that this is because the callback method is invoked when the worker has been started, and not when it returns.
Is this true?
And if it is, how can i know when my worker has completed its tasks?
Edit: I can't get my head around this. It doesn't make sense. I came to think of the list i am adding to. I couldn't just add to it, i had to wrap it in a Dispatcher thread. This must be the problem, right? Is there a way i can make my Async method StartScan() wait for this Dispatcher ?
Thanks in advance!
StartScan()
private void StartScan()
{
//For testing
for (int t = 0; t < 1; t++)
{
var app2 = Application.Current;
if (app2 != null)
{
app2.Dispatcher.BeginInvoke(DispatcherPriority.Background,
new DispatcherOperationCallback(AddComputerToList),
new ComputerModel()
{
HostName = "Andreas-PC",
IpAddress = "192.168.1.99",
ActiveDirectoryPath = "ikke i AD",
Name = "Andreas-PC",
OperatingSystem = "Microsoft Windows 7 Enterprise"
});
}
}
}
ScanCompletedCallback()
private void ScanCompletedCallback(IAsyncResult ar)
{
var worker = (ScanDelegate)((AsyncResult)ar).AsyncDelegate;
var async = (AsyncOperation)ar.AsyncState;
worker.EndInvoke(ar);
lock (_sync)
{
IsScanning = false;
}
var completedArgs = new AsyncCompletedEventArgs(null, false, null);
async.PostOperationCompleted(e => OnScanCompleted((AsyncCompletedEventArgs)e), completedArgs);
}
AddComputerToList()
private object AddComputerToList(Object objComputer)
{
var computer = objComputer as ComputerModel;
if (computer != null)
{
ComputersList.Add(computer);
OnPropertyChanged("ComputersList");
}
return null;
}
As a direct answer to your question the callback will occur on completion of ScanCompletedCallback.
If you have a list that you believe should have some data in it at this point then there appears to be something else wrong, posting some more code may help with this.