Create And Execute Tasks in A Row - c#

I am developing an application in which i want to execute the tasks from only 1 place so that every time i add new Task it is added to that row to be executed, Also i want a priority for each task so if i set the task priority to HIGH it is added to the top of the row so it is executed immediately, On the other hand if i set the priority to Low it is added to the end of the row and so on...
I thought about using Tasks and ContinueWith but i don't have any clue from where should i start to have a class that totally handles my needs.
I am sorry for not providing a code or something bug i hope someone can get the point that i am pointing to and help me. And thank you in advance .

Well, if you didn't need to make room for high-priority tasks, you could make a simple helper class using Task and ContinueWith:
public class SimpleWorkQueue
{
private Task _main = null;
public void AddTask(Action task)
{
if (_main == null)
{
_main = new Task(task);
_main.Start();
}
else
{
Action<Task> next = (t) => task();
_main = _main.ContinueWith(next);
}
}
}
If you do need high-priority tasks, you probably need to handle more stuff yourself. Here is a producer/consumer example where all incoming tasks are inserted into a list in AddTask(), and a single worker thread consumes tasks from that list:
public class PrioritizedWorkQueue
{
List<Action> _queuedWork;
object _queueLocker;
Thread _workerThread;
public PrioritizedWorkQueue()
{
_queueLocker = new object();
_queuedWork = new List<Action>();
_workerThread = new Thread(LookForWork);
_workerThread.IsBackground = true;
_workerThread.Start();
}
private void LookForWork()
{
while (true)
{
Action work;
lock (_queueLocker)
{
while (!_queuedWork.Any()) { Monitor.Wait(_queueLocker); }
work = _queuedWork.First();
_queuedWork.RemoveAt(0);
}
work();
}
}
public void AddTask(Action task, bool highPriority)
{
lock (_queueLocker)
{
if (highPriority)
{
_queuedWork.Insert(0, task);
}
else
{
_queuedWork.Add(task);
}
Monitor.Pulse(_queueLocker);
}
}
}

Related

How do I guarantee execution of code only if and when optional main thread task and worker threads are finished?

Background:
I have an application I am developing that deals with a large number of addons for another application. One if its primary uses is to safely modify file records in files with fewer records so that they may be treated as one file (almost as if it is combing the files together into one set of records. To do this safely it keeps track of vital information about those files and changes made to them so that those changes can be undone if they don't work as expected.
When my application starts, it analyzes those files and keeps essential properties in a cache (to reduce load times). If a file is missing from the cache, the most important stuff is retrieved and then a background worker must process the file for more information. If a file that was previously modified has been updated with a new version of the file, the UI must confirm this with the user and its modification data removed. All of this information, including information on its modification is stored in the cache.
My Problem:
My problem is that neither of these processes are guaranteed to run (the confirmation window or the background file processor). If either of them run, then the cache must be updated by the main thread. I don't know enough about worker threads, and which thread runs the BackgroundWorker.RunWorkerCompleted event handler in order to effectively decide how to approach guaranteeing that the cache updater is run after either (or both) processes are completed.
To sum up: if either process is run, they both must finish and (potentially) wait for the other to be completed before running the cache update code. How can I do this?
ADJUNCT INFO (My current intervention that doesn't seem to work very well):
I have a line in the RunWorkerCompleted handler that waits until the form reference is null before continuing and exiting but maybe this was a mistake as it sometimes locks my program up.
SpinWait.SpinUntil(() => overwriteForm == null);
I haven't included any more code because I anticipate that this is more of a conceptual question than a code one. However, if necessary, I can supply code if it helps.
I think CountDownTask is what you need
using System;
using System.Threading;
public class Program
{
public class AtomicInteger
{
protected int value = 0;
public AtomicInteger(int value)
{
this.value = value;
}
public int DecrementAndGet()
{
int answer = Interlocked.Decrement(ref value);
return answer;
}
}
public interface Runnable
{
void Run();
}
public class CountDownTask
{
private AtomicInteger count;
private Runnable task;
private Object lk = new Object();
private volatile bool runnable;
private bool cancelled;
public CountDownTask(Int32 count, Runnable task)
{
this.count = new AtomicInteger(count);
this.task = task;
this.runnable = false;
this.cancelled = false;
}
public void CountDown()
{
if (count.DecrementAndGet() == 0)
{
lock (lk)
{
runnable = true;
Monitor.Pulse(lk);
}
}
}
public void Await()
{
lock (lk)
{
while (!runnable)
{
Monitor.Wait(lk);
}
if (cancelled)
{
Console.WriteLine("Sorry! I was cancelled");
}
else {
task.Run();
}
}
}
public void Cancel()
{
lock (lk)
{
runnable = true;
cancelled = true;
Monitor.Pulse(lk);
}
}
}
public class HelloWorldTask : Runnable
{
public void Run()
{
Console.WriteLine("Hello World, I'm last one");
}
}
public static void Main()
{
Thread.CurrentThread.Name = "Main";
Console.WriteLine("Current Thread: " + Thread.CurrentThread.Name);
CountDownTask countDownTask = new CountDownTask(3, new HelloWorldTask());
Thread worker1 = new Thread(() => {
Console.WriteLine("Worker 1 run");
countDownTask.CountDown();
});
Thread worker2 = new Thread(() => {
Console.WriteLine("Worker 2 run");
countDownTask.CountDown();
});
Thread lastThread = new Thread(() => countDownTask.Await());
lastThread.Start();
worker1.Start();
worker2.Start();
//countDownTask.Cancel();
Console.WriteLine("Main Thread Run");
countDownTask.CountDown();
Thread.Sleep(1000);
}
}
let me explain (but you can refer Java CountDownLatch)
1. To ensure a task must run after another tasks, we need create a Wait function to wait for they done, so I used
while(!runnable) {
Monitor.Wait(lk);
}
2. When there is a task done, we need count down, and if count down to zero (it means all of the tasks was done) we will need notify to blocked thread to wake up and process task
if(count.decrementAndGet() == 0) {
lock(lk) {
runnable = true;
Monitor.Pulse(lk);
}
}
Let read more about volatile, thanks
While dung ta van's "CountDownTask" answer isn't quite what I needed, it heavily inspired the solution below (see it for more info). Basically all I did was add some extra functionality and most importantly: made it so that each task "vote" on the outcome (true or false). Thanks dung ta van!
To be fair, dung ta van's solution DOES work to guarantee execution which as it turns out isn't quite what I needed. My solution adds the ability to make that execution conditional.
This was my solution which worked:
public enum PendingBool
{
Unknown = -1,
False,
True
}
public interface IRunnableTask
{
void Run();
}
public class AtomicInteger
{
int integer;
public int Value { get { return integer; } }
public AtomicInteger(int value) { integer = value; }
public int Decrement() { return Interlocked.Decrement(ref integer); }
public static implicit operator int(AtomicInteger ai) { return ai.integer; }
}
public class TaskElectionEventArgs
{
public bool VoteResult { get; private set; }
public TaskElectionEventArgs(bool vote) { VoteResult = vote; }
}
public delegate void VoteEventHandler(object sender, TaskElectionEventArgs e);
public class SingleVoteTask
{
private AtomicInteger votesLeft;
private IRunnableTask task;
private volatile bool runTask = false;
private object _lock = new object();
public event VoteEventHandler VoteCast;
public event VoteEventHandler TaskCompleted;
public bool IsWaiting { get { return votesLeft.Value > 0; } }
public PendingBool Result
{
get
{
if (votesLeft > 0)
return PendingBool.Unknown;
else if (runTask)
return PendingBool.True;
else
return PendingBool.False;
}
}
public SingleVoteTask(int numberOfVotes, IRunnableTask taskToRun)
{
votesLeft = new AtomicInteger(numberOfVotes);
task = taskToRun;
}
public void CastVote(bool vote)
{
votesLeft.Decrement();
runTask |= vote;
VoteCast?.Invoke(this, new TaskElectionEventArgs(vote));
if (votesLeft == 0)
lock (_lock)
{
Monitor.Pulse(_lock);
}
}
public void Await()
{
lock(_lock)
{
while (votesLeft > 0)
Monitor.Wait(_lock);
if (runTask)
task.Run();
TaskCompleted?.Invoke(this, new TaskElectionEventArgs(runTask));
}
}
}
Implementing the above solution was as simple as creating the SingleVoteTask in the UI thread and then having each thread affecting the outcome cast a vote.

C# concurrent: Is it a good idea to use many AutoResetEvent?

Suppose there are many threads calling Do(), and only one worker thread handles the actual job.
void Do(Job job)
{
concurrentQueue.Enqueue(job);
// wait for job done
}
void workerThread()
{
while (true)
{
Job job;
if (concurrentQueue.TryDequeue(out job))
{
// do job
}
}
}
The Do() should wait until the job done before return. So I wrote the following code:
class Task
{
public Job job;
public AutoResetEvent ev;
}
void Do(Job job)
{
using (var ev = new AutoResetEvent(false))
{
concurrentQueue.Enqueue(new Task { job = job, ev = ev }));
ev.WaitOne();
}
}
void workerThread()
{
while (true)
{
Task task;
if (concurrentQueue.TryDequeue(out task))
{
// do job
task.ev.Set();
}
}
}
After some tests I found it works as expected. However I'm not sure is it a good way to allocate many AutoResetEvents, or is there a better way to accomplish?
Since all clients must wait a single thread to do the job, there is no real need for using a queue. So I suggest to use the Monitor class instead, and specifically the Wait/Pulse functionality. It is a bit low level and verbose though.
class Worker<TResult> : IDisposable
{
private readonly object _outerLock = new object();
private readonly object _innerLock = new object();
private Func<TResult> _currentJob;
private TResult _currentResult;
private Exception _currentException;
private bool _disposed;
public Worker()
{
var thread = new Thread(MainLoop);
thread.IsBackground = true;
thread.Start();
}
private void MainLoop()
{
lock (_innerLock)
{
while (true)
{
Monitor.Wait(_innerLock); // Wait for client requests
if (_disposed) break;
try
{
_currentResult = _currentJob.Invoke();
_currentException = null;
}
catch (Exception ex)
{
_currentException = ex;
_currentResult = default;
}
Monitor.Pulse(_innerLock); // Notify the waiting client that the job is done
}
} // We are done
}
public TResult DoWork(Func<TResult> job)
{
TResult result;
Exception exception;
lock (_outerLock) // Accept only one client at a time
{
lock (_innerLock) // Acquire inner lock
{
if (_disposed) throw new InvalidOperationException();
_currentJob = job;
Monitor.Pulse(_innerLock); // Notify worker thread about the new job
Monitor.Wait(_innerLock); // Wait for worker thread to process the job
result = _currentResult;
exception = _currentException;
// Clean up
_currentJob = null;
_currentResult = default;
_currentException = null;
}
}
// Throw the exception, if occurred, preserving the stack trace
if (exception != null) ExceptionDispatchInfo.Capture(exception).Throw();
return result;
}
public void Dispose()
{
lock (_outerLock)
{
lock (_innerLock)
{
_disposed = true;
Monitor.Pulse(_innerLock); // Notify worker thread to exit loop
}
}
}
}
Usage example:
var worker = new Worker<int>();
int result = worker.DoWork(() => 1); // Accepts a function as argument
Console.WriteLine($"Result: {result}");
worker.Dispose();
Output:
Result: 1
Update: The previous solution is not await-friendly, so here is one that allows proper awaiting. It uses a TaskCompletionSource for each job, stored in a BlockingCollection.
class Worker<TResult> : IDisposable
{
private BlockingCollection<TaskCompletionSource<TResult>> _blockingCollection
= new BlockingCollection<TaskCompletionSource<TResult>>();
public Worker()
{
var thread = new Thread(MainLoop);
thread.IsBackground = true;
thread.Start();
}
private void MainLoop()
{
foreach (var tcs in _blockingCollection.GetConsumingEnumerable())
{
var job = (Func<TResult>)tcs.Task.AsyncState;
try
{
var result = job.Invoke();
tcs.SetResult(result);
}
catch (Exception ex)
{
tcs.TrySetException(ex);
}
}
}
public Task<TResult> DoWorkAsync(Func<TResult> job)
{
var tcs = new TaskCompletionSource<TResult>(job,
TaskCreationOptions.RunContinuationsAsynchronously);
_blockingCollection.Add(tcs);
return tcs.Task;
}
public TResult DoWork(Func<TResult> job) // Synchronous call
{
var task = DoWorkAsync(job);
try { task.Wait(); } catch { } // Swallow the AggregateException
// Throw the original exception, if occurred, preserving the stack trace
if (task.IsFaulted) ExceptionDispatchInfo.Capture(task.Exception.InnerException).Throw();
return task.Result;
}
public void Dispose()
{
_blockingCollection.CompleteAdding();
}
}
Usage example
var worker = new Worker<int>();
int result = await worker.DoWorkAsync(() => 1); // Accepts a function as argument
Console.WriteLine($"Result: {result}");
worker.Dispose();
Output:
Result: 1
From a synchronization perspective this is working fine.
But it seems useless to do it this way. If you want to execute jobs one after the other you can just use a lock:
lock (lockObject) {
RunJob();
}
What is your intention with this code?
There also is an efficiency question because each task creates an OS event and waits on it. If you use the more modern TaskCompletionSource this will use the same thing under the hood if you synchronously wait on that task. You can use asynchronous waiting (await myTCS.Task;) to possibly increase efficiency a bit. Of course this infects the entire call stack with async/await. If this is a fairly low volume operation you won't gain much.
In general I think would work, although when you say "many" threads are calling Do() this might not scale well ... suspended threads use resources.
Another problem with this code is that at idle times, you will have a "hard loop" in "workerThread" which will cause your application to return high CPU utilization times. You may want to add this code to "workerThread":
if (concurrentQueue.IsEmpty) Thread.Sleep(1);
You might also want to introduce a timeout to the WaitOne call to avoid a log jam.

Multiple timer/callbacks — best approach to prevent duplicates and to monitor them

I have a c# console, that I have made into a Windows service, which I would like to run reliably and constantly.
I want to prevent overlap of the same timer firing again
I want to prevent different timers trying to use the same resource at once
I want to be able to monitor the timers and interact with then.
It has a few aspects to it. Each runs very regularly. I have previously read about TaskScheduler vs Windows Service running this kind of thing, and have opted for this approach because something is running almost constantly.
TaskType1
TaskType2
TaskType3
TaskType4
I'm using timer callbacks, each with their own, similar to this simplified version:
class Program
{
static PollingService _service;
static void Main()
{
_service = new PollingService();
TimerCallback tc1 = _service.TaskType1;
TimerCallback tc2 = _service.TaskType2;
TimerCallback tc3 = _service.TaskType3A;
TimerCallback tc4 = _service.TaskType3B;
Timer t1 = new Timer(tc1, null, 1000, 5000);
Timer t2 = new Timer(tc2, null, 2000, 8000);
Timer t3 = new Timer(tc3, null, 3000, 11000);
Timer t4 = new Timer(tc4, null, 4000, 13000);
Console.WriteLine("Press Q to quit");
while (Console.ReadKey(true).KeyChar != 'q')
{
}
}
}
class PollingService
{
public void TaskType1(object state)
{
for (int i = 1; i <= 10; i++)
{
Console.WriteLine($"TaskOne numbering {i}");
Thread.Sleep(100);
}
}
public void TaskType2(object state)
{
for (int i = 10; i <= 100; i++)
{
Console.WriteLine($"TaskTwo numbering {i}");
Thread.Sleep(100);
}
}
public void TaskType3A(object state)
{
Increment(200000000);
}
public void TaskType3B(object state)
{
Increment(40000);
}
private void Increment(int startNumber)
{
for (int i = startNumber; i <= startNumber + 1000; i++)
{
Console.WriteLine($"Private {startNumber} numbering {i}");
Thread.Sleep(5);
}
}
}
1 Firstly I want to ensure these don't get tied up with each other when one sometimes runs long.
Eg. If Task one takes 20 seconds to run sometimes, I want to prevent a duplicate timer while the previous might still be running, the same for all of the timers infact. Eg. if t2 is running for a little longer than usual then don't start another. I've read a little about if (Monitor.TryEnter(lockObject)), is that the best way to handle that requirement?
2 Secondly if they both access the same resource (in my case an EF context), such that t3 is already using it, and t4 tries to do so. Is there a way of asking the timer to wait until the other finishes?
3 Lastly is there a way I can monitor these timer/callbacks? I'd like to provide an UI to see the state of this when I have it running as a windows service. My endgame there is to provide a UI that users can see if a task is running, and if not then trigger it on demand if one isn't set to run for a little while. But in the same breath, not create a duplicate while one is running.
I have wondered whether I should've asked these as separate questions, but they seem so entwined with the decision of each other.
If you have to make sure that each thread doesn't have any overlap, you can use the Timer.Change(int, int) method to stop executing at the start of the callback, and then resume it at the end of the callback. You can also do some magic with a ManualResetEvent for each thread but it'll get messy.
I'm not a fan of timers for threading and try to avoid them whenever I can. If you can sacrifice the "each thread must run after n seconds", do it. Use tasks with a cancellation token instead, it will solve your overlap problem. For example:
A.
public class Foo
{
private CancellationTokenSource _cts;
//In case you care about what tasks you have.
private List< Task > _tasks;
public Foo()
{
this._cts = new CancellationTokenSource();
this._tasks.Add(Task.Factory.StartNew(this.Method1, this._cts.Token));
this._tasks.Add(Task.Factory.StartNew(this.Method2, this._cts.Token));
this._tasks.Add(Task.Factory.StartNew(this.Method3, this._cts.Token));
this._tasks.Add(Task.Factory.StartNew(this.Method4, this._cts.Token));
}
private void Method1(object state)
{
var token = (CancellationToken) state;
while ( !token.IsCancellationRequested )
{
//do stuff
}
}
private void Method2(object state)
{
var token = (CancellationToken)state;
while (!token.IsCancellationRequested)
{
//do stuff
}
}
private void Method3(object state)
{
var token = (CancellationToken)state;
while (!token.IsCancellationRequested)
{
//do stuff
}
}
private void Method4(object state)
{
var token = (CancellationToken)state;
while (!token.IsCancellationRequested)
{
//do stuff
}
}
public void StopExecution()
{
this._cts.Cancel();
}
}
An EF context will throw an exception if used by more than one thread at a time. There is a way to synchronize it, using lock. It would look something like this, given the example above:
B.
public class Foo
{
private object _efLock;
public Foo()
{
this._efLock = new object();
}
.
.
.
private void MethodX(object state)
{
var token = (CancellationToken)state;
while (!token.IsCancellationRequested)
{
lock(this._efLock)
{
using(.......
}
}
}
}
You'll have to do that in each thread that accesses your EF context. Keep in mind that, again, maintenance gets annoying because of the cognitive load that goes with complex lock scenarios.
I recently developed an application in which I needed multiple threads to access the same EF context. As I mentioned above, the locking got to be too much (and there was a performance requirement), so I devised a solution where each thread adds its object to a common queue, and a separate thread does nothing but pull data from the queue and call into EF. That way the EF context is only ever accessed by one thread. Problem solved. Here is what that would look like given the sample above:
C.
public class Foo
{
private struct InternalEFData
{
public int SomeProperty;
}
private CancellationTokenSource _dataCreatorCts;
private CancellationTokenSource _efCts;
//In case you care about what tasks you have.
private List< Task > _tasks;
private Task _entityFrameworkTask;
private ConcurrentBag< InternalEFData > _efData;
public Foo()
{
this._efData = new ConcurrentBag< InternalEFData >();
this._dataCreatorCts = new CancellationTokenSource();
this._efCts = new CancellationTokenSource();
this._entityFrameworkTask = Task.Factory.StartNew(this.ProcessEFData, this._efCts.Token);
this._tasks.Add(Task.Factory.StartNew(this.Method1, this._dataCreatorCts.Token));
this._tasks.Add(Task.Factory.StartNew(this.Method2, this._dataCreatorCts.Token));
.
.
.
}
private void ProcessEFData(object state)
{
var token = (CancellationToken) state;
while ( !token.IsCancellationRequested )
{
InternalEFData item;
if (this._efData.TryTake(out item))
{
using ( var efContext = new MyDbContext() )
{
//Do processing.
}
}
}
}
private void Method1(object state)
{
var token = (CancellationToken) state;
while ( !token.IsCancellationRequested )
{
//Get data from whatever source
this._efData.Add(new InternalEFData());
}
}
private void Method2(object state)
{
var token = (CancellationToken) state;
while ( !token.IsCancellationRequested )
{
//Get data from whatever source
this._efData.Add(new InternalEFData());
}
}
public void StopExecution()
{
this._dataCreatorCts.Cancel();
this._efCts.Cancel();
}
}
When it comes to reading data from executing threads, I generally use a SynchronizationContext. I don't know if it's the right object to use and someone else can probably comment on that. Create a Synchronization object, pass it to your threads and have them update it with the necessary data and post it to your UI/Console thread:
D.
public struct SyncObject
{
public int SomeField;
}
public delegate void SyncHandler(SyncObject s);
public class Synchronizer
{
public event SyncHandler OnSynchronization;
private SynchronizationContext _context;
public Synchronizer()
{
this._context = new SynchronizationContext();
}
public void PostUpdate(SyncObject o)
{
var handleNullRefs = this.OnSynchronization;
if ( handleNullRefs != null )
{
this._context.Post(state => handleNullRefs((SyncObject)state), o);
}
}
}
public class Foo
{
private Synchronizer _sync;
public Foo(Synchronizer s)
{
this._sync = s;
}
private void Method1(object state)
{
var token = (CancellationToken) state;
while ( !token.IsCancellationRequested )
{
//do things
this._sync.PostUpdate(new SyncObject());
}
}
}
Again, that's how I do it, I don't know if it's the proper way.
basically, yes, or AutoResetEvent
you can stop it, wait for the resource to free up, and then restart it
keep a list of states associated with your timers, and update those states from the timers (set to running when starting, set to waiting when done, or something along these lines)
1: Likely the best is not to do anything in the timers but stat a task - IF any WHEN a flag is set or not set. Look for interlocked (the class) on how to implement that without locking.
2: Monitor. But seriously, why do they share an EF contect?
3: Sure. Create performance counters. Monitor them. The API is in windows for many many years.

WinForm asynchronously update UI status from console application call

I want to asynchronously update UI status when doing a long-time task . The program is a console application , however , when I execute the async operations , the UI thread will exit soon after the task begins .
How should I let the UI thread wait when my long-time task finish ?
I simplify my code as below :
public static class Program
{
static void Main()
{
WorkerWrapper wp = new WorkerWrapper();
wp.ProcessData();
}
}
public class WorkerWrapper
{
private RateBar bar;
public void ProcessData()
{
bar = new RateBar();
bar.Show();
Worker wk = new Worker();
wk.WorkProcess += wk_WorkProcess;
Action handler = new Action(wk.DoWork);
var result = handler.BeginInvoke(new AsyncCallback(this.AsyncCallback), handler);
}
private void AsyncCallback(IAsyncResult ar)
{
Action handler = ar.AsyncState as Action;
handler.EndInvoke(ar);
}
private void wk_WorkProcess(object sender, PrecentArgs e)
{
if (e.Precent < 100)
{
bar.Precent = e.Precent;
}
}
}
public class Worker
{
public event EventHandler<PrecentArgs> WorkProcess;
public void DoWork()
{
for (int i = 0; i < 100; i++)
{
WorkProcess(this, new PrecentArgs(i));
Thread.Sleep(100);
}
}
}
public class PrecentArgs : EventArgs
{
public int Precent { get; set; }
public PrecentArgs(int precent)
{
Precent = precent;
}
}
public partial class RateBar : Form
{
public int Precent
{
set
{
System.Windows.Forms.MethodInvoker invoker = () => this.progressBar1.Value = value;
if (this.progressBar1.InvokeRequired)
{
this.progressBar1.Invoke(invoker);
}
else
{
invoker();
}
}
}
public RateBar()
{
InitializeComponent();
}
}
However , in method ProcessData() , if I add result.AsyncWaitHandle.WaitOne() in the end to wait my operation to complete , the Form will freeze .
Is there anything wrong with my way to wait the thread to complete ?
Reason that your application exiting before your "background threads" completed is when there are multiple threads application exists soon after there are not any foreground threads. This is explained more in here http://msdn.microsoft.com/en-us/library/system.threading.thread.isbackground(v=vs.110).aspx
You should add proper waiting mechanisms to your background threads to be completed. There are multiple ways of letting other threads know that the thread is complete. Please refer here. How to wait for thread to finish with .NET?
You shouldn't block the UI thread waiting for the result, but rather retrieve the result from EndInvoke. Your deadlock probably occurs because you are using both result.AsyncWaitHandle.WaitOne() and EndInvoke, both will block until the result is available.
In my opinion the best option is to not call result.AsyncWaitHandle.WaitOne() and just retrieve the result in the AsyncCallback
private void AsyncCallback(IAsyncResult ar)
{
Action handler = ar.AsyncState as Action;
var result = handler.EndInvoke(ar);
}
More information here. Also if you are using .net 4.0 or higher, this sort of thing can be done much easier with async/await.
I write down this solution and hope it may helps others with same question .
The key to this problem is to use a new thread to run RateBar's ShowDialog function .
public void ProcessData()
{
new Thread(() => new RateBar().ShowDialog()).Start();
Worker wk = new Worker();
wk.WorkProcess += wk_WorkProcess;
Action handler = new Action(wk.DoWork);
var result = handler.BeginInvoke(new AsyncCallback(this.AsyncCallback), handler);
}

How close the Thread opening after it ends?

I wonder how to abort my Thread after my function ends Thread.Abort();
My application running files and each file is opened is different thread
int _counter;
int _parallelThreads
_queue = new Queue();
public void transmit()
{
while (_counter < _parallelThreads)
{
lock (_queue)
{
string file = (string)_queue.Dequeue();
ThreadStart ts = delegate { processFile(file); };
Thread thread = new Thread(ts);
thread.IsBackground = true;
thread.Start();
_counter++;
}
}
}
private void processFile(string file)
{
WiresharkFile wf = new WiresharkFile(file, _selectedOutputDevice, 1);
wf.OnFinishPlayEvent += wf_OnFinishPlayEvent;
wf.sendBuffer();
}
and this is the event that my file finished
private void wf_OnFinishPlayEvent(MyClass class)
{
// here i want to abort my thread
}
The reason i want to abort my thread when it finished is because i think this is my memory lack reason in case i open a lot of parallels thread and run it over ond over (my application memory usage read more than 1 giga)
when you abort a thread, a lot of unexpected things can go wrong. particularly when you work with files. when i had to do that (for example, a "cancel" button) i used a litlle trick.
i had a flag IsCanceled on a scope both threads can see be set to true, and on the worker thread, every few statement, will check that flag and close all open files and end itself.
this might not work well for your situation, depending on wf.sendBuffer(); logic. let me know
Example:
private void processFile(string file)
{
WiresharkFile wf = new WiresharkFile(file, _selectedOutputDevice, 1);
wf.OnFinishPlayEvent += wf_OnFinishPlayEvent;
if(IsCanceled == false)
{
wf.sendBuffer();
}
}
and if the sendBuffer() method logic is too long, then
public void sendBuffer()
{
// some logic
if(IsCanceled)
{
// close open streams
return;
}
// some logic
}
as for the flag itself, a singleton class could do just fine for that, or a class all the other classes know
public class Singleton
{
private static Singleton instance;
private bool isCanceled;
private Singleton()
{
isCanceled = false;
}
public static Singleton Instance
{
get
{
if (instance == null)
{
instance = new Singleton();
}
return instance;
}
}
public bool IsCanceled
{
get
{
return isCanceled;
}
set
{
isCanceled = value;
}
}
}
notice that the singleton class is open to everyone, and you might want to use a class only known by the threads that needs to check it. that is something that depend on your security needs.
You should not abort the threads, threads will quit automatically when the code in it finishes. Maybe you just want to wait the thread to finish, after that do something else.
You can use an array to store the thread, and use Thread.Join() to wait all the threads end.
List<Thread> threadList = new List<Thread>();
public void transmit()
{
while (_counter < _parallelThreads)
{
lock (_queue)
{
string file = (string)_queue.Dequeue();
ThreadStart ts = delegate { processFile(file); };
Thread thread = new Thread(ts);
thread.IsBackground = true;
threadList.Add(thread); //add thread to list
thread.Start();
_counter++;
}
}
//wait threads to end
foreach(Thread t in threadList)
t.Join();
}
private void processFile(string file)
{
WiresharkFile wf = new WiresharkFile(file, _selectedOutputDevice, 1);
wf.OnFinishPlayEvent += wf_OnFinishPlayEvent;
wf.sendBuffer();
}

Categories

Resources