I have an application that uses timers to occasionally run monitoring tasks on secondary threads. Some of these cleanup tasks take a lot of time and I would like to be able to abort these tasks (gracefully if possible) when my user ends the program.
Is there any way to abort the thread programatically as I can with Thread.Abort(), or would I have to add a flag to the code to indicate that the thread has finished and check for that in valrious places in the code that is started by the timer?
You can stop the timer before it's callback executes using .change, but once the callback starts executing you should use an application level flag to allow your code to exit.
As a side note, you shouldn't use thread.abort() unless you are absolutely 100% sure that you know the state it is going to be left in. it can seriously destabilize your application in strange ways.
There is no way to know the Thread on which a Threading.Timer callback will run ahead of time. Hence there is no general way to abort it. It is possible to have the callback itself communicate the Thread instance but it opens up a couple of race conditions
Note: In general using Abort is a bad practice. It's a fairly reliable way to end up with hard to detect deadlocks and / or resource leaks. It's much better to use a passive mechanism like CancellationToken
use this.Timer.Change(Timeout.Infinite, Timeout.Infinite);
first i have use .dispose method but it does not work in my case so i have use Timer.change.
this is the best solution i have find.
You mean something along these lines?
using System;
using System.Threading ;
class AppCore : IDisposable
{
Timer TimerInstance ;
string[] Args ;
public AppCore( string[] args )
{
if ( args == null ) throw new ArgumentNullException("args") ;
this.TimerInstance = new Timer( Tick , null , new TimeSpan(0,0,30) , new TimeSpan(0,0,15) ) ;
this.Args = args ;
this.Cancelled = false ;
this.Disposed = false ;
return ;
}
public int Run()
{
// do something useful
return 0 ;
}
private bool Cancelled ;
public void Cancel()
{
lock( TimerInstance )
{
Cancelled = true ;
TimerInstance.Change( System.Threading.Timeout.Infinite , System.Threading.Timeout.Infinite ) ;
}
return ;
}
private void Tick( object state )
{
if ( !Cancelled )
{
// do something on each tick
}
return ;
}
private bool Disposed ;
public void Dispose()
{
lock ( TimerInstance )
{
if ( !Disposed )
{
using ( WaitHandle handle = new EventWaitHandle( false , EventResetMode.ManualReset ) )
{
TimerInstance.Dispose( handle ) ;
handle.WaitOne() ;
}
Disposed = true ;
}
}
return ;
}
}
public void stopTimer(){
myThreadingTimer = null;
}
Explication: Destroy object = destroy proccess.
Related
As stated in the title i have this situation
lock ( _myLockObj )
{
// protected section here ( select + update over SQL Tables)
}
That works great , but sometimes i do not need thread safety , because is guaranteed by datas that nothing wrong could happen (even if two thread run in parralel) but i need speed .
At the moment program can understand when threads could have threads safety problem and when not ( and when not i need to be fast as possible ) .
What i would achieve make is optionallity on that lock instruction , so that it become effective only if in proper condition is true .
eg :
lock ( _myLockObj ) && flag
I am pretty sure that lock keyword does not provide that semantics, what i would understand is what is the proper way to achieve that behavior.
The lock statement is a syntactic sugar for Monitor.Enter + Monitor.Exit in a try/finally block.
You can use these methods directly:
bool flag = false;
bool acquiredLock = false;
try
{
if (flag)
{
Monitor.Enter(_myLockObj, ref acquiredLock);
}
}
finally
{
if (flag & acquiredLock)
{
Monitor.Exit(_myLockObj);
}
}
Personally I think the following is nicer:
private bool flag;
public T DoStuff() {
T DoStuffUnsafe() {
// ...
}
if (flag) {
lock (_myLockObj) {
return DoStuffUnsafe();
}
}
else {
return DoStuffUnsafe();
}
}
I am writing a windows service application that is capable of collecting data from sensors like temperature, pressure volume etc...
The frequency at which the data is read is pretty high, there could be a hundred sensors and the data being received could be at a frequency could be one per second per sensor..
I need to store this data to an oracle database, for obvious reasons i dont want to hit the database at such a high rate.
Hence i want to create a Buffer.
My plan is to create a Buffer using the standard .NET Queue, a few threads keep Enqueue data into the queue and another timer driven thread can keep writing into the database at regular intervals.
What i want to know is..?? Is This thread safe
If this is not, what is the best way of creating a in memory buffer
To answer your question, as long as you lock accesses, you can have multiple threads access a regular queue.
For me though, I didn't use that and wanted to use queues with locks to keep them thread safe. I have been doing this in c# for one of my programs. I just use a regular queue, and then put a locker on accesses to it (enqueue, dequeue, count). It is completely thread safe if you just lock the accesses.
My setup comes from the tutorial/example here: http://www.albahari.com/threading/part2.aspx#_ProducerConsumerQWaitHandle
My situation is a little different than yours, but pretty similar. For me, my data can come in very fast, and if I don't queue it I lose the data if multiple come in at the same time. Then I have a thread running that slowly takes items off the queue and processes them. This hand-off uses an AutoResetEvent to hold my working-thread until data is ready to be processed. In your case you would use a timer or something that happens regularly.
I copy/pasted my code and tried to change the names. Hopefully I didn't completely break it by missing some name changes, but you should be able to get the gist.
public class MyClass : IDisposable
{
private Thread sensorProcessingThread = null;
private Queue<SensorData> sensorQueue = new Queue<SensorData>();
private readonly object _sensorQueueLocker = new object();
private EventWaitHandle _whSensorEvent = new AutoResetEvent(false);
public MyClass () {
sensorProcessingThread = new Thread(sensorProcessingThread_DoWork);
sensorProcessingThread.Start();
}
public void Dispose()
{
// Signal the end by sending 'null'
EnqueueSensorEvent(null);
sensorProcessingThread.Join();
_whSensorEvent.Close();
}
// The fast sensor data comes in, locks queue, and then
// enqueues the data, and releases the EventWaitHandle
private void EnqueueSensorEvent( SensorData wd )
{
lock ( _sensorQueueLocker )
{
sensorQueue.Enqueue(wd);
_whSensorEvent.Set();
}
}
// When asynchronous events come in, I just throw them into queue
private void OnSensorEvent( object sender, MySensorArgs e )
{
EnqueueSensorEvent(new SensorData(sender, e));
}
// I have several types of events that can come in,
// they just get packaged up into the same "SensorData"
// struct, and I worry about the contents later
private void FileSystem_Changed( object sender, System.IO.FileSystemEventArgs e )
{
EnqueueSensorEvent(new SensorData(sender, e));
}
// This is the slower process that waits for new SensorData,
// and processes it. Note, if it sees 'null' as data,
// then it knows it should quit the while(true) loop.
private void sensorProcessingThread_DoWork( object obj )
{
while ( true )
{
SensorData wd = null;
lock ( _sensorQueueLocker )
{
if ( sensorQueue.Count > 0 )
{
wd = sensorQueue.Dequeue();
if ( wd == null )
{
// Quit the loop, thread finishes
return;
}
}
}
if ( wd != null )
{
try
{
// Call specific handlers for the type of SensorData that was received
if ( wd.isSensorDataType1 )
{
SensorDataType1_handler(wd.sender, wd.SensorDataType1Content);
}
else
{
FileSystemChanged_handler(wd.sender, wd.FileSystemChangedContent);
}
}
catch ( Exception exc )
{
// My sensor processing also has a chance of failing to process completely, so I have a retry
// methodology that gives up after 5 attempts
if ( wd.NumFailedUpdateAttempts < 5 )
{
wd.NumFailedUpdateAttempts++;
lock ( _sensorQueueLocker )
{
sensorQueue.Enqueue(wd);
}
}
else
{
log.Fatal("Can no longer try processing data", exc);
}
}
}
else
_whWatchEvent.WaitOne(); // No more tasks, wait for a signal
}
}
Something you could maybe look at is Reactive (Rx) for .net from Microsoft. Check out: https://msdn.microsoft.com/en-us/data/gg577611.aspx and at the bottom of page is a pdf tutorial "Curing the asynchronous blues": http://go.microsoft.com/fwlink/?LinkId=208528 This is something very different but maybe you will see something you like.
I have 2 threads to are triggered at the same time and run in parallel. These 2 threads are going to be manipulating a string value, but I want to make sure that there are no data inconsistencies. For that I want to use a lock with Monitor.Pulse and Monitor.Wait. I used a method that I found on another question/answer, but whenever I run my program, the first thread gets stuck at the Monitor.Wait level. I think that's because the second thread has already "Pulsed" and "Waited". Here is some code to look at:
string currentInstruction;
public void nextInstruction()
{
Action actions = {
fetch,
decode
}
Parallel.Invoke(actions);
_pc++;
}
public void fetch()
{
lock(irLock)
{
currentInstruction = "blah";
GiveTurnTo(2);
WaitTurn(1);
}
decodeEvent.WaitOne();
}
public void decode()
{
decodeEvent.Set();
lock(irLock)
{
WaitTurn(2);
currentInstruction = "decoding..."
GiveTurnTo(1);
}
}
// Below are the methods I talked about before.
// Wait for turn to use lock object
public static void WaitTurn(int threadNum, object _lock)
{
// While( not this threads turn )
while (threadInControl != threadNum)
{
// "Let go" of lock on SyncRoot and wait utill
// someone finishes their turn with it
Monitor.Wait(_lock);
}
}
// Pass turn over to other thread
public static void GiveTurnTo(int nextThreadNum, object _lock)
{
threadInControl = nextThreadNum;
// Notify waiting threads that it's someone else's turn
Monitor.Pulse(_lock);
}
Any idea how to get 2 parallel threads to communicate (manipulate the same resources) within the same cycle using locks or anything else?
You want to run 2 peaces of code in parallel, but locking them at start using the same variable?
As nvoigt mentioned, it already sounds wrong. What you have to do is to remove lock from there. Use it only when you are about to access something exclusively.
Btw "data inconsistencies" can be avoided by not having to have them. Do not use currentInstruction field directly (is it a field?), but provide a thread safe CurrentInstruction property.
private object _currentInstructionLock = new object();
private string _currentInstruction
public string CurrentInstruction
{
get { return _currentInstruction; }
set
{
lock(_currentInstructionLock)
_currentInstruction = value;
}
}
Other thing is naming, local variables name starting from _ is a bad style. Some peoples (incl. me) using them to distinguish private fields. Property name should start from BigLetter and local variables fromSmall.
I'm building a multithreaded app in .net.
I have a thread that listens to a connection (abstract, serial, tcp...).
When it receives a new message, it adds it to via AddMessage. Which then call startSpool. startSpool checks to see if the spool is already running and if it is, returns, otherwise, starts it in a new thread. The reason for this is, the messages HAVE to be processed serially, FIFO.
So, my questions are...
Am I going about this the right way?
Are there better, faster, cheaper patterns out there?
My apologies if there is a typo in my code, I was having problems copying and pasting.
ConcurrentQueue<IMyMessage > messages = new ConcurrentQueue<IMyMessage>();
const int maxSpoolInstances = 1;
object lcurrentSpoolInstances;
int currentSpoolInstances = 0;
Thread spoolThread;
public void AddMessage(IMyMessage message)
{
this.messages.Add(message);
this.startSpool();
}
private void startSpool()
{
bool run = false;
lock (lcurrentSpoolInstances)
{
if (currentSpoolInstances <= maxSpoolInstances)
{
this.currentSpoolInstances++;
run = true;
}
else
{
return;
}
}
if (run)
{
this.spoolThread = new Thread(new ThreadStart(spool));
this.spoolThread.Start();
}
}
private void spool()
{
Message.ITimingMessage message;
while (this.messages.Count > 0)
{
// TODO: Is this below line necessary or does the TryDequeue cover this?
message = null;
this.messages.TryDequeue(out message);
if (message != null)
{
// My long running thing that does something with this message.
}
}
lock (lcurrentSpoolInstances)
{
this.currentSpoolInstances--;
}
}
This would be easier using BlockingCollection<T> instead of ConcurrentQueue<T>.
Something like this should work:
class MessageProcessor : IDisposable
{
BlockingCollection<IMyMessage> messages = new BlockingCollection<IMyMessage>();
public MessageProcessor()
{
// Move this to constructor to prevent race condition in existing code (you could start multiple threads...
Task.Factory.StartNew(this.spool, TaskCreationOptions.LongRunning);
}
public void AddMessage(IMyMessage message)
{
this.messages.Add(message);
}
private void Spool()
{
foreach(IMyMessage message in this.messages.GetConsumingEnumerable())
{
// long running thing that does something with this message.
}
}
public void FinishProcessing()
{
// This will tell the spooling you're done adding, so it shuts down
this.messages.CompleteAdding();
}
void IDisposable.Dispose()
{
this.FinishProcessing();
}
}
Edit: If you wanted to support multiple consumers, you could handle that via a separate constructor. I'd refactor this to:
public MessageProcessor(int numberOfConsumers = 1)
{
for (int i=0;i<numberOfConsumers;++i)
StartConsumer();
}
private void StartConsumer()
{
// Move this to constructor to prevent race condition in existing code (you could start multiple threads...
Task.Factory.StartNew(this.spool, TaskCreationOptions.LongRunning);
}
This would allow you to start any number of consumers. Note that this breaks the rule of having it be strictly FIFO - the processing will potentially process "numberOfConsumer" elements in blocks with this change.
Multiple producers are already supported. The above is thread safe, so any number of threads can call Add(message) in parallel, with no changes.
I think that Reed's answer is the best way to go, but for the sake of academics, here is an example using the concurrent queue -- you had some races in the code that you posted (depending upon how you handle incrementing currnetSpoolInstances)
The changes I made (below) were:
Switched to a Task instead of a Thread (uses thread pool instead of incurring the cost of creating a new thread)
added the code to increment/decrement your spool instance count
changed the "if currentSpoolInstances <= max ... to just < to avoid having one too many workers (probably just a typo)
changed the way that empty queues were handled to avoid a race: I think you had a race, where your while loop could have tested false, (you thread begins to exit), but at that moment, a new item is added (so your spool thread is exiting, but your spool count > 0, so your queue stalls).
private ConcurrentQueue<IMyMessage> messages = new ConcurrentQueue<IMyMessage>();
const int maxSpoolInstances = 1;
object lcurrentSpoolInstances = new object();
int currentSpoolInstances = 0;
public void AddMessage(IMyMessage message)
{
this.messages.Enqueue(message);
this.startSpool();
}
private void startSpool()
{
lock (lcurrentSpoolInstances)
{
if (currentSpoolInstances < maxSpoolInstances)
{
this.currentSpoolInstances++;
Task.Factory.StartNew(spool, TaskCreationOptions.LongRunning);
}
}
}
private void spool()
{
IMyMessage message;
while (true)
{
// you do not need to null message because it is an "out" parameter, had it been a "ref" parameter, you would want to null it.
if(this.messages.TryDequeue(out message))
{
// My long running thing that does something with this message.
}
else
{
lock (lcurrentSpoolInstances)
{
if (this.messages.IsEmpty)
{
this.currentSpoolInstances--;
return;
}
}
}
}
}
Check 'Pipelines pattern': http://msdn.microsoft.com/en-us/library/ff963548.aspx
Use BlockingCollection for the 'buffers'.
Each Processor (e.g. ReadStrings, CorrectCase, ..), should run in a Task.
HTH..
Is it possible to cancel a linq2sql query? Like if I have built a query that takes a while to run, I would like it to be possible for the user to cancel it. Does anyone have any good ideas on this?
If you set the CommandTimeout (seconds) property of DataContext, it will automatically throw an exception after the timeout elapses.
So, according to Richard Szalay's comment:
Your best bet is to run the query in a background thread and simply unsubscribe from the container object's events when the user hits Cancel.
And I think I agree that this is a an ok work-around for now. What I would love to see is some Async query functionality already in the framework, but until that happens this will have to do.
Haven't started to implement this yet (have to finish some other things first), but one way it could work:
In the working thread, do the queries in a separate query thread and then Join that thread until it is finished.
When user hits cancel, call the Interrupt method of the working thread, which would then get an ThreadInterruptedException and stop waiting for the query thread to finish.
May add some code later when I make it. But we'll see how pretty it turns out :p
I know this answer is kind of late but this is how I do it:
class Program
{
public class Person
{
public string Name;
public int Age;
}
public static void ExecuteQueryAsync ( IEnumerable<Person> collectionToQuery , Action<List<Person>> onQueryTerminated , out Action stopExecutionOfQuery )
{
var abort = false;
stopExecutionOfQuery = () =>
{
abort = true;
};
Task.Factory.StartNew( () =>
{
try
{
var query = collectionToQuery.Where( x =>
{
if ( abort )
throw new NotImplementedException( "Query aborted" );
// query logic:
if ( x.Age < 25 )
return true;
return
false;
} );
onQueryTerminated( query.ToList() );
}
catch
{
onQueryTerminated( null );
}
});
}
static void Main ( string[] args )
{
Random random = new Random();
Person[] people = new Person[ 1000000 ];
// populate array
for ( var i = 0 ; i < people.Length ; i++ )
people[ i ] = new Person() { Age = random.Next( 0 , 100 ) };
Action abortQuery;
ExecuteQueryAsync( people , OnQueryDone , out abortQuery );
// if after some time user wants to stop query:
abortQuery();
Console.Read();
}
static void OnQueryDone ( List<Person> results )
{
if ( results == null )
Console.WriteLine( "Query was canceled by the user" );
else
Console.WriteLine( "Query yield " + results.Count + " results" );
}
}
I'd say you'd probably need to run it on a separate Thread and cancel that instead.