Best practice for continual running process in C# - c#

I am working on a project in C#.NET using the .NET framework version 3.5.
My project has a class called Focuser.cs which represents a physical device, a telescope focuser, that can communicate with a PC via a serial (RS-232) port. My class (Focuser) has properties such as CurrentPosition, CurrentTemperature, ect which represents the current conditions of the focuser which can change at any time. So, my Focuser class needs to continually poll the device for these values and update its internal fields. My question is, what is the best way to perform this continual polling sequence? Occasionally, the user will need to switch the device into a different mode which will require the ability to stop the polling, perform some action, and then resume polling.
My first attempt was to use a time that ticks every 500ms and then calls up a background worker which polls for one position and one temperature then returns. When the timer ticks if the background worker isBusy then it just returns and tries again 500ms later. Someone suggested that I get rid of the background worker all together and just do the poll in the timer tick event. So I set the AutoReset property of the timer to false and then just restart the timer every time a poll finishes. These two techniques seemed to behave the exact same way in my application so I am not sure if one is better than the other. I also tried creating a new thread every time I want to do a poll operation using a new ThreadStart and all that. This also seemed to work fine.
I should mention one other thing. This class is part of a COM object server which basically means that the class library that is produced will be called upon via COM. I am not sure if this has any influence on the answer but I just thought I should throw it out there.
The reason I am asking all of this is that all of my test harness runs and debug builds work just fine but when I do a release build and try to make calls to my class from another application, that application freezes up and I am having a hard time determining the cause.
Any advice, suggestions, comments would be appreciated.
Thanks, Jordan

Remember that the timer hides its own background worker thread, which basically sleeps for the interval, then fires its Elapsed event. Knowing that, it makes sense just to put the polling in Elapsed. This would be the best practice IMO, rather than starting a thread from a thread. You can start and stop Timers as well, so the code that switches modes can Stop() the Timer, perform the task, then Start() it again, and the Timer doesn't even have to know the telescope IsBusy.
However, what I WOULD keep track of is whether another instance of the Elapsed event handler is still running. You could lock the Elapsed handler's code, or you could set a flag, visible from any thread, that indicates another Elapsed() event handler is still working; Elapsed event handlers that see this flag set can exit immediately, avoiding concurrency problems working with the serial port.

So it looks like you have looked at 2 options:
Timer. The Timer is non-blocking while waiting (uses another thread), so the rest of the program can continue running and be responsive. When the timer event kicks off, you simply get/update the current values.
Timer + BackgroundWorker. The background worker is also simply a separate thread. It may take longer to actually start the thread than to simply get the current values. Unless it takes a long time to get the current values and causes your program to become unresponsive, this is unnecessary complexity.
If getting values is fast enough, stick to #1 for simplicity.
If getting values is slow, #2 will work but unnecessarily has a thread start a thread. Instead, do it with only a BackgroundWorker (no Timer). Create the BackgroundWorker once and store in a variable. No need to recreate it every time. Make sure to set WorkerSupportsCancellation to true. Whenever you want to start checking values, on your main program thread do bgWorker.RunWorkerAsync(). When you want to stop, do bgWorker.CancelAsync(). Inside your DoWork method, have a loop that checks the values and does a Thread.Sleep(500). Since it's a separate thread, it won't make your program unresponsive. In the loop conditions, also check to see if the polling was cancelled and break out. You'll probably need a way to get the values back to the main thread. You can use ReportProgress() if an integer is good enough. Otherwise you can create an object to hold the content, but make sure to lock (object) { } before reading and modifying it. This is a quick summary, but if you go this route I would recommend you read: http://www.albahari.com/threading/part3.aspx#_BackgroundWorker

Is the process of contacting the telescope and getting the current values actually take long enough to warrant polling? Have you tried dropping the multithreading and just blocking while you get the current value?
To answer your question, however, I would suggest not using a background worker but an actual Thread that updates the properties continuously.
If all these properties are read only (can you set the temp of the telescope?) and there are no dependencies between them (e.g., no transactions are required to update multiple properties at once) you can drop all the blocking code and let your thread update willy-nilly while other threads access the properties.
I suggest a real, dedicated Thread rather than the thread pool just because of a lack of knowledge of what might happen when mixing background threads and COM servers. Also, apartment state might play into this; with a Thread you can try STA but you can't do that with a threadpool thread.

You say the app freezes up in a release build?
To eliminate extra variables, I'd take all the timer/multi-threaded code out of the application(just comment it out), and try it with a straightforward blocking method.
i.e. You click a button, it calls a function, that function hits the COM object for data, and then updates the UI. All in a blocking, synchronous fashion. This will tell you for sure whether it's the multi-threading code that's freezing you up, or if it's the COM interaction itself.

How about starting a background thread with ThreadPool? Then enter a loop based on a bool (While (bContinue)) that loops and does your work and then a Thread.Sleep at the end of the loop - exiting the program would include setting bContinue to false so the thread stops - perhaps hook it up to the OnStop event in a windows service
bool bRet = ThreadPool.QueueUserWorkItem(new WaitCallback(ThreadFunc));
private void ThreadFunc(object objState)
{
// enter loop
bContinue = true;
while (bContinue) {
// do stuff
// sleep
Thread.Sleep(m_iWaitTime_ms);
}
}

Related

Thread in C# Timer

I have a windows service that does long operations every X minutes using System.Timers.Timer.
I need to kwnow if there is a way to figure out, in the next loop, if the previous execution is still running.
Is there a way to know how much CPU memory the previous thread using?
A simple solution would be to use a shared, volatile bool that is set in the start of the method, and reset at the end. Or use Interlocked.Increment/decrement of a shared counter to keep a running total of the number of threads in your method.
If you do not want invocations to overlap another, one solution would be to use a Threading.Timer, set the due-time you want and a infinite period. At the end of each event you call .Change to reset the due-time. Another way to do essentially the same thing would be a loop over Task.Delay:
while(!cancellationToke.IsCancellationRequested){
// do work
await Task.Delay(periodTime);
}
Your question about CPU/Memory does not make that much sense. A thread is either running, waiting to run, or blocked. And the only memory that can be directly attributed to the thread would be the stack for that thread, and that is usually quite small, and has a 1Mb limit by default if I'm remembering correctly. If you want to measure how much time a thread is in the running state, you need to do periodic sampling or instrument the scheduler.

.NET Scheduling many operations: single timer vs one timer for operation

I am developing a Windows Service application, in .NET, which executes many functions (it is a WCF service host), and one of the targets is running scheduled tasks.
I chose to create a System.Threading.Timer for every operation, with a dueTime set to the next execution and no period to avoid reentrancy.
Every time the operation ends, it changes the dueTime to match the next scheduled execution.
Most of the operations are scheduled to run every minute, not all toghether but delayed by some seconds each other.
Now, after adding a number of operations, about 30, it seems that the timers start to be inaccurate, starting the operations many seconds late, or even minutes late.
I am running the operation logic directly in the callback method of the timer, so the running thread should be the same as the timer.
Should I create a Task to run the operation instead of running it in the callback method to improve accuracy?
Or should I use a single timer with a fixed (1 second) dueTime to check which operations need to be started?
I don't like this last option because it would be more difficult to handle reentrancy..
Timers fire on a thread pool thread, so you are probably finding that as you add lots of timers that you are exhausting the thread pool.
You could increase the size of the thread pool, or alternatively ensure you have fewer timers than the thread pool size.
Firing off Tasks from the callback likely won't help - since you are going to be fighting for threads from the same thread pool. Unless you use long-running tasks.
We usually setup multiple timers to handle different actions within a single service. We set the intervals and start, stop the timer on the Service Start/Stop/Shutdown events (and have a variable indicating the status for each one, i.e. bool Stopped)
When the timer ticks over, we stop the timer, run the processing (which may take a while depending on the process, i.e. may take longer than the interval if its short.. (this code needs to be in a try--catch so it keeps going on errors)
After the code has processed, we check the Stopped variable and if its not stopped we start the timer again (this handles the reentrancy that you've mentioned and allows the code to stick to the interval as much as possible)
Timers are generally more accurate after about 100ms as far as I know, but should be close enough for what you want to do.
We have run this concept for years, and it hasn't let us down.
If you running these tasks as a sub-system of an ASP.NET app, you should also look at HangFire, which can handle background processing, eliminating the need for the windows service.
How accurate do the timers need to be? you could always use a single timer and run multiple processing threads at the same time? or queue the calls to some operations if less critical.
Ok, I came to a decision: since I am not able to easily reproduce the behavior, I chose to solve the root problem and use the Service process to only:
serve WCF requests done by clients
schedule operations (which was problematic)
Every operation that could eat CPU is executed by another process, which is controlled directly by the main process (with System.Diagnostics.Process and its events) and communicates with it through WCF.
When I start the secondary process, I pass to it the PID of the main process through command line. If the latter gets killed, the Process.Exited event fires, and I can close the child process too.
This way the main service usually doesn't use much CPU time, and is free to schedule happily without delays.
Thanks to all who gave me some advices!

C# how should I go about creating this threading application?

Alright I will attempt to explain every aspect of why I need to do this a certain way. Basically, I need an application to execute a certain .exe multiple times asynchronously.
Specs:
I need to be able to restrict the amount of executions going at one time.
It has to use threading because my program has a GUI and simply launching the .exe's and monitoring them will lock up the .GUI AND the console for other things.
How should I go about doing this? (examples help me a lot)
I've already told you multiple times how you should go about this. The launcher program has a single thread. It monitors the child processes. If a process ends and there is a free processor, it starts up a new process and affinitizes the process to that processor. When it's not doing any of those things it yields control back to its UI. Since each of those operations is of short duration, the UI never appears to block.
UPDATE
Actually this wasn't a great answer. As Henk pointed out in my comments, when you call Process.Start() that's not a blocking call. You have to explicitly set Process.EnableRaisingEvents to true, and handle the Exited event. I'm not sure if the Exited event is fired in the calling thread (I doubt it, but you should check), but the point is starting a process isn't a blocking call, so you don't need more threads doing the waiting.
See this similar answer for more details: Async process start and wait for it to finish
PREVIOUS ANSWER
Fire off your threads (limited to your max number of threads), and have them run the external exe using the Process.Start() method. Make sure you set them to wait for the process to finish. When the processes finish, have the threads use something like Interlocked.Increment() to increment a counter variable that you can read from your main form code. Better still, have those threads call a callback delegate (e.g. Action<T>), which will in turn check for this.InvokeRequired before doing the actual work.

System.Timers.Timer event from a completed thread

I have a C# service that I'm working on, which will load info via WMI on various servers on the network. I have set it up so that each server is represented by an instance of my ServerInfo class. In the main program, it initializes the configuration and loops through the ArrayList of servers, kicking off worker threads using ThreadPool.QueueUserWorkItem to init or update each server.
What I want to do is refresh each server in the background, after an amount of time specified in the config passes. My first thought was to set up a System.Timers.Timer object in each server instance, and once the threaded refresh method is completed, start the timer. The main program would wait for the elapsed event, and kick off the refresh method for that server instance again.
However, it looks like once the worker thread has completed, the timer is dead in the water and never sends an elapsed event (seems kind of obvious, since the object is no longer processing anything).
What should I do to allow for triggering of updates like this?
Well, my first thought is to have the ServerInfo class handle updating internally. The ServerInfo constructor could start a timer or use QueueUserWorkItem to kick off an update thread. That way, you would not have to manage all updating logic in the main program.
The ServerInfo class could have an event that the main program subscribes to to receive updates.
It's not necessary to queue a worker thread to start a timer. Just do it in the main thread (i.e., the constructor for ServerInfo)
Of course, I'm not 100% clear on what you're trying to accomplish, so this may not fit into your situation.
System.Timers.Timer probably isn't working as you expect as it uses windows messages, so it requires a message pump, which you do not have on your threads. Instead, look into System.Threading.Timer.
But my main answer is the opposite of Chris Hogan's.. Don't do threading inside objects, but in the main application code, or at least a layer of.
So.. you create all your ServerInfo's, which at this point only contain meta data eg machine name.
Then you fire up a separate thread using System.Threading.Thread, lock the ArrayList of ServerInfo's, grab a copy of the ArrayList, then iterate through it and call an Update() method on each ServerInfo.
You can either fire an event after each Update, or after they all have updated.
Don't forget to marshal the call from the event handler of ServerInfo.Updated to the main thread! Otherwise you'll hold up the next update cycle. And it's a requirement if you are to alter UI properties.

How do I Yield to the UI thread to update the UI while doing batch processing in a WinForm app?

I have a WinForms app written in C# with .NET 3.5. It runs a lengthy batch process. I want the app to update status of what the batch process is doing. What is the best way to update the UI?
The BackgroundWorker sounds like the object you want.
The quick and dirty way is using Application.DoEvents() But this can cause problems with the order events are handled. So it's not recommended
The problem is probably not that you have to yield to the ui thread but that you do the processing on the ui thread blocking it from handling messages. You can use the backgroundworker component to do the batch processing on a different thread without blocking the UI thread.
Run the lengthy process on a background thread. The background worker class is an easy way of doing this - it provides simple support for sending progress updates and completion events for which the event handlers are called on the correct thread for you. This keeps the code clean and concise.
To display the updates, progress bars or status bar text are two of the most common approaches.
The key thing to remember is if you are doing things on a background thread, you must switch to the UI thread in order to update windows controls etc.
To beef out what people are saying about DoEvents, here's a description of what can happen.
Say you have some form with data on it and your long running event is saving it to the database or generating a report based on it. You start saving or generating the report, and then periodically you call DoEvents so that the screen keeps painting.
Unfortunately the screen isn't just painting, it will also react to user actions. This is because DoEvents stops what you're doing now to process all the windows messages waiting to be processed by your Winforms app. These messages include requests to redraw, as well as any user typing, clicking, etc.
So for example, while you're saving the data, the user can do things like making the app show a modal dialog box that's completely unrelated to the long running task (eg Help->About). Now you're reacting to new user actions inside the already running long running task. DoEvents will return when all the events that were waiting when you called it are finished, and then your long running task will continue.
What if the user doesn't close the modal dialog? Your long running task waits forever until this dialog is closed. If you're committing to a database and holding a transaction, now you're holding a transaction open while the user is having a coffee. Either your transaction times out and you lose your persistence work, or the transaction doesn't time out and you potentially deadlock other users of the DB.
What's happening here is that Application.DoEvents makes your code reentrant. See the wikipedia definition here. Note some points from the top of the article, that for code to be reentrant, it:
Must hold no static (or global) non-constant data.
Must work only on the data provided to it by the caller.
Must not rely on locks to singleton resources.
Must not call non-reentrant computer programs or routines.
It's very unlikely that long running code in a WinForms app is working only on data passed to the method by the caller, doesn't hold static data, holds no locks, and calls only other reentrant methods.
As many people here are saying, DoEvents can lead to some very weird scenarios in code. The bugs it can lead to can be very hard to diagnose, and your user is not likely to tell you "Oh, this might have happened because I clicked this unrelated button while I was waiting for it to save".
Use Backgroundworker, and if you are also trying to update the GUI thread by handling the ProgressChanged event(like, for a ProgressBar), be sure to also set WorkerReportsProgress=true, or the thread that is reporting progress will die the first time it tries to call ReportProgress...
an exception is thrown, but you might not see it unless you have 'when thrown' enabled, and the output will just show that the thread exited.
Use the backgroundworker component to run your batch processing in a seperate thread, this will then not impact on the UI thread.
I want to restate what my previous commenters noted: please avoid DoEvents() whenever possible, as this is almost always a form of "hack" and causes maintenance nightmares.
If you go the BackgroundWorker road (which I suggest), you'll have to deal with cross-threading calls to the UI if you want to call any methods or properties of Controls, as these are thread-affine and must be called only from the thread they were created on. Use Control.Invoke() and/or Control.BeginInvoke() as appropriate.
If you are running in a background/worker thread, you can call Control.Invoke on one of your UI controls to run a delegate in the UI thread.
Control.Invoke is synchronous (Waits until the delegate returns). If you don't want to wait you use .BeginInvoke() to only queue the command.
The returnvalue of .BeginInvoke() allows you to check if the method completed or to wait until it completed.
Application.DoEvents() or possibly run the batch on a separate thread?
DoEvents() was what I was looking for but I've also voted up the backgroundworker answers because that looks like a good solution that I will investigate some more.

Categories

Resources