C# Event polling and multiple Tasks - c#

Let me briefly explain my problem. I want to read data from a sensor with my c#
solution. In order to do that I use an event, which pulls the data at a very fast rate from the sensor. The data is then saved to a database in sql.
To achieve that and maximize performance, I registered the event in Task A, which now frequently polls the Data from my sensor, lets say 1000 samples per second to give you an idea.
The data is saved in a blocking collection object with a queue.
What i want to do now is to stark a second task, which saves my data to an sql database, but only if there are more than 5000 samples in my blocking collection queue. How do i achieve that?
I tried to start the second task in my event which runs on the first task, but came over a few problems.
A) Sometimes the second task would not start in my event
B) I got an exception that the second task is still runnig (Because the event triggered so fast and it was not done i guess)
Is there any good way to do this?
Best regards

Related

Using Threadpool in a Subscription based windows service c#

I am implementing a windows service which will be subscribing to an external data source. Whenever there is some update in the external data source it will activate an eventhandler lets say OnDataReceived method. The number of updates received from the data source could be huge at a particular time of day. For example, in evening, within 30-40 mins I could get ~50K event handler calls. This event handler gets the data as XML and does some processing onxml before saving it to Database.
I am trying to figure out if so many eventhandler call in small time window will overwhelm my service by creating too many threads or the .Net framework will manage the number of threads running in parallel. If this can cause issues, so I am planning to use ThreadPool to put the request on a thread in the EventHandler thus reducing the time request spends in EventHandler method code.
So looking out for suggestions from fellow coders if I need to worry about thread management at all in my scenario. If yes, do you see any problems with the approach I am planning to take or if you can suggest some better approach to handle it.
Would appreciate any help on this.

Design thoughts required about concurrent processing

I have a series of calculations that need to be processed - the calculations and the order they run are all defined by the user on the UI.
If they just ran one after each other, it wouldn't be too hard. However, some of the calculations need to be processed concurrently and all calculations must have the ability to be individually paused at any time. I also need to be able to re-arrange orders or add new calculations to be processed at any time. So whatever I do must be flexible enough to handle this.
On the UI, imagine a listbox (a queue, if you like) of usercontrols - with each usercontrol displaying the name of the calculation and a pause button. And I can add calculations to this list at any time during processing.
What is the best way to do this?
Should I be running each calculation in its own thread? If so, how should I store the list of running processes? How will I pass the queue to the calculation processor? How will I be able to ensure that every time the queue changes (new ordering or new calculation) the calculation processor will be made aware of this?
My initial thoughts were to have:
CalcProcessor class
CalcCalculation class
In CalcProcessor have 2 Lists of CalcCalculations. One being the "queue" as shown on the UI (perhaps a pointer to it? Or some other way to ensure it updates live), and the other being the list of currently running calculations.
Somehow I need to get the CalcCalculation to be running in its own thread to process the calculation, and be able to handle any pause events. So I need some way to transmit the info of the Pause button being pressed in the UI to the CalcProcessor object, and then into the correct CalcCalculation.
Edit in response to David Hope:
Thanks for your reply.
Yes, there are n calculations but this could change at any time due to being able to add more calculations to process on the UI.
They do not need to share data in anyway. There will be a setting in the application to specify how many should run concurrently (ie. 10 at any given time, the first 10 in the queue for example - and when 1 finishes the next calculation in the queue will start processing).
The calculation will involve taking data from some data source - it could be a database or a file, and analysing it and performing some calculations on that data. When I say the calculation needs to be paused, I don't mean pausing the thread... I just mean (for example, as I haven't written this part of the application yet) if it is reading row by row from a database and doing some live calculations pausing at the completion of processing the current row... and continuing on when the pause button is unclicked on the UI - which could be done with something as primitive as a while(notPaused) loop providing I can get the Pause information from the UI into the thread.
There are several questions here:
How to synchronize the UI and the model?
I think you got this one backwards. Your model shouldn't have a “pointer” to the queue you're showing in the UI. Instead, the queue should be in your model and you should use databinding together with INotifyPropertyChange and ObservableCollection to show the queue on the UI. (At least that's how it's done in WPF.)
This way, you can manipulate your queue directly from your model, and it will automatically show on the UI.
How to start and monitor calculations?
I think Tasks are ideal for this. You can start a Task using Task.Factory.StartNew(). Since it seems your Tasks will take long to execute, you might consider using TaskCreationOptions.LongRunning. You can also use the Task to find out when is the calculation complete (or if it failed with an exception).
How to pause running calculations?
You can use ManualReserEventSlim for that. Normally, it would be set, but if you wanted to pause a running Task, you would Reset() it. The calculation will need to periodically call Wait() on that event. It's not possible to reasonably pause a running thread without cooperation from the calculation on that thread.
If you were using C# 5.0, a better approach would be to use something like PauseToken.
In Framework 4.5, the answer here is the Async API, which removes the need to manage threads. For details, look at the async/await keywords.
From a broader perspective, a "CalcProcessor" class is a good idea, but I think the Task object will suffice to replace your "CalcCalculation" class. The Processor can simply have an Enumerable of Tasks. The Processor can expose methods for managing the queue, if needed, as well as returning information about its status. When your application finally reaches a state where it must have the results, you can use the AwaitAll method to block the CalcProcessor's thread until all of the tasks complete.
Without more information about the actual goal here, it's hard to give better advice.
You can use Observer Pattern to display results on UI and order changes back in to Processor. State and Command patterns will help you to start, pause, cancel the calculations. These patterns have great answers to your questions in design way. Concurrency is still a problem, they do not answer multi-threading problems but they open an easier road to manage threads.
I suggest that you haven't broken the problem down far enough, which is the reason you are frustrated.
You need to start small and build up from there. You mention, but don't define your actual requirements, but they seem to be...
Need to be able to run ?N? calculations
Some need to be run concurrently (does this imply that they share data, if so how are you going to share the data)
Must be able to pause the calculation (don't use Thread.Suspend, as it potentially leaves a thread in an unstable state, particularly bad if you are sharing data), so you will need to build pause points into each calculation. Also need to consider how you are going to communicate the pause/unpause to the calculation
As far as methods, there are several to consider...
Threads are an obvious choice, but require careful tending too (starting, pausing, stopping, etc...)
You could also use BackGroundWorker or possibly Parallel.ForEach
BackGroundWorker contains the framework for cancelling the worker and providing progress (which can be useful).
My recommendation to start would be to go with BackGroundWorker, potentially subclass it to add the Pause/Resume functionality you need. Determine how you are going to manage data sharing (at least use lock to protect against simultaneous access).
You may find BackGroundWorker too restrictive and need to go with Threads, but I'm usually able to avoid it.
If you post more clear requirements, or samples of what you've tried and didn't work, I'll be happy to help more.
For queue you can use heap data structure (priority queue). This will help prioritize yours tasks. Also you should use Thread Pool for effectively calculations. And try to split you tasks to little parts.

C# - Multiple producer threads with single consumer thread

Working on a C# project at the minute - a general idea of what I'm trying to do is...
A user has one or more portfolios with symbols they are interested in,
Each portfolio downloads data on the symbols to csv, parses it, then runs a set of rules on the data,
Alerts are generated based on the outcomes of these rules
The portfolios should download the data etc. whenever a set interval has passed
I was planning on having each portfolio running on it's own thread, that way when the interval has elapsed, each portfolio can proceed to download the data, parse and run the rules on it concurrently, rather than one by one. The alerts should then be pushed onto another thread (!) that contains a queue of alerts. As it receives alerts, it sends them off to a client.
A bit much but what would be the best way to go about this with C# - using threads, or something like background worker and just having the alert-queue running on a separate thread?
Any advice is much appreciated, new to some of this stuff so feel free to tell me if I'm completely wrong :)
You can use blocking collection. This is new in c# 4.0 to support producer consumer scenario.
It also supports 1 consumer and multiple producer.
If this is entirely interval driven and you can upload alerts to the client at the end of each interval, then you can use a simplified approach:
When your interval timer fires:
Use TPL to launch simultaneous tasks to parse data, giving each task a callback method to pass alerts back to main.
The callback writes alerts to a List protected by a lock statement.
Wait for all tasks to complete. Then execute a continuation routine to forward the alerts to the client.
Restart the interval timer.
If you need to start processing the next interval before the upload is finished, you can use a double-buffered approach where you write to one List, give that to the upload method, and then start writing alerts to a new List, switching back and forth between lists as you collect and upload alerts.

Data handeling with c# chart control

I have a quick question for you all. This is my first time dealing with serial I/O, and was wondering what the most efficient method is. Ill be reading in from a laser through rs-232.
From what I can tell from researching it I can accomplish this by using a background worker to create a buffer from the serial port, then have the main thread access that buffer to plot and analyze the points. Is there a more efficient / reliable way to do this ,or is this approach going to be my best bet?
thanks!
How you read from the device depends on how data is retrieved from it. I have an application that gets data from a strain gauge (attached to a meter). This meter autonomously spits out readings, so I don't have to poll it. Because of this I simply use the SerialPorts DataReceived event and add data to an array.
I use the Microsoft Chart Control (built into .NET 4.0) which lives on the UI thread, so it has to be updated from the UI thread. The data comes in pretty fast (~100 points per second), so what I do is build a 100 point array and BeginInvoke (Winforms) that array off to the UI thread while starting a new array. This way the UI is updated periodically (~1 update per second).
If, on the other hand, your device requires polling (you have to ask it for data each time), then you'll want a dedicated thread that sits in a tight loop polling.

How to manage worker thread who wait until they have work to do?

I have a program that does the following:
Call webservice (there are many calls to the same web service)
Process the result of 1.
Insert result of 2. in a DB
So I think it should be better to do some multithreading. I think I can do it like this:
one thread is the master (let's call it A)
it creates some thread which calls the webservices (let's call it W)
when W has some results it sends it to A (or A detects that W has some stuff)
A sends the results to some computing thread (let's call it C)
when C has some results it sends it to A (or A detects that C has some stuff)
A sends the results to some database thread (let's call it D)
So sometimes C or D will wait for work to do.
With this technique I'll be able to set the thread number for each task.
Can you please tell me how I can do that, maybe if there is any pattern.
EDIT : I added "some" instead of "a", so I'll create many thread for some time-consuming process, and maybe only one for the fastest.
It sounds to me like you could use the producer/consumer pattern. With .NET 4 this has become really simple to implement. Start a number of Tasks and use the BlockingCollection<T> as a buffer between the tasks. Check out this post for details.
In .net you have a thread pool.
When you release a thread it does not actually get closed, it just goes back into the thread pool. When you open a new thread you get one from the thread pool.
If they are not used for a long time the thread pool will close them.
I would start two timers, which will fire their event handlers on separate ThreadPool threads. The event handler for the first timer will check the web service for data, write it to a Queue<T> if it finds some, and then go back to sleep until the next interval.
The event handler for the second timer reads the queue and updates the database, then sleeps until its next interval. Both event handlers should wrap access to the queue a lock to manage concurrency.
Separate timers with independent intervals will let you decouple when data is available from how long it takes to insert it into the database, with the queue acting as a buffer. Since generic queues can grow dynamically, you get some breathing room even if the database is unavailable for a time. The second event handler could even spawn multiple threads to insert multiple records simultaneously or to mirrored databases. The event handlers can also post updates to a log file or user interface to help you monitor activity.

Categories

Resources