Executing operations in Background in Windows Store apps - c#

I am building up an universal app targeting 8.1 runtime and also new to whole .NET world.
I have some operations in my application which I want to perform in parallel, asynchronously and also in background. These operations include file upload, download (both I/O and compute bound). So I wanted to execute them In threads. I would like to process them in 3 threads. Each thread will work on the operation queue and keep waiting perpetually (in while(true) loop) unless an operation is available for execution. These threads would also pass task status or progress updates to UI.
Would also like to have these threads running on Application lock or if the app suspends . If my application terminates or if the network connections is lost these threads will stop. Also would like to cancel the operations running in the threads according to me requirements (like cancelling a file download).
My initial hunch was to use threads. But since threads are not available for windows store apps and my exploration about threading in Windows store apps revealed to use task to perform this.
I read about the Tasks Asynchrony pattern (TAP) whitepaper published by MS ,channel 9 videos from Bruce Kyle and lot of other blogs. I am convinced to use Tasks .As it runs on a thread pool which can take advantage of multi core processor and is better performing
Questions on Tasks will be
Will it be useful as background thread. Can tasks be run as a
background thread perpetually.
I don't really need my tasks to return results. The result from the
operations can be relayed to UI through events.
Do I have to schedule my tasks from a background task API when the
app suspend ?
Does the Task API fit my scenario ?
Are Background tasks in Windows same as services in Android. ?
Thanks a lot and Regards,
Saurav

Note that file uploads and downloads already happen on background threads in Windows Store apps, so you don't really need to worry about it impacting your UI at all. If you want to do computationally-expensive work also, I would just spin off a new Task whenever you want to do something asynchronously unless you have specific needs around a dedicated thread reading from a queue.
If you must a perpetual thread, do not run a busy loop waiting for work; use a signalable object like an AutoResetEventinstead) to wake it up when there is something to do. This will minimize wasted CPU power (and hence battery).
Background execution is limited in Windows Store apps; you can read about it on MSDN.

Related

Is good idea to use parallel programming in pods

I'm starting with k8s, and i have little problem with parallel processing in my pods.
Currently I'm using dot.net core platform with c# 7.2 for my application with is running in pods.
I'm trying to use parallel task in apps, but it looks lika application is using only one core.
So I'm thinking that I should use only async/await pattern for this application and solve parallel processing by numbers of the pods in deployment settings.
Is this opinion correct?
Thanks for help.
When to use Parallel API ?
You have a CPU intensive task and wanted to ensure all the CPU cores are effectively utilized. Parallel calls are always blocking operation for main / Ui thread
When to use Async Await ?
When your aim is to do processing Asynchronously (In background), thus allowing Main / UI thread to remain responsive, main use case is calling remote processing logic like Database query, which shall not block the server thread. Async AWait used for in memory processing is mainly meant to allowing the Ui thread responsive for the end user, but that would still use Thread pool thread, which is not the case for IO processing, no pool threads are used
Regarding Kubernetes set up ?
Since this is an orchestration mechanism for the Docker, which virtualize the OS resources for setting up the Docker containers, so you may have to ensure that there's no setting configuration, which is restricting the total assigned CPU Cores, just having an adverse impact on the overall performance. This aspect will be outside the ambit of .Net Parallel APIs

How do I monitor and limit the number of QueueBackgroundWorkItem threads in ASP.net?

I have an ASP.Net application acting as an interface between systems which tend to send data in bursts via multiple requests and I would like to kick off a background task to perform some processing but would really like to only have one single background task do it. I can use HostingEnvironment.QueueBackgroundWorkItem but that will indiscriminately launch a thread for each incoming request which is a problem.
When I launch a background process I want it to queue up the work it has (in the connected database) and then check if there is another background process running. If there is then it should finish because the other process will process the queued work. If there is no other background process running then I want it to start processing the queued work until there is no more to do and then it will stop.
The process is not a heavy task or long running task but the main constraint is that everything is processed in a strict order making parallel threading risky. In a single process it's easy to ensure everything gets processed in order.
How do I achieve this without shifting to an external service?
Seems like a classic producer consumer scenario. Create a BlockingCollection that producers enqueue to. Create one permanent LongRunning Task that drains that collection.
You can drain in batches if you want.
This would not work with QueueBackgroundWorkItem because the you need to eventually exit the work that you put into QueueBackgroundWorkItem so that the worker process can shutdown gracefully.

Threading or Task

I'm currently picking up C# again and developping a simple application that sends broadcast messages and when received shown on a Windows Form.
I have a discovery class with two threads, one that broadcasts every 30 seconds, the other thread listens on a socket. It is in a thread because of the blocking call:
if (listenSocket.Poll(-1, SelectMode.SelectRead))
The first thread works much like a timer in a class library, it broadcasts the packet and then sleeps for 30 seconds.
Now in principle it works fine, when a packet is received I throw it to an event and the Winform places it in a list. The problems start with the form though because of the main UI thread requiring Invoke. Now I only have two threads and to me it doesn't seem to be the most effective in the long run becoming a complex thin when the number of threads will grow.
I have explored the Tasks but these seem to be more orientated at a once off long running task (much like the background worker for a form).
Most threading examples I find all report to the console and do not have the problems of Invoke and locking of variables.
As i'm using .NET 4.5 should I move to Tasks or stick to the threads?
Async programming will still delegate some aspects of your application to a different thread (threadpool) if you try to update the GUI from such a thread you are going to have similar problems as you have today with regular threads.
However there are many techniques in async await that allow you to delegate to a background thread, and yet put a kind off wait point saying please continue here on the GUI thread when you are finished with that operation which effectively allows you to update the GUI thread without invoke, and have a responsive GUI. I am talking about configureAwait. But there are other techniques as well.
If you don't know async await mechanism yet, this will take you some investment of your time to learn all these new things. But you'll find it very rewarding.
But it is up to you to decide if you are willing to spend a few days learning and experimenting with a technology that is new to you.
Google around a bit on async await, there are some excellent articles from Stephen Cleary for instance http://blog.stephencleary.com/2012/02/async-and-await.html
Firstly if you're worried about scalability you should probably start off with an approach that scales easily. ThreadPool would work nice. Tasks are based on ThreadPool as well and they allow for a bit more complex situations like tasks/threads firing in a sequence (also based on condition), synchronization etc. In your case (server and client) this seems unneeded.
Secondly it looks to me that you are worried about a bottleneck scenario where more than one thread will try to access a common resource like UI or DB etc. With DBs - don't worry they can handle multiple access well. But in case of UI or other not-multithread-friendly resource you have to manage parallel access yourself. I would suggest something like BlockingCollection which is a nice way to implement "many producers, one consumer" pattern. This way you could have multiple threads adding stuff and just one thread reading it from the collection and passing it on the the single-threaded resource like UI.
BTW, Tasks can also be long running i.e. run loops. Check this documentation.

Threads, Task, async/await, Threadpool

I am getting really confused here about multithreading :(
I am reading about the C# Async/Await keywords. I often read, that by using this async feature, the code gets executed "non-blocking". People put code examples in two categories "IO-Bound" and "CPU-bound" - and that I should not use a thread when I execute io-bound things, because that thread will just wait ..
I dont get it... If I do not want a user have to wait for an operation, I have to execute that operation on another thread, right ?
If I use the Threadpool, an instance of "Thread"-class, delegate.BeginInvoke or the TPL -- every asynchronous execution is done on another thread. (with or without a callback)
What you are missing is that not every asynchronous operation is done on another thread. Waiting on an IO operation or a web service call does not require the creation of a thread. On Windows this is done by using the OS I/O Completion Ports.
What happens when you call something like Stream.ReadAsync is that the OS will issue a read command to the disk and then return to the caller. Once the disk completes the read the notifies the OS kernel which will then trigger a call back to your processes. So there is no need to create a new threadpool thread that will just sit and block.
What is meant is this:
Suppose you query some data from a database (on another server) - you will send a request and just wait for the answer. Instead of having a thread block and wait for the return it's better to register an callback that get's called when the data comes back - this is (more or less) what async/await does.
It will free the thread to do other things (give it back to the pool) but once your data come back asynchronously it will get another thread and continue your code at the point you left (it's really some kind of state-machine that handles that).
If your calculation is really CPU intensive (let's say you are calculating prime-numbers) things are different - you are not waiting for some external IO, you are doing heavy work on the CPU - here it's a better idea to use a thread so that your UI will not block.
I dont get it... If I do not want a user have to wait for an operation, I have to execute that operation on another thread, right ?
Not exactly. An operation will take however long it is going to take. When you have a single-user application, running long-running things on a separate thread lets the user interface remain responsive. At the very least this allows the UI to have something like a "Cancel" button that can take user input and cancel processing on the other thread. For some single-user applications, it makes sense to allow the user to keep doing other things while a long-running task completes (for example let them work on one file while another file is uploading or downloading).
For web applications, you do not want to block a thread from the thread pool during lengthy(ish) IO, for example while reading from a database or calling another web service. This is because there are only a limited number of threads available in the thread pool, and if they are all in use, the web server will not be able to accept additional HTTP requests.

Threadpool, order of execution and long running operations

I have a need to create multiple processing threads in a new application. Each thread has the possibility of being "long running". Can someone comment on the viability of the built in .net threadpool or some existing custom threadpool for use in my application?
Requirements :
Works well within a windows service. (queued work can be removed from the queue, currently running threads can be told to halt)
Ability to spin up multiple threads.
Work needs to be started in sequential order, but multiple threads can be processing in parallel.
Hung threads can be detected and killed.
EDIT:
Comments seem to be leading towards manual threading. Unfortunately I am held to 3.5 version of the framework. Threadpool was appealing because it would allow me to queue work up and threads created for me when resources were available. Is there a good 3.5 compatable pattern (producer/consumer perhaps) that would give me this aspect of threadpool without actually using the threadpool?
Your requirements essentially rule out the use of the .NET ThreadPool;
It generally should not be used for long-running threads, due to the danger of exhausting the pool.
It does work well in Windows services, though, and you can spin up multiple threads - limited automatically by the pool's limits.
You can not guarantee thread starting times with the thread pool; it may queue threads for execution when it has enough free ones, and it does not even guarantee they will be started in the sequence you submit them.
There are no easy ways to detect and kill running threads in the ThreadPool
So essentially, you will want to look outside the ThreadPool; I might recommend that perhaps you might need 'full' System.Threading.Thread instances just due to all of your requirements. As long as you handle concurrency issues (as you must with any threading mechanism), I don't find the Thread class to be all that difficult to manage myself, really.
Simple answer, but the Task class (Fx4) meets most of your requirements.
Cancellation is cooperative, ie your Task code has to check for it.
But detecting hung threads is difficult, that is a very high requirement anyway.
But I can also read your requirements as for a JobQueue, where the 'work' consists of mostly similar jobs. You could roll your own system that Consumes that queue and monitors execution on a few Threads.
I've done essentially the same thing with .Net 3.5 by creating my own thread manager:
Instantiate worker classes that know how long they've been running.
Create threads that run a worker method and add them to a Queue<Thread>.
A supervisor thread reads threads from the Queue and adds them to a Dictionary<int, Worker> as it launches them until it hits its maximum running threads. Add the thread as a property of the Worker instance.
As each worker finishes it invokes a callback method from the supervisor that passes back its ManagedThreadId.
The supervisor removes the thread from the Dictionary and launches another waiting thread.
Poll the Dictionary of running workers to see if any have timed out, or put timers in the workers that invoke a callback if they take too long.
Signal a long-running worker to quit, or abort its thread.
The supervisor invokes callbacks to your main thread to inform of progress, etc.

Categories

Resources