Is good idea to use parallel programming in pods - c#

I'm starting with k8s, and i have little problem with parallel processing in my pods.
Currently I'm using dot.net core platform with c# 7.2 for my application with is running in pods.
I'm trying to use parallel task in apps, but it looks lika application is using only one core.
So I'm thinking that I should use only async/await pattern for this application and solve parallel processing by numbers of the pods in deployment settings.
Is this opinion correct?
Thanks for help.

When to use Parallel API ?
You have a CPU intensive task and wanted to ensure all the CPU cores are effectively utilized. Parallel calls are always blocking operation for main / Ui thread
When to use Async Await ?
When your aim is to do processing Asynchronously (In background), thus allowing Main / UI thread to remain responsive, main use case is calling remote processing logic like Database query, which shall not block the server thread. Async AWait used for in memory processing is mainly meant to allowing the Ui thread responsive for the end user, but that would still use Thread pool thread, which is not the case for IO processing, no pool threads are used
Regarding Kubernetes set up ?
Since this is an orchestration mechanism for the Docker, which virtualize the OS resources for setting up the Docker containers, so you may have to ensure that there's no setting configuration, which is restricting the total assigned CPU Cores, just having an adverse impact on the overall performance. This aspect will be outside the ambit of .Net Parallel APIs

Related

Is Task parallel library, PLINQ or Concurrent Collections used in Web Applications built with Asp.Net core Mvc or Razor Pages?

I am beginner in C# , .Net core. So, I do have very limited knowledge over this advance topics (Task parallel library, PLINQ or Concurrent Collections). If my question seemed like idiotic, then I am extremely sorry.
I have developed small web apps using Asp.Net core MVC and Razor pages. I have used async and await in those web apps but never got an opportunity to use TPL or PLINQ or Concurrent collection. As Presently I am learning Multi threading and I am looking for opportunity to implement TPL, PLINQ or Concurrent Collections in my Web Apps.
So here are my queries,
Is Task parallel library, PLINQ or Concurrent Collections used in Web Applications built with Asp.Net core MVC or Razor Pages??
If it is used, so what are those situation where TPL, PLINQ are more suitable than async.
Is there any tutorials related this topic??
[Update]
If I used PLINQ or TPL or Concurrent Collection in utility Library or Repository Services will it be too much for my web apps or will it create deadlock or blocking in my Web App [.net core ]
Or I am making simple thing very much complex??
Thank you.
Is Task parallel library, PLINQ or Concurrent Collections used in Web Applications built with Asp.Net core MVC or Razor Pages??
The short answer is: No for PLINQ & multi-threading.
First, async/await are wrapped around Task class, which is part of TPL.
Second, it's considered bad practice doing parallelism and spin multiple worker threads in an ASP.NET (Core) application.
In ASP.NET Core you use TPL and async/await for truely async operation (network calls, I/O access (file access), database calls etc), where the application has to wait on an external hardware component to finish.
CPU bound tasks (calculations etc. running on one or multiple threads) arern't awaited in ASP.NET Core applications and just done on the main thread (in a "blocking" fashion).
ASP.NET (the legacy one) used to have its own thread pool for managing connections and ASP.NET Core just uses the default thread-pool for it. In legacy ASP.NET you would still have single thread exection per request, even if you use PLINQ fire off multiple tasks and await them. ASP.NET Core wouldn't.
Still, spinnig off your own tasks and threads is a bad idea. It may make your single threaded operation faster when having low number of requests (lower latency), but may make it unresponsible in high-request scenarios due to thread starvation and your application spending a lot of CPU time managing and switching between threads. It also interferes with the thread heuristics.
Also on thread starvation, your application won't accept any new requests. For that reasons CPU bound tasks are meant to be run syncronously. Running an CPU bound work via Task.Run alone gains you nothing. You free up the request thread while waiting for the task to finish, but since you spin up a new task, another thread is used to process it.
Its similar to when running parallel task. When you run to many per request, there is one point where you have to many tasks queued and to many requests and application stops accepting requests and the user get the dreaded "50x" http errors for your application not accepting new requests where as not spinning up many parallel tasks would allow you to take a much higher number of connections and queue them while the other tasks requests complete
Concurrency and parallel tasks are important for Desktop applications and awaiting them there makes sense (to not block the UI thread), even if its a single lengthy execution. Also, in a Desktop application you can be certain, that the application is only used by a single user, so using up as many CPU cores as possible to process a task quickly is a very good thing.
It just doesn't translate well, when you apply this to a sharerd application such as an ASP.NET Core application where an unknown numbers of users will interact with it.

Real time AutoResetEvent after async-await continuation in C#

I am using C# to control a hardware device. The program is structured as
A hardware control thread (normal CPU priority)
while (notFinished)
{
Prepare();
await DeviceCommunication();
autoResetEvent.WaitOne();
}
A UI thread (normal CPU priority)
A heavy computational thread (below normal CPU priority)
There is a layer of device API written in C# Task. The AutoResetEvent delay after Task continuation is sometimes as high as 500ms depending on the state of PC (the heavy computational thread is not even running). It is generally fine except during some critical hardware control moments. It requires 10ms response time.
I tested setting the consumer thread to above normal and mocking the asynchronous function to force it to synchronous. It seemed to solve the problem. However, in the real asynchronous functions, there are await. They immediately release the thread. The continuations are in threads from the thread-pool.
Question 1
Is the 500ms delay normal? I am using VirtualBox with i5 2 threads. I expect the target PC will perform similarly to mine.
Assuming my findings are valid. To solve the problem, my choices are
Use Task.GetAwaiter().GetResult() to turn async to sync. It should not cause deadlocks.
Rewrite the device API layer to support true sync operations. It is elegant and follows the general rules but they are just nice to have.
Set Task scheduling priority and CPU priority
Use 3rd parties Task libraries
Question 2
Are there better choices?
Question 3
How to do choice 3 (Set Task scheduling priority and CPU priority)? Is a custom TaskScheduler the only way to do it?
Windows does not provide any scheduling guarantees of any kind. Period. We frequently see extra delays of several seconds for C# code on a Win7 system while device drivers are busy. If you really have a hard real time requirement, you need to be running on an RTOS.

Executing operations in Background in Windows Store apps

I am building up an universal app targeting 8.1 runtime and also new to whole .NET world.
I have some operations in my application which I want to perform in parallel, asynchronously and also in background. These operations include file upload, download (both I/O and compute bound). So I wanted to execute them In threads. I would like to process them in 3 threads. Each thread will work on the operation queue and keep waiting perpetually (in while(true) loop) unless an operation is available for execution. These threads would also pass task status or progress updates to UI.
Would also like to have these threads running on Application lock or if the app suspends . If my application terminates or if the network connections is lost these threads will stop. Also would like to cancel the operations running in the threads according to me requirements (like cancelling a file download).
My initial hunch was to use threads. But since threads are not available for windows store apps and my exploration about threading in Windows store apps revealed to use task to perform this.
I read about the Tasks Asynchrony pattern (TAP) whitepaper published by MS ,channel 9 videos from Bruce Kyle and lot of other blogs. I am convinced to use Tasks .As it runs on a thread pool which can take advantage of multi core processor and is better performing
Questions on Tasks will be
Will it be useful as background thread. Can tasks be run as a
background thread perpetually.
I don't really need my tasks to return results. The result from the
operations can be relayed to UI through events.
Do I have to schedule my tasks from a background task API when the
app suspend ?
Does the Task API fit my scenario ?
Are Background tasks in Windows same as services in Android. ?
Thanks a lot and Regards,
Saurav
Note that file uploads and downloads already happen on background threads in Windows Store apps, so you don't really need to worry about it impacting your UI at all. If you want to do computationally-expensive work also, I would just spin off a new Task whenever you want to do something asynchronously unless you have specific needs around a dedicated thread reading from a queue.
If you must a perpetual thread, do not run a busy loop waiting for work; use a signalable object like an AutoResetEventinstead) to wake it up when there is something to do. This will minimize wasted CPU power (and hence battery).
Background execution is limited in Windows Store apps; you can read about it on MSDN.

Are threads executed on multiple processors?

It appears that the Task class provides us the ability to use multiple processors of the system. Does the Thread class work on multiple processors as well or does it use time slicing only on a single processor? (Assuming a system with multiple cores).
My question is if threads will/could be executed on multiple cores then what is so special about Task and Parallelism ?
When you create a thread it forms kind of a logical group of work. The .NET Framework will aquire CPU-Time from the system. Most likely multiple threads will run on different cores (This is something the system handeles - not even .NET has any influence on this)
But it might be possible that the system will execute all your Threads on the same core or even moves the execution between several cores during the execution. Keep in Mind, that you are crating managed Threads, and not real System-Threads.
(Correctly spoken I should say: The System could execute your managed Threads within the same System-Thread or use multiple System-Threads for multiple managed threads.)
Maybe you want to have a look at this Blog-Post: http://www.drdobbs.com/parallel/managed-threads-are-different-from-windo/228800359 The explanation there is pretty good in terms of details.
Not a bad first question. +1
I would suggest you to read Threading in C# by Joseph Albahari. If you read through the post you will find:
How Threading Works
Multithreading is managed internally by a thread scheduler, a
function the CLR typically delegates to the operating system. A
thread scheduler ensures all active threads are allocated appropriate
execution time, and that threads that are waiting or blocked (for
instance, on an exclusive lock or on user input) do not consume CPU
time.
So multi threading is handled by operating system through a thread scheduler.
Further the post has:
On a multi-processor computer, multithreading is implemented with
a mixture of time-slicing and genuine concurrency, where different
threads run code simultaneously on different CPUs. It’s almost certain
there will still be some time-slicing, because of the operating
system’s need to service its own threads — as well as those of other
applications.
-It appears that Task class provide us the ability to use on multiple processors in the system.
-if threads will/could be executed on multiple cores then what is so special about Task Parallelism ?
The Task class is just a small, but important, part of TPL (Task Parallel Library). TPL is a high level abstraction, so you don't have to work with threads directly. It encapsulates and hides most of the churn you'd have to implement for any decent multi-threaded application.
Tasks don't introduce any new functionality that you couldn't implement on your own, per se (which is the core of your questions, I believe). They can be synchronous or asynchronous - when they are async, they're either using Thread class internally or IOCP ports.
Some of the points addressed by TPL are:
Rethrow exceptions from a child thread on the calling thread.
Asynchronous code (launch thread -> run arbitrary code while waiting for child thread -> resume when child thread is over) looks as if it were synchronous, greatly improving readability and maintainability
Simpler thread cancelation (using CancellationTokenSource)
Parallel queries/data manipulation using PLINQ or the Parallel class
Asynchronous workflows using TPL Dataflow

ThreadPool.QueueUserWorkItem uses ASP.Net

In Asp.Net for creating a huge pdf report iam using "ThreadPool.QueueUserWorkItem", My requirement is report has to be created asynchronously , and i do not want to wait for the Response. I plan to achieve it through below code
protected void Button1_Click(object sender, EventArgs e)
{
ThreadPool.QueueUserWorkItem(report => CreateReport());
}
public void CreateReport()
{
//This method will take 30 seconds to finish it work
}
My question is ThreadPool.QueueUserWorkItem will create a new thread from Asp.Net worker process or some system thread. Is this a good approach ?, I may have 100 of concurrent users accessing the web page.
The QueueUserWorkItem() method utilizes the process's ThreadPool which automatically manages a number of worker-threads. These threads are assigned a task, run them to completion, then are returned to the ThreadPool for reuse.
Since this is hosted in ASP.NET the ThreadPool will belong to the ASP.NET process.
The ThreadPool is a very good candidate for this type of work; as the alternative of spinning up a dedicated thread is relatively expensive. However, you should consider the following limitations of the ThreadPool as well:
The ThreadPool is used by other aspects of .NET, and provides a limited number of threads. If you overuse it there is the possibility your tasks will be blocked waiting for others to complete. This is especially a concern in terms of scalability--however it shouldn't dissuade you from using the ThreadPool unless you have reason to believe it will be a bottleneck.
The ThreadPool tasks must be carefully managed to ensure they are returned for reuse. Unhandled exceptions or returns from a background thread will essentially "leak" that thread and prevent it from being reused. In these scenarios the ThreadPool may effectively lose it's threads and cause a serious slowdown or halt of the process.
The tasks you assign to the ThreadPool should be short-lived. If your processing is intensive then it's a better idea to provide it with a dedicated thread.
All these topics relate to the simple concept that the ThreadPool is intended for small tasks, and for it's threads to provide a cost-saving to the consuming code by being reused. Your scenario sounds like a reasonable case for using the ThreadPool--however you will want to carefully code around it, and ensure you run realistic load-tests to determine if it is the best approach.
The thread pool will manage the number of active threads as needed. Once a thread is done with a task it continues on the next queued task. Using the thread pool is normally a good way to handle background processing.
When running in an ASP.NET application there are a couple of things to be aware of:
ASP.NET applications can be recycled for various reasons. When this happens all queued work items are lost.
There is no simple way to signal back to the client web browser that the operation completed.
A better approach in your case might be to have a WCF service with a REST/JSON binding that is called by AJAX code on the client web page for doing the heavy work. This would give you the possibility to report process and results back to the user.
In addition to what Anders Abel has already laid out, which I agree with entirely, you should consider that ASP.NET also uses the thread pool to respond to requests, so if you have long running work like this using up a thread pool thread, it is technically stealing from the resources which ASP.NET is able to use to fulfill other requests anyway.
If you were to ask me how best to architect it I would say you dispatch the work to a WCF service using one way messaging over the MSMQ transport. That way it is fast to dispatch, resilient to failure and processing of the requests on the WCF side can be more tightly controlled because the messages will just sit on the queue waiting to be processed. So if your server can only create 10 PDFs at a time you would just set the maxConcurrentCalls for the WCF service to 10 and it will only pull a maximum of 10 messages off the queue at once. Also, if your service shuts down, when it starts up it will just begin processing again.

Categories

Resources