i want to call same database procedure from C# programe to multiple database server at same time.how we can implement using multithereading or any alternative method.
e.g. server1 takes 2 min,server2 takes 3 min,server3 takes 1 min
total time taken=6 min.
i want to run C# programe to run database procedure parallel to all server , so that i can get result within 2 min.
Sounds reasonable. You could create a method for each algorithm which performs the job you need. Then use ThreadPool to invoke this task in parallel.
if we have to use same function then i would have to implement multiple time? like this:
using System;
using System.Threading;
public class mythread
{
public static void Main()
{
// Queue the task.
Console.WriteLine(DateTime.Now);
ThreadPool.QueueUserWorkItem(new WaitCallback(ThreadProc1));
//ThreadPool.QueueUserWorkItem(new WaitCallback(ThreadProc2));
Console.WriteLine("Main thread does some work, then sleeps.");
// If you comment out the Sleep, the main thread exits before
// the thread pool task runs. The thread pool uses background
// threads, which do not keep the application running. (This
// is a simple example of a race condition.)
Thread.Sleep(1000);
Console.WriteLine("Main thread exits.");
Console.WriteLine(DateTime.Now);
Console.Read();
}
// This thread procedure performs the task.
static void ThreadProc1(object obj)
{
// No state object was passed to QueueUserWorkItem, so
// stateInfo is null.
Console.WriteLine("Hello world!!,this is ONE.");
}
static void ThreadProc2(object obj)
{
// No state object was passed to QueueUserWorkItem, so
// stateInfo is null.
Console.WriteLine("Hello world!!,this is TWO.");
}
}
Related
I tested console application like this:
class Program
{
static void Main(string[] args)
{
Test();
Console.WriteLine("C");
Console.ReadLine();
}
static async void Test()
{
Console.WriteLine("A");
await Task.Delay(2000);
Console.WriteLine("B");
}
}
This application printed A and C immmediately then B after 2 seconds. It looks fine. But I read an article about async/await "There Is No Thread" (http://blog.stephencleary.com/2013/11/there-is-no-thread.html), it says async/await does not create additional thread.
So back to my console application, I think that the main thread is blokced on Console.ReadLine(), so the remain code in Test() (Console.WriteLine("B")) is not executed until Console.ReadLine() is complete. But actual result is different, the remain code is executed regardless of the blocking of main thread.
I want to know that await works like CPU interrupt, so the instruction pointer is moved to the remain code(Console.WriteLine("B");) and moved back to interrupted position(Console.ReadLine();) after execution?
Unlike a Windows form app, console applications don't have a single "blessed" thread (the UI thread). As such, no special synchronization context is employed by default in console applications and so the continuations that await makes use of are in fact scheduled using the thread pool.
So there's no need to "interrupt" the thread that's currently waiting on Console.ReadLine - another thread is used instead.
I have a simple code where I'm trying to get a better understanding of how a method can be called asynchronously in C#.
I have a function called Function1 which I want to run asynchronously
static void Function1(out int threadId)
{
Console.WriteLine("I'm inside Function 1. Going to sleep for 7 seconds...");
Thread.Sleep(7000);
Console.WriteLine("I'm awake now");
threadId = Thread.CurrentThread.ManagedThreadId;
}
Then I have a second function Function2 which I want to run normally (synchronously)
static void Function2()
{
Thread.Sleep(3000);
Console.WriteLine("I'm inside Function 2");
}
I also created a delegate with the same method signature as Function1
delegate void AsyncMethodCaller(out int threadId);
And here is my main calling method:
static void Main(string[] args)
{
int threadId;
AsyncMethodCaller caller = new AsyncMethodCaller(Function1);
caller.BeginInvoke(out threadId, null, null);
Function2();
}
In my main method, I expect Function1 to start running asynchronously; and then without waiting on it to finish, Function2 executes. So I expect the following output:
I'm inside the asynchronous Function 1. Going to sleep for 7
seconds... I'm inside Function 2 I'm awake now
Instead I just get the following output:
I'm inside the asynchronous Function 1. Going to sleep for 7
seconds... I'm inside Function 2
Why is my expectation different from reality? why is the "I'm awake now" line never reached?
Thank you
The basic process lifetime rules for .NET are that a process exits when all foreground threads have exited. Any background threads are simply aborted; the process will not wait for them.
Furthermore, when you call a delegate's BeginInvoke() method, this implicitly causes the delegate to be invoked using the thread pool. The thread pool is made entirely of background threads.
In other words, when you call BeginInvoke(), you are telling .NET to invoke the delegate using a thread that does not in and of itself guarantee its own lifetime. That thread can (and in this case, is) aborted once the single main foreground thread of the process exits, which occurs immediately after you call Function2().
If you want the asynchronously invoked delegate to complete normally, you'll have to wait for it explicitly. E.g.:
IAsyncResult result = caller.BeginInvoke(out threadId, null, null);
Function2();
caller.EndInvoke(out threadId, result);
The EndInvoke() method will block until the asynchronously invoked delegate has completed, allowing the main thread to wait for that to happen and thus ensuring that the thread used to invoke the delegate is not aborted before the invocation has completed.
The structure of your code example suggests you've already looked at MSDN's Calling Synchronous Methods Asynchronously, but in case you haven't I'll mention that page, as it includes a lot of details that would help explain how to deal with this particular scenario.
What is the best way to accomplish a queue line for threads so that I can only have a max number of threads and if I already have that many the code waits for a free slot before continuing..
Pseudo codeish example of what I mean, Im sure this can be done in a better way...
(Please check the additional requirements below)
private int _MaxThreads = 10;
private int _CurrentThreads = 0;
public void main(string[] args)
{
List<object> listWithLotsOfItems = FillWithManyThings();
while(listWithLotsOfItems.Count> 0)
{
// get next item that needs to be worked on
var item = listWithLotsOfItems[0];
listWithLotsOfItems.RemoveAt(0);
// IMPORTANT!, more items can be added as we go.
listWithLotsOfItems.AddRange(AddMoreItemsToBeProcessed());
// wait for free thread slot
while (_CurrentThreads >= _MaxThreads)
Thread.Sleep(100);
Interlocked.Increment(ref _CurrentThreads); // risk of letting more than one thread through here...
Thread t = new Thread(new ParameterizedThreadStart(WorkerThread(item));
t.Start();
}
}
public void WorkerThread(object bigheavyObject)
{
// do heavy work here
Interlocked.Decrement(ref _CurrentThreads);
}
Looked at Sempahore but that seems to be needing to run inside the threads and not outside before it is created. In this example the Semaphore is used inside the thread after it is created to halt it, and in my case there could be over 100k threads that need to run before the job is done so I would rather not create the thread before a slot is available.
(link to semaphore example)
In the real application, data can be added to the list of items as the program progresses so the Parallel.ForEach won't really work either (I'm doing this in a script component in a SSIS package to send data to a very slow WCF).
SSIS has .Net 4.0
So, let me first of all say that what you're trying to do is only going to give you a small enhancement to performance in a very specific arrangement. It can be a lot of work to try and tune at the thread-allocation level, so be sure you have a very good reason before proceeding.
Now, first of all, if you want to simply queue up the work, you can put it on the .NET thread pool. It will only allocate threads up to the maximum configured and any work that doesn't fit onto those (if all the threads are busy) will be queued up until a thread becomes available.
The simplest way to do this is to call:
Task.Factory.StartNew(() => { /* Your code */});
This creates a TPL task and schedules it to run on the default task scheduler, which should in turn allocate the task to the thread-pool.
If you need to wait for these tasks to complete before proceeding, you can add them to a collection and then use Task.WaitAll(...):
var tasks = new List<Task>();
tasks.Add(Task.Factory.StartNew(() => { /* Your code */}));
// Before leaving the script.
Task.WaitAll(tasks);
However, if you need to go deeper and control the scheduling of these tasks, you can look at creating a custom task scheduler that supports limited concurrency. This MSDN article goes into more details about it and suggests a possible implementation, but it isn't a trivial task.
The easiest way to do this is with the overload of Parallel.ForEach() which allows you to select MaxDegreeOfParallelism.
Here's a sample program:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
namespace Demo
{
public static class Program
{
private static void Main()
{
List<int> items = Enumerable.Range(1, 100).ToList();
Parallel.ForEach(items, new ParallelOptions {MaxDegreeOfParallelism = 5}, process);
}
private static void process(int item)
{
Console.WriteLine("Processing " + item);
Thread.Sleep(2000);
}
}
}
If you run this, you'll see that it processes 5 elements very quickly and then there's a delay (caused by Thread.Sleep(2000)) before the next block of elements is processed. This is because in this sample code no more than 5 threads are allowed to execute at once.
Note that if MaxDegreeOfParallelism exceeds the threadpool's minimum thread value, then it may take a while for all the threads to be started.
The reason for this is that Parallel.ForEach() uses threadpool threads - and there is a certain number of threads that the threadpool keeps available by default. When creating threads beyond this limit, a delay is introduced inbetween each new threadpool thread creation.
You can set the minimum number of threadpool threads to a higher value using ThreadPool.SetMinThreads(), but I do NOT recommend this.
However, if you do want to do so, here's an example which sets the minimum thread count to 20:
ThreadPool.GetMinThreads(out dummy, out ioThreads);
ThreadPool.SetMinThreads(20, ioThreads);
If you do that and then run the previous code with MaxDegreeOfParallelism = 20 you'll see that there's no longer any delay when the initial threads are created.
Have you considered using a Wait handle? See this
Also you can use Parallel.Foreach to manage the thread creation for you.
Hope it helps ;)
Could anyone please give a sample or any link that describes how to spawn thread where each will do different work at the same time.
Suppose I have job1 and job2. I want to run both the jobs simultaneously. I need those jobs to get executed in parallel. how can I do that?
Well, fundamentally it's as simple as:
ThreadStart work = NameOfMethodToCall;
Thread thread = new Thread(work);
thread.Start();
...
private void NameOfMethodToCall()
{
// This will be executed on another thread
}
However, there are other options such as the thread pool or (in .NET 4) using Parallel Extensions.
I have a threading tutorial which is rather old, and Joe Alabahari has one too.
Threading Tutorial from MSDN!
http://msdn.microsoft.com/en-us/library/aa645740(VS.71).aspx
Threads in C# are modelled by Thread Class. When a process starts (you run a program) you get a single thread (also known as the main thread) to run your application code. To explicitly start another thread (other than your application main thread) you have to create an instance of thread class and call its start method to run the thread using C#, Let's see an example
using System;
using System.Diagnostics;
using System.Threading;
public class Example
{
public static void Main()
{
//initialize a thread class object
//And pass your custom method name to the constructor parameter
Thread thread = new Thread(SomeMethod);
//start running your thread
thread.Start();
Console.WriteLine("Press Enter to terminate!");
Console.ReadLine();
}
private static void SomeMethod()
{
//your code here that you want to run parallel
//most of the cases it will be a CPU bound operation
Console.WriteLine("Hello World!");
}
}
You can learn more in this tutorial Multithreading in C#, Here you will learn how to take advantage of Thread class and Task Parallel Library provided by C# and .NET Framework to create robust applications that are responsive, parallel and meet the user expectations.
I have a server application which needs to schedule the deferred execution of method(s). In other words, mechanism to run a method using a thread in ThreadPool after a certain period of time.
void ScheduleExecution (int delay, Action someMethod){
//How to implement this???
}
//At some other place
//MethodX will be executed on a thread in ThreadPool after 5 seconds
ScheduleExecution (5000, MethodX);
Please suggest an efficient mechanism to achieve above. I would prefer to avoid frequently creating new objects since above activity is likely to happen A LOT on server. Also the accuracy of call is important, i.e. while MethodX being executed after 5200 msec is fine but being executed after 6000 msec is a problem.
Thanks in advance...
You could use the RegisterWaitForSingleObject method. Here's an example:
public class Program
{
static void Main()
{
var waitHandle = new AutoResetEvent(false);
ThreadPool.RegisterWaitForSingleObject(
waitHandle,
// Method to execute
(state, timeout) =>
{
Console.WriteLine("Hello World");
},
// optional state object to pass to the method
null,
// Execute the method after 2 seconds
TimeSpan.FromSeconds(2),
// Execute the method only once. You can set this to false
// to execute it repeatedly every 2 seconds
true);
Console.ReadLine();
}
}