BlockingCollection multiple consumer - c#

I have the following code with one producer thread and multiple consumer threads. Do you know if multiple consumers are thread safe. For example is there any chance that thread 1 is consuming and while do that thread 2 consume in parallel and change the value of the item that is used in thread 1?
namespace BlockingColl
{
public partial class Form1 : Form
{
public Form1()
{
InitializeComponent();
}
private void button1_Click(object sender, EventArgs e)
{
try
{
for (int i = 0; i < 3; i++)
{
ThreadPool.QueueUserWorkItem((x) =>
{
foreach (var item in bc.GetConsumingEnumerable())
{
Console.WriteLine(Thread.CurrentThread.ManagedThreadId + " - " + item + " - " + DateTime.Now.ToString("MM/dd/yyyy hh:mm:ss.fff tt"));
}
});
}
}
catch (Exception)
{
throw;
}
}
private void button2_Click(object sender, EventArgs e)
{
for (int i = 0; i < 3; i++)
{
ThreadPool.QueueUserWorkItem((x) =>
{
Cache.Consume();
});
}
for (int i = 0; i < 50000; i++)
{
Cache.bc.TryAdd(new Client() { ClientId = i, ClientName = "Name" + i });
}
}
}
static class Cache
{
public static BlockingCollection<Client> bc = new BlockingCollection<Client>();
public static void Consume()
{
foreach (var item in bc.GetConsumingEnumerable())
{
Console.WriteLine(Thread.CurrentThread.ManagedThreadId + " - " + item + " - " + DateTime.Now.ToString("MM/dd/yyyy hh:mm:ss.fff tt"));
}
}
}
public class Client
{
public int ClientId { get; set; }
public string ClientName { get; set; }
}
}
Thanks in advance

Once you've consumed an element it is removed from the collection, so no other thread will be able to access it (at least through the collection).
That Cache looks more like a buffer to me. What does it add on top of the blocking collection anyway? It's weird that the cache would be able to consume its own elements.

A BlockingCollection blocks only the collection itself. Not the objects in the list.

Related

Thread-safe queue mechanism

Is it my queue mechanism thread safe? I just wonder if I need concurrent collections. Need I lock Enqueue method? Console displays queue count in incorrect order, Does it affect on maxQueueCount at Load method? Can I improve it in some way? I want queue with a maximum size, and I don't want the same item to be enqueued again.
I have many database sources with stored procedures which select documents. Each document has a unique Id but may be contained in many data sources. So I need to check if the document with the specified ID is processed in my data flow or not. I don't want to clogged my queue so If queue count equals = 1000 I don't want to enqueue new documents.
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Threading.Tasks;
using System.Timers;
class Program
{
public class Document : IItem
{
public Guid Id { get; set; }
}
static void Main()
{
var queueProvider = new Provider();
var docs = new List<IItem>
{
new Document { Id = Guid.NewGuid() },
new Document { Id = Guid.NewGuid() },
new Document { Id = Guid.NewGuid() },
new Document { Id = Guid.NewGuid() },
new Document { Id = Guid.NewGuid() }
};
try
{
var tasks = new List<Task>();
var task1 = Task.Factory.StartNew(() =>
{
var timer1 = new Timer(1000) { Interval = 1000 };
timer1.Elapsed += (object sender, ElapsedEventArgs e) =>
{
queueProvider.Load(docs, 1);
};
timer1.Enabled = true;
timer1.Start();
});
tasks.Add(task1);
var task2 = Task.Factory.StartNew(() =>
{
var timer1 = new Timer(1000) { Interval = 1000 };
timer1.Elapsed += (object sender, ElapsedEventArgs e) =>
{
queueProvider.Load(docs, 2);
};
timer1.Enabled = true;
timer1.Start();
});
tasks.Add(task2);
//Dequeue
//var task3 = Task.Factory.StartNew(() =>
//{
// var timer1 = new Timer(3000) { Interval = 1000 };
// timer1.Elapsed += (object sender, ElapsedEventArgs e) =>
// {
// queueProvider.Dequeue();
// };
// timer1.Enabled = true;
// timer1.Start();
//});
//tasks.Add(task3);
Task.WaitAll(tasks.ToArray());
}
catch (Exception e)
{
Console.WriteLine(e);
}
Console.ReadKey();
}
}
public interface IItem
{
Guid Id { get; set; }
}
public interface IProvider
{
void Enqueue(IItem feedingItem, int id);
}
public class Provider : IProvider
{
private readonly ConcurrentQueue<IItem> queue;
private readonly ConcurrentDictionary<Guid, DateTime> inputBuffor;
private readonly object locker = new object();
private int maxQueueCount = 3;
public Provider()
{
queue = new ConcurrentQueue<IItem>();
inputBuffor = new ConcurrentDictionary<Guid, DateTime>();
}
public IItem Dequeue()
{
queue.TryDequeue(out var item);
Console.WriteLine("Dequeue: " + item.Id);
return item;
}
public void Enqueue(IItem item, int id)
{
//lock (locker)
//{
if (inputBuffor.TryAdd(item.Id, DateTime.Now))
{
queue.Enqueue(item);
Console.WriteLine("Enqueue: " + item.Id + "taskId: " + id);
Console.WriteLine("Count: " + queue.Count + " Buffor: " + inputBuffor.Count);
}
else
{
Console.WriteLine("Not Enqueue: " + item.Id + "taskId: " + id);
}
//}
}
public void Load(IEnumerable<IItem> data, int id)
{
foreach (var item in data)
{
if (queue.Count < maxQueueCount)
Enqueue(item, id);
}
}
}
Update
I renamed Enqueu method to TryEnqueue and added BlockingCollection instead Concurent Collection.
var task1 = Task.Factory.StartNew(() =>
{
var timer1 = new Timer(1000) { Interval = 1000 };
timer1.Elapsed += (object sender, ElapsedEventArgs e) =>
{
foreach(var doc in docs)
{
if (queueProvider.TryEnqueue(doc, 1))
{
Console.WriteLine("Enqueue: " + doc.Id + "taskId: 2");
Console.WriteLine("Count: " + queueProvider.QueueCount + " Buffor: " + queueProvider.BufforCount);
}
else
{
Console.WriteLine("Not Enqueue: " + doc.Id + "taskId: 2");
}
}
};
timer1.Enabled = true;
timer1.Start();
});
tasks.Add(task1);
public bool TryEnqueue(IItem item, int id)
{
if (inputBuffor.TryAdd(item.Id, DateTime.Now))
{
if (queue.TryAdd(item))
{
return true;
}
}
return false;
}
public IItem Dequeue()
{
queue.TryTake(out var item);
return item;
}
Multiple threads could both satisfy queue.Count < maxQueueCount (at the same time) and then each thread would run your Enqueue method and push past your maxQueueCount. That is definitely not thread safe. I’d move that check into your EnqueueMethod and surround it with a lock.

Bubbling events from child to parent (overal progress)

I have a problem which i don't know how to solve.
I have some classes (Processors) that fires an event with progress information ( Percentage of how far it is). There are multiple processors that do this and the top Processor (Engine) which calls them all needs to send information to the end user on its progress.
If i don't know beforehand how many items will be processed by each processor how can i give the user some good feedback on how far the process is?
Take a look at the following simplified example
NetFiddle
class Program {
static void Main(string[] args) {
var p = new Program();
p.Run();
}
private void Run() {
var engine = new Engine();
engine.UpdateProgress += Engine_UpdateProgress;
engine.Process();
Console.ReadLine();
}
private void Engine_UpdateProgress(object sender, UpdateProgressEventArgs e) {
Console.WriteLine($"{e.UpdateDateTime} - Caller: {e.Caller}, Percentage: {e.Percentage}");
}
}
public class Engine {
private readonly ProcessorA _processorA;
private readonly ProcessorB _processorB;
private readonly ProcessorC _processorC;
private readonly ProcessorD _processorD;
public event EventHandler<UpdateProgressEventArgs> UpdateProgress;
public Engine() {
_processorA = new ProcessorA();
_processorB = new ProcessorB();
_processorC = new ProcessorC();
_processorD = new ProcessorD();
//Handle events
_processorA.UpdateProgress += ProcessorA_UpdateProgress;
_processorB.UpdateProgress += ProcessorA_UpdateProgress;
_processorC.UpdateProgress += ProcessorA_UpdateProgress;
_processorD.UpdateProgress += ProcessorA_UpdateProgress;
}
private void ProcessorA_UpdateProgress(object sender, UpdateProgressEventArgs e) {
OnUpdateProgress(e);
}
public void Process() {
_processorA.Process();
_processorB.Process();
_processorC.Process();
_processorD.Process();
}
protected virtual void OnUpdateProgress(UpdateProgressEventArgs e) {
UpdateProgress?.Invoke(this, e);
}
}
public class ProcessorA : Processor {
private readonly ProcessorA_A _processorA_A;
public ProcessorA() {
_processorA_A = new ProcessorA_A();
//Handle events
_processorA_A.UpdateProgress += ProcessorA_A_UpdateProgress;
}
public void Process() {
_processorA_A.Process();
}
private void ProcessorA_A_UpdateProgress(object sender, UpdateProgressEventArgs e) {
OnUpdateProgress(e);
}
}
public class ProcessorB : Processor {
public void Process() {
for (int i = 0; i <= 100; i++) {
var args = new UpdateProgressEventArgs() { Caller = nameof(ProcessorB), Percentage = i, UpdateDateTime = DateTime.Now};
//Do some work
Thread.Sleep(r.Next(50,250));
OnUpdateProgress(args);
}
}
}
public class ProcessorC : Processor {
public void Process() {
for (int i = 0; i <= 100; i++) {
var args = new UpdateProgressEventArgs() { Caller = nameof(ProcessorC), Percentage = i, UpdateDateTime = DateTime.Now };
//Do some work
Thread.Sleep(r.Next(50, 250));
OnUpdateProgress(args);
}
}
}
public class ProcessorD : Processor {
public void Process() {
for (int i = 0; i <= 100; i++) {
var args = new UpdateProgressEventArgs() { Caller = nameof(ProcessorD), Percentage = i, UpdateDateTime = DateTime.Now };
//Do some work
Thread.Sleep(r.Next(50, 250));
OnUpdateProgress(args);
}
}
}
public class ProcessorA_A : Processor {
public void Process() {
for (int i = 0; i <= 100; i++) {
var args = new UpdateProgressEventArgs() { Caller = nameof(ProcessorA_A), Percentage = i, UpdateDateTime = DateTime.Now };
//Do some work
Thread.Sleep(r.Next(50, 250));
OnUpdateProgress(args);
}
}
}
public class Processor : IProcessor {
protected Random r = new Random();
public event EventHandler<UpdateProgressEventArgs> UpdateProgress;
protected virtual void OnUpdateProgress(UpdateProgressEventArgs e) {
UpdateProgress?.Invoke(this, e);
}
}
public interface IProcessor {
event EventHandler<UpdateProgressEventArgs> UpdateProgress;
}
public class UpdateProgressEventArgs {
public int Percentage { get; set; }
public string Caller { get; set; }
public DateTime UpdateDateTime { get; set; }
}
Just sending the progress from child to parent won't do the trick obviously. I hope someone can help me find a solution for this. Or if someone has another brilliant solution :)
Thanks in advance
Engine could maintain a private list of "the last completeness" of each process. In a Dictionary more than likely.
if we extend your Engine class to have this.
private Dictionary<IProcessor, int> _lastReportedPercentage = new Dictionary<IProcessor, int>();
and in the constructor where all your child processors are defined, set them all to 0.
public Engine()
{
//your other stuff
_lastReportedPercentage[_processorA] = 0;
_lastReportedPercentage[_processorB] = 0;
_lastReportedPercentage[_processorC] = 0;
_lastReportedPercentage[_processorD] = 0;
}
in your Event handler for the child processes do something like this:
private void ProcessorA_UpdateProgress(object sender, UpdateProgressEventArgs e)
{
_lastReportedPercentage[(IProcessor)sender] = e.Percentage;
var totalCompleteness = (int)Math.Floor(_lastReportedPercentage.Values.Sum() / _lastReportedPercentages.Values.Count);
OnUpdateProgress(new UpdateProgressEventArgs() { Caller = nameof(Engine), Percentage = totalCompleteness, UpdateDateTime = DateTime.Now });
}
You will then be reporting, from your Engine, the total completeness of all tasks.
There are some obvious design flaws to this, but you get the idea. It can be extended for a variable number of Processors if they're loaded from not-the-constructor, etc. but this should give you the desired output you'd want.
I would suggest having each processor keep a count of how many items it has completed, and how many items it has remaining.
This way the engine receives updates whenever an item is completed, as well as when an item is added to the processor.
As in Skintkingle's answer, you would then store the last received update from each processor within the engine.
You can then let the engine decide on the best way to put this information across. Something as simple as total completed / total remaining would work well.

How can I keep updating my ListBox synchronously

I am calculating prime numbers bw two numbers using following code
private static IEnumerable<int> GetPrimes(int from, int to)
{
for (int i = from; i <= to; i++)
{
bool isPrime = true;
int limit = (int)Math.Sqrt(i);
for (int j = 2; j <= limit; j++)
if (i % j == 0)
{
isPrime = false;
break;
}
if (isPrime)
{
yield return i;
}
}
}
And I want to update my list box without blocking my UI thread, with all the prime numbers using above code. The approch which I am using as following but this is not working out.
public MainWindow()
{
InitializeComponent();
_worker = new BackgroundWorker();
_worker.DoWork += _worker_DoWork;
this.DataContext = this;
}
private void _worker_DoWork(object sender, DoWorkEventArgs e)
{
PrimeNumbers = new ObservableCollection<int>();
foreach (var item in GetPrimes(1, 10000000))
{
Dispatcher.BeginInvoke(new Action<int>(Test), item);
}
}
private void Test(int obj)
{
PrimeNumbers.Add(obj);
}
public ObservableCollection<int> PrimeNumbers
{
get
{
return primeNumbers;
}
set
{
primeNumbers = value;
OnPropertyChanged("PrimeNumbers");
}
}
private void Button_Click_1(object sender, RoutedEventArgs e)
{
_worker.RunWorkerAsync();
}
but this approach freezes my UI. I want to have result continuously coming from the GetPrimes method and keep adding to my listboz
You are just posting too much. This code works as expected:
public partial class MainWindow
{
public ObservableCollection<int> PrimeNumbers { get; set; }
public MainWindow()
{
InitializeComponent();
PrimeNumbers = new ObservableCollection<int>();
}
private void Button_Click(object sender, System.Windows.RoutedEventArgs e)
{
PrintPrimes(PrimeNumbers.Add, 1, 10000000, SynchronizationContext.Current);
}
private static void PrintPrimes(Action<int> action, int from, int to,
SynchronizationContext syncContext)
{
Task.Run(() =>
{
for (var i = from; i <= to; i++)
{
var isPrime = true;
var limit = (int) Math.Sqrt(i);
for (var j = 2; j <= limit; j++)
{
if (i%j == 0)
{
isPrime = false;
break;
}
}
if (isPrime)
{
syncContext.Post(state => action((int)state), i);
Thread.Sleep(1);
}
}
});
}
}
Consider avoiding old BackgroundWorker class. Also, instead of using a synchronization mechanism of your platform try to switch to platform independent SynchronizationContext.
Instead of sleeping a thread you can post your results in bunches:
public partial class MainWindow
{
public ObservableCollection<int> PrimeNumbers { get; set; }
public MainWindow()
{
InitializeComponent();
PrimeNumbers = new ObservableCollection<int>();
}
private void Button_Click(object sender, System.Windows.RoutedEventArgs e)
{
PrintPrimes(items => items.ForEach(PrimeNumbers.Add),
1, 10000000, SynchronizationContext.Current);
}
private static void PrintPrimes(Action<List<int>> action, int from, int to,
SynchronizationContext syncContext)
{
Task.Run(() =>
{
var primesBuffer = new List<int>();
for (var i = from; i <= to; i++)
{
var isPrime = true;
var limit = (int) Math.Sqrt(i);
for (var j = 2; j <= limit; j++)
{
if (i%j == 0)
{
isPrime = false;
break;
}
}
if (isPrime)
{
primesBuffer.Add(i);
if (primesBuffer.Count >= 1000)
{
syncContext.Post(state => action((List<int>) state),
primesBuffer.ToList());
primesBuffer.Clear();
}
}
}
});
}
}
You can use Thread instead of Task.Run if you're stuck with older versions of a framework.
That code looks ok, you just forgot to start your background worker by calling BackgroundWorker.RunWorkerAsync.
PrimeNumbers = new ObservableCollection<int>();
This line needs to be outside your worker (or being invoked in the UI thread as well).
It seems like my UI was getting upadated very frequently so I have used a delay of 1 second on my background worker thread. It helped me to achieve my functionality
private void _worker_DoWork(object sender, DoWorkEventArgs e)
{
foreach (var item in GetPrimes(1, 1000000))
{
Thread.Sleep(1000);
Dispatcher.BeginInvoke(new Action<int>(Test), item);
}
}

How to clean the queue from all of the processed items?

There is a global queue of objects that you have to send to your customers. Queue is continually filled with new elements in its flow (one element in a second), that`s why you have to send constantly. Every client is served in a separate thread. After the object is sent to all clients it must be removed from the queue. It seems to be easy, but how to know that all the threads have already sent a particular object?
I do everything on the socket.
Thread threadForClientSending = new Thread(delegate()
{
while (true)
{
try
{
List<SymbolsTable> [] localArrayList ;
//main.que -- global queue
foreach (var eachlist in localArrayList = main.que.ToArray())
{
foreach (var item in eachlist)
{
byte[] message =
encoding.GetBytes((item.GetHashCode()%100).ToString() + " "+item.SDate +"\n\r");
client.Send(message);
}
Thread.Sleep(500);
}
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
break;
}
}
});
Such code sends everything to everyone, but it doesn`t clean the queue.
How to clean the queue from all of the processed items?
public static ConcurrentQueue<List<SymbolsTable>> que = new ConcurrentQueue<List<SymbolsTable>>();
public partial class SymbolsTable
{
public string SName { get; set; }
public Nullable<double> SPrice { get; set; }
public Nullable<int> SVolume { get; set; }
public System.DateTime SDate { get; set; }
}
NOTE: I highly recommend to define local queue for each client (task in server) to achieve maximum concurrency and cleaner code.
You can achieve what you need by using a CountDownEvent which would hold thread access for each item, We should set it to number of available worker that send data to clients.
here is how we can do it:
Definitions:
public static ConcurrentQueue<List<SymbolsTable>> que = new ConcurrentQueue<List<SymbolsTable>>();
public static CountdownEvent counter = new CountdownEvent(NumberOfThreads);
private const int NumberOfThreads = 3; //for example we have 3 clients here
Thread:
Thread threadForClientSending = new Thread(delegate()
{
while (true)
{
try
{
List<SymbolsTable> list;
var peek = que.TryPeek(out list);
if (!peek)
{
Thread.Sleep(100); //nothing to pull
continue;
}
foreach (var item in list)
{
main.que -- global queue
byte[] message =
encoding.GetBytes((item.GetHashCode() % 100).ToString() + " " + item.SDate + "\n\r");
client.Send(message);
Thread.Sleep(500);
}
counter.Signal(); //the thread would signal itself as finished, and wait for others to finish the task
lock (que)
{
List<SymbolsTable> lastList;
if (que.TryPeek(out lastList) && lastList.Equals(list))
{
//just one of the threads would dequeue the item
que.TryDequeue(out lastList);
counter.Reset(); //reset counter for next iteration
}
}
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
break;
}
}
});
here we used TryPeek to just access the item in queue so we won't remove it, at the end in:
lock (que)
{
List<SymbolsTable> lastList;
if (que.TryPeek(out lastList) && lastList.Equals(list))
{
//just one of the threads would dequeue the item
que.TryDequeue(out lastList);
counter.Reset(); //reset counter for next iteration
}
}
we would lock the que so only one thread at a time can access it, then we check to see if the processed item has been removed from queue and if not we will remove it here.
More Elegant Solution (in my Humble Opinion):
as you saw in previous solution we're blocking threads to finish the task for each item together,adding a local queue to each thread would remove this blocking mechanism, so we can achieve maximum concurrency.
I suggest something like:
class GlobalQueue
{
private readonly List<IMyTask> _subscribers=new List<IMyTask>();
public void Subscribe(IMyTask task)
{
_subscribers.Add(task);
}
public void Unsubscribe(IMyTask task)
{
_subscribers.Remove(task);
}
public void Enqueue(List<SymbolsTable> table)
{
foreach (var s in _subscribers)
s.Enqueue(table);
}
}
interface IMyTask
{
void Enqueue(List<SymbolsTable> table);
}
which your task would be roughly like:
class MyTask : IMyTask
{
private readonly ConcurrentQueue<List<SymbolsTable>> _localQueue = new ConcurrentQueue<List<SymbolsTable>>();
private readonly Thread _thread;
private bool _started;
public void Enqueue(List<SymbolsTable> table)
{
_localQueue.Enqueue(table);
}
public MyTask()
{
_thread = new Thread(Execute);
}
public void Start()
{
_started = true;
_thread.Start();
}
public void Stop()
{
_started = false;
}
private void Execute()
{
while (_started)
{
try
{
List<SymbolsTable> list;
var peek = _localQueue.TryDequeue(out list);
if (!peek)
{
Thread.Sleep(100); //nothing to pull
continue;
}
foreach (var item in list)
{
byte[] message =
encoding.GetBytes((item.GetHashCode() % 100).ToString() + " " + item.SDate + "\n\r");
client.Send(message);
}
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
break;
}
}
}
}

Access List<T> elements in multiple threads and report progress

Inside my application I have list of persons from my database.
For every person I must call 5 (for now) services to search for some informations.
If service returns info I' adding it to that person (list of orders for specific person)
Because services work independent I thought I could try to run them parallel.
I've created my code as so:
using System;
using System.Collections.Generic;
using System.Threading;
namespace Testy
{
internal class Program
{
internal class Person
{
public int Id { get; set; }
public string Name { get; set; }
public List<string> Orders { get; private set; }
public Person()
{
// thanks for tip #juharr
Orders = new List<string>();
}
public void AddOrder(string order)
{
lock (Orders) //access across threads
{
Orders.Add(order);
}
}
}
internal class Service
{
public int Id { get; private set; }
public Service(int id)
{
Id = id;
}
//I get error when I use IList instead of List
public void Search(ref List<Person> list)
{
foreach (Person p in list)
{
lock (p) //should I lock Person here? and like this???
{
Search(p);
}
}
}
private void Search(Person p)
{
Thread.Sleep(50);
p.AddOrder(string.Format("test order from {0,2}",
Thread.CurrentThread.ManagedThreadId));
Thread.Sleep(100);
}
}
private static void Main()
{
//here I load my services from external dll's
var services = new List<Service>();
for (int i = 1; i <= 5; i++)
{
services.Add(new Service(i));
}
//sample data load from db
var persons = new List<Person>();
for (int i = 1; i <= 10; i++)
{
persons.Add(
new Person {Id = i,
Name = string.Format("Test {0}", i)});
}
Console.WriteLine("Number of services: {0}", services.Count);
Console.WriteLine("Number of persons: {0}", persons.Count);
ManualResetEvent resetEvent = new ManualResetEvent(false);
int toProcess = services.Count;
foreach (Service service in services)
{
new Thread(() =>
{
service.Search(ref persons);
if (Interlocked.Decrement(ref toProcess) == 0)
resetEvent.Set();
}
).Start();
}
// Wait for workers.
resetEvent.WaitOne();
foreach (Person p in persons)
{
Console.WriteLine("{0,2} Person name: {1}",p.Id,p.Name);
if (null != p.Orders)
{
Console.WriteLine(" Orders:");
foreach (string order in p.Orders)
{
Console.WriteLine(" Order: {0}", order);
}
}
else
{
Console.WriteLine(" No orders!");
}
}
Console.ReadLine();
}
}
}
I have 2 problems with my code:
When I run my app I should get list of 10 persons and for every person 5 orders, but from time to time (ones for 3-5 runs) for first person I get only 4 orders. How I can prevent such behaviour? solved! thanks to #juharr
How can I report progress from my threads? What I would like to get is one Function from my Program class that will be called every time order is added from service - I need that to show some kind of progress for every report. I was trying solution described here: https://stackoverflow.com/a/3874184/965722, but I'm wondering if there is an easier way. Ideally I would like to add delegate to Service class and place all Thread code there.
How should I add event and delegate to Service class and how to subscribe to it in Main method?
I'm using .NET 3.5
I've added this code to be able to get progress reports:
using System;
using System.Collections.Generic;
using System.Threading;
namespace Testy
{
internal class Program
{
public class ServiceEventArgs : EventArgs
{
public ServiceEventArgs(int sId, int progress)
{
SId = sId;
Progress = progress;
}
public int SId { get; private set; }
public int Progress { get; private set; }
}
internal class Person
{
private static readonly object ordersLock = new object();
public int Id { get; set; }
public string Name { get; set; }
public List<string> Orders { get; private set; }
public Person()
{
Orders = new List<string>();
}
public void AddOrder(string order)
{
lock (ordersLock) //access across threads
{
Orders.Add(order);
}
}
}
internal class Service
{
public event EventHandler<ServiceEventArgs> ReportProgress;
public int Id { get; private set; }
public string Name { get; private set; }
private int counter;
public Service(int id, string name)
{
Id = id;
Name = name;
}
public void Search(List<Person> list) //I get error when I use IList instead of List
{
counter = 0;
foreach (Person p in list)
{
counter++;
Search(p);
Thread.Sleep(3000);
}
}
private void Search(Person p)
{
p.AddOrder(string.Format("Order from {0,2}", Thread.CurrentThread.ManagedThreadId));
EventHandler<ServiceEventArgs> handler = ReportProgress;
if (handler != null)
{
var e = new ServiceEventArgs(Id, counter);
handler(this, e);
}
}
}
private static void Main()
{
const int count = 5;
var services = new List<Service>();
for (int i = 1; i <= count; i++)
{
services.Add(new Service(i, "Service " + i));
}
var persons = new List<Person>();
for (int i = 1; i <= 10; i++)
{
persons.Add(new Person {Id = i, Name = string.Format("Test {0}", i)});
}
Console.WriteLine("Number of services: {0}", services.Count);
Console.WriteLine("Number of persons: {0}", persons.Count);
Console.WriteLine("Press ENTER to start...");
Console.ReadLine();
ManualResetEvent resetEvent = new ManualResetEvent(false);
int toProcess = services.Count;
foreach (Service service in services)
{
new Thread(() =>
{
service.ReportProgress += service_ReportProgress;
service.Search(persons);
if (Interlocked.Decrement(ref toProcess) == 0)
resetEvent.Set();
}
).Start();
}
// Wait for workers.
resetEvent.WaitOne();
foreach (Person p in persons)
{
if (p.Orders.Count != count)
Console.WriteLine("{0,2} Person name: {1}, orders: {2}", p.Id, p.Name, p.Orders.Count);
}
Console.WriteLine("END :)");
Console.ReadLine();
}
private static void service_ReportProgress(object sender, ServiceEventArgs e)
{
Console.CursorLeft = 0;
Console.CursorTop = e.SId;
Console.WriteLine("Id: {0,2}, Name: {1,2} - Progress: {2,2}", e.SId, ((Service) sender).Name, e.Progress);
}
}
}
I've added custom EventArgs, event for Service class.
In this configuration I should have 5 services running, but only 3 of them report progress.
I imagined that if I have 5 services I should have 5 events (5 lines showing progress).
This is probably because of threads, but I have no ideas how to solve this.
Sample output now looks like this:
Number of services: 5
Number of persons: 10
Press ENTER to start...
Id: 3, Name: Service 3 - Progress: 10
Id: 4, Name: Service 4 - Progress: 10
Id: 5, Name: Service 5 - Progress: 19
END :)
It should look like this:
Number of services: 5
Number of persons: 10
Press ENTER to start...
Id: 1, Name: Service 1 - Progress: 10
Id: 2, Name: Service 2 - Progress: 10
Id: 3, Name: Service 3 - Progress: 10
Id: 4, Name: Service 4 - Progress: 10
Id: 5, Name: Service 5 - Progress: 10
END :)
Last edit
I've moved all my thread creation to separate class ServiceManager now my code looks like so:
using System;
using System.Collections.Generic;
using System.Threading;
namespace Testy
{
internal class Program
{
public class ServiceEventArgs : EventArgs
{
public ServiceEventArgs(int sId, int progress)
{
SId = sId;
Progress = progress;
}
public int SId { get; private set; } // service id
public int Progress { get; private set; }
}
internal class Person
{
private static readonly object ordersLock = new object();
public int Id { get; set; }
public string Name { get; set; }
public List<string> Orders { get; private set; }
public Person()
{
Orders = new List<string>();
}
public void AddOrder(string order)
{
lock (ordersLock) //access across threads
{
Orders.Add(order);
}
}
}
internal class Service
{
public event EventHandler<ServiceEventArgs> ReportProgress;
public int Id { get; private set; }
public string Name { get; private set; }
public Service(int id, string name)
{
Id = id;
Name = name;
}
public void Search(List<Person> list)
{
int counter = 0;
foreach (Person p in list)
{
counter++;
Search(p);
var e = new ServiceEventArgs(Id, counter);
OnReportProgress(e);
}
}
private void Search(Person p)
{
p.AddOrder(string.Format("Order from {0,2}", Thread.CurrentThread.ManagedThreadId));
Thread.Sleep(50*Id);
}
protected virtual void OnReportProgress(ServiceEventArgs e)
{
var handler = ReportProgress;
if (handler != null)
{
handler(this, e);
}
}
}
internal static class ServiceManager
{
private static IList<Service> _services;
public static IList<Service> Services
{
get
{
if (null == _services)
Reload();
return _services;
}
}
public static void RunAll(List<Person> persons)
{
ManualResetEvent resetEvent = new ManualResetEvent(false);
int toProcess = _services.Count;
foreach (Service service in _services)
{
var local = service;
local.ReportProgress += ServiceReportProgress;
new Thread(() =>
{
local.Search(persons);
if (Interlocked.Decrement(ref toProcess) == 0)
resetEvent.Set();
}
).Start();
}
// Wait for workers.
resetEvent.WaitOne();
}
private static readonly object consoleLock = new object();
private static void ServiceReportProgress(object sender, ServiceEventArgs e)
{
lock (consoleLock)
{
Console.CursorTop = 1 + (e.SId - 1)*2;
int progress = (100*e.Progress)/100;
RenderConsoleProgress(progress, '■', ConsoleColor.Cyan, String.Format("{0} - {1,3}%", ((Service) sender).Name, progress));
}
}
private static void ConsoleMessage(string message)
{
Console.CursorLeft = 0;
int maxCharacterWidth = Console.WindowWidth - 1;
if (message.Length > maxCharacterWidth)
{
message = message.Substring(0, maxCharacterWidth - 3) + "...";
}
message = message + new string(' ', maxCharacterWidth - message.Length);
Console.Write(message);
}
private static void RenderConsoleProgress(int percentage, char progressBarCharacter,
ConsoleColor color, string message)
{
ConsoleColor originalColor = Console.ForegroundColor;
Console.ForegroundColor = color;
Console.CursorLeft = 0;
int width = Console.WindowWidth - 1;
var newWidth = (int) ((width*percentage)/100d);
string progBar = new string(progressBarCharacter, newWidth) + new string(' ', width - newWidth);
Console.Write(progBar);
if (!String.IsNullOrEmpty(message))
{
Console.CursorTop++;
ConsoleMessage(message);
Console.CursorTop--;
}
Console.ForegroundColor = originalColor;
}
private static void Reload()
{
if (null == _services)
_services = new List<Service>();
else
_services.Clear();
for (int i = 1; i <= 5; i++)
{
_services.Add(new Service(i, "Service " + i));
}
}
}
private static void Main()
{
var services = ServiceManager.Services;
int count = services.Count;
var persons = new List<Person>();
for (int i = 1; i <= 100; i++)
{
persons.Add(new Person {Id = i, Name = string.Format("Test {0}", i)});
}
Console.WriteLine("Services: {0}, Persons: {1}", services.Count, persons.Count);
Console.WriteLine("Press ENTER to start...");
Console.ReadLine();
Console.Clear();
Console.CursorVisible = false;
ServiceManager.RunAll(persons);
foreach (Person p in persons)
{
if (p.Orders.Count != count)
Console.WriteLine("{0,2} Person name: {1}, orders: {2}", p.Id, p.Name, p.Orders.Count);
}
Console.CursorTop = 12;
Console.CursorLeft = 0;
Console.WriteLine("END :)");
Console.CursorVisible = true;
Console.ReadLine();
}
}
}
Basically you have a race condition with the create of the Orders. Imagine the following execution of two threads.
Thread 1 checks if Orders is null and it is.
Thread 2 checks if Orders is null and it is.
Thread 1 sets Orders to a new list.
Thread 1 gets the lock.
Thread 1 adds to the Orders list.
Thread 2 sets Order to a new list. (you just lost what Thread 1 added)
You need to include the creation of the Orders inside the lock.
public void AddOrder(string order)
{
lock (Orders) //access across threads
{
if (null == Orders)
Orders = new List<string>();
Orders.Add(order);
}
}
Or you really should create the Order list in a Person constructor
public Person()
{
Orders = new List<Order>();
}
Also you should really create a separate object for locking.
private object ordersLock = new object();
public void AddOrder(string order)
{
lock (ordersLock) //access across threads
{
Orders.Add(order);
}
}
EDIT:
In your foreach where you create the threads you need to create a local copy of the service to use inside the lambda expression. This is because the foreach will update the service variable and the thread can end up capturing the wrong variable. So something like this.
foreach (Service service in services)
{
Service local = service;
local.ReportProgress += service_ReportProgress;
new Thread(() =>
{
local.Search(persons);
if (Interlocked.Decrement(ref toProcess) == 0)
resetEvent.Set();
}
).Start();
}
Note the subscription doesn't need to be inside the thread.
Alternatively you could move the creation of the thread inside the Search method of your Service class.
Additionally you might want to create a OnReportProgress method in the Service class like so:
protected virtual void OnReportProgress(ServiceEventArgs e)
{
EventHandler<ServiceEventArgs> handler = ReportProgress;
if (handler != null)
{
handler(this, e);
}
}
Then call that inside your Search method. Personally I'd call it in the public Search method and make the counter a local variable as well to allow for reuse of the Service object on another list.
Finally you will need an additional lock in the event handler when writing to the console to make sure one thread doesn't change the cursor position before another one writes it's output.
private static object consoleLock = new object();
private static void service_ReportProgress(object sender, ServiceEventArgs e)
{
lock (consoleLock)
{
Console.CursorLeft = 0;
Console.CursorTop = e.SId;
Console.WriteLine("Id: {0}, Name: {1} - Progress: {2}", e.SId, ((Service)sender).Name, e.Progress);
}
}
Also you might want to use Console.Clear() in the following locaiton:
...
Console.WriteLine("Number of services: {0}", services.Count);
Console.WriteLine("Number of persons: {0}", persons.Count);
Console.WriteLine("Press ENTER to start...");
Console.Clear();
Console.ReadLine();
...
And you'll need to update the cursor position before your write out your end statement.
Console.CursorTop = 6;
Console.WriteLine("END :)");
This might not totally answer your question (but I think you might have a race condition), when you start dealing with threads you need to implement proper synchronization when updating objects from different threads. You should make sure only one thread is able to update an instance of the person class at any given time.
The p.AddOrder( should have mutex that ensures only one thread is updating the Person object.

Categories

Resources