My program works with a queue and a large file. I work with the queue in several threads and at some point there is OutOfMemoryException because of having too many objects Enqueued in the first thread in the queue. They do not manage to Dequeue in the second thread as fast as required.
P.S. I can use just the primitives of synchronyzing (Thread, Monitor)
I've already written a code which works with not very big data. I know that I can do Thread.Sleep (and this is works, I thried) when in my queue exists definite amount of objects. As far as I know it's not the best solution
class SynchronizedQueue
{
protected readonly object locker = new object();
protected Queue<BlockData> queue = new Queue<BlockData>();
public int Counter { get; set; }
public bool IsClose { get; set; }
public bool TryDequeue(out BlockData blockData)
{
lock (locker)
{
while (queue.Count == 0)
{
if (IsClose)
{
blockData = new BlockData();
return false;
}
Monitor.Wait(locker);
}
blockData = queue.Dequeue();
return true;
}
}
public void Close()
{
lock (locker)
{
IsClose = true;
Monitor.PulseAll(locker);
}
}
public void Enqueue(BlockData blockData)
{
lock (locker)
{
//That's what i want to avoid
if (Counter == 1000)
{
Thread.Sleep(240);
}
if (IsClose)
throw new InvalidOperationException("Work was canceled!");
while (blockData.Id != Counter)
Monitor.Wait(locker);
queue.Enqueue(blockData);
Counter++;
Monitor.PulseAll(locker);
}
}
}
What can you recommend for synchronizing Enqueue/Dequeue and avoid OutOfMemoryException?
Related
For sake of practice, I am trying to write a solution to the readers-writers problem.
The expected behavior should be that multiple reads can run concurrently, but writes need to wait for all readers to finish.
My solution is below in Read(), Write() methods, and the book I am referencing suggests Write2() for the writers.
1) I don't entirely understand why they chose to implement this way, specifically why the read lock is trying to be acquired again, after being awoken when numOfReaders == 0.
Is that to give readers priority, if one came right after Write acquired the read lock, and right before it actually wrote anything?
2) Are there any other issues with the my suggested Write implementation?
Thanks!!
class ReaderWriter
{
private int numOfReaders = 0;
private readonly object readLock = new object();
private readonly object writeLock = new object();
public void Read()
{
lock (readLock)
{
this.numOfReaders++;
}
// Read stuff
lock (readLock)
{
this.numOfReaders--;
Monitor.Pulse(readLock);
}
}
// My solution
public void Write()
{
lock (writeLock)
{
lock (readLock)
{
while (this.numOfReaders > 0)
{
Monitor.Wait(readLock);
}
// Write stuff
}
}
}
// Alternative solution
public void Write2()
{
lock (writeLock)
{
bool done = false;
while (!done)
{
lock (readLock)
{
if (this.numOfReaders == 0)
{
// Write stuff
done = true;
}
else
{
while (this.numOfReaders > 0)
{
Monitor.Wait(readLock);
}
}
}
}
}
}
}
I would like a function to check a Queue for new additions continuously on one thread
Obviously there is the option of a continuous loop with sleeps, but I want something less wasteful.
I considered a wait handle of some type and then having the queue signal it, but I can't override Enqueue safely as it is not virtual.
Now I'm considering encapsulating a Queue<T> as my best option but I wanted to ask you fine folks if there were a better one!
The idea is: I want many threads to access a socket connection while guaranteeing they read only the response for their message, so I was going to have one thread dispatch and read responses and then execute a callback with the response data (in plain text)
Try the blocking queue: Creating a blocking Queue<T> in .NET?
The basic idea is that when you call TryDequeue it will block until there is something in the queue. As you can see "beauty" of the blocking queue is that you don't have to poll/sleep or do anything crazy like that... it's the fundamental backbone for a Producer/Consumer pattern.
My version of the blocking queue is:
public class BlockingQueue<T> where T : class
{
private bool closing;
private readonly Queue<T> queue = new Queue<T>();
public int Count
{
get
{
lock (queue)
{
return queue.Count;
}
}
}
public BlockingQueue()
{
lock (queue)
{
closing = false;
Monitor.PulseAll(queue);
}
}
public bool Enqueue(T item)
{
lock (queue)
{
if (closing || null == item)
{
return false;
}
queue.Enqueue(item);
if (queue.Count == 1)
{
// wake up any blocked dequeue
Monitor.PulseAll(queue);
}
return true;
}
}
public void Close()
{
lock (queue)
{
if (!closing)
{
closing = true;
queue.Clear();
Monitor.PulseAll(queue);
}
}
}
public bool TryDequeue(out T value, int timeout = Timeout.Infinite)
{
lock (queue)
{
while (queue.Count == 0)
{
if (closing || (timeout < Timeout.Infinite) || !Monitor.Wait(queue, timeout))
{
value = default(T);
return false;
}
}
value = queue.Dequeue();
return true;
}
}
public void Clear()
{
lock (queue)
{
queue.Clear();
Monitor.Pulse(queue);
}
}
}
Many thanks to Marc Gravell for this one!
I have a class running the Producer-Consumer model like this:
public class SyncEvents
{
public bool waiting;
public SyncEvents()
{
waiting = true;
}
}
public class Producer
{
private readonly Queue<Delegate> _queue;
private SyncEvents _sync;
private Object _waitAck;
public Producer(Queue<Delegate> q, SyncEvents sync, Object obj)
{
_queue = q;
_sync = sync;
_waitAck = obj;
}
public void ThreadRun()
{
lock (_sync)
{
while (true)
{
Monitor.Wait(_sync, 0);
if (_queue.Count > 0)
{
_sync.waiting = false;
}
else
{
_sync.waiting = true;
lock (_waitAck)
{
Monitor.Pulse(_waitAck);
}
}
Monitor.Pulse(_sync);
}
}
}
}
public class Consumer
{
private readonly Queue<Delegate> _queue;
private SyncEvents _sync;
private int count = 0;
public Consumer(Queue<Delegate> q, SyncEvents sync)
{
_queue = q;
_sync = sync;
}
public void ThreadRun()
{
lock (_sync)
{
while (true)
{
while (_queue.Count == 0)
{
Monitor.Wait(_sync);
}
Delegate query = _queue.Dequeue();
query.DynamicInvoke(null);
count++;
Monitor.Pulse(_sync);
}
}
}
}
/// <summary>
/// Act as a consumer to the queries produced by the DataGridViewCustomCell
/// </summary>
public class QueryThread
{
private SyncEvents _syncEvents = new SyncEvents();
private Object waitAck = new Object();
private Queue<Delegate> _queryQueue = new Queue<Delegate>();
Producer queryProducer;
Consumer queryConsumer;
public QueryThread()
{
queryProducer = new Producer(_queryQueue, _syncEvents, waitAck);
queryConsumer = new Consumer(_queryQueue, _syncEvents);
Thread producerThread = new Thread(queryProducer.ThreadRun);
Thread consumerThread = new Thread(queryConsumer.ThreadRun);
producerThread.IsBackground = true;
consumerThread.IsBackground = true;
producerThread.Start();
consumerThread.Start();
}
public bool isQueueEmpty()
{
return _syncEvents.waiting;
}
public void wait()
{
lock (waitAck)
{
while (_queryQueue.Count > 0)
{
Monitor.Wait(waitAck);
}
}
}
public void Enqueue(Delegate item)
{
_queryQueue.Enqueue(item);
}
}
The code run smoothly but the wait() function.
In some case I want to wait until all the function in the queue were finished running so I made the wait() function.
The producer will fire the waitAck pulse at suitable time.
However, when the line "Monitor.Wait(waitAck);" is ran in the wait() function, all thread stop, includeing the producer and consumer thread.
Why would this happen and how can I solve it? thanks!
It seems very unlikely that all the threads will actually stop, although I should point out that to avoid false wake-ups you should probably have a while loop instead of an if statement:
lock (waitAck)
{
while(queryProducer.secondQueue.Count > 0)
{
Monitor.Wait(waitAck);
}
}
The fact that you're calling Monitor.Wait means that waitAck should be released so it shouldn't prevent the consumer threads from locking...
Could you give more information about the way in which the producer/consumer threads are "stopping"? Does it look like they've just deadlocked?
Is your producer using Notify or NotifyAll? You've got an extra waiting thread now, so if you only use Notify it's only going to release a single thread... it's hard to see whether or not that's a problem without the details of your Producer and Consumer classes.
If you could show a short but complete program to demonstrate the problem, that would help.
EDIT: Okay, now you've posted the code I can see a number of issues:
Having so many public variables is a recipe for disaster. Your classes should encapsulate their functionality so that other code doesn't have to go poking around for implementation bits and pieces. (For example, your calling code here really shouldn't have access to the queue.)
You're adding items directly to the second queue, which means you can't efficiently wake up the producer to add them to the first queue. Why do you even have multiple queues?
You're always waiting on _sync in the producer thread... why? What's going to notify it to start with? Generally speaking the producer thread shouldn't have to wait, unless you have a bounded buffer
You have a static variable (_waitAck) which is being overwritten every time you create a new instance. That's a bad idea.
You also haven't shown your SyncEvents class - is that meant to be doing anything interesting?
To be honest, it seems like you've got quite a strange design - you may well be best starting again from scratch. Try to encapsulate the whole producer/consumer queue in a single class, which has Produce and Consume methods, as well as WaitForEmpty (or something like that). I think you'll find the synchronization logic a lot easier that way.
Here is my take on your code:
public class ProducerConsumer
{
private ManualResetEvent _ready;
private Queue<Delegate> _queue;
private Thread _consumerService;
private static Object _sync = new Object();
public ProducerConsumer(Queue<Delegate> queue)
{
lock (_sync)
{
// Note: I would recommend that you don't even
// bother with taking in a queue. You should be able
// to just instantiate a new Queue<Delegate>()
// and use it when you Enqueue. There is nothing that
// you really need to pass into the constructor.
_queue = queue;
_ready = new ManualResetEvent(false);
_consumerService = new Thread(Run);
_consumerService.IsBackground = true;
_consumerService.Start();
}
}
public override void Enqueue(Delegate value)
{
lock (_sync)
{
_queue.Enqueue(value);
_ready.Set();
}
}
// The consumer blocks until the producer puts something in the queue.
private void Run()
{
Delegate query;
try
{
while (true)
{
_ready.WaitOne();
lock (_sync)
{
if (_queue.Count > 0)
{
query = _queue.Dequeue();
query.DynamicInvoke(null);
}
else
{
_ready.Reset();
continue;
}
}
}
}
catch (ThreadInterruptedException)
{
_queue.Clear();
return;
}
}
protected override void Dispose(bool disposing)
{
lock (_sync)
{
if (_consumerService != null)
{
_consumerService.Interrupt();
}
}
base.Dispose(disposing);
}
}
I'm not exactly sure what you're trying to achieve with the wait function... I'm assuming you're trying to put some type of a limit to the number of items that can be queued. In that case simply throw an exception or return a failure signal when you have too many items in the queue, the client that is calling Enqueue will keep retrying until the queue can take more items. Taking an optimistic approach will save you a LOT of headaches and it simply helps you get rid of a lot of complex logic.
If you REALLY want to have the wait in there, then I can probably help you figure out a better approach. Let me know what are you trying to achieve with the wait and I'll help you out.
Note: I took this code from one of my projects, modified it a little and posted it here... there might be some minor syntax errors, but the logic should be correct.
UPDATE: Based on your comments I made some modifications: I added another ManualResetEvent to the class, so when you call BlockQueue() it gives you an event which you can wait on and sets a flag to stop the Enqueue function from queuing more elements. Once all the queries in the queue are serviced, the flag is set to true and the _wait event is set so whoever is waiting on it gets the signal.
public class ProducerConsumer
{
private bool _canEnqueue;
private ManualResetEvent _ready;
private Queue<Delegate> _queue;
private Thread _consumerService;
private static Object _sync = new Object();
private static ManualResetEvent _wait = new ManualResetEvent(false);
public ProducerConsumer()
{
lock (_sync)
{
_queue = new Queue<Delegate> _queue;
_canEnqueue = true;
_ready = new ManualResetEvent(false);
_consumerService = new Thread(Run);
_consumerService.IsBackground = true;
_consumerService.Start();
}
}
public bool Enqueue(Delegate value)
{
lock (_sync)
{
// Don't allow anybody to enqueue
if( _canEnqueue )
{
_queue.Enqueue(value);
_ready.Set();
return true;
}
}
// Whoever is calling Enqueue should try again later.
return false;
}
// The consumer blocks until the producer puts something in the queue.
private void Run()
{
try
{
while (true)
{
// Wait for a query to be enqueued
_ready.WaitOne();
// Process the query
lock (_sync)
{
if (_queue.Count > 0)
{
Delegate query = _queue.Dequeue();
query.DynamicInvoke(null);
}
else
{
_canEnqueue = true;
_ready.Reset();
_wait.Set();
continue;
}
}
}
}
catch (ThreadInterruptedException)
{
_queue.Clear();
return;
}
}
// Block your queue from enqueuing, return null
// if the queue is already empty.
public ManualResetEvent BlockQueue()
{
lock(_sync)
{
if( _queue.Count > 0 )
{
_canEnqueue = false;
_wait.Reset();
}
else
{
// You need to tell the caller that they can't
// block your queue while it's empty. The caller
// should check if the result is null before calling
// WaitOne().
return null;
}
}
return _wait;
}
protected override void Dispose(bool disposing)
{
lock (_sync)
{
if (_consumerService != null)
{
_consumerService.Interrupt();
// Set wait when you're disposing the queue
// so that nobody is left with a lingering wait.
_wait.Set();
}
}
base.Dispose(disposing);
}
}
I wrote a multithreaded application for .NET and in a very important portion of code I have the following:
public class ContainerClass {
private object list_lock;
private ArrayList list;
private object init_lock = new object();
private ThreadClass thread;
public void Start() {
lock(init_lock) {
if (thread == null) {
thread = new ThreadClass();
...
}
}
}
public void Stop() {
lock(init_lock) {
if (thread != null) {
thread.processList(0);
thread.finish();
thread.waitUntilFinished();
thread = null;
} else {
throw new ApplicationException("Assertion failed - already stopped.");
}
...
}
}
private class ThreadedClass {
private ContainerClass container;
private Thread thread;
private bool finished;
private bool actually_finished;
public ThreadedClass(ContainerClass container) {
this.container = container;
thread = new Thread(run);
thread.IsBackground = true;
thread.Start();
}
private void run() {
bool local_finished = false;
while (!local_finished) {
ArrayList to_process = null;
lock (container.list_lock) {
if (container.list.Count > 0) {
to_process = new ArrayList();
to_process.AddRange(container.list);
}
}
if (to_process == null) {
// Nothing to process so wait
lock (this) {
if (!finished) {
try {
Monitor.Wait(this);
} catch (ThreadInterruptedException) {
}
}
}
} else if (to_process.Count > 0) {
// Something to process, so go ahead and process the journals,
int sz = to_process.Count;
// For all elements
for (int i = 0; i < sz; ++i) {
// Pick the lowest element to process
object obj = to_process[i];
try {
// process the element...
...
} catch (IOException e) {
...
// If there is an error processing the best thing to do is finish
lock (this) {
finished = true;
}
}
}
}
lock (this) {
local_finished = finished;
// Remove the elements that we have just processed.
if (to_process != null) {
lock (container.list_lock) {
int sz = to_process.Count;
for (int i = 0; i < sz; ++i) {
container.list.RemoveAt(0);
}
}
}
// Notify any threads waiting
Monitor.PulseAll(this);
}
}
lock (this) {
actually_finished = true;
Monitor.PulseAll(this);
}
}
public void waitUntilFinished() {
lock (this) {
try {
while (!actually_finished) {
Monitor.Wait(this);
}
} catch (ThreadInterruptedException e) {
throw new ApplicationException("Interrupted: " + e.Message);
}
}
}
public void processList(int until_size) {
lock (this) {
Monitor.PulseAll(this);
int sz;
lock (container.list_lock) {
sz = container.list.Count;
}
// Wait until the sz is smaller than 'until_size'
while (sz > until_size) {
try {
Monitor.Wait(this);
} catch (ThreadInterruptedException ) {
}
lock (container.list_lock) {
sz = container.list.Count;
}
}
}
}
}
}
As you can see, the thread waits until the collection is empty but it seems that the synchronization clashes forbids the thread to enter at the point (the only one in the whole code) where an element is removed from the collection list in the ContainerClass.
This clash provokes the code to never return and the application to continue running if the method processList is called with the value of until_size of 0.
I beg any better developer than me (and I guess there are a lot out there) to help me fixing this small piece of code, since I really can't understand why the list isn't decremented...
Thank you very much from the bottom of my heart.
PS. I would like to underline that the code works perfectly for all the time: the only situation in which it brakes it's when calling thread.processList(0) from ContainerClass.Stop().
Could the problem be that you are locking the ThreadClass object itself rather than a synchronizing object?
Try adding another private variable to lock on:
private static readonly object lockObject = new object()
and replace all the calls of lock(this) with lock(lockObject)
MSDN clearly advises against what your doing:
In general, avoid locking on a public
type, or instances beyond your code's
control. The common constructs lock
(this), lock (typeof (MyType)), and
lock ("myLock") violate this
guideline:
lock (this) is a problem if the instance can be accessed publicly.
Edit:
I think I see a deadlock condition. If you call run() when there are no objects to process, or you get to no objects to process, you lock(this), then call Monitor.Wait(this) and the thread waits:
if (to_process == null) {
// Nothing to process so wait
lock (this) { /* nothing's going to get this lock again until Monitor.PulseAll(this) is called from somewhere */
if (!finished) {
try {
Monitor.Wait(this); /* thread is waiting for Pulse(this) or PulseAll(this) */
} catch (ThreadInterruptedException) {
}
}
}
}
If you are in this condition when you call Container.Stop(), when ThreadProcess.processList(int) is called, you call lock(this) again, which can't enter the section because the run() method still has the lock:
lock (this) { /* run still holds this lock, waiting for PulseAll(this) to be called */
Monitor.PulseAll(this); /* this isn't called so run() never continues */
int sz;
lock (container.list_lock) {
sz = container.list.Count;
}
So, Monitor.PulseAll() can't be called to free the waiting thread in the run() method to exit the lock(this) area, so they are deadlocked waiting on each other. Right?
I think you should try to explain better what you actually want to achieve.
public void processList(int until_size) {
lock (this) {
Monitor.PulseAll(this);
This looks very strange as you should call the Monitor.Pulse when changing the lock state and not when beginning with locking.
Where are you creating the worker threads - this section is not clear as I see only Thread.Start()?
Btw I would advise you to look at PowerCollections - maybe you find what you need there.
The following C# class is used in a multithreaded enviroment. I removed very much of the actual code. The problem occurs when calling MethodA and MethodB almost simultaneously. The order of the lock in the IsDepleted property doesn't solves the problem. Removing lock(WaitingQueue) from the IsDepleted property solves the deadlock, but this solution causes a problem when another thread adds/removes an item from the WaitingQueue between the WaitingQueue.Count == 0 and Processing.Count == 0 statements.
using System.Collections.Generic;
class Example
{
bool IsDepleted
{
get
{
lock (Processing)
{
lock (WaitingQueue)
{
return WaitingQueue.Count == 0
&& Processing.Count == 0;
}
}
}
}
private readonly List<object> Processing = new List<object>();
private readonly Queue<object> WaitingQueue = new Queue<object>();
public void MethodA(object item)
{
lock (WaitingQueue)
{
if (WaitingQueue.Count > 0)
{
if (StartItem(WaitingQueue.Peek()))
{
WaitingQueue.Dequeue();
}
}
}
}
public void MethodB(object identifier)
{
lock (Processing)
{
Processing.Remove(identifier);
if (!IsDepleted)
{
return;
}
}
//Do something...
}
bool StartItem(object item)
{
//Do something and return a value
}
}
It depends if you want a quick fix or a rigorous fix.
A quick fix would be just to use one lock object in all cases.
e.g. private readonly object _lock = new object();
And then just lock on that. However, depending on your situation, that may impact performance more than you can accept.
I.e. your code would become this:
using System.Collections.Generic;
class Example
{
private readonly object _lock = new object();
bool IsDepleted
{
get
{
lock (_lock)
{
return WaitingQueue.Count == 0
&& Processing.Count == 0;
}
}
}
private readonly List<object> Processing = new List<object>();
private readonly Queue<object> WaitingQueue = new Queue<object>();
public void MethodA(object item)
{
lock (_lock)
{
if (WaitingQueue.Count > 0)
{
if (StartItem(WaitingQueue.Peek()))
{
WaitingQueue.Dequeue();
}
}
}
}
public void MethodB(object identifier)
{
lock (_lock)
{
Processing.Remove(identifier);
if (!IsDepleted)
{
return;
}
}
//Do something...
}
bool StartItem(object item)
{
//Do something and return a value
}
}
Take the Processing lock in method A and the WaitingQueue lock in method B (in other words, make it look like the first block of code). That way, you always take the locks in the same order and you'll never deadlock.
Simplify your code and use only a single object to lock on. You could also replace your locks with:
Monitor.TryEnter(Processing,1000)
this will give you a 1 second timeout. So essentially:
if (Monitor.TryEnter(Processing, 1000))
{
try
{
//do x
}
finally
{
Monitor.Exit(Processing);
}
}
Now you won't stop the deadlock but you can handle the case where you don't get a lock.