I am trying to add a batch of nodes to a TreeView asynchronously, in 50ms time intervals, but I am getting an exception that says the collection has been modified. How can I prevent this exception from occurring?
Collection was modified; enumeration operation may not execute.
public partial class Form1 : Form
{
private SynchronizationContext synchronizationContext;
private DateTime previousTime = DateTime.Now;
private List<int> _batch = new List<int>();
public Form1()
{
InitializeComponent();
}
private async void button1_Click(object sender, EventArgs e)
{
synchronizationContext = SynchronizationContext.Current;
await Task.Run(() =>
{
for (var i = 0; i <= 5000000; i++)
{
_batch.Add(i);
UpdateUI(i);
}
});
}
public void UpdateUI(int value)
{
var timeNow = DateTime.Now;
if ((DateTime.Now - previousTime).Milliseconds <= 50) return;
synchronizationContext.Post(new SendOrPostCallback(o =>
{
foreach (var item in _batch)
{
TreeNode newNode = new TreeNode($"{item}");
treeView1.Nodes.Add(newNode);
}
}), null);
_batch.Clear();
previousTime = timeNow;
}
}
List<T> is not thread safe. You are trying to modify the list while you are trying to read from it in your foreach.
There are a few ways to solve this.
Use ConcurrentQueue
Change:
private List<int> _batch = new List<int>();
To:
var _batch = new ConcurrentQueue<int>();
You can add items by calling:
_batch.Enqueue(1);
Then you can grab them all out and add them in a thread-safe manner:
while(_batch.TryDequeue(out var n))
{
// ...
}
Use List, but with Thread-synchronization Objects
Create a new object like so:
private readonly _listLock = new object();
Then add these helper methods:
private void AddToBatch(int value)
{
lock(_listLock)
{
_batch.Add(value);
}
}
private int[] GetAllItemsAndClear()
{
lock(_listLock)
{
try
{
return _batch.ToArray();
}
finally
{
_batch.Clear();
}
}
}
This ensures only one thread at a time is operating on the List object.
There are other ways to do this as well, but this is the gist of the problem. Your code won't be as fast because of the overhead of synchronizing data, but you won't crash.
Related
I need a thread safe cache to store the instances of a disposable class.
It will be used with .NET 4.0
Cache should be aware of if a stored instance is beign used or not.
When an instance is wanted from the cache, it should look at the stored avaliable instances and give one; if there is no available, create a new instance and store it.
If the cache has not been used for a period of time, cache should dispose the stored, not being used insances and clear them.
This is the solution I wrote:
private class cache<T> where T : IDisposable
{
Func<T> _createFunc;
long livingTicks;
int livingMillisecs;
public cache(Func<T> createFunc, int livingTimeInSec)
{
this.livingTicks = livingTimeInSec * 10000000;
this.livingMillisecs = livingTimeInSec * 1000;
this._createFunc = createFunc;
}
Stack<T> st = new Stack<T>();
public IDisposable BeginUseBlock(out T item)
{
this.actionOccured();
if (st.Count == 0)
item = _createFunc();
else
lock (st)
if (st.Count == 0)
item = _createFunc();
else
item = st.Pop();
return new blockDisposer(this, item);
}
long _lastTicks;
bool _called;
private void actionOccured()
{
if (!_called)
lock (st)
if (!_called)
{
_called = true;
System.Threading.Timer timer = null;
timer = new System.Threading.Timer((obj) =>
{
if ((DateTime.UtcNow.Ticks - _lastTicks) > livingTicks)
{
timer.Dispose();
this.free();
}
},
null, livingMillisecs, livingMillisecs);
}
_lastTicks = DateTime.UtcNow.Ticks;
}
private void free()
{
lock (st)
{
while (st.Count > 0)
st.Pop().Dispose();
_called = false;
}
}
private class blockDisposer : IDisposable
{
T _item;
cache<T> _c;
public blockDisposer(cache<T> c, T item)
{
this._c = c;
this._item = item;
}
public void Dispose()
{
this._c.actionOccured();
lock (this._c.st)
this._c.st.Push(_item);
}
}
}
This is a sample use:
class MyClass:IDisposable
{
public MyClass()
{
//expensive work
}
public void Dispose()
{
//free
}
public void DoSomething(int i)
{
}
}
private static Lazy<cache<MyClass>> myCache = new Lazy<cache<MyClass>>(() => new cache<MyClass>(() => new MyClass(), 60), true);//free 60sec. after last call
private static void test()
{
Parallel.For(0, 100000, (i) =>
{
MyClass cl;
using (myCache.Value.BeginUseBlock(out cl))
cl.DoSomething(i);
});
}
My questions:
Is there a faster way of doing this? (I've searched for the MemoryCache examples, but couln't figure out how I could use it for my requirements. And it requires a key check. Stack.Pop would be faster than a key search, I thought; and for my problem, performance is very important.)
In order to dispose the instances after a while (60sec. for the example code) I had to use a Timer. I just need a delayed function call that would be re-delayed on each action happening with the cache. Is there a way to do that without using a timer?
Edit:
I've tried #mjwills's comment. The performance is better with this:
ConcurrentStack<T> st = new ConcurrentStack<T>();
public IDisposable BeginUseBlock(out T item)
{
this.actionOccured();
if (!st.TryPop(out item))
item = _createFunc();
return new blockDisposer(this, item);
}
Edit2:
In my cas its not required, but if we need to control the size of the stack and dispose the unused objects, using a separate counter which will be increment-decremented with Interlocked.Increment will be faster (#mjwills)
I'm searching for this for several hours now and was not able to find proper solution. I'm c# beginner.
I have a winforms app with a ListBox and a class that does some work and should run forever on separate thread. I want to push MyDataStruct to ListBox each time its created in WorkerClass.Work.
Later on, several WorkerClass instances should run simultaneously and I will have combobox to pick which instance data to feed to ListBox . Is it better to have WorkerClas return only single MyDataStruct and keep their queue in Form1 class or have a queue in each WorkerClass and exchange the entire queue with Form1 every time it changes?
is my void QueueToLb good way to add queue data to ListBox ?
thank you for your support.
public partial class Form1 : Form
{
Queue<MyDataStruct> qList;
MyDataStruct myDataStruct;
private void RunTask()
{
//how do I make MyLongTask to update either qList or myDataStuct
Task.Run(() =>
{
MyLongTask(0, 1000);
});
}
private void MyLongTask(int low, int high)
{
WorkerClass wc = new WorkerClass();
wc.Work(low,high);
}
private void QueueToLb()
{
//is this good way to update listbox from queue?
List<MyDataStruct> lstMds = qList.Reverse<MyDataStruct>().ToList<MyDataStruct>();
List<string> lstStr = new List<string>();
foreach (MyDataStruct m in lstMds)
{
lstStr.Add(m.ToString());
}
listBox1.DataSource = lstStr;
}
}
public class WorkerClass
{
Queue<MyDataStruct> qList; //not sure if its better to keep the queue here or in Form1
public WorkerClass()
{
qList = new Queue<MyDataStruct>();
}
public void Work(int low, int high) //does some work forever
{
while (true)
{
if (qList.Count > 11) qList.Dequeue();
MyDataStruct mds = new MyDataStruct();
Random random = new Random();
mds.dt = DateTime.Now;
mds.num = random.Next(low, high);
qList.Enqueue(mds);
Thread.Sleep(1000);
}
}
}
public class MyDataStruct
{
public DateTime dt;
public int num;
public override string ToString()
{
StringBuilder s = new StringBuilder();
s.Append(num.ToString());
s.Append(" - ");
s.Append(dt.ToShortDateString());
return s.ToString();
}
}
OK I think I figured how to use BackgroundWorker on this, I'll be happy if someone could verify it is correct
public partial class Form1 : Form
{
Queue<MyDataStruct> qList;
BackgroundWorker bw = new BackgroundWorker();
public Form1()
{
InitializeComponent();
bw.WorkerReportsProgress = true;
bw.DoWork += new DoWorkEventHandler(Bw_DoWork);
bw.ProgressChanged += new ProgressChangedEventHandler(bw_ProgressChanged);
}
private void Form1_Load(object sender, EventArgs e)
{
qList = new Queue<MyDataStruct>(12);
}
private void button1_Click(object sender, EventArgs e)
{
bw.RunWorkerAsync();
}
private void MyLongTask(int low = 0, int high = 1000)
{
WorkerClass wc = new WorkerClass(bw);
wc.Work(low,high);
}
private void BindToLbWithQueue()
{
MyDataStruct mds = new MyDataStruct();
Random random = new Random();
mds.dt = DateTime.Now;
mds.num = random.Next(0, 1000);
qList.Enqueue(mds);
QueueToLb();
}
private void QueueToLb()
{
//is this good way to update listbox from queue?
List<MyDataStruct> lstMds = qList.Reverse<MyDataStruct>().ToList<MyDataStruct>();
List<string> lstStr = new List<string>();
foreach (MyDataStruct m in lstMds)
{
lstStr.Add(m.ToString());
}
listBox1.DataSource = lstStr;
}
#region worker
private void Bw_DoWork(object sender, DoWorkEventArgs e)
{
MyLongTask();
}
private void bw_ProgressChanged(object sender, ProgressChangedEventArgs e)
{
qList = (Queue<MyDataStruct>)e.UserState;
QueueToLb();
}
#endregion
}
public class WorkerClass
{
Queue<MyDataStruct> qList; //not sure if its better to keep the queue here or in Form1
BackgroundWorker bw = null;
public WorkerClass(BackgroundWorker bw)
{
this.bw = bw;
qList = new Queue<MyDataStruct>();
}
public void Work(int low, int high) //does some work forever
{
while (true)
{
if (qList.Count > 11) qList.Dequeue();
MyDataStruct mds = new MyDataStruct();
Random random = new Random();
mds.dt = DateTime.Now;
mds.num = random.Next(low, high);
qList.Enqueue(mds);
bw.ReportProgress(0, qList);
Thread.Sleep(1000);
}
}
}
public class MyDataStruct
{
public DateTime dt;
public int num;
public override string ToString()
{
StringBuilder s = new StringBuilder();
s.Append(num.ToString());
s.Append(" - ");
s.Append(dt.ToShortDateString());
return s.ToString();
}
}
I write an infinity loop for pulling from queue(RabbitMQ) and processing each pulled item in concurrent threads with limited count on running threads.
Now i want a solution for make a limit in thread execution count.see an example of my loop:
public class ThreadWorker<T>
{
public List<T> _lst;
private int _threadCount;
private int _maxThreadCount;
public ThreadWorker(List<T> lst, int maxThreadCount)
{
_lst = lst;
_maxThreadCount = maxThreadCount;
}
public void Start()
{
var i = 0;
while (i < _lst.Count)
{
i++;
var pull = _lst[i];
Process(pull);
}
}
public void Process(T item)
{
if (_threadCount > _maxThreadCount)
{
//wait any opration be done
// How to wait for one thread?
Interlocked.Decrement(ref _threadCount);
}
var t = new Thread(() => Opration(item));
t.Start();
Interlocked.Increment(ref _threadCount);
}
public void Opration(T item)
{
Console.WriteLine(item.ToString());
}
}
Notice that when i use a semaphore for limitation, Start() method don't wait for all running threads. my loop should after running threads with _maxThreadCount, be wait until release a thread and then push new thread for concurrent processing.
I would use Semaphore this way to control the number of threads:
public class ThreadWorker<T>
{
SemaphoreSlim _sem = null;
List<T> _lst;
public ThreadWorker(List<T> lst, int maxThreadCount)
{
_lst = lst;
_sem = new SemaphoreSlim(maxThreadCount);
}
public void Start()
{
var i = 0;
while (i < _lst.Count)
{
i++;
var pull = _lst[i];
_sem.Wait(); /*****/
Process(pull);
}
}
public void Process(T item)
{
var t = new Thread(() => Opration(item));
t.Start();
}
public void Opration(T item)
{
Console.WriteLine(item.ToString());
_sem.Release(); /*****/
}
}
I modifying the enumerated collection but, i put Lock around it...
and it don't understan why i get
"Collection was modified; enumeration operation may not execute."
i don't want to solve it with: "foreach (IObserver obs in _observers.ToList())"
the code is Observer pattern:
class Program
{
static void Main(string[] args)
{
Subject sub = new Subject();
Obs1 obs1 = new Obs1(sub);
Obs2 obs2 = new Obs2(sub);
sub.Nodefiy();
sub.Nodefiy();
sub.Nodefiy();
sub.Nodefiy();
sub.Nodefiy();
sub.Nodefiy();
Console.ReadKey();
}
}
public interface IObserver
{
void Update(int data);
}
public interface ISubscrib
{
void Reg(IObserver obs);
void UnReg(IObserver obs);
void Nodefiy();
}
public class Subject : ISubscrib
{
private static Object _lock;
private List<IObserver> _observers;
private int data = 0;
public Subject()
{
_lock = new Object();
_observers = new List<IObserver>();
}
public void Reg(IObserver obs)
{
lock (_lock)
{
_observers.Add(obs);
}
}
public void UnReg(IObserver obs)
{
lock (_lock)
{
int ind = _observers.IndexOf(obs);
if (ind >= 0)
{
_observers.RemoveAt(ind);
}
}
}
public void Nodefiy()
{
data = data + 1;
lock (_lock)
{
int sentData = data;
//foreach (IObserver obs in _observers.ToList<IObserver>())
foreach (IObserver obs in _observers)
{
obs.Update(sentData);
}
}
}
}
public class Obs1 : IObserver
{
private ISubscrib _subj;
public Obs1(ISubscrib subj)
{
_subj = subj;
_subj.Reg(this);
}
public void Update(int data)
{
Console.WriteLine("Obs1: {0}", data);
}
}
public class Obs2 : IObserver
{
private ISubscrib _subj;
public Obs2(ISubscrib subj)
{
_subj = subj;
_subj.Reg(this);
}
public void Update(int data)
{
Console.WriteLine("Obs2: {0}", data);
if (data > 3)
{
_subj.UnReg(this);
}
}
}
can anyone help me?
thanks...
When your Obj2 is invoking Update inside this foreach loop, it is going back to your Subject object and modifying this _observers collection, in the same thread. That's why the lock is not working. This is not a synchronization issue. Your problem is happening in the same thread.
I am not sure what you are trying to do in this code, so I can't help further.
A lock has to do with thread synchronization, I'm not sure why you thought that would help you.
You can't enumerate over a collection while you are in the middle of modifying that collection (see all the related questions). Change your foreach to this instead:
for (int i = _observers.Count-1; i >= 0; i--) {
_observers[i].Update(sentData);
}
The problem is i think... that while you are iterating the list of _observers you also Add or modify the collection from another thread.
You can sinchronize threads, and you should, because if you try to access an observer that was removed that not even a solution like using a for with an indexer will not help.
Try looking into Mutexes to implement a way that while a thread is iterating through the collection another thread will wait until its done before modifying the collection.
I was wondering if somebody explain me what is wrong in my code.
I run timer in extra thread. In timer function I work with EF context. I see that timer function has worked 6 times and I especially set interval in 3 seconds and take only 100 rows but in my DB I see only one work.
So where is my error?
namespace WindowsFormsApplication2
{
public partial class Form1 : Form
{
private static int cnt = 0;
private Thread thread;
public Form1()
{
InitializeComponent();
}
private void StartThread()
{
var timer = new System.Timers.Timer();
timer.Elapsed += new System.Timers.ElapsedEventHandler(ProcessDb);
timer.Interval = 3000;
timer.Start();
}
public void ProcessDb(object sender, System.Timers.ElapsedEventArgs e)
{
cnt++;
ConnService serv = new ConnService();
serv.UpdateConnections(cnt);
}
private void button1_Click(object sender, EventArgs e)
{
thread = new Thread(StartThread);
thread.Start();
Thread.Sleep(20000);
}
}
public class MyqQueue
{
public static Stack<int> myStack = new Stack<int>();
public static Stack<int> myStack2 = new Stack<int>();
}
}
namespace WindowsFormsApplication2
{
class ConnService
{
public ConnService()
{
cnt++;
}
private static int cnt;
public void UpdateConnections(int second)
{
MyqQueue.myStack.Push(second);
DjEntities ctx = new DjEntities();
var entities = ctx.Connections.Take(100).Where(c => c.State == null);
foreach (var connection in entities)
{
connection.State = second;
}
if (second == 1)
Thread.Sleep(7000);
MyqQueue.myStack2.Push(second);
ctx.SaveChanges();
}
}
}
ctx.Connections.Take(100).Where(c => c.State == null)
Should be changed to
ctx.Connections.Where(c => c.State == null).Take(100)
Your first query translates to first take 100 without filtering and then apply filter.
Second query I have written will take items filtered and then take top 100.