I'm using parallel linq to load a list of links from a text file. I'm checking each line whether it is a valid link(Uri) or not...if it is a valid Uri it is added to a Listbox. I'm just wondering if i should lock the ListBox.Items while adding a link to it.
Here is my code.
if (openFile.ShowDialog() == DialogResult.OK)
{
File.ReadLines(openFile.FileName).AsParallel().AsOrdered().ForAll(x =>
{
if (x.IsValidUri())
{
//lock(siteList.Items) <-should I?
siteList.Invoke(new Action<string>(s => siteList.Items.Add(s)), x);
}
});
}
There is no need to lock in this case. Using Invoke() already forces all changes to the Items collection to occur synchronously on the GUI thread.
Because of that though, you're not really gaining anything by using AsParallel(). You may want to consider using BeginInvoke() instead, which may speed things up a bit. That way, the calling thread isn't waiting for the invoke to complete.
Related
The following code gives me an exception:
var snapshot = BluetoothCapture.Instance.Snapshot();
var allowedDevice = snapshot.FirstOrDefault( _ => some_expression );
Collection was modified; enumeration operation may not execute.
I thought I could use a lock to freeze the collection so that I can iterate through it. However, I'm still getting the same exception.
The class definition below has a Snapshot method that attempts this:
public partial class BluetoothCapture
{
...
public void Capture()
{
_watcher = DeviceInformation.CreateWatcher();
_watcher.Added += (s, e) => { _devices.Add(e); };
_watcher.Start();
}
public IEnumerable<DeviceInformation> Snapshot()
{
lock (_devices)
{
return _devices.AsReadOnly();
}
}
}
Any suggestions?
Lock is used when you need to stop a block of code to execute in multiple threads (stop the parallel execution).
If Capture is called multiple times, so, yes, you can have a write being called before the previous one being finished.
You can use a ConcurrentBag.
The ConcurrentBag is a list like object, but it is thread safe (none of generic list is).
But, ConcurrentBag is unordered collection, so it does not guarantee ordering.
If you need some ordered list, you can see this link
Thread safe collections in .NET
You can also make a "lock" in the Add (not in the get)
_watcher.Added += (s, e) => { lock(_devices){_devices.Add(e); }};
But, if your app is running for a while, you can have a memory and performance problem (the add will be not async), even if the Capture is.
lock is really very useful concept but only when we use it wisely.
If in further code, your don't want to update reference of snapshot (the collection which you are getting from BluetoothCapture.Instance.Snapshot()) but just perform some Linq query to get filtered value to perform some logic.
you can avoid using lock.
It will be beneficial too, as by not doing lock you are actually not holding other threads to perform it's logic. - and we should not ignore the fact that terrible use of lock can cause serious problems like dead-lock too.
you are getting this exception, most likly, the collection on which you are performing linq query; is being updated by some other thread. (i too got that problem).
you can do one thing, instead of using general reference of collection (the one which you are getting from BluetoothCapture.Instance.Snapshot()) you can create a local list - as it is local it will not be updated by other threads.
Is it possible to prevent multiple execution of a ReactiveCommand.
Here is the 'simple' code I use:
The command is created:
this.LoadCommand = ReactiveCommand.CreateAsyncTask(
async _ => await this._dataService.Load(),
RxApp.TaskpoolScheduler);
After I add the subscription to the command:
this.LoadCommand.Subscribe(assets => ...);
And finally, I execute the command:
this.LoadCommand.ExecuteAsyncTask();
If I call the ExecuteAsyncTask multiple time at several location, I would like that any subsequent calls wait for the first one to finish.
EDIT:
Here is the complete code for the Subscribe method:
this.LoadCommand.Subscribe(assets =>
{
Application.Current.Dispatcher.Invoke(
DispatcherPriority.Background,
new Action(() => this.Assets.Clear()));
foreach (Asset asset in assets)
{
Application.Current.Dispatcher.Invoke(
DispatcherPriority.Background,
new Action<Asset>(a =>
{
this.Assets.Add(a);
}), asset);
}
});
Thanks,
Adrien.
I downloaded your sample application, and was able to fix it. Here's my 2 cents:
1) I took out the Rx.TaskpoolScheduler parameter in your command creation. That tells it to deliver the results using that scheduler, and I think you want to stick to delivering results on the UI thread.
2) Since by making this change you are now running your Subscribe logic on the UI thread, you don't need to deal with all that Invoking. You can access the collection directly:
this.LoadCommand.Subscribe(dataCollection =>
{
DataCollection.Clear();
DataCollection.AddRange(dataCollection);
});
Making just those 2 changes caused it to "work".
I'm no expert, but what I think was happening is that the actual ReactiveCommand "LoadCommand" you had was immediately returning and delivering results on various TaskPool threads. So it would never allow concurrency within the Command itself, which is by design. However the subscribes, I think since each was coming in on a different thread, were happening concurrently (race). So all the clears occurred, then all the adds.
By subscribing and handling all on the same thread you can avoid this, and if you can manage it on the UI thread, you won't need to involve Invoking to the Dispatcher.
Also, in this particular situation using the Invoke on the Dispatcher with the priority DispatcherPriority.Background seems to execute things in a non-serial fashion, not sure exactly the order, but it seemed to do all the clears, then the adds in reverse order (I incremented them so I could tell which invocation it was). So there is definitely something to be said for that. FWIW changing the priority to DispatcherPriority.Send kept it serial and displayed the "expected" behavior. That being said, I still prefer avoiding Invoking to the Dispatcher altogether, if you can.
thanks for the assistance. I've got a triple-threaded process, linked by a concurrent queue. Thread one processes information, returns to the second thread, which places data into a concurrent queue. The third thread is just looping like so:
while (true) {
if(queue.TryDequeue(out info)) {
doStuff(info);
} else {
Thread.Sleep(1);
}
}
Is there a better way to handle it such that I'm not iterating over the loop so much? The application is extremely performance sensitive, and currently just the TryDequeue is taking ~8-9% of the application runtime. Looking to decrease that as much as possible, but not really sure what my options are.
You should consider using System.Collections.Concurrent.BlockingCollection and its Add() / Take() methods. With Take() your third thread will be just suspended while waiting for new item. Add() is thread safe and can be used by second thread.
With that approach you should be able to simplify your code into something like that:
while (true) {
var info = collection.Take();
doStuff(info);
}
You can increase the sleep time. I would also use await Task.Delay instead of sleep. This way you can wait longer without the extra cpu cycles that Thread.Sleep uses and still be able to cancel the delay by making use of the CancellationTokenSource.
On another note, there are better ways of queuing up jobs. Taking into consideration that it appears you want to run these jobs synchronously, an example would be to have a singleton class that takes your work items and queues them up. So if there are no items in the queue when you add one, it should detect that and then start your job process. At the end of your job process, check for more work, use recursion to do that work or if no more jobs then exit the job process, which will run again when you add an item to the empty queue. If my assumption is wrong and you can run these jobs in parallel, why use a queue?
You may like to use a thread safe implementation of ObservableCollection. Check out this SO question ObservableCollection and threading
I don't have a recommendation that avoids looping, however I would recommend you move away from
while (true)
and consider this instead:
MyThing thing;
while (queue.TryDequeue(out thing))
{
doWork(thing);
}
Put this in a method that gets called each time the queue is modified, this ensures it is running when needed, but ends when not needed.
I've got some code which saves data from an object to XML. This locked the UI for a few seconds so I made it so it wouldn't.
foreach (Path path in m_canvasCompact.Children)
{
Task.Run(() => WritePathDataToXML(false, path));
}
Private void WritePAthDataToXML(bool is32x32, Path path)
{
//stuff going on...
xmlDoc.Root.Descendants.......Add(iconToAdd);
xmlDoc.Save(..);
}
The problem is (as expected) the order in which the data is written to the XML is in a random order depending upon the speed in which the tasks finish (I assume)
I could probably write some bodged code which looks at the XML and rearranges it once everything has been completed, but that's not ideal. Is there anyway to do this on a separate thread, but perhaps only one at a time, so they get executed and saved in the correct order.
Thanks.
It sounds like you want a producer/consumer queue. You can rig that up fairly easily using BlockingCollection<T>.
Create the blocking collection
Start a task which will read from the collection until it's "finished" (simplest with GetConsumingEnumerable), writing to the file
Add all the relevant items to the collection - making sure you do everything that touches UI elements within the UI thread.
Tell the collection it's "finished" (CompleteAdding)
Alternatively, as suggested in comments:
In the UI thread, create a collection with all the information you need from UI elements - basically you don't want to touch the UI elements within a non-UI thread.
Start a task to write that collection to disk; optionally await that task (which won't block the UI)
That's simpler, but it does mean building up the whole collection in memory before you start writing. With the first approach, you can add to the collection as you write - although it's entirely possible that if building the collection is much faster than writing to disk, you'll end up with the whole thing in memory anyway. If this is infeasible, you'll need some way of adding "gently" from the UI thread, without blocking it. It would be nice if BlockingCollection had an AddAsync method, but I can't see one.
We don't know enough about what you're doing with the Path elements to give you sample code for this, but hopefully that's enough of a starting point.
Run the whole loop in a Task:
Task.Run(()=>{
foreach (Path path in m_canvasCompact.Children)
{
WritePathDataToXML(false, path);
}
});
This will still take the same time, but should not block the UI.
Edit: To simplify things here is the paradigm: I have an list of items that is constantly being updated by a continuous stream. Every now and then, I get a new data snapshot that re-initializes the stream. As a result if there are any updates happening while I want to re-initialize, I need to make sure that these updates stop and the new snapshot is being used.
I am dealing with a number of continuous data streams of updates that needs to be displayed to the UI.
The updates need to be displayed in reverse order i.e. the most recent update goes on top of the list.
In order to display the result on top I have to insert into the list.
The problem that I have that sometime the list needs to be rest (i.e. List.Clear), however, if I am mid insert, I need to stop it, because otherwise Inserting will cause an exception.
I've put together a reactive method to help me with that, however, it seems to be ignoring my until stream.
public static IObservable<T> BufferAndDispatchUntil<T, TStopUnit>(
this IObservable<T> source,
Action<T> onNext,
IScheduler scheduler,
IObservable<TStopUnit> until,
DispatcherPriority dispatcherPriority = DispatcherPriority.Background)
{
if (source == null) throw new ArgumentNullException("source");
if (onNext == null) throw new ArgumentNullException("onNext");
if (Application.Current == null)
return source.Do(onNext);
var dispatcher = Application.Current.Dispatcher;
return source
.LazyBuffer(BufferTime, BufferCount, scheduler)
.TakeUntil(until)
.Do(b => dispatcher.BeginInvoke(() => b.ForEach(onNext), dispatcherPriority))
.SelectMany(i => i);
}
LazyBuffer is a custom implemention of Buffer, which only returns a result set when new items are available, rather than returning empty result sets on the specified interval.
This is how I invoke it, it blows as described above.
BufferAndDispatchUntil(p => Update.Insert(p.Item1, UpdateFactory.CreateView(p.Item2)), _config.DispatcherScheduler, _ignore);
This is my clear call in a separate segment of the code running on separate thread.
_ignore.OnNext(new Unit());
Update.Clear();
I would appreciate if you can help me figure it out.
You can't stop "mid insert". You need to lock your data structures when performing non-atomic operations to ensure data integrity.
lock(someObj)
{
myList.Insert(i, obj);
}
Make sure to lock at the smallest resolution possible, i.e., don't lock an entire method if you need only protect against a single non-atomic operation.
That, or have you looked at C#'s thread safe collections yet which handle much of the locking for you?
You're undoing the guarantees that Rx provides you when you do this:
.Do(b => dispatcher.BeginInvoke(() => b.ForEach(onNext), dispatcherPriority))
This is why you're hitting the bug, because you're queuing up stuff to run in the UI thread, but your Rx pipeline isn't aware of it, so you end up splitting.