ConcurrentBag of strings and using .Contains in Parallel.ForEach - c#

Im using the ConcurrentBag to contain a list of strings. Occasionally it will contain a duplicate.
However im checking the contents of this before adding the new entry so it should never have a duplicate.
ConcurrentDictionary<string, string> SystemFiles = PopulateSystemFiles();
ConcurrentBag<string> SystemNames = new ConcurrentBag<string>();
Parallel.ForEach(SystemFiles, file =>
{
string name = GetSystemName(file.Value);
if (!SystemNames.Contains(name))
{
SystemNames.Add(name);
}
});
My assumpion is that the .Contains method is not thread safe. Am i correct?

ConcurrentBag is threadsafe, but your code isn't:
if (!SystemNames.Contains(name))
{
SystemNames.Add(name);
}
Contains will execute in a thread-safe way, then Add will also execute in a thread-safe way, but you have no guarantee that an item haven't been added in-between.
For your needs, I recommend using a ConcurrentDictionary instead. Just ignore the value as you won't need it.
var SystemNames = new ConcurrentDictionary<string, bool>();
Then use the TryAdd method to do the "if not contains then add" in a single atomic operation:
SystemNames.TryAdd(name, true);

Related

Is HashSet<T> thread safe as a value of ConcurrentDictionary<TKey, HashSet<T>>?

If I have the following code:
var dictionary = new ConcurrentDictionary<int, HashSet<string>>();
foreach (var user in users)
{
if (!dictionary.ContainsKey(user.GroupId))
{
dictionary.TryAdd(user.GroupId, new HashSet<string>());
}
dictionary[user.GroupId].Add(user.Id.ToString());
}
Is the act of adding an item into the HashSet inherently thread safe because HashSet is a value property of the concurrent dictionary?
No. Putting a container in a thread-safe container does not make the inner container thread safe.
dictionary[user.GroupId].Add(user.Id.ToString());
is calling HashSet's add after retrieving it from the ConcurrentDictionary. If this GroupId is looked up from two threads at once this would break your code with strange failure modes. I saw the result of one of my teammates making the mistake of not locking his sets, and it wasn't pretty.
This is a plausible solution. I'd do something different myself but this is closer to your code.
if (!dictionary.ContainsKey(user.GroupId))
{
dictionary.TryAdd(user.GroupId, new HashSet<string>());
}
var groups = dictionary[user.GroupId];
lock(groups)
{
groups.Add(user.Id.ToString());
}
No, the collection (the dictionary itself) is thread-safe, not whatever you put in it. You have a couple of options:
Use AddOrUpdate as #TheGeneral mentioned:
dictionary.AddOrUpdate(user.GroupId, new HashSet<string>(), (k,v) => v.Add(user.Id.ToString());
Use a concurrent collection, like the ConcurrentBag<T>:
ConcurrentDictionary<int, ConcurrentBag<string>>
Whenever you are building the Dictionary, as in your code, you should be better off accessing it as little as possible. Think of something like this:
var dictionary = new ConcurrentDictionary<int, ConcurrentBag<string>>();
var grouppedUsers = users.GroupBy(u => u.GroupId);
foreach (var group in grouppedUsers)
{
// get the bag from the dictionary or create it if it doesn't exist
var currentBag = dictionary.GetOrAdd(group.Key, new ConcurrentBag<string>());
// load it with the users required
foreach (var user in group)
{
if (!currentBag.Contains(user.Id.ToString())
{
currentBag.Add(user.Id.ToString());
}
}
}
If you actually want a built-in concurrent HashSet-like collection, you'd need to use ConcurrentDictionary<int, ConcurrentDictionary<string, string>>, and care either about the key or the value from the inner one.

How to add a new element to a hashset that is value of a ConcurrentDictionary?

I have a ConcurrentDictionary that has as key a long and as value a hashset of int. I want that if the key isn't in the dictionary, add a new hashset with the first element. If the key exists, add the new element to the existing dictionary.
I am trying something like that:
ConcurrentDictionary<long, HashSet<int>> myDic = new ConcurrentDictionary<long, HashSet<int>>();
int myElement = 1;
myDic.AddOrUpdate(1, new Hashset<int>(){myFirstElement},
(key, actualValue) => actualValue.Add(myElement));
The problem with this code is the third parameter, because .Add() method returns a bool and the AddOrUpdate expects a hashset. The first and second parameters are right.
So my question is how I can add a new element to the hashset in thread-safe way and avoid duplicates (it is the reason why I am using a hashset as value). The problem of the hashset is that it is not thread-safe and if I get it first and later add the new element, I am doing outside of the dictionary and I could have problems.
Thanks.
To fix compiler error you can do this:
myDic.AddOrUpdate(1, new HashSet<int>() { myFirstElement },
(key, actualValue) => {
actualValue.Add(myFirstElement);
return actualValue;
});
BUT this is not thread safe, because "update" function is not run inside any lock so you are potentially adding to not-thread-safe HashSet from multiple threads. This might result in (for example) losing values (so you were adding 1000 items to HashSet but in the end you have only 970 items in it for example). Update function in AddOrUpdate should not have any side effects and here it does.
You can lock yourself over adding values to HashSet:
myDic.AddOrUpdate(1, new HashSet<int>() { myFirstElement },
(key, actualValue) => {
lock (actualValue) {
actualValue.Add(myFirstElement);
return actualValue;
}
});
But then question is why you are using lock-free structure (ConcurrentDictionary) in the first place. Besides that - any other code might get HashSet from your dictionary and add value there without any locks, making the whole thing useless. So if you decide to go that way for some reason - you have to ensure that all code locks when accessing HashSet from that dictionary.
Instead of all that - just use concurrent collection instead of HashSet. There is no ConcurrentHashSet as far as I know but you can use another ConcurrentDictionary with dummy keys as a replacement (or look over internet for custom implementations).
Side note. Here
myDic.AddOrUpdate(1, new Hashset<int>(){myFirstElement},
you create new HashSet every time when calling AddOrUpdate, even if that dictionary is not needed because key is already there. Instead use overload with add value factory:
myDic.AddOrUpdate(1, (key) => new HashSet<int>() { myFirstElement },
Edit: sample usage of ConcurrentDictionary as hash set:
var myDic = new ConcurrentDictionary<long, ConcurrentDictionary<int, byte>>();
long key = 1;
int element = 1;
var hashSet = myDic.AddOrUpdate(key,
_ => new ConcurrentDictionary<int, byte>(new[] {new KeyValuePair<int, byte>(element, 0)}),
(_, oldValue) => {
oldValue.TryAdd(element, 0);
return oldValue;
});
If you wrap the anonymous function definition in curly braces, you can define multiple statements in the body of the function and thus specify the return value like this:
myDic.AddOrUpdate(1, new HashSet<int>() { myFirstElement },
(key, actualValue) => {
actualValue.Add(myElement);
return actualValue;
});

Using the Concurrent Dictionary - Thread Safe Collection Modification

Recently I was running into the following exception when using a generic dictionary
An InvalidOperationException has occurred. A collection was modified
I realized that this error was primarily because of thread safety issues on the static dictionary I was using.
A little background: I currently have an application which has 3 different methods that are related to this issue.
Method A iterates through the dictionary using foreach and returns a value.
Method B adds data to the dictionary.
Method C changes the value of the key in the dictionary.
Sometimes while iterating through the dictionary, data is also being added, which is the cause of this issue. I keep getting this exception in the foreach part of my code where I iterate over the contents of the dictionary. In order to resolve this issue, I replaced the generic dictionary with the ConcurrentDictionary and here are the details of what I did.
Aim : My main objective is to completely remove the exception
For method B (which adds a new key to the dictionary) I replaced .Add with TryAdd
For method C (which updates the value of the dictionary) I did not make any changes. A rough sketch of the code is as follows :
static public int ChangeContent(int para)
{
foreach (KeyValuePair<string, CustObject> pair in static_container)
{
if (pair.Value.propA != para ) //Pending cancel
{
pair.Value.data_id = prim_id; //I am updating the content
return 0;
}
}
return -2;
}
For method A - I am simply iterating over the dictionary and this is where the running code stops (in debug mode) and Visual Studio informs me that this is where the error occured.The code I am using is similar to the following
static public CustObject RetrieveOrderDetails(int para)
{
foreach (KeyValuePair<string, CustObject> pair in static_container)
{
if (pair.Value.cust_id.Equals(symbol))
{
if (pair.Value.OrderStatus != para)
{
return pair.Value; //Found
}
}
}
return null; //Not found
}
Are these changes going to resolve the exception that I am getting.
Edit:
It states on this page that the method GetEnumerator allows you to traverse through the elements in parallel with writes (although it may be outdated). Isnt that the same as using foreach ?
For modification of elements, one option is to manually iterate the dictionary using a for loop, e.g.:
Dictionary<string, string> test = new Dictionary<string, string>();
int dictionaryLength = test.Count();
for (int i = 0; i < dictionaryLength; i++)
{
test[test.ElementAt(i).Key] = "Some new content";
}
Be weary though, that if you're also adding to the Dictionary, you must increment dictionaryLength (or decrement it if you move elements) appropriately.
Depending on what exactly you're doing, and if order matters, you may wish to use a SortedDictionary instead.
You could extend this by updating dictionaryLength explicitly by recalling test.Count() at each iteration, and also use an additional list containing a list of keys you've already modified and so on and so forth if there's a danger of missing any, it really depends what you're doing as much as anything and what your needs are.
You can further get a list of keys using test.Keys.ToList(), that option would work as follows:
Dictionary<string, string> test = new Dictionary<string, string>();
List<string> keys = test.Keys.ToList();
foreach (string key in keys)
{
test[key] = "Some new content";
}
IEnumerable<string> newKeys = test.Keys.ToList().Except(keys);
if(newKeys.Count() > 0)
// Do it again or whatever.
Note that I've also shown an example of how to find out whether any new keys were added between you getting the initial list of keys, and completing iteration such that you could then loop round and handle the new keys.
Hopefully one of these options will suit (or you may even want to mix and match- for loop on the keys for example updating that as you go instead of the length) - as I say, it's as much about what precisely you're trying to do as much as anything.
Before doing foreach() try out copying container to a new instance
var unboundContainer = static_container.ToList();
foreach (KeyValuePair<string, CustObject> pair in unboundContainer)
Also I think updating Value property is not right from thread safety perspectives, refactor your code to use TryUpdate() instead.

synchronized collection thread safety

I have System.Collections.Generic.SynchronizedCollection shared collection. Our code uses .Net 4.0 Task library to span threads and pass the synchronized collection to the thread. So far threads has not been adding or removing items into the collection. But the new requirement which requires one of the thread has to remove items from the collection while the other thread just read the collection. Do I need to add lock before removing the items from the Collection? If so, would reader thread be thread safe? Or Suggest best way to get the thread safety?
No it is not fully thread-safe. Try the following in a simple Console-Application and see how it crashes with an exception:
var collection = new SynchronizedCollection<int>();
var n = 0;
Task.Run(
() =>
{
while (true)
{
collection.Add(n++);
Thread.Sleep(5);
}
});
Task.Run(
() =>
{
while (true)
{
Console.WriteLine("Elements in collection: " + collection.Count);
var x = 0;
if (collection.Count % 100 == 0)
{
foreach (var i in collection)
{
Console.WriteLine("They are: " + i);
x++;
if (x == 100)
{
break;
}
}
}
}
});
Console.ReadKey();
Note, that if you replace the SynchronizedCollection with a ConcurrentBag, you will get thread-safety:
var collection = new ConcurrentBag<int>();
SynchronizedCollection is simply not thread-safe in this application. Use Concurrent Collections instead.
As Alexander already pointed out the SynchronizedCollection is not thread safe for this scenario.
The SynchronizedCollection actually wraps a normal generic list and just delegates every call to the underlying list with a lock surrounding the call. This is also done in GetEnumerator. So the getting of the enumerator is synchronized but NOT the actual enumeration.
var collection = new SynchronizedCollection<string>();
collection.Add("Test1");
collection.Add("Test2");
collection.Add("Test3");
collection.Add("Test4");
var enumerator = collection.GetEnumerator();
enumerator.MoveNext();
collection.Add("Test5");
//The next call will throw a InvalidOperationException ("Collection was modified")
enumerator.MoveNext();
When using a foreach an enumerator will be called in this way. So adding a ToArray() before enumerating through this array will not work either as this will first enumerate into this array.
This enumeration could be faster when what you are doing inside of your foreach so it could reduce the probability of getting a concurrency issue.
As Richard pointed out: for true thread safety go for the System.Collections.Concurrent classes.
Yes, SynchronizedCollection will do the locking for you.
If you have multiple readers and just one writer, you may want to look at using a ReaderWriterLock, instead of SynchronizedCollection.
Also, if you are .Net 4+ then take a look at System.Collections.Concurrent. These classes have much better performance than SynchronizedCollection.

C# Locking mechanism - write only locking

In continuation for my latest ponders about locks in C# and .NET,
Consider the following scenario:
I have a class which contains a specific collection (for this example, i've used a Dictionary<string, int>) which is updated from a data source every few minutes using a specific method which it's body you can see below:
DataTable dataTable = dbClient.ExecuteDataSet(i_Query).GetFirstTable();
lock (r_MappingLock)
{
i_MapObj.Clear();
foreach (DataRow currRow in dataTable.Rows)
{
i_MapObj.Add(Convert.ToString(currRow[i_Column1]), Convert.ToInt32[i_Column2]));
}
}
r_MappingLock is an object dedicated to lock the critical section which refreshes the dictionary's contents.
i_MapObj is the dictionary object
i_Column1 and i_Column2 are the datatable's column names which contain the desired data for the mapping.
Now, I also have a class method which receives a string and returns the correct mapped int based on the mentioned dictionary.
I want this method to wait until the refresh method completes it's execution, so at first glance one would consider the following implementation:
lock (r_MappingLock)
{
int? retVal = null;
if (i_MapObj.ContainsKey(i_Key))
{
retVal = i_MapObj[i_Key];
}
return retVal;
}
This will prevent unexpected behaviour and return value while the dictionary is being updated, but another issue arises:
Since every thread which executes the above method tries to claim the lock, it means that if multiple threads try to execute this method at the same time, each will have to wait until the previous thread finished executing the method and try to claim the lock, and this is obviously an undesirable behaviour since the above method is only for reading purposes.
I was thinking of adding a boolean member to the class which will be set to true or false wether the dictionary is being updated or not and checking it within the "read only" method, but this arise other race-condition based issues...
Any ideas how to solve this gracefully?
Thanks again,
Mikey
Have a look at the built in ReaderWriterLock.
I would just switch to using a ConcurrentDictionary to avoid this situation altogether - manually locking is error-prone. Also as I can gather from "C#: The Curious ConcurrentDictionary", ConcurrentDictionary is already read-optimized.
Albin pointed out correctly at ReaderWriterLock. I will add an even nicer one: ReaderWriterGate by Jeffrey Richter. Enjoy!
You might consider creating a new dictionary when updating, instead of locking. This way, you will always have consistent results, but reads during updates would return previous data:
private volatile Dictionary<string, int> i_MapObj = new Dictionary<string, int>();
private void Update()
{
DataTable dataTable = dbClient.ExecuteDataSet(i_Query).GetFirstTable();
var newData = new Dictionary<string, int>();
foreach (DataRow currRow in dataTable.Rows)
{
newData.Add(Convert.ToString(currRow[i_Column1]), Convert.ToInt32[i_Column2]));
}
// Start using new data - reference assignments are atomic
i_MapObj = newData;
}
private int? GetValue(string key)
{
int value;
if (i_MapObj.TryGetValue(key, out value))
return value;
return null;
}
In C# 4.0 there is ReaderWriterLockSlim class that is a lot faster!
Almost as fast as a lock().
Keep the policy to disallow recursion (LockRecursionPolicy::NoRecursion) to keep performances so high.
Look at this page for more info.

Categories

Resources