Exception when modifying collection in foreach loop - c#

I know the basic principle of not modifying collection inside a foreach, that's why I did something like this:
public void UpdateCoverages(Dictionary<PlayerID, double> coverages)
{
// TODO: temp
var keys = coverages.Select(pair => pair.Key);
foreach (var key in keys)
{
coverages[key] = 0.84;
}
}
And:
class PlayerID : IEquatable<PlayerID>
{
public PlayerID(byte value)
{
Value = value;
}
public byte Value { get; private set; }
public bool Equals(PlayerID other)
{
return Value == other.Value;
}
}
First I save all my keys not to have the Collection modified exception and then I go through it. But I still get the exception which I cannot understand.
How to correct this and what is causing the problem?

First I save all my keys
No you don't; keys is a live sequence that is actively iterating the collection as it is iterated by the foreach. To create an isolated copy of the keys, you need to add .ToList() (or similar) to the end:
var keys = coverages.Select(pair => pair.Key).ToList();
Although personally I'd probably go for:
var keys = new PlayerID[coverages.Count];
coverages.Keys.CopyTo(keys, 0);
(which allows for correct-length allocation, and memory-copy)
What is a live sequence actually?
The Select method creates non-buffered spooling iterator over another... that is a really complicated thing to understand, but basically: when you first start iterating var key in keys, it grabs the inner sequence of coverages (aka coverages.GetEnumerator()), and then every time the foreach asks for the next item, it asks for the next item. Yeah, that sounds complicated. The good news is the C# compiler has it all built in automatically, with it generating state machines etc for you. All mainly done using the yield return syntax. Jon Skeet gives an excellent discussion of this in Chapter 6 of C# in Depth. IIRC this used to be the "free chapter", but now it is not.
However, consider the following:
static IEnumerable<int> OneTwoOneTwoForever()
{
while(true) {
yield return 1;
yield return 2;
}
}
It might surprise you to learn that you can consume the above, using the same non-buffered "when you ask for another value, it runs just enough code to give you the next value" approach:
var firstTwenty = OneTwoOneTwoForever().Take(20).ToList(); // works!

Related

C# Recursive Search an array of Objects.parent_id for value, then search those and so on till none left

Looking for a solution to find an object.id and get all the parent_id's in an array of objects, and then set object.missed = true.
Object.id, and Object parent_id. If the object doesn't have a parent_id, parent_id = id.
I know how to do it for one level of parent_id's. How can I go unlimited levels deep? Below is the code I have for searching the 1 level.
public class EPlan
{
public int id;
public int parent_id;
public bool is_repeatable;
public bool missed;
}
EPlan[] plans = Array.FindAll(eventsPlan, item => item.parent_id == event_id);
foreach (EPlan plan in plans)
{
plan.missed = true;
plan.is_repeatable = false;
}
I'm trying to search for event_id an int. So I search all of the object.id's for event_id. Once I find object.id == event_id. I need to set object.is_repeatable = false and object.missed = true.
Then I need to search all of the objects.parent_id for current object.id (event_id). Change all of those object to the same as above.
Then I need to check all of those object.id's against all of the object.parent_id's and do the same to those. Like a tree affect. 1 event was missed, and any of the events that are parented to that event need to be set as missed as well.
So far, all I can do is get 1 level deep, or code multiple foreach loops in. But it could be 10 or more levels deep. So that doesn't make sense.
Any help is appreciated. There has to be a better way that the multiple loops.
I too was confused by the question, save for the one line you said:
1 event was missed, and any of the events that are parented to that event need to be set as missed as well.
With that in mind, I suggest the following code will do what you're looking for. Each time you call the method, it will find all of the objects in the array that match the ID and set the event as Missed and Is_Repeatable appropriately.
It also keeps a running list of the Parent_ID's it found during this scan. Once the loop is finished it will call itself, using the list of parent id values instead of the passed in list of events ids it just used. That is the trick that makes the recursion work here.
To start the process off, you call the method with the single event ID you did for 1-level search.
findEvents(new List<string>{event_id}, eventsPlan);
private void findEvents(List<int> eventIDs, EPlan[] eventsPlan)
{
foreach (int eventID in eventIDs)
{
EPlan[] plans = Array.FindAll(eventsPlan, item => item.parent_id == eventID);
List<int> parentIDs = new List<int>();
foreach (EPlan plan in plans)
{
plan.missed = true;
plan.is_repeatable = false;
parentIDs.Add(plan.parent_id);
}
if (parentIDs.Count > 0)
findEvents(parentIDs, eventsPlan);
}
}
I also recommend that if you have the chance to reengineer this code to not use arrays, but a Generic Collection (like List<EPlan>) you can avoid the performance penalty this code has because it's building new arrays in memory each time you call the Array.FindAll method. Using the Generic Collection, or even using old-school foreach loop will work faster when processing a lot of data here.
Update 1:
To answer your question about how you might go about this using a Generic Collection instead:
private void findEventsAsList(List<int> eventIDs, List<EPlan> eventsPlans)
{
List<int> parentIDs = new List<int>();
foreach (EPlan plan in eventsPlans.Where(p => eventIDs.Contains(p.parent_id)))
{
plan.missed = true;
plan.is_repeatable = false;
parentIDs.Add(plan.parent_id);
}
findEventsAsList(parentIDs, eventsPlan);
}

How to make extension unionWith for Hashset

I am trying to make an extension for the custom type. This is my code. I don't know how my source becomes zero in this code. Even in the debug part hashset temp is giving me a list of 10 logevents. But in the final the source is becoming zero.
public static void UnionSpecialWith(this HashSet<LogEvent> source, List<LogEvent> given,IEqualityComparer<LogEvent> comparer)
{
List<LogEvent> original = new List<LogEvent>(source);
List<LogEvent> second = given.Condense(comparer);
source = new HashSet<LogEvent>(original.Condense(comparer),comparer);
foreach (LogEvent logEvent in second)
{
if (original.Contains(logEvent, comparer))
{
int index = original.FindIndex(x => comparer.Equals(x, logEvent));
original[index].filesAndLineNos.MergeFilesAndLineNos(logEvent.filesAndLineNos);
}
else
original.Add(logEvent);
}
#if DEBUG
String content = String.Join(Environment.NewLine, original.Select(x => x.GetContentAsEventsOnly()));
HashSet<LogEvent> temp = new HashSet<LogEvent>(original, comparer);
#endif
source = new HashSet<LogEvent>(original, comparer);
}
Can anybody point me out what is wrong?
EDIT:
This is my custom type. Whenever I found a duplicate , I want to merge it's "filesAndLineNos" with the original one. This is what I am trying to achieve with the above code.
public class LogEvent
{
public String mainEventOriginal;
public String subEventOriginal;
public String mainEvent;
public String subEvent;
public int level;
public Dictionary<String,HashSet<int>> filesAndLineNos;
}
The usage is something like
HashSet<LogEvent> required = new HashSet<LogEvent>(initialUniqueSet);
required.UnionSpecialWith(givenListOfLogEvents);
This is simply a matter of parameters being passed by value in .NET by default. You're changing the value of source to refer to a different HashSet, and that doesn't change the caller's variable at all. Assuming that Condense doesn't modify the list (I'm unaware of that method) your method is as pointless as:
public void TrimString(string text)
{
// This changes the value of the *parameter*, but doesn't affect the original
// *object* (strings are immutable). The caller won't see any effect!
text = text.Trim();
}
If you call the above with:
string foo = " hello ";
TrimString(foo);
... then foo is still going to refer to a string with contents " hello ". Obviously your method is more complicated, but the cause of the problem is the same.
Either your extension method needs to modify the contents of the original HashSet passed in via the source parameter, or it should return the new set. Returning the new set is more idiomatically LINQ-like, but HashSet.UnionWith does modify the original set - it depends which model you want to be closer to.
EDIT: If you want to modify the set in place, but effectively need to replace the contents entirely due to the logic, then you might want to consider creating the new set, then clearing the old and adding all the contents back in:
public static void UnionSpecialWith(this HashSet<LogEvent> source,
List<LogEvent> given,
IEqualityComparer<LogEvent> comparer)
{
List<LogEvent> original = new List<LogEvent>(source);
List<LogEvent> second = given.Condense(comparer);
foreach (LogEvent logEvent in second)
{
if (original.Contains(logEvent, comparer))
{
int index = original.FindIndex(x => comparer.Equals(x, logEvent));
original[index].filesAndLineNos
.MergeFilesAndLineNos(logEvent.filesAndLineNos);
}
else
{
original.Add(logEvent);
}
}
source.Clear();
foreach (var item in original)
{
source.Add(item);
}
}
However, note:
This does not replace the comparer in the existing set. You can't do that.
It's pretty inefficient in general. It feels like a Dictionary would be a better fit, to be honest.

Using the Concurrent Dictionary - Thread Safe Collection Modification

Recently I was running into the following exception when using a generic dictionary
An InvalidOperationException has occurred. A collection was modified
I realized that this error was primarily because of thread safety issues on the static dictionary I was using.
A little background: I currently have an application which has 3 different methods that are related to this issue.
Method A iterates through the dictionary using foreach and returns a value.
Method B adds data to the dictionary.
Method C changes the value of the key in the dictionary.
Sometimes while iterating through the dictionary, data is also being added, which is the cause of this issue. I keep getting this exception in the foreach part of my code where I iterate over the contents of the dictionary. In order to resolve this issue, I replaced the generic dictionary with the ConcurrentDictionary and here are the details of what I did.
Aim : My main objective is to completely remove the exception
For method B (which adds a new key to the dictionary) I replaced .Add with TryAdd
For method C (which updates the value of the dictionary) I did not make any changes. A rough sketch of the code is as follows :
static public int ChangeContent(int para)
{
foreach (KeyValuePair<string, CustObject> pair in static_container)
{
if (pair.Value.propA != para ) //Pending cancel
{
pair.Value.data_id = prim_id; //I am updating the content
return 0;
}
}
return -2;
}
For method A - I am simply iterating over the dictionary and this is where the running code stops (in debug mode) and Visual Studio informs me that this is where the error occured.The code I am using is similar to the following
static public CustObject RetrieveOrderDetails(int para)
{
foreach (KeyValuePair<string, CustObject> pair in static_container)
{
if (pair.Value.cust_id.Equals(symbol))
{
if (pair.Value.OrderStatus != para)
{
return pair.Value; //Found
}
}
}
return null; //Not found
}
Are these changes going to resolve the exception that I am getting.
Edit:
It states on this page that the method GetEnumerator allows you to traverse through the elements in parallel with writes (although it may be outdated). Isnt that the same as using foreach ?
For modification of elements, one option is to manually iterate the dictionary using a for loop, e.g.:
Dictionary<string, string> test = new Dictionary<string, string>();
int dictionaryLength = test.Count();
for (int i = 0; i < dictionaryLength; i++)
{
test[test.ElementAt(i).Key] = "Some new content";
}
Be weary though, that if you're also adding to the Dictionary, you must increment dictionaryLength (or decrement it if you move elements) appropriately.
Depending on what exactly you're doing, and if order matters, you may wish to use a SortedDictionary instead.
You could extend this by updating dictionaryLength explicitly by recalling test.Count() at each iteration, and also use an additional list containing a list of keys you've already modified and so on and so forth if there's a danger of missing any, it really depends what you're doing as much as anything and what your needs are.
You can further get a list of keys using test.Keys.ToList(), that option would work as follows:
Dictionary<string, string> test = new Dictionary<string, string>();
List<string> keys = test.Keys.ToList();
foreach (string key in keys)
{
test[key] = "Some new content";
}
IEnumerable<string> newKeys = test.Keys.ToList().Except(keys);
if(newKeys.Count() > 0)
// Do it again or whatever.
Note that I've also shown an example of how to find out whether any new keys were added between you getting the initial list of keys, and completing iteration such that you could then loop round and handle the new keys.
Hopefully one of these options will suit (or you may even want to mix and match- for loop on the keys for example updating that as you go instead of the length) - as I say, it's as much about what precisely you're trying to do as much as anything.
Before doing foreach() try out copying container to a new instance
var unboundContainer = static_container.ToList();
foreach (KeyValuePair<string, CustObject> pair in unboundContainer)
Also I think updating Value property is not right from thread safety perspectives, refactor your code to use TryUpdate() instead.

"Possible multiple enumeration of IEnumerable" vs "Parameter can be declared with base type"

In Resharper 5, the following code led to the warning "Parameter can be declared with base type" for list:
public void DoSomething(List<string> list)
{
if (list.Any())
{
// ...
}
foreach (var item in list)
{
// ...
}
}
In Resharper 6, this is not the case. However, if I change the method to the following, I still get that warning:
public void DoSomething(List<string> list)
{
foreach (var item in list)
{
// ...
}
}
The reason is, that in this version, list is only enumerated once, so changing it to IEnumerable<string> will not automatically introduce another warning.
Now, if I change the first version manually to use an IEnumerable<string> instead of a List<string>, I will get that warning ("Possible multiple enumeration of IEnumerable") on both occurrences of list in the body of the method:
public void DoSomething(IEnumerable<string> list)
{
if (list.Any()) // <- here
{
// ...
}
foreach (var item in list) // <- and here
{
// ...
}
}
I understand, why, but I wonder, how to solve this warning, assuming, that the method really only needs an IEnumerable<T> and not a List<T>, because I just want to enumerate the items and I don't want to change the list.
Adding a list = list.ToList(); at the beginning of the method makes the warning go away:
public void DoSomething(IEnumerable<string> list)
{
list = list.ToList();
if (list.Any())
{
// ...
}
foreach (var item in list)
{
// ...
}
}
I understand, why that makes the warning go away, but it looks a bit like a hack to me...
Any suggestions, how to solve that warning better and still use the most general type possible in the method signature?
The following problems should all be solved for a good solution:
No call to ToList() inside the method, because it has a performance impact
No usage of ICollection<T> or even more specialized interfaces/classes, because they change the semantics of the method as seen from the caller.
No multiple iterations over an IEnumerable<T> and thus risking accessing a database multiple times or similar.
Note: I am aware that this is not a Resharper issue, and thus, I don't want to suppress this warning, but fix the underlying cause as the warning is legit.
UPDATE:
Please don't care about Any and the foreach. I don't need help in merging those statements to have only one enumeration of the enumerable.
It could really be anything in this method that enumerates the enumerable multiple times!
You should probably take an IEnumerable<T> and ignore the "multiple iterations" warning.
This message is warning you that if you pass a lazy enumerable (such as an iterator or a costly LINQ query) to your method, parts of the iterator will execute twice.
There is no perfect solution, choose one acording to the situation.
enumerable.ToList, you may optimize it by firstly trying "enumerable as List" as long as you don't modify the list
Iterate two times over the IEnumerable but make it clear for the caller (document it)
Split in two methods
Take List to avoid cost of "as"/ToList and potential cost of double enumeration
The first solution (ToList) is probably the most "correct" for a public method that could be working on any Enumerable.
You can ignore Resharper issues, the warning is legit in a general case but may be wrong in your specific situation. Especially if the method is intended for internal usage and you have full control on callers.
This class will give you a way to split the first item off of the enumeration and then have an IEnumerable for the rest of the enumeration without giving you a double enumeration, thus avoiding the potentially nasty performance hit. It's usage is like this (where T is whatever type you are enumerating):
var split = new SplitFirstEnumerable(currentIEnumerable);
T firstItem = split.First;
IEnumerable<T> remaining = split.Remaining;
Here is the class itself:
/// <summary>
/// Use this class when you want to pull the first item off of an IEnumerable
/// and then enumerate over the remaining elements and you want to avoid the
/// warning about "possible double iteration of IEnumerable" AND without constructing
/// a list or other duplicate data structure of the enumerable. You construct
/// this class from your existing IEnumerable and then use its First and
/// Remaining properties for your algorithm.
/// </summary>
/// <typeparam name="T">The type of item you are iterating over; there are no
/// "where" restrictions on this type.</typeparam>
public class SplitFirstEnumerable<T>
{
private readonly IEnumerator<T> _enumerator;
/// <summary>
/// Constructor
/// </summary>
/// <remarks>Will throw an exception if there are zero items in enumerable or
/// if the enumerable is already advanced past the last element.</remarks>
/// <param name="enumerable">The enumerable that you want to split</param>
public SplitFirstEnumerable(IEnumerable<T> enumerable)
{
_enumerator = enumerable.GetEnumerator();
if (_enumerator.MoveNext())
{
First = _enumerator.Current;
}
else
{
throw new ArgumentException("Parameter 'enumerable' must have at least 1 element to be split.");
}
}
/// <summary>
/// The first item of the original enumeration, equivalent to calling
/// enumerable.First().
/// </summary>
public T First { get; private set; }
/// <summary>
/// The items of the original enumeration minus the first, equivalent to calling
/// enumerable.Skip(1).
/// </summary>
public IEnumerable<T> Remaining
{
get
{
while (_enumerator.MoveNext())
{
yield return _enumerator.Current;
}
}
}
}
This does presuppose that the IEnumerable has at least one element to start. If you want to do more of a FirstOrDefault type setup, you'll need to catch the exception that would otherwise be thrown in the constructor.
There exists a general solution to address both Resharper warnings: the lack of guarantee for repeat-ability of IEnumerable, and the List base class (or potentially expensive ToList() workaround).
Create a specialized class, I.E "RepeatableEnumerable", implementing IEnumerable, with "GetEnumerator()" implemented with the following logic outline:
Yield all items already collected so far from the inner list.
If the wrapped enumerator has more items,
While the wrapped enumerator can move to the next item,
Get the current item from the inner enumerator.
Add the current item to the inner list.
Yield the current item
Mark the inner enumerator as having no more items.
Add extension methods and appropriate optimizations where the wrapped parameter is already repeatable. Resharper will no longer flag the indicated warnings on the following code:
public void DoSomething(IEnumerable<string> list)
{
var repeatable = list.ToRepeatableEnumeration();
if (repeatable.Any()) // <- no warning here anymore.
// Further, this will read at most one item from list. A
// query (SQL LINQ) with a 10,000 items, returning one item per second
// will pass this block in 1 second, unlike the ToList() solution / hack.
{
// ...
}
foreach (var item in repeatable) // <- and no warning here anymore, either.
// Further, this will read in lazy fashion. In the 10,000 item, one
// per second, query scenario, this loop will process the first item immediately
// (because it was read already for Any() above), and then proceed to
// process one item every second.
{
// ...
}
}
With a little work, you can also turn RepeatableEnumerable into LazyList, a full implementation of IList. That's beyond the scope of this particular problem though. :)
UPDATE: Code implementation requested in comments -- not sure why the original PDL wasn't enough, but in any case, the following faithfully implements the algorithm I suggested (My own implementation implements the full IList interface; that is a bit beyond the scope I want to release here... :) )
public class RepeatableEnumerable<T> : IEnumerable<T>
{
readonly List<T> innerList;
IEnumerator<T> innerEnumerator;
public RepeatableEnumerable( IEnumerator<T> innerEnumerator )
{
this.innerList = new List<T>();
this.innerEnumerator = innerEnumerator;
}
public IEnumerator<T> GetEnumerator()
{
// 1. Yield all items already collected so far from the inner list.
foreach( var item in innerList ) yield return item;
// 2. If the wrapped enumerator has more items
if( innerEnumerator != null )
{
// 2A. while the wrapped enumerator can move to the next item
while( innerEnumerator.MoveNext() )
{
// 1. Get the current item from the inner enumerator.
var item = innerEnumerator.Current;
// 2. Add the current item to the inner list.
innerList.Add( item );
// 3. Yield the current item
yield return item;
}
// 3. Mark the inner enumerator as having no more items.
innerEnumerator.Dispose();
innerEnumerator = null;
}
}
System.Collections.IEnumerator System.Collections.IEnumerable.GetEnumerator()
{
return GetEnumerator();
}
}
// Add extension methods and appropriate optimizations where the wrapped parameter is already repeatable.
public static class RepeatableEnumerableExtensions
{
public static RepeatableEnumerable<T> ToRepeatableEnumerable<T>( this IEnumerable<T> items )
{
var result = ( items as RepeatableEnumerable<T> )
?? new RepeatableEnumerable<T>( items.GetEnumerator() );
return result;
}
}
I realize this question is old and already marked as answered, but I was surprised that nobody suggested manually iterating over the enumerator:
// NOTE: list is of type IEnumerable<T>.
// The name was taken from the OP's code.
var enumerator = list.GetEnumerator();
if (enumerator.MoveNext())
{
// Run your list.Any() logic here
...
do
{
var item = enumerator.Current;
// Run your foreach (var item in list) logic here
...
} while (enumerator.MoveNext());
}
Seems a lot more straightforward than the other answers here.
Generally speaking, what you need is some state object into which you can PUSH the items (within a foreach loop), and out of which you then get your final result.
The downside of the enumerable LINQ operators is that they actively enumerate the source instead of accepting items being pushed to them, so they don't meet your requirements.
If you e.g. just need the minimum and maximum values of a sequence of 1'000'000 integers which cost $1'000 worth of processor time to retrieve, you end up writing something like this:
public class MinMaxAggregator
{
private bool _any;
private int _min;
private int _max;
public void OnNext(int value)
{
if (!_any)
{
_min = _max = value;
_any = true;
}
else
{
if (value < _min) _min = value;
if (value > _max) _max = value;
}
}
public MinMax GetResult()
{
if (!_any) throw new InvalidOperationException("Sequence contains no elements.");
return new MinMax(_min, _max);
}
}
public static MinMax DoSomething(IEnumerable<int> source)
{
var aggr = new MinMaxAggregator();
foreach (var item in source) aggr.OnNext(item);
return aggr.GetResult();
}
In fact, you just re-implemented the logic of the Min() and Max() operators. Of course that's easy, but they are only examples for arbitrary complex logic you might otherwise easily express in a LINQish way.
The solution came to me on yesterday's night walk: we need to PUSH... that's REACTIVE! All the beloved operators also exist in a reactive version built for the push paradigm. They can be chained together at will to whatever complexity you need, just as their enumerable counterparts.
So the min/max example boils down to:
public static MinMax DoSomething(IEnumerable<int> source)
{
// bridge over to the observable world
var connectable = source.ToObservable(Scheduler.Immediate).Publish();
// express the desired result there (note: connectable is observed by multiple observers)
var combined = connectable.Min().CombineLatest(connectable.Max(), (min, max) => new MinMax(min, max));
// subscribe
var resultAsync = combined.GetAwaiter();
// unload the enumerable into connectable
connectable.Connect();
// pick up the result
return resultAsync.GetResult();
}
Why not:
bool any;
foreach (var item in list)
{
any = true;
// ...
}
if(any)
{
//...
}
Update: Personally, I wouldn't drastically change the code just to get around a warning like this. I would just disable the warning and continue on. The warning is suggesting you change the general flow of the code to make it better; if you're not making the code better (and arguably making it worse) to address the warning; then the point of the warning is missed.
For example:
// ReSharper disable PossibleMultipleEnumeration
public void DoSomething(IEnumerable<string> list)
{
if (list.Any()) // <- here
{
// ...
}
foreach (var item in list) // <- and here
{
// ...
}
}
// ReSharper restore PossibleMultipleEnumeration
UIMS* - Fundamentally, there is no great solve. IEnumerable<T> used to be the "very basic thing that represents a bunch of things of the same type, so using it in method sigs is Correct." It has now also become a "thing that might evaluate behind the scenes, and might take a while, so now you always have to worry about that."
It's as if IDictionary suddenly were extended to support lazy loading of values, via a LazyLoader property of type Func<TKey,TValue>. Actually that'd be neat to have, but not so neat to be added to IDictionary, because now every time we receive an IDictionary we have to worry about that. But that's where we are.
So it would seem that "if a method takes an IEnumerable and evals it twice, always force eval via ToList()" is the best you can do. And nice work by Jetbrains to give us this warning.
*(Unless I'm Missing Something . . . just made it up but it seems useful)
Be careful when accepting enumerables in your method. The "warning" for the base type is only a hint, the enumeration warning is a true warning.
However, your list will be enumerated at least two times because you do any and then a foreach. If you add a ToList() your enumeration will be enumerated three times - remove the ToList().
I would suggest to set resharpers warning settings for the base type to a hint. So you still have a hint (green underline) and the possibility to quickfix it (alt+enter) and no "warnings" in your file.
You should take care if enumerating the IEnumerable is an expensive action like loading something from file or database, or if you have a method which calculates values and uses yield return. In this case do a ToList() or ToArray() first to load/calculate all data only ONCE.
You could use ICollection<T> (or IList<T>). It's less specific than List<T>, but doesn't suffer from the multiple-enumeration problem.
Still I'd tend to use IEnumerable<T> in this case. You can also consider to refactor the code to enumerate only once.
Use an IList as your parameter type rather than IEnumerable - IEnumerable has different semantics to List whereas IList has the same
IEnumerable could be based on a non-seekable stream which is why you get the warnings
You can iterate only once :
public void DoSomething(IEnumerable<string> list)
{
bool isFirstItem = true;
foreach (var item in list)
{
if (isFirstItem)
{
isFirstItem = false;
// ...
}
// ...
}
}
There is something no one had said before (#Zebi). Any() already iterates trying to find the element. If you call a ToList(), it will iterate as well, to create a list. The initial idea of using IEnumerable is only to iterate, anything else provokes an iteration in order to perform. You should try to, inside a single loop, do everything.
And include in it your .Any() method.
if you pass a list of Action in your method you would have a cleaner iterated once code
public void DoSomething(IEnumerable<string> list, params Action<string>[] actions)
{
foreach (var item in list)
{
for(int i =0; i < actions.Count; i++)
{
actions[i](item);
}
}
}

Get Current Index for Removal in String Collection

I have a String Collection that is populated with ID's like so -->
12345
23456
34567
and so on. What I need to do is at the user's request, and when certain parameters are met, go through that list, starting at the top, and perform a method() using that ID. If successful I would remove it from the list and move on.
I, embarrassingly, have never worked with a collection before in this manner. Can someone point me in the right direction. Examples all seem to be of the Console.Writeline(""); variety.
My base, ignorant, attempt looks like this -->
var driUps = Settings.Default.DRIUpdates.GetEnumerator();
while (driUps.MoveNext())
{
var wasSuccessfull = PerformDRIUpdate(driUps.Current);
if (wasSuccessfull)
{
driUps.Current.Remove(driUps.Current.IndexOf(driUps.Current));
}
}
The part I am most concerned with is the Remove(); Isn't there a better way to get the Current Index? Any and all Tips, Hints, Criticism, Pointers, etc....welcome. Thanks!
You are quite right to be concerned about the 'remove' during enumeration. How about somethign like this:
int idx = 0;
while (idx < strCol.Count)
{
var wasSuccessful = PerformDRIUpdate(strCol[idx]);
if (wasSuccessful)
strCol.RemoveAt(idx);
else
++idx;
}
As suggested by n8wrl, using RemoveAt solves the issue of trying to remove an item whilst enumerating the collection, but for large collections removing items from the front can cause performance issues as the underlying collection is re-built. Work your way from the end of the collection and remove items from that end:
//Loop backwards, as removing from the beginning
//causes underlying collection to be re-built
int index = (strCol.Count - 1);
while (index >= 0)
{
if (PerformDRIUpdate(strCol[index]))
{
strCol.RemoveAt(index);
}
--index;
}
Iterating an enumerator is best done with the foreach(), it does a GetEnumerator() and creates a similar block under the covers to what you're getting at, the syntax is:
foreach(ObjectType objectInstance in objectInstanceCollection)
{
do something to object instance;
}
for you,
List<DRIUpdate> updatesToRemove = new List<DRIUpdate>();
foreach(DRIUpdate driUpdate in Settings.Default.DRIUpdates)
{
if (PerformDRIUpdate(driUpdate))
{
updatesToRemove.Add(driUpdate);
}
}
foreach(DRIUpdate driUpdate in updatesToRemove)
{
Settings.Default.DRIUpdates.Remove(driUpdate);
}
If driUps is an IEnumerable<T>, try this:
driUps = driUps.Where(elem => !PerformDRIUpdate(elem));
Update:
From the example, it seems this is more appropriate:
Settings.Default.DRIUpdates =
Settings.Default.DRIUpdates.Where(elem => !PerformDRIUpdate(elem));
For a List<T>, it's simpler:
list.RemoveAll(PerformDRIUpdate);

Categories

Resources