Consistency between C# lists - c#

I am writing a small game engine for XNA. Right now my entity manager has a list of all entities it can update and render. Lets say, however, that I want to split this list up into 'sub lists'. Maybe some entities do not require rendering, and so we would like to separate the rendering entities from the non-rendering entities, while maintaining another list which shows ALL entities for updating. So thats 3 lists.
Is there a way to maintain consistency between lists easily? To ensure that the exact same objects are being referenced in both lists? (Not a copy of the object?). In C++ we'd just have an array of pointers to entities that render, and another for entities that dont.
The problem isn't really putting entities into these lists, its disposing them thats throwing me off. To destroy an entity, i simply set a 'dispose' bool to true in that entity. The entity manger, when looping through all entities, removes the reference to that entity if 'dispose' is set to true, and the garbage cleaner destroys the entity because there is no more references left to it. Would i have to manually go through each and every list that has something to do with Entities and remove that entity from those lists to keep things consistent?
another example of this 'problem' is with quad trees. I have a 2d game and i generate a quad tree for all the entities. The entities that lie in between 2 grid locations are in 2 branches of the tree. how would i remove it from the tree? search the entire tree for references to that entity? seems a little pointless to have a quad tree if thats the case..!

Consider instead of 3 lists, have one list and 2 queries of that list.
var list = GetYourMainList();
var renderableEntities = list.Where(item => item.IsRenderable);
var nonrenderableEntities = list.Where(item => !item.IsRenderable);
// work with the queries
foreach (var item in renderableEntities)
{
// do whatever
}
Here, you can iterate over each query, the results will stream from the main list, and any update to that list will be reflected in the query results as they are iterated. The caveat is that the list cannot be modified (as in added to or deleted from) while the queries are being iterated.

As for your disposing problem, you could implement IDisposable. For your lists, you could make a class that contains the two lists, and if you need the entire list, implement an enumerator that would enumerate over the two lists.

You could put a wrapper around your entity that knows what lists it is in so that when you want to remove it, you can remove it from all places at once relatively easily.

If you have items in multiple lists, then yes, if you want to remove them you must remove them from all lists in which they occur. A simple way to do this is to assign responsibility for removing items to the items themselves: for example, rendering items would remove themselves from the rendering items list when you call RemoveFromLists() on them.
The use of queries as described by Anthony Pegram eliminates this complexity but is inefficient if the lists are large, since they cannot be incrementally updated.
As for quadtrees, a straightforward thing to do is to have each item keep a list of quadtree nodes in which it occurs (or if you don't want to add a field to your items, use a Dictionary). It is then simple to remove them from all nodes in which they occur in constant time.

You could have your entity expose a "Disposing" event that is fired before it is disposed. Then you could have your list's Add method register an event handler with the entity being added that will remove the entity from the list before it is disposed.

Related

Behavior I can't explain with Entity Framework

I'm trying to find documentation about a behavior in Entity Framework. It works but before relying on this behavior, I want to be sure it's a normal behavior of EF and not a unexpected side effect that would be "fixed" in a future version. Here's the situation :
I've a pretty deep object hierachy (which I will simplify here). The structure is a multi levels collection of objects (class A contains a collection of class B which contains a collection of class C, which contains ...) for 7 levels deep.
I've to filter elements on some properties of C and my first trials of doing it loading the complete hierachy produce a complicated LINQ query (which would be a maintenance nightmare) and the generated SQL query was far from efficient. To simplify all this, I decided to split the query in 2 steps : first, I load the collection of class C (and all its childs) filtered as I want and then, I load class A and B for all instances of B that contains an item of my filtered collection of C.
Here's the catch : using that technique, I expected to have to repopulate manually the collection of C in my B class but actually, the collection is already populated with the elements of the collection. I verified the SQL query in intellitrace and the data required to fill instances of C is not included in the second query so the only logical conclusion is that EF did this from the informations in the context. BTW, lazy loading is turned off for that context.
Is this behavior normal in EF ? It so, can you give me link to the documentation explaining how this works ?
Here's a snippet to illustrate this :
using(var context = new MyContext())
{
//Includes and where clauses are greatly simplified for the purpose of the sample
var filteredC = context.C.Include(x=>x.ListOfD).Include(x=>x.ListOfD.Select(y=>y.ListOfE)).Where(c=>c.Status==Status).ToList();
int[] bToLoad = filteredC.Select(c=>c.IDofB).Distinct().ToArray();
var listOfAAndB = context.A.Include(a=>a.ListOfB).Where(x=>x.ListOfB.Any(y=>bToLoad.Contains(y.ID))).ToList();
//At this step, I expected B.ListOfC to be empty but it's somehow populated
}
Thanks
This is standard behavior for a DbContext life cycle. To be honest, I can't link you to any documentation that documents this feature, but I can explain you how this works.
An EF Context is stateful, and keeps track of all the entities that have already been fetched. It also knows about the relations between entities in your DB and your entity model.
So if you fetch new objects that have a direct relation to that object (in your case, C has a foreign key to B), the navigation property is populated by the Context. This is a feature, and not a bug, as it tries to explicitly avoid Lazy loading queries to the DB for objects that have already been fetched.

Not able to remove items from database when user deletes a row in data grid

When a user hits the button, I'm executing the following code.
using (Context context = new Context())
{
foreach (Thing thing ViewModel.Things)
context.Things.AddOrUpdate(thing);
context.SaveChanges();
}
The updates are executed except for when the user selected a row and hit delete button. Visually, that post is gone but it's not really being removed from the database because it's not in the view model anymore. Hence, the loppification only ticks for the remaining things and not touching the removees.
I can think of two ways to handle that. One really bad - to remove everything from the context, save it and then recreate based on the view model. It's an idiotic solution so I'm only mentioning it for the reference's sake.
The other is to store each removed post in an array. Then, when the user invokes the code above, I could additionally perform the deletion of the elements in that array. This solution requires me to build the logic for that and I'm having this sense that it should be done automagically for me, if I ask nicely.
Am I right in my expectation and if so, how should I do it? If not, is there a smarter way to achieve my goal than creating this kill squad array?
At the moment, I do a double loop, first adding and updating what's left in the data grid. Then, removing anything that isn't found there. It's going to be painful if the number of elements grows. Also, for some reason I couldn't use Where because I need to rely on Contains and EF didn't let me do that. Not sure why.
using (Context context = new Context())
{
foreach (Thing thing in ViewModel.Things)
context.Things.AddOrUpdate(driver);
foreach (Thing thing in context.Things)
if (!ViewModel.Things.Contains(thing))
context.Things.Remove(thing);
context.SaveChanges();
}
The first thing I want to advice you is you should use the AddOrUpdate extension method only for seeding migrations. The job of AddOrUpdate is to ensure that you don’t create duplicates when you seed data during development.
The best way to achieve what you need you can find it in this link.
First in your ViewModel class you should have an ObservableCollection property of type Thing:
public ObservableCollection<Thing> Things {get;set;}
Then in the ViewModel's constructor (or in another place), you should set the Things property this way:
context.Things.Load();
Things = context.Things.Local;
From the quoted link:
Load is a new extension method on IQueryable that will cause the
results of the query to be iterated, in EF this equates to
materializing the results as objects and adding them to the DbContext
in the Unchanged state
The Local property will give you an ObservableCollection<TEntity> that
contains all Unchanged, Modified and Added objects that are currently
tracked by the DbContext for the given DbSet. As new objects enter the
DbSet (through queries, DbSet.Add/Attach, etc.) they will appear in
the ObservableCollection. When an object is deleted from the DbSet it
will also be removed from the ObservableCollection. Adding or Removing
from the ObservableCollection will also perform the corresponding
Add/Remove on the DbSet. Because WPF natively supports binding to an
ObservableCollection there is no additional code required to have two
way data binding with full support for WPF sorting, filtering etc.
Now to save changes, the only you need to do is create a command in your ViewModel class that call SaveThingsChanges method:
private void SaveThingsChanges()
{
context.SaveChanges();
}

How to eagerly load several attributes (including parent/grandparent/great grandparent attributes) without having duplicate grandparents on a parent

I have an object that I want to eagerly load, where I want to eagerly load several parent elements, but also some grandparent elements. I've set up my select like so:
var events = (from ed in eventRepo._session.Query<EventData>() where idsAsList.Contains(ed.Id) select ed)
.FetchMany(ed => ed.Parents)
.ThenFetchMany(pa => pa.Grandparents)
.ThenFetch(gp => gp.GreatGrandparents)
// other fetches here for other attributes
.ToList();
My problem is that if I just .FetchMany the parents, I get the right number of elements. Once I add the grandparents, I get way too many, and that grows even more with great grandparents.
It's clearly doing some kind of cartesian product, so I had a look around and saw that some people use Transformers to solve this. I had a look at that and tried to implement it, but adding a .TransformUsing() causes a compiler error, since I don't seem to be able to call .TransformUsing() on this type of call.
What is the right way to get the right number of elements from such a call, without duplicates due to computing the cartesian product?
Here is a pretty popular post that uses Futures to do this type of loadign to avoid cartesian products. It isn't as elegant as doing it in a single query but it gets the job done.
Fighting cartesian product (x-join) when using NHibernate 3.0.0
One other possible solution would be to define your collections as sets instead of bags. This would also avoid cartesian product issues. I don't really like this solution considering you have to use an nhibernate specific collection type but it is known to work.
There is not much you can do about it if you get force NHibernate join explicitly. The database will return same entities multiple times (this is perfectly normal since your query makes Cartesian joins). And NHibernate cannot distinguish if you ask same item multiple times or not. NHibernate does not know your intention. There is a workaround though. You can add the below line
var eventsSet = new HashSet<Events>(events);
Assuming your entity override Equals and GetHashCode, you will end up with unique events.

Nice Rx way of subscribing to a collection within a collection

I'll describe my object model, then what I want to do.
It is a Silverlight application, and these are model objects that are bound to UI elements.
An Agreement has a collection of TradingBranch, branches can be added or removed. A branch has collection of Product.
agreement.Branches
.SelectMany(x => x.Products)
.Distinct()
These collections are driven by a matrix of branches and products. The same products can be selected by more than one branch, hence the Distinct.
Essentially I want to let the user select from a list of all the products that have been selected as available for any of the branches. I want this list to be updated when there is a change in the matrix.
So rather than having to add a CollectionChanged handler for the branches, then more handlers to listen on the Products collection, work out whether the product is already there and then have to unsubscribe when branches are removed etc etc, I was hoping there was some nice Rx syntax I could employ to say simply - "listen to this piece of LINQ" and update this other observable collection that I'm binding my ListBox to when it changes.
Despite the name similarities, IObservable and ObservableCollection are completely unrelated and unfortunately also incompatible. They have two very different models of observing a collection.
Have a look at Bindable LINQ. It attempts to define LINQ-to-ObservableCollection, such that a LINQ query on a ObservableCollection results in a ObservableCollection again. The project seems dead, though, and I haven't used the recommend replacement yet (Obtics).
Try my ObservableComputations library:
agreement.Branches
.SelectingMany(x => x.Products)
.Distincting()
agreement.Branches and Products should be of type INotifyCollectionChanged (ObservableCollection).

.Remove(object) on a List<T> returned from LINQ to SQL compiled query won't delete the Object right

I am returning two lists from the database using LINQ to SQL compiled query.
While looping the first list I remove duplicates from the second list as I dont want to process already existing objects again.
eg.
//oldCustomers is a List<Customer> returned by my Compiled Linq to SQL Statmenet that I have added a .ToList() at the end to
//Same goes for newCustomers
for (Customer oC in oldCustomers)
{
//Do some processing
newCustomers.Remove(newCusomters.Find(nC=> nC.CustomerID == oC.CusomterID));
}
for (Cusomter nC in newCustomers)
{
//Do some processing
}
DataContext.SubmitChanges()
I expect this to only save the changes that have been made to the customers in my processing and not Remove or Delete any of my customers from the database.
Correct?
I have tried it and it works fine - but I am trying to know if there is any rare case it might actually get removed
Right. When you call .ToList() extension method on any IEnumerable, a new in-memory list of such items is created, without any binding to the previous location of items.
You can add or remove items to/from such a list without fear of some side-effects.
But I have to add that your code is terrible performance-wise.
for (Cusomter nC in newCusomters.Except(oldCustomers)) is much faster and easier to write.
I'm not sure what the point of re-inventing the wheel when it comes to unions. If you're using .NET 3.5 or later just use the LINQ union method.
Be sure to have:
'using System.Linq'
Then use:
var customers = oldCustomers.Union(newCustomers).To.List();
Union combines the two lists and removes any duplicates.
Note: The .ToList() is not necessary if you use var, it is necessary if you don't.
Right. List<T>.Remove() removes the item from the List collection.
You need to call DeleteOnSubmit() on the LINQ Table class inorder to delete the item from your Database.
As an aside, your duplicate list item algorithm is pretty inefficient. One way to rewrite it, is to use the HashSet class in .NET 3.5 which will give you O(1) lookups instead of the O(n) that Find does.

Categories

Resources