NHibernate NonUniqueObjectException when updating an existing record - c#

I am maintaing an existing application where I needed for performance issues to point NHibernate to a view to get away from it producing outer joins. This is ok, and I get an entity back populated with data.
Now, this object is then updated in C# and calls on Update, which is a generic method in the C# code used by a number of other repository classes. When this Update method is called, I am getting an error message:
"NHibernate.NonUniqueObjectException: a different object with the same identifier value was already associated with the session"
It points to a nested object inside the entity object, but I am unclear as to how to resolve this. I don't want to change the update method in case this impacts classes that use it.
If I need to revert back from using a view to get data, is it possible to set in the mapping config to force NHibernate to use equi-joins rather than left outer joins?
I am not that familiar with NHibernate, and so any guidance/help would be appreciated.

A session.Merge(entity) would be solution here. Working with detached objects is described here:
9.4.2. Updating detached objects
small cite:
...
The last case can be avoided by using Merge(Object o). This method
copies the state of the given object onto the persistent object with
the same identifier. If there is no persistent instance currently
associated with the session, it will be loaded. The method returns the
persistent instance. If the given instance is unsaved or does not
exist in the database, NHibernate will save it and return it as a
newly persistent instance. Otherwise, the given instance does not
become associated with the session. In most applications with detached
objects, you need both methods, SaveOrUpdate() and Merge().
Other words, calling Merge(entity) should solve this issue, and properly resolve conflicts related to "passed objects" vs "session (loaded) objects"

Related

How should one handle deep objects with EF+Web API?

This is my first question here, so be gentle.
I have a database model consisting of about 60 objects describing information and various features of a an industrial process. The end result is approx. 10-level deep objects.
My intention was to
send a top level object to the client in JSON
two-way bind said object (in angular, but nvm that) and manipulate it
make the client's AJAX calls refer to that top-level object
rebuild said object via one or two constructor calls in the Web API
alter the object and save the changes via a top-level object method
My solution was to create an additional layer of objects that are based on the EF ones to allow omitting/adding data from/in the objects being sent to the client at will, circumventing issues with circular references and other problems caused by eager/lazy loading. These objects are fed to the Web API.
Now here's where the trouble begins:
as a result of the additional layer, reconstruction of the EF objects from the additional layer ones is necessary whenever saving changes in the EF objects down the chain. It is getting increasingly arduous to keep up with all of this.
The objects are highly interconnected and constrained. Should I just write extensions for the EF objects to emulate the features of the additional layer?
If that's the case, won't the JavaScriptSerializer try to serialize all the objects in all of the relationships (where the serialized object's key is defined as a FK in another object)? Because that's what I've gathered from the error messages.
Or am I doing this all wrong?
In a disconnected application like yours, I'd remove all navigation properties. They may seem convenient at first, but will cause headaches along the way.
I believe that accessing all entities via Id is the way to go.
You can write a JavaScript class which is responsible for receiving entities per Id and can therefore cache them.
So each time you need an entity on the client, you get it through this class.
This would result in having one controller for each entity.
Another advantage is that you don't always have to send and receive the whole object graph, which seems like a lot of data (10 levels deep is a lot).
won't the JavaScriptSerializer try to serialize all the objects in all of the relationships
Yes it will. That's one reason why objects with navigation properties, especially circular ones, are very difficult to serialize.

How to clear Original Values of a detached DevForce entity

We are running into various StackOverflowException and OutOfMemoryExceptions when sending certain entities to the server as part of an InvokeServerMethod call. The problem seems to come up because DevForce ends up trying to serialize a ton more data than we are expecting it to. I tracked it down to data that is stored in the OriginalValuesMap.
The original values are for DataEntityProperties that we've added to the entity but that aren't marked with [DataMember] so they normally don't get sent to the server. But if we have an existing (previously saved entity) and then change one of those properties, the initial value of the property does end up getting serialized as part of the OriginalValuesMap. This is causing us big problems because it turns out the original value is an entity that has a huge entity graph.
Adding to the problem, the entities we are dealing with are actually clones (via ((ICloneable)origEntity).Clone()) of existing (previously saved) entities so they have a state of detached and I haven't found a way to clear the OriginalValuesMap for detached entities. Usually I'd do myEntity.EntityAspect.AcceptChanges() but that doesn't do anything for detached entities. I couldn't find any other easy way to do this.
So far, the only way I've found to clear the original values is to attach the entity to an Entity Manager. This ends up clearing the original values but it is a major pain because I'm actually dealing with a large number of entities (so performance is a concern) and many of these entities don't have unique primary key values (in fact, they don't have any key values filled in because they are just 'in memory' objects that I don't plan to actually ever save) so I need to do extra work to avoid 'duplicate key exception' errors when adding them to an entity manager.
Is there some other way I can clear the original values for a detached entity? Or should detached entities even be tracking original values in the first place if things like AcceptChanges don't even work for detached entities? Or maybe a cloned entity shouldn't 'inherit' the original values of its source? I don't really have a strong opinion on either of these possibilities...I just want to be able to serialize my entities.
Our app is a Silverlight client running DevForce 2012 v7.2.4.0
Before diving into the correct behavior for detached entities, I'd like to back up and verify that it really is the OriginalValuesMap which is causing the exception. The contents of the OriginalValuesMap should follow the usual rules for the DataContractSerializer, so I'd think that non-DataMember items would not be serialized. Can you try serializing one of these problem entities to a text file to send to IdeaBlade support? You can use SerializationFns.Save(entity, filename, null, false) to quickly serialize an item. If it does look like the OriginalValuesMap contains things it shouldn't, I'll also need the type definition(s) involved.

Intercepting NHibernate Lazy-Load behaviour to return null if not connected to a session?

This seems like it should be an obvious thing but I've been searching for the answer for hours now with no success.
I'm using NHibernate to persist a domain model, with a service layer that serves an ASP.NET MVC front end (the 'service layer' is currently just a standard class library but may be converted to WCF in the future). The web app asks for the data it wants and specifies the collections on the domain object that it needs, the service layer takes the request, loads the object and required collections (using lazy loading) and passes the object back where it is transformed using AutoMapper to a viewmodel friendly representation.
What I want to be able to do is load the required collections, detach the object from the session and pass it to the front end. However, when AutoMapper tries to map the object this causes a an exception because it's trying to access collections that haven't been initialized and the session is no longer available. I can leave the object connected but in this case the AutoMapper transformation ends up causing all the properties on the object to be lazy-loaded anyway and this won't be an option is we go down the WCF route.
What I want to do is alter this behaviour so that instead of throwing an exception, the collection returns null (or better yet empty) when it is not connected to a session. This was the default behaviour in Entity Framework V1 (which admittedly didn't do auto lazy loading), which I worked with previously but I can't find a way to do it in NH.
Any ideas? Am I on the wrong track here?
EDIT- To be a bit clearer on what I'm trying to achieve, when accessing a collection property I want this behaviour:
Connected to session: lazy-load collection as normal.
No session: property is null (rather than throw exception)
UPDATE - Following this post by Billy McCafferty, I've managed to implement a solution using IUserCollectionType that seems to work so far. Rather than use the provided PersistentGenericBag as he does though, I had to create new types that changed the behaviour when not connected to the session. It's not perfect and requires some very ugly mappings but at least I don't need to touch my domain objects or client mappings to get it working.
The most appropriate solution in this case is probably to check in AutoMapper for lazy-loadable fields if they were indeed loaded with NHibernateUtil.IsInitialized(). Not sure how/if possible to make Automapper use this check for all implicit property mappings though.
Old question but this is what we did to solve the same issue, hopefully it helps to set you on correct path if somebody stumbles upon this problem.

Entity Framework: Attaching related objects and other state management

I have an application which uses Entity Framework Code First. I am attempting to write my resource access layer. I have several objects which all have separate database tables and a lot of object relationships. Can someone point me to an up-to-date example of CRUD methods with related objects? Everything I have found uses an older version (I use DbContext, not ObjectContext, etc.) and I am having problems writing it myself.
For example, I am currently working on an object with a parent-child relationship with itself. I am attempting to write the Create method. If I use context.Objects.Add(newObject) then all the children objects also have their state changed to Added, which means that duplicate children are added. So I tried looping through all the children and attaching them to the context, but then any children that did not previously exist do not get added to the database and a DbUpdateException is thrown.
Is there a generic way I can attach all related entities and have their states be correct? Any help you can give me would be appreciated. Thanks!
Edit:
I tried explicitly loading the children using Load() and then adding the initial object. Unfortunately, it caused an exception because the parent comment had the child in its list of children but the parentID of the existing child had not yet been updated.
No there is no way to attach whole graph and let EF automatically set correct state - these problems didn't changed since ObjectContext API. You must always set state manually for each entity and relation or you must build the graph from attached entities. The only exception are Self tracking entities but they are not supported with DbContext API.

how can I save/keep-in-sync an in-memory graph of objects with the database?

Question - What is a good best practice approach for how can I save/keep-in-sync an jn-memory graph of objects with the database?
Background:
That is say I have the classes Node and Relationship, and the application is building up a graph of related objects using these classes. There might be 1000 nodes with various relationships between them. The application needs to query the structure hence an in-memory approach is good for performance no doubt (e.g. traverse the graph from Node X to find the root parents)
The graph does need to be persisted however into a database with tables NODES and RELATIONSHIPS.
Therefore what is a good best practice approach for how can I save/keep-in-sync an jn-memory graph of objects with the database?
Ideal requirements would include:
build up changes in-memory and then 'save' afterwards (mandatory)
when saving, apply updates to database in correct order to avoid hitting any database constraints (mandatory)
keep persistence mechanism separate from model, for ease in changing persistence layer if needed, e.g. don't just wrap an ADO.net DataRow in the Node and Relationship classes (desirable)
mechanism for doing optimistic locking (desirable)
Or is the overhead of all this for a smallish application just not worth it and I should just hit the database each time for everything? (assuming the response times were acceptable) [would still like to avoid if not too much extra overhead to remain somewhat scalable re performance]
I'm using the self tracking entities in Entity Framework 4. After the entities are loaded into memory the StartTracking() MUST be called on every entity. Then you can modify your entity graph in memory without any DB-Operations. When you're done with the modifications, you call the context extension method "ApplyChanges(rootOfEntityGraph)" and SaveChanges(). So your modifications are persisted. Now you have to start the tracking again on every entity in the graph. Two hints/ideas I'm using at the moment:
1.) call StartTracking() at the beginning on every entity
I'm using an Interface IWorkspace to abstract the ObjectContext (simplifies testing -> see the OpenSource implementation bbv.DomainDrivenDesign at sourceforge). They also use a QueryableContext. So I created a further concrete Workspace and QueryableContext implementation and intercept the loading process with an own IEnumerable implementation. When the workspace's consumer executes the query which he get with CreateQuery(), my intercepting IEnumerable object registers an eventhandler on the context's ChangeTracker. In this event handler I call StartTracking() for every entity loaded and added into the context (doesn't work if you load the objects with NoTrakcing, because in that case the objects aren't added to the context and the event handler will not be fired). After the enumeration in the self made Iterator, the event handler on the ObjectStateManager is deregistered.
2.) call StartTracking() after ApplyChanges()/SaveChanges()
In the workspace implementation, I ask the context's ObjectStateManager for the modified entities, i.e:
var addedEntities = this.context.ObjectStateManager.GetObjectStateEntries(EntityState.Added);
--> analogous for modified entities
cast them to IObjectWithChangeTracker and call the AcceptChanges() method on the entity itself. This starts the object's changetracker again.
For my project I have the same mandatory points as you. I played around with EF 3.5 and didn't find a satisfactory solution. But the new ability of self tracking entities in EF 4 seems to fit my requirements (as far as I explored the funcionality).
If you're interested, I'll send you my "spike"-project.
Have anyone an alternative solution? My project is a server application which holds objects in memory for fast operations, while modifications should also be persisted (no round trip to DB). At some points in code the object graphs are marked as deleted/terminated and are removed from the in-memory container. With the explained solution above I can reuse the generated model from EF and have not to code and wrapp all objects myself again. The generated code for the self tracking entities arises from T4 templates which can be adapted very easily.
Thanks a lot for other ideas/critism
Short answer is that you can still keep a graph (collection of linked objects) of the objects in memory and write the changes to the database as they occur. If this is taking too long, you could put the changes onto a message queue (but that is probably overkill) or execute the updates and inserts on a separate thread.

Categories

Resources