Will NHibernate attempt to save the same object twice in this scenario? - c#

Lets say I have two entities: Stable, and Pony.
Stable has an IList<Pony> with a HasMany mapping configured with Cascade.All().
That means that if I do session.Save(myStable), each Pony in the collection will be saved.
However, what happens if I do this?
Pony myLittlePony = new Pony(Color.Brown);
Stable myStable = new Stable(WoodTypes.RichMahogany);
myStable.Ponies.Add(myLittlePony);
session.Save(myStable);
session.Save(myLittlePony);
Will NHibernate try to save myLittlePony twice? or is NHibernate "smart" enough to know that myLittlePony has already been persisted to the DB?
Are there any performance implications from doing something like this?

NHibernate is pretty smart, and AFAIK, since myLittlePony would still be in the session and have no changes (IsDirty returning false), it won't trigger another flush to the persistence medium.

It will probably only be saved once* when the transaction is flushed but you need to mark one side of the relationship (typically the collection) as the inverse side to denote the non-owner of the relationship. This will insure that the behavior is consistent. If you don't mark the collection as the inverse side of the relationship NH will do an insert then update the child object with the foreign key from the parent. This will obviously fail if there's a non-nullable FK constraint.
My guess is that internally NH will save the object twice but the second save won't generate a database operation because the object will not have pending changes at that point.

It should only get saved once.
And actually nothing usually makes it to the database until you commit the transaction or flush the session.

If there is SQL Server then run SQL Profiler to see what happen but NHibernate shoud persist that object only ones.

.Save() doesn't do anything other than tell the session about your new object*. NH only works out what to do when the session flushes to the database, which is normally when the transaction commits. It will never write the same entity twice.
(* OK, it may go to the database if you're using a bad identity generator)

I believe what you're doing there is actually what you need to do. The only way that would work with an individual session.Save(myStable); you would need to correctly configure the cascade relationship for the stable class.

Related

How to access to the DBContext used in an EntityManager?

I insert/update a large amount of entities (~5000) during a process and this is taking a huge amount of time (it timeout on a 5 minutes transaction).
I read that by default the DBContext.AutoDetectChangesEnabled is set to ON and cause this kind of behavior (http://www.exsertus.be/2014/10/ef-bulk-performance/).
To my understanding, Devforce "kind of" encapsulate a DBContext within each EntityManager. Devforce use it's own implementation unless I define mine which I did. I would like to know how can I access it to be able to "play" with this property AutoDetectChangesEnabled.
Or are there any other solution to insert/update/delete large amount of entities with Devforce ?
Regards
I have worked with this EF tool "https://www.nuget.org/packages/EFUtilities" and I got big performance enhancement with large inserts, as it uses bulk copy instead of normal insert per entity.
You can check the documentation of Github here.
I have used it with a 17,000 entities insert transaction and it finished it in a few seconds.
Check this to get a better understanding and comparison with EF.
http://blog.credera.com/technology-insights/microsoft-solutions/entity-framework-batch-operations-using-ef-utilities/
A sample of using the utility to insert a list of entities is like this:
using (var db = new YourDbContext())
{
EFBatchOperation.For(db, db.BlogPosts).InsertAll(list);
}
Hope this helps.
Since you've defined your own DbContext you can alter this property setting in the DbContext constructor with Configuration.AutoDetectChangesEnabled = false;
However, I'm not sure how much this change will help. If your application is n-tier and you're trying to save ~5000 entities across the wire this will always be slow, and you'll also run into communication timeouts. Generally if you need to do any bulk operations DevForce isn't the optimal approach.

How to clear Original Values of a detached DevForce entity

We are running into various StackOverflowException and OutOfMemoryExceptions when sending certain entities to the server as part of an InvokeServerMethod call. The problem seems to come up because DevForce ends up trying to serialize a ton more data than we are expecting it to. I tracked it down to data that is stored in the OriginalValuesMap.
The original values are for DataEntityProperties that we've added to the entity but that aren't marked with [DataMember] so they normally don't get sent to the server. But if we have an existing (previously saved entity) and then change one of those properties, the initial value of the property does end up getting serialized as part of the OriginalValuesMap. This is causing us big problems because it turns out the original value is an entity that has a huge entity graph.
Adding to the problem, the entities we are dealing with are actually clones (via ((ICloneable)origEntity).Clone()) of existing (previously saved) entities so they have a state of detached and I haven't found a way to clear the OriginalValuesMap for detached entities. Usually I'd do myEntity.EntityAspect.AcceptChanges() but that doesn't do anything for detached entities. I couldn't find any other easy way to do this.
So far, the only way I've found to clear the original values is to attach the entity to an Entity Manager. This ends up clearing the original values but it is a major pain because I'm actually dealing with a large number of entities (so performance is a concern) and many of these entities don't have unique primary key values (in fact, they don't have any key values filled in because they are just 'in memory' objects that I don't plan to actually ever save) so I need to do extra work to avoid 'duplicate key exception' errors when adding them to an entity manager.
Is there some other way I can clear the original values for a detached entity? Or should detached entities even be tracking original values in the first place if things like AcceptChanges don't even work for detached entities? Or maybe a cloned entity shouldn't 'inherit' the original values of its source? I don't really have a strong opinion on either of these possibilities...I just want to be able to serialize my entities.
Our app is a Silverlight client running DevForce 2012 v7.2.4.0
Before diving into the correct behavior for detached entities, I'd like to back up and verify that it really is the OriginalValuesMap which is causing the exception. The contents of the OriginalValuesMap should follow the usual rules for the DataContractSerializer, so I'd think that non-DataMember items would not be serialized. Can you try serializing one of these problem entities to a text file to send to IdeaBlade support? You can use SerializationFns.Save(entity, filename, null, false) to quickly serialize an item. If it does look like the OriginalValuesMap contains things it shouldn't, I'll also need the type definition(s) involved.

What is happening behind EF SaveChanges()

Assume that I have a domain object which has association with couple of other entities (ofcourse mapped to multiple tables). And I made changes in master and associated entities. Naturally EF has to update this in multiple tables on save.
Whether it be ObjectContext or DbContext, a call to SaveChanges() method will tell Entity Framework to "Saves all changes made in this context to the underlying database."
Could anyone tell me "What is happening behind SaveChanges()"?
Is all resulting sql statements INSERT/UPDATE/DELETE goes to database in one go as prepared statement?
or is EF doing back and forth with DB to execute the sql statement one by one?
Is there any configuration in EF to switch between this?
At the moment statements for CUD operations are not batched. We have a work item to fix this. Feel free to upvote
From what I understand, each modified entity will result in a roundtrip to the database (all bound by a single transaction). While I'm not aware of any configuration that will change this behavior, I won't say that there are not EF implementations out there that achieve this "batching" functionality. I just don't think they are available, out of the box.

Dangers of creating Foreign Keys. nHibernate and SQL Server

I have inherited an old shabby database, and would like to put loads of foreign keys in on existing relationship columns, so that I can use things like nHibernate for relationships.
I am a little inexperienced in the realm of keys, and although I feel I understand how they work, there is some part of me that is fearful, or corrupting the database somehow.
For example, I've come across the concept of "cascade on delete". I don't think there are currently any foreign keys on the database, so I guess this won't affect me... but how can I check to be sure?
What other risks do I need to be aware of?
In a way I'd love to just use nHibernate without foreign keys, but from what I can see this wouldn't be possible?
The biggest problem of putting foreign keys on a database that was designed without them (which is a indication the orginal database designers were incompetent, so there will be many other problems to fix as well), is that there is close to a 100% chance that you have orphaned data that doesn't have a parent key. You will need to figure out what to do with this data. Some of it can just be thrown out as it is no longer usable in any fashion and is simply wasting space. However if any of it relates to orders or anything financial, you need to keep the data, in which case you may need to define a parent record of "unknown" that you can relate the records to. Find and fix all bad data first, then add the foreign keys.
Use cascade update and cascade delete sparingly as they can lock up your database if a large number of records need to be changed. Additonally, in many cases, you want the delete to fail if existing records exist. You don't want to cascade delete ever through financial records for instance. If deleting the user would delete past orders, that is a very bad thing! If you don't use cascading, you are likely to come across the buggy code that let the data get bad when you can no longer delete or change a record once the key is in place. So test all deleting and updating functionality thoroughly once you have the keys in place.
NHibernate does not require foreign keys to be present on a database to be used, however I would still recommend adding foreign keys whenever possible as foreign keys are a good thing they make sure that your database's referential integrity is as it should be.
For example, if I had a User and a Comment table within my database and I were to delete user 1 who happens to have made two comments, without foreign keys I'd now have two comments without an owner! We obviously do not want this situation to ever occur.
This is where foreign keys come in, by declaring that User is a foreign key within Comment table our database server will make sure that we can't delete a user unless it there are no comments associated with him or her (anymore).
Introducing foreign keys into a database is a good thing. It will expose existing invalid data. It will keep existing valid data, valid. You might have to perform some data manipulation on tables that have already gone haywire (i.e. create an 'Unknown user' or something similar and update all non-existing keys to point at it, this is a decision that needs to be made after examining the meaning of the data).
It might even cause a few issues initially where an existing application crash if for example it doesn't delete all the data it should do (such as not deleting all comments in my example). But this is a good thing in the long term, as it exposes where things are going wrong and allows you to fix them without the data and database getting into an even worse state in the meantime.
NHibernate cascades are seperate from foreign keys and are NHibernate's way of allowing you to for example make sure all child objects are deleted when you delete a parent. This for example allows you to make sure that any change you make to your data model does not violate your foreign key relationships (which would cause a database exception and no changes to be applied). Personally I prefer to take care of this myself, but it's up to you whether and how you want to use them.
Foreign keys formalize relationships in a normalized database. The foreign key constraints you are talking about do things like preventing the creation of duplicate keys or the deletion of a field which defines an entity still being used or referenced. This is called "referential integrity.
I suggest using some kind of modelling tool to draw a so-called ERM or entity-relationship model diagram. This will help you to get an overview of how the data is stored and where changes would be useful.
After you have done this, then you should consider whether or not the data is at a reasonable (say second or third normal form) degree of normalization. Pay particular attention to every entity having a primary key, and that the data completely describes the key. You should also try to remove redundancy and split non-atomic fields into a new table. "Every non-key attribute must provide a fact about the key, the whole key, and nothing but the key so help you Codd." If you find the data is not normalized it would be a good time to fix any serious structural problems and/or refactor, if appropriate.
At this point, adding foreign keys and constraints is putting the cart before the horse. Ensure you have data integrity before you try to protect it. You need to do some preparation work first, then constraints will keep your not-so-shabby newly remodeled database in tip-top shape. The constraints will ensure that no one decides to make exceptions to the rules that turn the data into a mess. Take the time to give the data a better organized home now, and put locks on the doors after.

Remove entity in NHibernate only by primary key

I'm trying to implement a repository method for removing entities using only primary key, mainly because from a webapp I usually only are aware of the primary key when invoking a "delete request" from a web page.
Because of the ORM, the option today is to get the entity from the database, and then deleting it, which gives me an extra roundtrip.
I could use HQL delete, but since I want to create a generic delete method for all entities, that won't fly unless I use reflection to find out which field that is the primary key (doable, but doesn't feel correct).
Or is it in the nature of NHibernate to need the entity in order to correctly handle cascades?
I tried this approach, with the assumption that it would not load the entity unless explicitly necessary, however haven't had time to test it yet. Maybe someone can shed some light on how this will be handled?
var entity = session.Load<T>( primaryKey );
session.Delete( entity );
EDIT: Have now tested it and it seems that it still does a full select on the entity before deletion.
Load may return a proxy object but it isn't guaranteed. Your mapping may contain cascade deletes that will force NHibernate to load the object from the database in order to determine how to delete the object and its graph.
I would implement this using Load as you are doing. For some objects NHibernate may not need to do a select first. In cases where it does, that's the [usually] trivial price you pay for using an o/r mapper.
This was already asked and answered before: How to delete an object by using PK in nhibernate?
I even have two blog posts about it:
http://sessionfactory.blogspot.com/2010/08/deleting-by-id.html
http://sessionfactory.blogspot.com/2010/08/delete-by-id-gotchas.html
nHibernate is an O(bject)RM. I agree with you that it probably needs the objects to resolve dependencies.
You can of course use direct ADO.Net calls to delete your objects. That presents its own problems of course since you'll have to take care of any cascading issues yourself. If you do go down this road, don't forget to evict from the nHibernate session whatever objects you delete using this method.
But, if this delete is in a really sensitive part of your system, that might be the way to go.
I'd make 100% sure though that this is the case. Throwing away all the nHibernate gives you because of this would not be wise.
I get the sense you know this, and you're looking for a strictly nHibernate answer, and I don't think it exists, sorry.
Disclaimer: I am not able to test it as of now. But won't following thing help:
Person entity = new Person();
entity.Id = primaryKey;
session.Delete( entity );
Don't load the entity but build your entity having just the primary key. I would have loved to test it but right now my environment is not working.

Categories

Resources