Why Is FluentNHibernate Missing CascadeType.REPLICATE? - c#

I must keep the same domain running at two places at the same time. One end must be able to run "offline", while still must receive and send data to the other end from time to time when "online". Basically we got a central server which aggregates data comming from the clients and serves some updated data (like the latest price of a product, new products, etc). I'm using NHibernate to take care of persistance.
I'm trying to use NHibernate's Replicate method
session.Replicate(detached, ReplicationMode.LatestVersion);
to get the object comming from the other end and incorporate/merge/attach to the "local" database.
It fails to execute because it can't cascade the references and collections. Reviewing the cascade options from FluentNHibernate (and even directly looking at NHibernate source code) I could not find the REPLICATE cascade type. From Hibernate's documentation:
CascadeType.REPLICATE
My question is: does anybody knows why FluentNHibernate is missing such option? Is there a different/better way to set this kind of cascade behaviour?
I tried the Cascade.Merge() option together with session.Merge(detached), but although the cascade works just fine, it give me some headaches, mainly because of the id generation and optmistic lock (versioning).
EDIT: NHibernate's source code DOES have a ReplicateCascadeStyle class that maps to the string "replicate". The Cascade / CascadeConverter classes (from Mapping.ByCode namespace) DOES NOT have Replicate as an option. So NHibernate itself supports cascade on Replicate, but only through manual mapping I guess.

OK, as I'm using Fluent NHibernate to map about 100+ classes, switch to xml mapping is not an option to me.
So I forked Fluent NHibernate on GitHub, added the missing Cascade.Replicate option and sent a pull request.
Hope it helps someone.

Related

ef core best practice to update complexe objects

We are leading into some issues with ef-core on sql databases in a web-api when trying to update complexe objects on the database provided by a client.
A detailed example: When receiving an object "Blog" with 1-n "Posts" from an client and trying to update this existing object on database, should we:
Make sure the primary keys are set and just use
dbContext.Update(blogFromClient)
Load and track the blog while
including the posts from database, then patch the changes from
client onto this object and use SaveChanges()
When using approach (1) we got issues with:
Existing posts for the existing blog on database are not deleted
when the client does not post them any more, needing to manually
figure them out and delete them
Getting tracking issues ("is already been tracked") if
dependencies of the blog (for example an "User" as "Creator") are
already in ChangeTracker
Cannot unit test our business logic without using a real DbContext
while using a repository pattern (tracking errors do just not exist)
While using a real DbContext with InMemoryDatabase for tests cannot rely on things like foreign-key exceptions or computed
columns
when using approach (2):
we can easily manage updated relations and keep an easy track of
the object
lead into performance penalty because of loading the
object which we do not really need
need to map many manual things
as tools like AutoMapper cannot be used to automaticlly map
objects with n-n relations while keeping a correct track by ef core (getting some primary key errors, as some objects are deleted from lists and are added again with the same primary
key, which is not allowed as the primary key cannot be set on insert)
n-n relations can be easily damaged by this as on database
there could be n-n blog to post, while the post in blog does hold
the same relation to its posts. if only one relation is (blog to
post, but not post to blog - which is the same in sql) is posted and
the other part is deleted from list, ef core will track this entry
as "deleted".
in vanilla SQL we would manage this by
deleting all existing relations for the blog to posts
updating the post itself
creating all new relations
in ef core we cannot write such statements like deleting of bulk relations without loading them before and then keeping detailed track on each relation.
Is there any best practice, how to handle an update of complexe objects with deep relations while getting the "new" data from a client?
The correct approach is #2: "Load and track the blog while including the posts from database, then patch the changes from client onto this object and use SaveChanges()".
As to your concerns:
lead into performance penalty because of loading the object which we do not really need
You are incorrect in assuming you don't need this. You do in fact need this because you absolutely shouldn't be posting every single property on every single entity and related entity, including things that should not be be changed like audit props and such. If you don't post every property, then you will end up nulling stuff out when you save. As such, the only correct path is to always load the full dataset from the database and then modify that via what was posted. Doing it any other way will cause problems and is totally and completely 100% wrong.
need to map many manual things as tools like AutoMapper cannot be used to automaticlly map objects with n-n relations while keeping a correct track by ef core
What you're describing here is a limitation of any automatic mapping. In order to map entity to entity in collections, the tool would have to somehow know what identifies each entity uniquely. That's usually going to be a PK, of course, but AutoMapper doesn't (and shouldn't) make assumptions about that. Instead, the default and naive behavior is to simply replace the collection on the destination with the collection on the source. To EF, though, that looks like you're deleting everything in the collection and then adding new items to the collection, which is the source of your issue.
There's two paths forward. First, you can simply ignore the collection props on the source, and then manually map these. You can still use AutoMapper for the mapping, but you'd simply need to iterate over each item in the collection individually matching it with the appropriate item that should map to it, based on your knowledge of what identifies the entity (i.e. the part AutoMapper doesn't know).
Second, there's actually an additional library for AutoMapper to make this easier: AutoMapper.Collection. The entire point of this library is to provide the ability to tell AutoMapper how to identify your entities, so that it can then map collections correctly. If you utilize this library and add the additional necessary configuration, then you can map your entities as normal without worrying about collections getting messed up.

Is it possible to use Entity Framework and keep object relations in the code and out of the database

I'm having a hard time just defining my situation so please be patient. Either I have a situation that no one blogs about, or I've created a problem in my mind by lack of understanding the concepts.
I have a database which is something of a mess and the DB owner wants to keep it that way. By mess I mean it is not normalized and no relationships defined although they do exist...
I want to use EF, and I want to optimize my code by reducing database calls.
As a simplified example I have two tables with no relationships set like so:
Table: Human
HumanId, HumanName, FavoriteFoodId, LeastFavoriteFoodId, LastFoodEatenId
Table: Food
FoodId, FoodName, FoodProperty1, FoodProperty2
I want to write a single EF database call that will return a human and a full object for each related food item.
First, is it possible to do this?
Second, how?
Boring background information: A super sql developer has written a query that returns 21 tables in 20 milliseconds which contain a total of 1401 columns. This is being turned into an xml document for our front end developer to bind to. I want to change our technique to use objects and thus reduce the amount of hand coding and mapping from fields to xml (not to mention the handling of nulls vs empty strings etc) and create a type safe compile time environment. Unfortunately we are not allowed to change the database or add relationships...
If I understand you correct, it's better for you to use Entity Framework Code First Approach:
You can define your objects (entities) Human and Food
Make relations between them in code even if they don't have foreign keys in DB
Query them usinq linq-to-sql
And yes, you can select all related information in one call.
You can define the relationships in the code with Entity Framework using Fluent API. In your case you might be able to define your entities manually, or use a tool to reverse engineer your EF model from an existing database. There is some support for this built in to Visual Studio, and there are VS extensions like EF Power Tools that offer this capability.
As for making a single call to the database with EF, you would probably need to create a stored procedure or a view that returns all of the information you need. Using the standard setup with lazy-loading enabled, EF will make calls to the database and populate the data as needed.

NHibernate using 2 schema's within the same application?

Imagine you are writing a large scale application using NHibernate and you want to have 2 seperate schema's (using Sql Server by the way)
Application_System (all the tables relating to the system, config tables, user tables etc)
Application_Data (all the actual data that is stored/retrieved when the user interacts with the system)
Now I've been trying to find a simple clean way to do this in NHibernate and thought I'd found a solution by using the Catalog and Schema properties so for example:
Catalog("Application_System");
Schema("dbo");
Table("SystemSettings")
would generate sql for Application_System.dbo.SystemSettings. And this kinda works but if I have 2 Catalogs defined then the Create/Delete tables functionality of hbm2ddl.auto stops working. Now I've come to the conclusion that I am probably abusing the Catalog and Schema properties for something it wasn't intended for. However I can't seem to find a simple way of achieving the same thing that doesn't involve some convoluted scaffolding.
Any help would be appreciated. I can't believe NHibernate wouldn't support this out of the box I mean it's a fairly basic requirement.
SchemaExport does not support creating schema/catalog ootb but you can add the create schema/catalog ddl by yourself using auxiliary objects in xml, FluentNHibernate or MappingByCode. Note that the auxiliary object has to be added first.
Ok well I kind of found a half way house that I'm reasonably satisfied with. The ISession has a Connection property that exposes a ChangeDatabase(string databaseName) method that allowes you to change the database the session is pointing to.
My schema export is still knackered because ultimately it doesn't know which object is for which database so will attempt to save it all to the database defined in the configuration.
You win some you lose some.

How to guard against NHibernate incomplete mappings

I am new to NHibernate/FluentNHibernate. I use FNH for my coding now as I find it is easier to use. However, I am working with some existing code base which is written in NHibernate. Today I found a bug in the code where the database wasn't getting updated as expected. After about 30 mins I found out that I hadn't updated the mapping xml even though I added a new class variable - so that row in the table wasn't getting updated. My question is, is there a way to identify such incomplete mappings with NHibernate easily so that I don't have to manually check the mapping always when something goes wrong? i.e. A warning message if I am updating an object which has non default data for any fields which aren't mapped?
Take a look at the PersistenceSpecification class in FluentNHibernate: http://wiki.fluentnhibernate.org/Persistence_specification_testing
You could wrap this up using reflection to test every property if that makes sense for your system.
You could also try to use the NHibernate mapping metadata and search for unmapped properties via reflection in a UnitTest.
By using the Metatdata, it is transparent for your application if you are using fluent nhibernate or other means to create the nhibernate mapping.
If you test your mappings in UnitTests you will know during test-time not during application startup if your mappings are alright.
This question seems to be related and this shows how to query the metadata.
The bug where the database did not get updated can be caused by issues other than not mapped field/property. There may be other mapping mistakes that are impossible to catch using reflection. What if you used wrong cascade or wrong generator? Or forgot association mapping?
If you want to catch majority of mapping issues you should create an integration test that will execute against real or in-memory database. Good overview of this approach is here.

Remove entity in NHibernate only by primary key

I'm trying to implement a repository method for removing entities using only primary key, mainly because from a webapp I usually only are aware of the primary key when invoking a "delete request" from a web page.
Because of the ORM, the option today is to get the entity from the database, and then deleting it, which gives me an extra roundtrip.
I could use HQL delete, but since I want to create a generic delete method for all entities, that won't fly unless I use reflection to find out which field that is the primary key (doable, but doesn't feel correct).
Or is it in the nature of NHibernate to need the entity in order to correctly handle cascades?
I tried this approach, with the assumption that it would not load the entity unless explicitly necessary, however haven't had time to test it yet. Maybe someone can shed some light on how this will be handled?
var entity = session.Load<T>( primaryKey );
session.Delete( entity );
EDIT: Have now tested it and it seems that it still does a full select on the entity before deletion.
Load may return a proxy object but it isn't guaranteed. Your mapping may contain cascade deletes that will force NHibernate to load the object from the database in order to determine how to delete the object and its graph.
I would implement this using Load as you are doing. For some objects NHibernate may not need to do a select first. In cases where it does, that's the [usually] trivial price you pay for using an o/r mapper.
This was already asked and answered before: How to delete an object by using PK in nhibernate?
I even have two blog posts about it:
http://sessionfactory.blogspot.com/2010/08/deleting-by-id.html
http://sessionfactory.blogspot.com/2010/08/delete-by-id-gotchas.html
nHibernate is an O(bject)RM. I agree with you that it probably needs the objects to resolve dependencies.
You can of course use direct ADO.Net calls to delete your objects. That presents its own problems of course since you'll have to take care of any cascading issues yourself. If you do go down this road, don't forget to evict from the nHibernate session whatever objects you delete using this method.
But, if this delete is in a really sensitive part of your system, that might be the way to go.
I'd make 100% sure though that this is the case. Throwing away all the nHibernate gives you because of this would not be wise.
I get the sense you know this, and you're looking for a strictly nHibernate answer, and I don't think it exists, sorry.
Disclaimer: I am not able to test it as of now. But won't following thing help:
Person entity = new Person();
entity.Id = primaryKey;
session.Delete( entity );
Don't load the entity but build your entity having just the primary key. I would have loved to test it but right now my environment is not working.

Categories

Resources