Nhibernate: how to postpone update or delete? - c#

i need help:
I'm a beginner with Nhibernate.
I have created a wpf application that load a datagrid binded with an observable collection.
This collection is loaded with repository pattern and Nhibernate querying database.
I want to modify this collection with UI (edit, add, delete).
When i click to my save button i want to persist my changes to db table.
I read nhibernate documentation and i learn that there are 2 level of cache, my idea is to modify objects in first level cache, and when I am sure of my changes i want to persist.
there are some best practices for doing this?
How to mark for deletion or update an object and delete or update it after "save changes" click?

This should be an interesting read: Building a Desktop To-Do Application with NHibernate
Basically, you should use the ISession object's methods, and do operations inside a transaction, i.e. ISession.BeginTransaction()
It depends on how you get your entities. If they are root entities, e.g. employee then when you delete an entity from the grid, you should keep track of these deleted entities and call delete on all of them. You should also keep track of the added entities.
Then basically what you are left with are the updated entities. NH keeps track of the state and knows if an entity was modified.
We have ISession.Save/Update/Delete.
When you have done this for every modified entity, call Commit on the transaction. This will save the changes to the database.
If your entities are not roots, but e.g. are employees addresses, then it will be enough to call save on the employee - if your mappings are correct.

Related

ef core best practice to update complexe objects

We are leading into some issues with ef-core on sql databases in a web-api when trying to update complexe objects on the database provided by a client.
A detailed example: When receiving an object "Blog" with 1-n "Posts" from an client and trying to update this existing object on database, should we:
Make sure the primary keys are set and just use
dbContext.Update(blogFromClient)
Load and track the blog while
including the posts from database, then patch the changes from
client onto this object and use SaveChanges()
When using approach (1) we got issues with:
Existing posts for the existing blog on database are not deleted
when the client does not post them any more, needing to manually
figure them out and delete them
Getting tracking issues ("is already been tracked") if
dependencies of the blog (for example an "User" as "Creator") are
already in ChangeTracker
Cannot unit test our business logic without using a real DbContext
while using a repository pattern (tracking errors do just not exist)
While using a real DbContext with InMemoryDatabase for tests cannot rely on things like foreign-key exceptions or computed
columns
when using approach (2):
we can easily manage updated relations and keep an easy track of
the object
lead into performance penalty because of loading the
object which we do not really need
need to map many manual things
as tools like AutoMapper cannot be used to automaticlly map
objects with n-n relations while keeping a correct track by ef core (getting some primary key errors, as some objects are deleted from lists and are added again with the same primary
key, which is not allowed as the primary key cannot be set on insert)
n-n relations can be easily damaged by this as on database
there could be n-n blog to post, while the post in blog does hold
the same relation to its posts. if only one relation is (blog to
post, but not post to blog - which is the same in sql) is posted and
the other part is deleted from list, ef core will track this entry
as "deleted".
in vanilla SQL we would manage this by
deleting all existing relations for the blog to posts
updating the post itself
creating all new relations
in ef core we cannot write such statements like deleting of bulk relations without loading them before and then keeping detailed track on each relation.
Is there any best practice, how to handle an update of complexe objects with deep relations while getting the "new" data from a client?
The correct approach is #2: "Load and track the blog while including the posts from database, then patch the changes from client onto this object and use SaveChanges()".
As to your concerns:
lead into performance penalty because of loading the object which we do not really need
You are incorrect in assuming you don't need this. You do in fact need this because you absolutely shouldn't be posting every single property on every single entity and related entity, including things that should not be be changed like audit props and such. If you don't post every property, then you will end up nulling stuff out when you save. As such, the only correct path is to always load the full dataset from the database and then modify that via what was posted. Doing it any other way will cause problems and is totally and completely 100% wrong.
need to map many manual things as tools like AutoMapper cannot be used to automaticlly map objects with n-n relations while keeping a correct track by ef core
What you're describing here is a limitation of any automatic mapping. In order to map entity to entity in collections, the tool would have to somehow know what identifies each entity uniquely. That's usually going to be a PK, of course, but AutoMapper doesn't (and shouldn't) make assumptions about that. Instead, the default and naive behavior is to simply replace the collection on the destination with the collection on the source. To EF, though, that looks like you're deleting everything in the collection and then adding new items to the collection, which is the source of your issue.
There's two paths forward. First, you can simply ignore the collection props on the source, and then manually map these. You can still use AutoMapper for the mapping, but you'd simply need to iterate over each item in the collection individually matching it with the appropriate item that should map to it, based on your knowledge of what identifies the entity (i.e. the part AutoMapper doesn't know).
Second, there's actually an additional library for AutoMapper to make this easier: AutoMapper.Collection. The entire point of this library is to provide the ability to tell AutoMapper how to identify your entities, so that it can then map collections correctly. If you utilize this library and add the additional necessary configuration, then you can map your entities as normal without worrying about collections getting messed up.

How to update / synchronize DbContext after external changes to the database

I am using Entity Framework 6 where, for performance reasons, I load my entities into my DbContext up front and then use them locally. Up to now, all changes to the database have gone through the DbContext so my local entities and the database have been in sync. However, I now have to call a Stored Procedure on the database, which has the side effects of making changes to tables (outside of the DbContext) that need to be reflected in my entities. By changes, I mean it is adding new records and deleting / updating existing records.
I do not want to dispose of my DbContext and create a new one, as some of the entity instances are wrapped within ViewModel classes. So, deleting the DbContext in this way would lead to major problems in the UI.
It is my understanding that simply calling Load() on all my DbSets of the DbContext will just replace the existing instances. So, any objects using the old entities instances won't work.
So, I thought I could use the Reload method like:
context.Entry(entity).Reload();
which would update my local entities, but I can only do this for the entities that the DbContext already knows about. It doesn't cover any NEW entities or DELETED entities that were created / deleted as a result of the Stored Procedure executing.
So, I am looking for a way to:
Load, from the database, entities that are NEW to my DbContext
Reload existing entities in my DbContext
Remove any deleted entities from my DbContext
Here is the official documentation for Entity Framework.
Starting from the analysis of your database situation, it suggests smart and quick ways to obtain what you want, detailing when necessary data-read strategies (like eager or lazy loading) or providing tutorials to correctly use code generation and the Wizard GUI.
http://www.entityframeworktutorial.net/choosing-development-approach-with-entity-framework.aspx
Here some more detailed info and tutorial on data-read strategies:
https://www.c-sharpcorner.com/article/eager-loading-lazy-loading-and-explicit-loading-in-entity-framework/
As I already told you in comments, I would suggest a database-first approach and with lazy loading to avoid uncontrolled data behaviours (or reloading the whole db when running a stored procedure).
Talking about the SP, it can simply be mapped through the Wizard that comes with Entity Framework and wrapped by a method.
Hope you will find these resources helpful!
In general EntityFramework can not aware on change in database and update dbcontext .there is no optimized or EntityFramework built-in solution for it.
I think you can use CDC in SqlServer, push change to your application and update your dbcontext. But it not acceptable solution for all business and senario

Linq updating existing record

I need to update the existing record in linq if the record exists elese add a new one.
will saveChanges() work for both? if yes how to differentiate between updation and insertion.
Thanks in advance.
SaveChanges() performs all changes you made to a data base since the last call. This includes:
Adding new items to a collection
Deleting items from a collection
Changing properties
So, you have either to add the record to a collection or to get the existing one and modify its properties. There is no general method to do this point. After you performed the changes, call SaveChanges() to save them.
Linq in general is for querying not for modification (it stands for Language Integrated Query after all) - ideally you do not want to create any side effects. Updating and insertion differ in that for updating you usually will have to query for an existing record to well.. update - for insertion you just add it. And yes SaveChanges() will work for both in that it commits your changes and additions to the underlying data store.
Your question is very broad - without a particular code that you are struggling with its hard to give a more detailed answer.

How NHibernate track changes made to the fields in my entity?

How NHibernate track changes made to the fields in my entity? If I use second level cache and I change my entity, How does it apply my changes to db?
When you change an entity, the entity becomes "dirty" and nhibernate knows the update the entity in your database when the session is flushed. That said, sometimes its possible for entities to get marked dirty even though you have made no change. This results in unnecessary update calls to your database.
It is best to isolate your entities from your views via view models. Once you pull an entity out of the database, convert it to a view model that you can mangle up.

how can I save/keep-in-sync an in-memory graph of objects with the database?

Question - What is a good best practice approach for how can I save/keep-in-sync an jn-memory graph of objects with the database?
Background:
That is say I have the classes Node and Relationship, and the application is building up a graph of related objects using these classes. There might be 1000 nodes with various relationships between them. The application needs to query the structure hence an in-memory approach is good for performance no doubt (e.g. traverse the graph from Node X to find the root parents)
The graph does need to be persisted however into a database with tables NODES and RELATIONSHIPS.
Therefore what is a good best practice approach for how can I save/keep-in-sync an jn-memory graph of objects with the database?
Ideal requirements would include:
build up changes in-memory and then 'save' afterwards (mandatory)
when saving, apply updates to database in correct order to avoid hitting any database constraints (mandatory)
keep persistence mechanism separate from model, for ease in changing persistence layer if needed, e.g. don't just wrap an ADO.net DataRow in the Node and Relationship classes (desirable)
mechanism for doing optimistic locking (desirable)
Or is the overhead of all this for a smallish application just not worth it and I should just hit the database each time for everything? (assuming the response times were acceptable) [would still like to avoid if not too much extra overhead to remain somewhat scalable re performance]
I'm using the self tracking entities in Entity Framework 4. After the entities are loaded into memory the StartTracking() MUST be called on every entity. Then you can modify your entity graph in memory without any DB-Operations. When you're done with the modifications, you call the context extension method "ApplyChanges(rootOfEntityGraph)" and SaveChanges(). So your modifications are persisted. Now you have to start the tracking again on every entity in the graph. Two hints/ideas I'm using at the moment:
1.) call StartTracking() at the beginning on every entity
I'm using an Interface IWorkspace to abstract the ObjectContext (simplifies testing -> see the OpenSource implementation bbv.DomainDrivenDesign at sourceforge). They also use a QueryableContext. So I created a further concrete Workspace and QueryableContext implementation and intercept the loading process with an own IEnumerable implementation. When the workspace's consumer executes the query which he get with CreateQuery(), my intercepting IEnumerable object registers an eventhandler on the context's ChangeTracker. In this event handler I call StartTracking() for every entity loaded and added into the context (doesn't work if you load the objects with NoTrakcing, because in that case the objects aren't added to the context and the event handler will not be fired). After the enumeration in the self made Iterator, the event handler on the ObjectStateManager is deregistered.
2.) call StartTracking() after ApplyChanges()/SaveChanges()
In the workspace implementation, I ask the context's ObjectStateManager for the modified entities, i.e:
var addedEntities = this.context.ObjectStateManager.GetObjectStateEntries(EntityState.Added);
--> analogous for modified entities
cast them to IObjectWithChangeTracker and call the AcceptChanges() method on the entity itself. This starts the object's changetracker again.
For my project I have the same mandatory points as you. I played around with EF 3.5 and didn't find a satisfactory solution. But the new ability of self tracking entities in EF 4 seems to fit my requirements (as far as I explored the funcionality).
If you're interested, I'll send you my "spike"-project.
Have anyone an alternative solution? My project is a server application which holds objects in memory for fast operations, while modifications should also be persisted (no round trip to DB). At some points in code the object graphs are marked as deleted/terminated and are removed from the in-memory container. With the explained solution above I can reuse the generated model from EF and have not to code and wrapp all objects myself again. The generated code for the self tracking entities arises from T4 templates which can be adapted very easily.
Thanks a lot for other ideas/critism
Short answer is that you can still keep a graph (collection of linked objects) of the objects in memory and write the changes to the database as they occur. If this is taking too long, you could put the changes onto a message queue (but that is probably overkill) or execute the updates and inserts on a separate thread.

Categories

Resources