Assume that I have a domain object which has association with couple of other entities (ofcourse mapped to multiple tables). And I made changes in master and associated entities. Naturally EF has to update this in multiple tables on save.
Whether it be ObjectContext or DbContext, a call to SaveChanges() method will tell Entity Framework to "Saves all changes made in this context to the underlying database."
Could anyone tell me "What is happening behind SaveChanges()"?
Is all resulting sql statements INSERT/UPDATE/DELETE goes to database in one go as prepared statement?
or is EF doing back and forth with DB to execute the sql statement one by one?
Is there any configuration in EF to switch between this?
At the moment statements for CUD operations are not batched. We have a work item to fix this. Feel free to upvote
From what I understand, each modified entity will result in a roundtrip to the database (all bound by a single transaction). While I'm not aware of any configuration that will change this behavior, I won't say that there are not EF implementations out there that achieve this "batching" functionality. I just don't think they are available, out of the box.
Related
I am using Entity Framework 6 where, for performance reasons, I load my entities into my DbContext up front and then use them locally. Up to now, all changes to the database have gone through the DbContext so my local entities and the database have been in sync. However, I now have to call a Stored Procedure on the database, which has the side effects of making changes to tables (outside of the DbContext) that need to be reflected in my entities. By changes, I mean it is adding new records and deleting / updating existing records.
I do not want to dispose of my DbContext and create a new one, as some of the entity instances are wrapped within ViewModel classes. So, deleting the DbContext in this way would lead to major problems in the UI.
It is my understanding that simply calling Load() on all my DbSets of the DbContext will just replace the existing instances. So, any objects using the old entities instances won't work.
So, I thought I could use the Reload method like:
context.Entry(entity).Reload();
which would update my local entities, but I can only do this for the entities that the DbContext already knows about. It doesn't cover any NEW entities or DELETED entities that were created / deleted as a result of the Stored Procedure executing.
So, I am looking for a way to:
Load, from the database, entities that are NEW to my DbContext
Reload existing entities in my DbContext
Remove any deleted entities from my DbContext
Here is the official documentation for Entity Framework.
Starting from the analysis of your database situation, it suggests smart and quick ways to obtain what you want, detailing when necessary data-read strategies (like eager or lazy loading) or providing tutorials to correctly use code generation and the Wizard GUI.
http://www.entityframeworktutorial.net/choosing-development-approach-with-entity-framework.aspx
Here some more detailed info and tutorial on data-read strategies:
https://www.c-sharpcorner.com/article/eager-loading-lazy-loading-and-explicit-loading-in-entity-framework/
As I already told you in comments, I would suggest a database-first approach and with lazy loading to avoid uncontrolled data behaviours (or reloading the whole db when running a stored procedure).
Talking about the SP, it can simply be mapped through the Wizard that comes with Entity Framework and wrapped by a method.
Hope you will find these resources helpful!
In general EntityFramework can not aware on change in database and update dbcontext .there is no optimized or EntityFramework built-in solution for it.
I think you can use CDC in SqlServer, push change to your application and update your dbcontext. But it not acceptable solution for all business and senario
I am writing a service that will decide at some point it needs to store values and so needs to create tables inside an existing database. Later on, it will decide it no longer needs this storage and so should remove those tables. This cycle could repeat multiple times. I am using EF6 to simplify the database operations.
Creating the tables is easy as the first time the DbContext is used it will automatically create the tables along with the initial migration record. To avoid having to create verbose migration classes I just derive from a class from DbMigrationsConfiguration and then set the AutomaticMigrationsEnabled to true. Then it generates the tables using the code first model classes.
The problem is that later on I need to remove these tables and the matching migration record. There does not seem to be any useful method for doing that. The closest I found was DbContext.Database.Delete(). This is not appropriate because it attempts to delete the entire database and this is not possible because the database has lots of other tables used for other purposes.
Is there no way to tell EF6 to remove all the tables and migration record for a DbContext? That seems a strange omission. I do not want lots of zombie tables lingering that are no longer needed. Not everyone can create and delete an entire database to support a single DbContext.
I'm having a hard time just defining my situation so please be patient. Either I have a situation that no one blogs about, or I've created a problem in my mind by lack of understanding the concepts.
I have a database which is something of a mess and the DB owner wants to keep it that way. By mess I mean it is not normalized and no relationships defined although they do exist...
I want to use EF, and I want to optimize my code by reducing database calls.
As a simplified example I have two tables with no relationships set like so:
Table: Human
HumanId, HumanName, FavoriteFoodId, LeastFavoriteFoodId, LastFoodEatenId
Table: Food
FoodId, FoodName, FoodProperty1, FoodProperty2
I want to write a single EF database call that will return a human and a full object for each related food item.
First, is it possible to do this?
Second, how?
Boring background information: A super sql developer has written a query that returns 21 tables in 20 milliseconds which contain a total of 1401 columns. This is being turned into an xml document for our front end developer to bind to. I want to change our technique to use objects and thus reduce the amount of hand coding and mapping from fields to xml (not to mention the handling of nulls vs empty strings etc) and create a type safe compile time environment. Unfortunately we are not allowed to change the database or add relationships...
If I understand you correct, it's better for you to use Entity Framework Code First Approach:
You can define your objects (entities) Human and Food
Make relations between them in code even if they don't have foreign keys in DB
Query them usinq linq-to-sql
And yes, you can select all related information in one call.
You can define the relationships in the code with Entity Framework using Fluent API. In your case you might be able to define your entities manually, or use a tool to reverse engineer your EF model from an existing database. There is some support for this built in to Visual Studio, and there are VS extensions like EF Power Tools that offer this capability.
As for making a single call to the database with EF, you would probably need to create a stored procedure or a view that returns all of the information you need. Using the standard setup with lazy-loading enabled, EF will make calls to the database and populate the data as needed.
We have a system built using Entity Framework 5 for creating, editing and deleting data but the problem we have is that sometimes EF is too slow or it simply isn't possible to use entity framework (Views which build data for tables based on users participating in certain groups in database, etc) and we are having to use a stored procedure to update the data.
However we have gotten ourselves into a problem where we are having to save the changes to EF in order to have the data in the database and then call the stored procedures, we can't use ITransactionScope as it always elevates to a distributed transaction and/or locks the table(s) for selects during the transaction.
We are also trying to introduce a DomainEvents pattern which will queue events and raise them after the save changes so we have the data we need in the DB but then we may end up with the first part succeeding and the second part failing.
Are there any good ways to handle this or do we need to move away from EF entirely for this scenario?
I had similar scenario . Later I break the process into small ones and use EF only, and make each small process short. Even overall time is longer, but system is easier to maintain and scale. Also I minimized joins, only update entity itself, disable EF'S AutoDetectChangesEnabled and ValidateOnSaveEnabled.
Sometimes if you look your problem in different ways, you may have better solution.
Good luck!
How NHibernate track changes made to the fields in my entity? If I use second level cache and I change my entity, How does it apply my changes to db?
When you change an entity, the entity becomes "dirty" and nhibernate knows the update the entity in your database when the session is flushed. That said, sometimes its possible for entities to get marked dirty even though you have made no change. This results in unnecessary update calls to your database.
It is best to isolate your entities from your views via view models. Once you pull an entity out of the database, convert it to a view model that you can mangle up.