This is in continuation of two ongoing problems I'm facing: Problems trying to attach a new EF4 entity to ObjectContext while its entity collection entities are already attached and EF4.0 - Is there a way to see what entities are attached to what ObjectContext during debugging? I'm using this space to ask another somewhat complicated question, and I don't want to make a huge, ultra long question out of my other threads.
So, a quick rundown:
I have incoming form data which is bound to a DTO. I want to map the DTO to an Entity (a Game entity). The wrinkle is that the Game contains a EntityCollection which I must create and Add() to the Game based on a int[] in the DTO (each integer represents the ID of a Platform). And, naturally, EF4 is choking in part because it's a many-to-many relationship, but also, I think, because there's some shenanigans going on with how many ObjectContext objects are in play. I keep getting an exception claiming I can't add my retrieved Platform entities to my new Game entity because they belong to two different ObjectContexts. I can't see how that is possible given my current set up, but I'm not sure what else the problem could be.
Okay, so I have three repositories which I inject into my controller via Ninject interface injection. I create the ObjectContexts in each like so:
public class HGGameRepository : IGameRepository
{
private HGEntities _siteDB = new HGEntities();
// rest of repo
}
The other two repositories are built the same way.
My Ninject DI code is fairly simple:
private class HandiGamerServices : NinjectModule
{
public override void Load()
{
Bind<IArticleRepository>().To<HGArticleRepository>().InRequestScope();
Bind<IGameRepository>().To<HGGameRepository>().InRequestScope();
Bind<INewsRepository>().To<HGNewsRepository>().InRequestScope();
Bind<ErrorController>().ToSelf().InRequestScope();
}
}
From what I've read, this should create these bindings once per HTTP request.
What I'd like to do is have one instance of my HGEntities object be shared among all repositories in order to ensure I have one and only one ObjectContext in play. I'm just unsure how to do it.
Is there a standard way to do this?
Here's one option:
Change your repositories to take in an interface, IHGEntities, in their constructor and hook up HGEntities into your NinjectModule the same you did with your repositories. That way, when your controllers need an instance of IArticleRepository, Ninject will either instantiate an instance of HGEntities to pass into the repositories or use the instance that is already active in the current HTTP context.
Then, inside your repository classes you can simply cast IHGEntities to HGEntities.
Related
I've run into a problem, I know why it's a problem, but don't know the right way to fix it.
So my controller sends a Product to my Service class, along with 2 strings defining the category of the object.
Here's the service class.
public Product AddProduct(Product p, string cat, string subcat)
{
var category = _categoryService.GetCategoryByName(cat, subcat);
p.Categories.Add(category);
return _productRepository.CreateProduct(p);
}
The first line get's an exisiting category( CategroyService->CategoryRepository->DbContext), so here I use an instance of My context.
Then I add that Category to my list of categories of the product. And finally I give to the repository and that persists it to my Database ( trough EF ofcourse ).
Then I get the error
An entity object cannot be referenced by multiple instances of IEntityChangeTracker.
I think it's because I first ask for a category, and then try to add a product. The change tracker then needs to hold track of 2 contexts and it gets confused. The thing is I don't get why the first context from the category doesn't get disposed when I have my category.
What should I do?
EDIT:
The _categoryService uses another repository than ProductRepository.
The problem is that you are using more instances of DbContext in one request (unit of work) as you have already mentioned.
When you develop a web application the best way is to work with a different instance of DbContext for every web request, but exaclty 1 instance for 1 request and call SaveChanges only 1 time when you made all the changes you need.
This is very easy to do if you use an IoC framework, like Unity and let your DbContext injected in your Repositories.
In this case you should use the following settings (if you use Unity):
container.RegisterType<DbContext, YourDbContext>(new PerRequestLifetimeManager(), ...);
Getting deeper with entity framework and repositories in order to enable better testing. Wondering if this is wise though?
public interface IRepository
{
int SaveChanges();
void Dispose();
}
using (MyContext context = new MyContext())
{
TransactionRepository txns = new TransactionRepository(context); // TransactionRepository implement IRepository
MappingRepository maps = new MappingRepository(context); // MappingRepositoryimplement IRepository
SomeCommand command = new SomeCommand(txns, maps);
command.Execute();
}
Each of the repositories is logically different, so in theory could be in different data sources. For now, they use the same database though. Each of the repository classes implements IRepository, and notably SaveChanges() along with some query methods that I've not shown for brevity.
What's a good practice for utilize multiple repositories?
+1 gorilla, some goods points made. I would add the following thoughts.
In web/mvc scenario , I use dozens of repositories and inject the Context into these Repositories. I use a repository base class.
I also UoW classes which use a context in constructor.
The Unit Of Work Classes contains references to all supported repositories for the context. I also use bounded contexts. Here is a sample blogs from Julie Lerman on the subject.
http://www.goodreads.com/author/show/1892325.Julia_Lerman/blog
So yes, it makes perfect sense to use multiple contexts and to use multiple repositories.
You may even have multiple Unit of Work classes, although concurrent use of UoW classes is another discussion.
ADDING SAMPLE code as requested:
This sample is one of Several LuW classes that inherits from a base LuW class.
The current state and DBContext to be use is injected. (or defaulted)
The repositories are interfaces from CORE project. The LuW classes are in the DAL project.
the base LuW is something like....
public interface ILuw : ILuwEvent, IDisposable
{
IBosCurrentState CurrentState{ get; set; }
OperationStatus Commit();
}
The Luw Class itself.
namespace XYZ.DAL
{
public class LuwBosMaster : Luw, ILuwBosMaster
{
public LuwBosMaster(DbContext context, IBosCurrentState currentState)
{
base.Initialise(context,currentState);
}
public LuwBosMaster()
{
base.Initialise(GetDefaultContext(), BosGlobal.BGA.IBosCurrentState);
}
public static DbContextBosMaster GetDefaultContext()
{
return new DbContextBosMaster("BosMaster");
}
//MasterUser with own Repository Class
private IRepositoryMasterUser _repositoryMasterUser;
public IRepositoryMasterUser RepMasterUser
{ get { return _repositoryMasterUser ?? (_repositoryMasterUser = new RepositoryMasterUser(Context, CurrentState)); } }
//20 other repositories declared adn available within this Luw
// Some repositories might address several tables other single tables only.
// The repositories are based on a base class that common generic behavior for each MODEL object
Im sure you get the basic idea...
This really comes down to design decisions on your part. If you're following Unit Of Work pattern, then each repository is probably going to have it's own context; mainly because according to UoW, each repository call should create it's context, do it's work, and then dispose of it's context.
There are other good reasons to share a context though, one of which (IMHO) is that the context has to track the state of an entity, if you're get an entity, dispose the context, make some modifications to the entity, and then attach to a new context this new context has to go hit the database so it can figure out the state of the entity. Likewise if you're working with graphs of entities (Invoices, and all their InvoiceItems), then the new context would have to fetch all the entities in the graph to determine their state.
Now, if you're working with web pages or services where you are not or cannot maintain a state, then the UoW pattern is sort of implied and it's a generally accepted "good practice".
The most important thing is forgotten: The database connection is not shared between multiple DbContext instances. That means that you have to use distributed transactions if you would like several repositories to be in the same transaction. That's a large performance degrade compared to a local transactions.
I am learning DDD development for few days, and i start to like it.
I (think i) understand the principle of DDD, where your main focus is on business objects, where you have aggregates, aggregates roots, repositories just for aggregates roots and so on.
I am trying to create a simple project where i combine DDD development with Code First approach.
My questions are: (I am using asp.net MVC)
DDD Business Objects will be different than Code First objects?
Even if they will probably be the same, for example i can have a Product business object which has all the rules and methods, and i can have a Product code first (POCO) object which will just contain the properties i need to save in database.
If answer to question 1 is "true", then how do i notify the Product POCO object that a property from business object Product has been changed and i have to update it? I am using an "AutoMapper" or something like this?
If the answer is "no", i am completely lost.
Can you show me the most simple (CRUD) example of how can i put those two together?
Thank you
Update I no longer advocate for the use of "domain objects" and instead advocate a use of a messaging-based domain model. See here for an example.
The answer to #1 is it depends. In any enterprise application, you're going to find 2 major categories of stuff in the domain:
Straight CRUD
There's no need for a domain object here because the next state of the object doesn't depend on the previous state of the object. It's all data and no behavior. In this case, it's ok to use the same class (i.e. an EF POCO) everywhere: editing, persisting, displaying.
An example of this is saving a billing address on an order:
public class BillingAddress {
public Guid OrderId;
public string StreetLine1;
// etc.
}
On the other hand, we have...
State Machines
You need to have separate objects for domain behavior and state persistence (and a repository to do the work). The public interface on the domain object should almost always be all void methods and no public getters. An example of this would be order status:
public class Order { // this is the domain object
private Guid _id;
private Status _status;
// note the behavior here - we throw an exception if it's not a valid state transition
public void Cancel() {
if (_status == Status.Shipped)
throw new InvalidOperationException("Can't cancel order after shipping.")
_status = Status.Cancelled;
}
// etc...
}
public class Data.Order { // this is the persistence (EF) class
public Guid Id;
public Status Status;
}
public interface IOrderRepository {
// The implementation of this will:
// 1. Load the EF class if it exists or new it up with the ID if it doesn't
// 2. Map the domain class to the EF class
// 3. Save the EF class to the DbContext.
void Save(Order order);
}
The answer to #2 is that the DbContext will automatically track changes to EF classes.
The answer is No. One of the best things about EF code-first is that it fits nicely with DDD since you have to create your business objects by hand so do use your EF models to be equivalent to DDD entities and value objects. No need to add an extra layer of complexity, I don't think DDD recommends that anywhere.
You could even have your entities to implement an IEntity and you value objects to implement IValue, additionally follow the rest of DDD patterns namely Repositories to do the actual communication to the database. More of these ideas you can find this very good sample application in .NET, even though it doesn't use EF code first, it's still very valuable: http://code.google.com/p/ndddsample/
Recently I've done similar project. I was following this tutorial: link
And I've done it this way: I've created Blank solution, added projects: Domain, Service and WebUI.
Simply said in domain I've put model (for example classes for EF code first, methods etc.)
Service was used for domain to communicate with world (WebUI, MobileUI, other sites etc.) using asp.net webapi
WebUi was actually MVC application (but model was in domain so it was mostly VC)
Hope I've helped
The Pluralsight course: Entity Framework in the Enterprise goes into this exact scenario of Domain Driven Design incorporated with EF Code First.
For number 1, I believe you can do it either way. It's just a matter of style.
For number 2, the instructor in the video goes through a couple ways to account for this. One way is to have a "State" property on every class that is set on the client-side when modifying a value. The DbContext then knows what changes to persist.
Late question on this topic.
Reading Josh Kodroff's answer confirms my thoughts about the implementation of a Repository to, for instance, Entity Framework DAL.
You map the domain object to an EF persistance object and let EF handle it when saving.
When retrieving, you let EF fetch from database and map it to your domain object(aggregate root) and adds it to your collection.
Is this the correct strategy for repository implementation?
Scenario:
Retrieve some entities
Update some properties on those entities
You perform some sort of business logic which dictates that you should no longer have those properties updated; instead you should insert some new entities documenting the results of your business logic.
Insert said new entities
SaveChanges();
Obviously in the above example calling SaveChanges() will not only insert the new entities, but update the properties on the original entities. Before I have managed to rearrange my code in a way where changes to the context (and its entities) would only be made when I knew for sure that I would want all my changes saved, however that’s not always possible. So the question is what is the best way to handle this scenario? I don’t work with the context directly, rather through repositories, if that matters. Is there a simple way to revert the entities to their original values? What is the best practice in this sort of scenario?
Update
Although I disagree with Ladislav that the business logic should be rearranged in such way that the validation always come before any modification to the entities, I agree that the solution should really be persisting wanted changes on a different context. The reason I disagree is because my business transaction is fairly long, and validation or error checking that might happen at the end of the transaction are not always obvious upfront. Imagine a Christmas tree you're decorating with lights from the top down, you've already modified the tree by the time you're working on the lower branches. What happens if one of the lights breaks? You want to roll back all of your changes, but you want to create some ERROR entities. As Ladislav suggested the most straight forward way would be to save the ERROR entities on a different context, allowing the original one (with the modified metaphorical tree) to expire without SaveChanges being ever called.
Now, in my situation I utilize Ninject for dependance injection, injecting one EF context into all of my repositories that are in the scope of the top level service. What this means is that my business layer classes don't really have control of creating new EF contexts. Not only do they not have access to the EF context (remember they work through repositories), but the injection has already occurred higher in the object hierarchy. The only solution I found is to create another class that will utilize Ninject to create a new UOW within it.
//business logic executing against repositories with already injected and shared (unit of work) context
Tree = treeRepository.Get();
Lights = lightsRepsitory.Get();
//update the tree as you're decorating it with lights
if(errors.Count == 0)
{
//no errors, calling SaveChanges() on any one repository will commit the entire UOW as they all share the same injected EF context
repository1.SaveChanges();
}
else
{
//oops one of the lights broke, we need to insert some Error entities
//however if we just add id to the errorRepository and call SaveChanges() the modifications that happened
//to the tree will also be committed.
TreeDecoratorErroHandler.Handle(errors);
}
internal class TreeDecoratorErroHandler
{
//declare repositories
//constructor that takes repository instances
public static void Handle(IList<Error> errors)
{
//create a new Ninject kernel
using(Ninject... = new Ninject...)
{
//this will create an instance that will get injected with repositories sharing a new EF instance
//completely separate from the one outside of this class
TreeDecoratorErroHandler errorHandler = ninjectKernel.Get<TreeDecoratorErroHandler>();
//this will insert the errors and call SaveChanges(), the only changes in this new context are the errors
errorHandler.InsertErrors(errors);
}
}
//other methods
}
You should definitely use a new context for this. Context is unit of work and once your business logic says: "Hey I don't want to update this entity" then the entity is not part of unit of work. You can either detach the entity or create new context.
There is possibility to use Refresh method but that method is supposed to be used in scenarios where you have to deal with optimistic concurrency. Because of that this method refreshes only scalar and complex properties and foreign keys if part of the entity - if you made changes to navigation properties these can be still present after you refresh the entity.
Take a look at ObjectContext.Refresh with RefreshMode.StoreWins I think that will do what you want. Starting a new context would achieve the same thing I guess, but not be as neat.
I'm new to the Entities Framework, and am just starting to play around with it in my free time. One of the major questions I have is regarding how to handle ObjectContexts.
Which is generally preferred/recommended of these:
This
public class DataAccess{
MyDbContext m_Context;
public DataAccess(){
m_Context = new MyDbContext();
}
public IEnumerable<SomeItem> GetSomeItems(){
return m_Context.SomeItems;
}
public void DeleteSomeItem(SomeItem item){
m_Context.DeleteObject(item);
m_Context.SaveChanges();
}
}
Or this?
public class DataAccess{
public DataAccess(){ }
public IEnumerable<SomeItem> GetSomeItems(){
MyDbContext context = new DbContext();
return context.SomeItems;
}
public void DeleteSomeItem(SomeItem item){
MyDbContext context = new DbContext();
context.DeleteObject(item);
context.SaveChanges();
}
}
The ObjectContext is meant to be the "Unit of Work".
Essentially what this means is that for each "Operation" (eg: each web-page request) there should be a new ObjectContext instance. Within that operation, the same ObjectContext should be re-used.
This makes sense when you think about it, as transactions and change submission are all tied to the ObjectContext instance.
If you're not writing a web-app, and are instead writing a WPF or windows forms application, it gets a bit more complex, as you don't have the tight "request" scope that a web-page-load gives you, but you get the idea.
PS: In either of your examples, the lifetime of the ObjectContext will either be global, or transient. In both situations, it should NOT live inside the DataAccess class - it should be passed in as a dependency
If you keep the same context for a long-running process running lots queries against it, linq-to-sql (I didn't test against linq to entities, but I guess that's the same problem) gets VERY slow (1 query a second after some 1000 simple queries). Renewing the context on a regular basis fixes this issue, and doesn't cost so much.
What happens is that the context keeps track of every query you do on it, so if it's not reset in a way, it gets really fat... Other issue is then the memory it takes.
So it mainly depends on the way your application is working, and if you new up a DataAccess instance regularly or if you keep it the same all along.
Hope this helps.
Stéphane
Just a quick note - the two code pieces are roughly the same in their underlying problem. This is something that I have been looking at, because you dont want to keep opening and closing the context (see second example) at the same time you are not sure if you can trust Microsoft to properly dispose of the context for you.
One of the things I did was create a common base class that lazy loads the Context in and implement the base class destruct-er to dispose of things. This works well for something like the MVC framework, but unfortunately leads to the problem of having to pass the context around to the various layers so the business objects can share the call.
In the end I went with something using Ninject to inject this dependency into each layer and had it track usage
While I'm not in favour of always creating, what must be, complicated objects each time I need them - I too have found that the DataContexts in Linq to Sql and the ObjectContexts in EF are best created when required.
Both of these perform a lot of static initialisation based on the model that you run them against, which is cached for subsequent calls, so you'll find that the initial startup for a context will be longer than all subsequent instantiations.
The biggest hurdle you face with this is the fact that once you obtain an entity from the context, you can't simply pass it back into another to perform update operations, or add related entities back in. In EF you can reattach an entity back to a new context. In L2S this process is nigh-on impossible.