I create instance of context per form and may multi forms use same entity
How handle this when save?
The DbContext is part of your "view state", intended to give you access to the database, hold a copy of the data the user is working on in the Change Tracker, and flush changes back to the database.
In a smart client app the natural scope and lifetime for the DbContext is either global or scoped to a UI element that represents a task or use-case. But in either case you have to prevent the DbContext's Change Tracker from growing too large or from holding on to copies of data too long, as even a single form can stay open all day. A possible rule-of-thumb for these apps is to clear the change tracker after every successful SaveChanges(). Other than that, of course you must avoid long-running transactions so the DbContext doesn't hold a DbConnection open for a long time, but you have to do that anyway. The point is that a DbContext with an empty change tracker can be long-lived.
Using a Short-Lived DbContext is possible, but you loose the services of the Change Tracker and .Local ObservableCollection which make data binding really easy. So that's not a silver bullet.
So on to the question at-hand, if you have DbContext-per-form, and you have form-to-form communication of entities which end up getting saved by a different DbContext, you really have no choice but to disconnect the entities from their home DbContext before sending them over.
Or this might indicate that the two forms really should share a View State, and you should create an explicit View State object that includes the DbContext and the entieies and have both share that between forms.
Take a look at the Model-View-ViewModel (MVVM) pattern which introduces a distinction between the View (form) and the ViewModel, the data that models the use case user's interaction with the app. It's a useful pattern and is very widely used in XAML-based smart clients. Also XAML is (still) the future of smart client apps, so it's useful to learn.
I am new to development in C# so still figuring a few things out.
I am trying to stick to good design principals so as to keep my code maintainable. So doing I am building a Line of Business application using the MVVM and Factory patterns. I am also using Dependency Injection (Unity).
My question is about refreshing the data cache created with Dependency Injection. When I create my View Model the data cache is loaded as follows:
_container.RegisterType<IRepo<Relation>, RelationRepo>(new TransientLifetimeManager());
My scenario is that I have a data presented to the users in a GridView. I have a multi user environment with dynamic information so it is possible that from time to time the data in the Cache becomes obsolete. This may result, as could be expected, in an DBConcurrency error from time to time. What I am grappling with is how to handle this in the correct manner. Should I abort the whole process and have the user reload the application thereby recreating the DI or is there an elegant way to refresh the cache and re-present the data after providing necessary information to user. Using the 2n'd option I could perhaps place a refresh button on the screen or a timed event to refresh data so user can see any variations in data.
So basically what I'm asking is how do I keep the Cache in sync with the underlying Data Base in real-time?
Thanks in advance.
This a very broad question you might prefer to take to the software engineering stackexchange site. There are many approaches for handling this kind of problem.
In any case you are going to have to handle concurrency exceptions. One approach you could take to reduce the chances of their occurrence is by distributing updates to clients via SignalR. Take the approach that a single update will distribute via a SignalR hub to all other clients. Still, the update may be broadcast moments after a concurrent update, and this will require some UI feature to explain that things have changes.
Although its a while ago I thought i'd post the solution to my problem in case someone else is grappling with the concepts as I was.
The issue was that I was not properly disposing my ViewModel when I navigated away from the View (or if there was a DB Error). Therefor when I returned to the View the EF DBContext was still using the local store. i.e not Disposed.
I subsequently implemented the Prism Framework and implemented IRegionMemberLifetime on a base class from which all my ViewModels inherit. This causes the instance of the ViewModel placed in the Region Manager (IRegion) to be removed when it transitions from an activated to deactivated state.
My (pseudo) base class looks like this:
using Prism.Mvvm;
using Prism.Regions;
public MyBindableBase : BindableBase, IRegionMemberLifetime
{
public virtual bool KeepAlive => false;
...
}
Working on a WPF application using MVVM and powered by Entity Framework. We were very keen to allow users to multi-window this app, for usability purposes. However, that has the potential to cause problems with EF. If we stick to the usual advice of creating one copy of the Repository per ViewModel and someone opens multiple windows of the same ViewModel, it could cause "multiple instances of IEntityChangeTracker" errors.
Rather than go with a Singleton, which has its own problems, we solved this by putting a Refresh method on the repository that gets a fresh data context. Then we do things like this all over the shop:
using (IRepository r = Rep.Refresh())
{
r.Update(CurrentCampaign);
r.SaveChanges();
}
Which is mostly fine. However, it causes problems with maintaining state. If the context is refreshed while a user is working with an object, their changes will be lost.
I can see two ways round this, both of which have their own drawbacks.
We call SaveChanges constantly. This has the potential to slow down the application with constant database calls. Plus there are occasions when we don't want to store incomplete objects.
We copy EF objects into memory when loaded, have the user work with those, and then add a "Save" button which copies all the objects back to the EF object and saves. We could do this with an automapper, but is still seems unnecessary faff.
Is there another way?
I believe having the repository for accessing entity framework as a singleton may not always be wrong.
If you have a scenario were you have a client side repository, i.e. a repository which is part of the executable of the client application, i.e. is used by one client, then a singleton might be ok. Of course I would not use a singleton on the server side.
I asked Brian Noyes (Microsoft MVP) a similar question on a "MVVM" course on the pluralsight website.
I asked: "What is the correct way to dispose of client services which are used in the view model?"
And in his response he wrote: "...most of my client services are Singletons anyway and live for the life of the app."
In addition having a singleton does not prevent you from writing unit tests, as long as you have an interface for the singleton.
And if you use a dependency injection framework (see Marks comment), which is a good idea for itself, changing to singleton instantiation is just a matter of performing a small change in the setup of the injection container for the respective class.
I implemented repository pattern to my software by the help of this article. One question boggles my mind. I think Database class should implement singleton pattern because user should not create more than one database context using "var ourDatabase = new Database();" statement. Am i right or is this situation not a criticial issue for usage of the implementation.
You should not have the database context as a singleton with Entity Framework.
For starters, each context instance tracks all changes made to it and "save changes" saves all the changes. So, if you have a web application and you made your context a singleton then all the users would be updating the same context and when one called "save changes" it would save changes for everyone. In a single-user Windows application this is less of an issue, unless you have different parts of the application working in parallel.
Also be mindful that the context caches data it has already loaded and tracks changes by default. In short, this can mean memory bloat and decreased performance as more and more objects are tracked - though in practice the impact of this varies.
From a performance point of view, Entity Framework implements connection pooling underneath the covers so don't worry about creating and disposing database context objects - it is very cheap.
I have been reading a lot of articles explaining how to set up Entity Framework's DbContext so that only one is created and used per HTTP web request using various DI frameworks.
Why is this a good idea in the first place? What advantages do you gain by using this approach? Are there certain situations where this would be a good idea? Are there things that you can do using this technique that you can't do when instantiating DbContexts per repository method call?
NOTE: This answer talks about the Entity Framework's DbContext, but
it is applicable to any sort of Unit of Work implementation, such as
LINQ to SQL's DataContext, and NHibernate's ISession.
Let start by echoing Ian: Having a single DbContext for the whole application is a Bad Idea. The only situation where this makes sense is when you have a single-threaded application and a database that is solely used by that single application instance. The DbContext is not thread-safe and since the DbContext caches data, it gets stale pretty soon. This will get you in all sorts of trouble when multiple users/applications work on that database simultaneously (which is very common of course). But I expect you already know that and just want to know why not to just inject a new instance (i.e. with a transient lifestyle) of the DbContext into anyone who needs it. (for more information about why a single DbContext -or even on context per thread- is bad, read this answer).
Let me start by saying that registering a DbContext as transient could work, but typically you want to have a single instance of such a unit of work within a certain scope. In a web application, it can be practical to define such a scope on the boundaries of a web request; thus a Per Web Request lifestyle. This allows you to let a whole set of objects operate within the same context. In other words, they operate within the same business transaction.
If you have no goal of having a set of operations operate inside the same context, in that case the transient lifestyle is fine, but there are a few things to watch:
Since every object gets its own instance, every class that changes the state of the system, needs to call _context.SaveChanges() (otherwise changes would get lost). This can complicate your code, and adds a second responsibility to the code (the responsibility of controlling the context), and is a violation of the Single Responsibility Principle.
You need to make sure that entities [loaded and saved by a DbContext] never leave the scope of such a class, because they can't be used in the context instance of another class. This can complicate your code enormously, because when you need those entities, you need to load them again by id, which could also cause performance problems.
Since DbContext implements IDisposable, you probably still want to Dispose all created instances. If you want to do this, you basically have two options. You need to dispose them in the same method right after calling context.SaveChanges(), but in that case the business logic takes ownership of an object it gets passed on from the outside. The second option is to Dispose all created instances on the boundary of the Http Request, but in that case you still need some sort of scoping to let the container know when those instances need to be Disposed.
Another option is to not inject a DbContext at all. Instead, you inject a DbContextFactory that is able to create a new instance (I used to use this approach in the past). This way the business logic controls the context explicitly. If might look like this:
public void SomeOperation()
{
using (var context = this.contextFactory.CreateNew())
{
var entities = this.otherDependency.Operate(
context, "some value");
context.Entities.InsertOnSubmit(entities);
context.SaveChanges();
}
}
The plus side of this is that you manage the life of the DbContext explicitly and it is easy to set this up. It also allows you to use a single context in a certain scope, which has clear advantages, such as running code in a single business transaction, and being able to pass around entities, since they originate from the same DbContext.
The downside is that you will have to pass around the DbContext from method to method (which is termed Method Injection). Note that in a sense this solution is the same as the 'scoped' approach, but now the scope is controlled in the application code itself (and is possibly repeated many times). It is the application that is responsible for creating and disposing the unit of work. Since the DbContext is created after the dependency graph is constructed, Constructor Injection is out of the picture and you need to defer to Method Injection when you need to pass on the context from one class to the other.
Method Injection isn't that bad, but when the business logic gets more complex, and more classes get involved, you will have to pass it from method to method and class to class, which can complicate the code a lot (I've seen this in the past). For a simple application, this approach will do just fine though.
Because of the downsides, this factory approach has for bigger systems, another approach can be useful and that is the one where you let the container or the infrastructure code / Composition Root manage the unit of work. This is the style that your question is about.
By letting the container and/or the infrastructure handle this, your application code is not polluted by having to create, (optionally) commit and Dispose a UoW instance, which keeps the business logic simple and clean (just a Single Responsibility). There are some difficulties with this approach. For instance, where do you Commit and Dispose the instance?
Disposing a unit of work can be done at the end of the web request. Many people however, incorrectly assume that this is also the place to Commit the unit of work. However, at that point in the application, you simply can't determine for sure that the unit of work should actually be committed. e.g. If the business layer code threw an exception that was caught higher up the callstack, you definitely don't want to Commit.
The real solution is again to explicitly manage some sort of scope, but this time do it inside the Composition Root. Abstracting all business logic behind the command / handler pattern, you will be able to write a decorator that can be wrapped around each command handler that allows to do this. Example:
class TransactionalCommandHandlerDecorator<TCommand>
: ICommandHandler<TCommand>
{
readonly DbContext context;
readonly ICommandHandler<TCommand> decorated;
public TransactionCommandHandlerDecorator(
DbContext context,
ICommandHandler<TCommand> decorated)
{
this.context = context;
this.decorated = decorated;
}
public void Handle(TCommand command)
{
this.decorated.Handle(command);
context.SaveChanges();
}
}
This ensures that you only need to write this infrastructure code once. Any solid DI container allows you to configure such a decorator to be wrapped around all ICommandHandler<T> implementations in a consistent manner.
There are two contradicting recommendations by microsoft and many people use DbContexts in a completely divergent manner.
One recommendation is to "Dispose DbContexts as soon as posible"
because having a DbContext Alive occupies valuable resources like db
connections etc....
The other states that One DbContext per request is highly
reccomended
Those contradict to each other because if your Request is doing a lot of unrelated to the Db stuff , then your DbContext is kept for no reason.
Thus it is waste to keep your DbContext alive while your request is just waiting for random stuff to get done...
So many people who follow rule 1 have their DbContexts inside their "Repository pattern" and create a new Instance per Database Query so X*DbContext per Request
They just get their data and dispose the context ASAP.
This is considered by MANY people an acceptable practice.
While this has the benefits of occupying your db resources for the minimum time it clearly sacrifices all the UnitOfWork and Caching candy EF has to offer.
Keeping alive a single multipurpose instance of DbContext maximizes the benefits of Caching but since DbContext is not thread safe and each Web request runs on it's own thread, a DbContext per Request is the longest you can keep it.
So EF's team recommendation about using 1 Db Context per request it's clearly based on the fact that in a Web Application a UnitOfWork most likely is going to be within one request and that request has one thread. So one DbContext per request is like the ideal benefit of UnitOfWork and Caching.
But in many cases this is not true.
I consider Logging a separate UnitOfWork thus having a new DbContext for Post-Request Logging in async threads is completely acceptable
So Finally it turns down that a DbContext's lifetime is restricted to these two parameters. UnitOfWork and Thread
Not a single answer here actually answers the question. The OP did not ask about a singleton/per-application DbContext design, he asked about a per-(web)request design and what potential benefits could exist.
I'll reference http://mehdi.me/ambient-dbcontext-in-ef6/ as Mehdi is a fantastic resource:
Possible performance gains.
Each DbContext instance maintains a first-level cache of all the entities its loads from the database. Whenever you query an entity by its primary key, the DbContext will first attempt to retrieve it from its first-level cache before defaulting to querying it from the database. Depending on your data query pattern, re-using the same DbContext across multiple sequential business transactions may result in a fewer database queries being made thanks to the DbContext first-level cache.
It enables lazy-loading.
If your services return persistent entities (as opposed to returning view models or other sorts of DTOs) and you'd like to take advantage of lazy-loading on those entities, the lifetime of the DbContext instance from which those entities were retrieved must extend beyond the scope of the business transaction. If the service method disposed the DbContext instance it used before returning, any attempt to lazy-load properties on the returned entities would fail (whether or not using lazy-loading is a good idea is a different debate altogether which we won't get into here). In our web application example, lazy-loading would typically be used in controller action methods on entities returned by a separate service layer. In that case, the DbContext instance that was used by the service method to load these entities would need to remain alive for the duration of the web request (or at the very least until the action method has completed).
Keep in mind there are cons as well. That link contains many other resources to read on the subject.
Just posting this in case someone else stumbles upon this question and doesn't get absorbed in answers that don't actually address the question.
I'm pretty certain it is because the DbContext is not at all thread safe. So sharing the thing is never a good idea.
One thing that's not really addressed in the question or the discussion is the fact that DbContext can't cancel changes. You can submit changes, but you can't clear out the change tree, so if you use a per request context you're out of luck if you need to throw changes away for whatever reason.
Personally I create instances of DbContext when needed - usually attached to business components that have the ability to recreate the context if required. That way I have control over the process, rather than having a single instance forced onto me. I also don't have to create the DbContext at each controller startup regardless of whether it actually gets used. Then if I still want to have per request instances I can create them in the CTOR (via DI or manually) or create them as needed in each controller method. Personally I usually take the latter approach as to avoid creating DbContext instances when they are not actually needed.
It depends from which angle you look at it too. To me the per request instance has never made sense. Does the DbContext really belong into the Http Request? In terms of behavior that's the wrong place. Your business components should be creating your context, not the Http request. Then you can create or throw away your business components as needed and never worry about the lifetime of the context.
I agree with previous opinions. It is good to say, that if you are going to share DbContext in single thread app, you'll need more memory. For example my web application on Azure (one extra small instance) needs another 150 MB of memory and I have about 30 users per hour.
Here is real example image: application have been deployed in 12PM
What I like about it is that it aligns the unit-of-work (as the user sees it - i.e. a page submit) with the unit-of-work in the ORM sense.
Therefore, you can make the entire page submission transactional, which you could not do if you were exposing CRUD methods with each creating a new context.
Another understated reason for not using a singleton DbContext, even in a single threaded single user application, is because of the identity map pattern it uses. It means that every time you retrieve data using query or by id, it will keep the retrieved entity instances in cache. The next time you retrieve the same entity, it will give you the cached instance of the entity, if available, with any modifications you have done in the same session. This is necessary so the SaveChanges method does not end up with multiple different entity instances of the same database record(s); otherwise, the context would have to somehow merge the data from all those entity instances.
The reason that is a problem is a singleton DbContext can become a time bomb that could eventually cache the whole database + the overhead of .NET objects in memory.
There are ways around this behavior by only using Linq queries with the .NoTracking() extension method. Also these days PCs have a lot of RAM. But usually that is not the desired behavior.
Another issue to watch out for with Entity Framework specifically is when using a combination of creating new entities, lazy loading, and then using those new entities (from the same context). If you don't use IDbSet.Create (vs just new), Lazy loading on that entity doesn't work when its retrieved out of the context it was created in. Example:
public class Foo {
public string Id {get; set; }
public string BarId {get; set; }
// lazy loaded relationship to bar
public virtual Bar Bar { get; set;}
}
var foo = new Foo {
Id = "foo id"
BarId = "some existing bar id"
};
dbContext.Set<Foo>().Add(foo);
dbContext.SaveChanges();
// some other code, using the same context
var foo = dbContext.Set<Foo>().Find("foo id");
var barProp = foo.Bar.SomeBarProp; // fails with null reference even though we have BarId set.