Working on a WPF application using MVVM and powered by Entity Framework. We were very keen to allow users to multi-window this app, for usability purposes. However, that has the potential to cause problems with EF. If we stick to the usual advice of creating one copy of the Repository per ViewModel and someone opens multiple windows of the same ViewModel, it could cause "multiple instances of IEntityChangeTracker" errors.
Rather than go with a Singleton, which has its own problems, we solved this by putting a Refresh method on the repository that gets a fresh data context. Then we do things like this all over the shop:
using (IRepository r = Rep.Refresh())
{
r.Update(CurrentCampaign);
r.SaveChanges();
}
Which is mostly fine. However, it causes problems with maintaining state. If the context is refreshed while a user is working with an object, their changes will be lost.
I can see two ways round this, both of which have their own drawbacks.
We call SaveChanges constantly. This has the potential to slow down the application with constant database calls. Plus there are occasions when we don't want to store incomplete objects.
We copy EF objects into memory when loaded, have the user work with those, and then add a "Save" button which copies all the objects back to the EF object and saves. We could do this with an automapper, but is still seems unnecessary faff.
Is there another way?
I believe having the repository for accessing entity framework as a singleton may not always be wrong.
If you have a scenario were you have a client side repository, i.e. a repository which is part of the executable of the client application, i.e. is used by one client, then a singleton might be ok. Of course I would not use a singleton on the server side.
I asked Brian Noyes (Microsoft MVP) a similar question on a "MVVM" course on the pluralsight website.
I asked: "What is the correct way to dispose of client services which are used in the view model?"
And in his response he wrote: "...most of my client services are Singletons anyway and live for the life of the app."
In addition having a singleton does not prevent you from writing unit tests, as long as you have an interface for the singleton.
And if you use a dependency injection framework (see Marks comment), which is a good idea for itself, changing to singleton instantiation is just a matter of performing a small change in the setup of the injection container for the respective class.
Related
I create instance of context per form and may multi forms use same entity
How handle this when save?
The DbContext is part of your "view state", intended to give you access to the database, hold a copy of the data the user is working on in the Change Tracker, and flush changes back to the database.
In a smart client app the natural scope and lifetime for the DbContext is either global or scoped to a UI element that represents a task or use-case. But in either case you have to prevent the DbContext's Change Tracker from growing too large or from holding on to copies of data too long, as even a single form can stay open all day. A possible rule-of-thumb for these apps is to clear the change tracker after every successful SaveChanges(). Other than that, of course you must avoid long-running transactions so the DbContext doesn't hold a DbConnection open for a long time, but you have to do that anyway. The point is that a DbContext with an empty change tracker can be long-lived.
Using a Short-Lived DbContext is possible, but you loose the services of the Change Tracker and .Local ObservableCollection which make data binding really easy. So that's not a silver bullet.
So on to the question at-hand, if you have DbContext-per-form, and you have form-to-form communication of entities which end up getting saved by a different DbContext, you really have no choice but to disconnect the entities from their home DbContext before sending them over.
Or this might indicate that the two forms really should share a View State, and you should create an explicit View State object that includes the DbContext and the entieies and have both share that between forms.
Take a look at the Model-View-ViewModel (MVVM) pattern which introduces a distinction between the View (form) and the ViewModel, the data that models the use case user's interaction with the app. It's a useful pattern and is very widely used in XAML-based smart clients. Also XAML is (still) the future of smart client apps, so it's useful to learn.
I am new to development in C# so still figuring a few things out.
I am trying to stick to good design principals so as to keep my code maintainable. So doing I am building a Line of Business application using the MVVM and Factory patterns. I am also using Dependency Injection (Unity).
My question is about refreshing the data cache created with Dependency Injection. When I create my View Model the data cache is loaded as follows:
_container.RegisterType<IRepo<Relation>, RelationRepo>(new TransientLifetimeManager());
My scenario is that I have a data presented to the users in a GridView. I have a multi user environment with dynamic information so it is possible that from time to time the data in the Cache becomes obsolete. This may result, as could be expected, in an DBConcurrency error from time to time. What I am grappling with is how to handle this in the correct manner. Should I abort the whole process and have the user reload the application thereby recreating the DI or is there an elegant way to refresh the cache and re-present the data after providing necessary information to user. Using the 2n'd option I could perhaps place a refresh button on the screen or a timed event to refresh data so user can see any variations in data.
So basically what I'm asking is how do I keep the Cache in sync with the underlying Data Base in real-time?
Thanks in advance.
This a very broad question you might prefer to take to the software engineering stackexchange site. There are many approaches for handling this kind of problem.
In any case you are going to have to handle concurrency exceptions. One approach you could take to reduce the chances of their occurrence is by distributing updates to clients via SignalR. Take the approach that a single update will distribute via a SignalR hub to all other clients. Still, the update may be broadcast moments after a concurrent update, and this will require some UI feature to explain that things have changes.
Although its a while ago I thought i'd post the solution to my problem in case someone else is grappling with the concepts as I was.
The issue was that I was not properly disposing my ViewModel when I navigated away from the View (or if there was a DB Error). Therefor when I returned to the View the EF DBContext was still using the local store. i.e not Disposed.
I subsequently implemented the Prism Framework and implemented IRegionMemberLifetime on a base class from which all my ViewModels inherit. This causes the instance of the ViewModel placed in the Region Manager (IRegion) to be removed when it transitions from an activated to deactivated state.
My (pseudo) base class looks like this:
using Prism.Mvvm;
using Prism.Regions;
public MyBindableBase : BindableBase, IRegionMemberLifetime
{
public virtual bool KeepAlive => false;
...
}
I'm creating my first application using EntityFramework. I'm using Entity Framework Core and MVVMLight.
I created a DbContext descendant class. I would like to know when to instanciate this DbContext.
My first thought was to create 1 instance for each View.
Imagine the following scenario :
I have an Item class
I create a ItemsViewModel to manage the list of items. In this viewModel, I add a property for the DbContext
When the user double-click an item, it's displayed in a detail view, associated to an ItemViewModel. This view-model also has an instance of my DbContext.
When the user quit the detail view :
If he saved, I update the DbContext of the list
If he canceled, the list doesn't have to be updated
Is this a correct way of doing things ? I've read somewhere that one should have only 1 instance of the DbContext. But in this case every modifications to the detail view are propagated to the list view, even if the detail view was canceled.
Many thanx for listening
Hence you're developing WPF application,you can use a context instance per form.
Here it is from EF Team :
When working with Windows Presentation Foundation (WPF) or Windows
Forms, use a context instance per form. This lets you use
change-tracking functionality that context provides.
I would like to recommend to use repository pattern with Dependency injection (DI).Then you don't need to worry about the instantiation and dispose of the dbcontext. Those are automatic.
Hence you're using EF core, you can use Autofac as a DI API.
Good Article for you : How to work with DbContext
Another good article where it explains how to implement a decoupled, unit-testable, N tier architecture based on Generic Repository Pattern with Entity Framework, IoC Container and Dependency Injection.Yes, this article is for the MVC. But you can get good knowledge of this pattern using this article.
Generic Repository and Unit of Work Pattern, Entity Framework,Autofac
There are tons of articles and SO questions on that, google for "DbContext lifetime desktop application". Also, this MSDN magazine might be helpful, although they discuss the case of nHibernate, rules are exactly the same.
Data Access - Building a Desktop To-Do Application with NHibernate
The recommended practice for desktop applications is to use a session per form, so that each form in the application has its own session. Each form usually represents a distinct piece of work that the user would like to perform, so matching session lifetime to the form lifetime works quite well in practice. The added benefit is that you no longer have a problem with memory leaks, because when you close a form in the application, you also dispose of the session. This would make all the entities that were loaded by the session eligible for reclamation by the garbage collector (GC).
There are additional reasons for preferring a single session per form. You can take advantage of NHibernate’s change tracking, so it will flush all changes to the database when you commit the transaction. It also creates an isolation barrier between the different forms, so you can commit changes to a single entity without worrying about changes to other entities that are shown on other forms.
While this style of managing the session lifetime is described as a session per form, in practice you usually manage the session per presenter.
As for the "1 instance of DbContext", this also is commented there:
A common bad practice with [...] desktop applications is to have a single global session for the entire application.
and reasons are discussed below.
Not sure if there's a "offical" name, but by DataContext I mean an object which transparently maintains objects' state, providing change tracking, statement-of-work functionality, concurrency and probably many other useful features. (In Entity Framework it's ObjectContext, in NHibernate - ISession).
Eventually I've come to an idea that something like that should be implemented in my application (it uses mongodb as back-end, and mongodb's partial updates are fine when we're able to track a certain property change).
So actually, I've got several questions on this subject
Could anyone formulate requirements to DataContext? - what's your understanding of it's tasks and responsibilities? (The most relevant I've managed to find is Esposito's book, but unfortunately that's at about msdn samples level).
What would you suggest for changes tracking implementation? (In simplest way it's possible to track changes "manually" in entities, but requires coding and mixes dal with business logic, so I mostly interested in "automatic" way, keeping entities more poco).
Is there way to utilize some existing solution? (I hoped nhibernate infrastructure would allow plugging-in custom module to work with mongo behind the scene, but not sure if it allows working with non-sql dbs at all).
The DataContext (ObjectContext or DbContext in EF) is nothing else than an implementation of the Unit of Work (UoW)/Repository pattern.
I suggest you Fowler's book on Patterns of Enterprise Application Architecture in which he outlines the implementation of several persistency patterns. That might be a help in implementing your own solution.
A DataContext basically needs to fullfil the job of a UoW. It needs to handle the reading and management of objects that are involved in a given lifecycle (i.e. HTTP request), s.t. there are no two objects in memory that represent the same record on the DB. Moreover it needs to provide some change tracking for performing partial updates to the DB (as you already mentioned).
To what regards change tracking, I fully agree that polluting properties with change events etc is bad. One of the recent templates introduced in EF4.1 uses Proxies to handle that and to give the possibility to have plain POCOs.
Answer to quetion 2: To make POCO classes work, you will need to generate code at run-time, possibly using System.Reflection.
If you analyse EntityFramework, you will see that it requires virtual properties to do change-tracking... that is because it needs to create a generated class at run-time, that overrides each property, and then adds code to tell the DataContext when someone changes that property.
EntityFramework also generates code to initialize collections, so that when someone try to do operations like Add and Remove, the collection object itself knows what to do.
I have windows forms app and server side services based on ADO.NET dataservice.
Is it a bad practice to create and initialize one static dataservice client in windows app and use it across the program? For example i can use it in all opened forms(which have bindings to service's datacontext's objects) to call SaveChanges() and not loose tracking.. Or creating a service client instance for every new form is better(because i think after some time with one static client there will be huge memory growth)? But when i create a new client for every form, i assume i create a new connection to the service every time..
May be im wrong and a bit confused about using services in client application. Please help me to understand the right way it works.
Actually the DataServiceContext class doesn't create a connection to the service. The OData protocol it uses is based on REST and as such it's stateless. So creation of the context alone doesn't even touch the service. Each operation (query, save changes) issues a separate and standalone request to the service. From the point of view of the service it's just number of unrelated requests.
As noted above it's usually a good idea to have a separate context for each "section" of your application. What is that exactly depends on your app. If you are not going to load/track huge number of entities (1000s at least) then one context might be fine. On the other hand several contexts give you the ability to "cancel" the update operations by simply droping the context and not calling SaveChanges, which might be handy in some applications.
I would say: It depends. ;) Well your problem is familiar to the decison you have to make, when using directly Entity Framework. So I recommend you to search for such articles and extract their point.
My own experience with EF tells me, that an application with several workflows should have a context for every workflow. Especially, when more than one workflow can be started at the same time and the user can switch between them.
If the application is simple it's proper approach to use only one context.