I am developing a multi tier desktop application(Onion architecture), with a WinForm Project as UI, and I used EF code first to access my DB, and for my Domain models, I want to use POCOs, so I have two choices :
Connected POCOs
Disconnected POCOs
If I use disconnected POCOs I have to do a lot of stuff outside of EF and do not use EF features, because:
I have to save and manage entity's State in client side
When I want save changes to DB, I have to sync client side POCO's State
with DbContext entity's State.
During adding POCOs to newly created DbContext, I have to control
to prevent adding two entities with same key to the DbContext.
So, it seems normal, to use Connected POCOs, but in this situation I think I have the following challenges:
If I want to manage lifetime of my DbContext using an IoC container, in a multiuser
environment, and keep alive the DbContexts for all users from time their
getting POCOs to time that saving back their changes to DB, it take large
amount of server memory, I think, and it isn't efficient.
I want to save below related graph of my entities in one transaction
I have these questions:
How can I use connected POCOs in a my multitier application?
which config should I use to manage lifetime of my DbContexts using IoC containers in my case?
Is there any way to make(or change) the total graph(in client side) and then pass it to save in DbContext?
Is there any sample that implement this situations?
Related
I create instance of context per form and may multi forms use same entity
How handle this when save?
The DbContext is part of your "view state", intended to give you access to the database, hold a copy of the data the user is working on in the Change Tracker, and flush changes back to the database.
In a smart client app the natural scope and lifetime for the DbContext is either global or scoped to a UI element that represents a task or use-case. But in either case you have to prevent the DbContext's Change Tracker from growing too large or from holding on to copies of data too long, as even a single form can stay open all day. A possible rule-of-thumb for these apps is to clear the change tracker after every successful SaveChanges(). Other than that, of course you must avoid long-running transactions so the DbContext doesn't hold a DbConnection open for a long time, but you have to do that anyway. The point is that a DbContext with an empty change tracker can be long-lived.
Using a Short-Lived DbContext is possible, but you loose the services of the Change Tracker and .Local ObservableCollection which make data binding really easy. So that's not a silver bullet.
So on to the question at-hand, if you have DbContext-per-form, and you have form-to-form communication of entities which end up getting saved by a different DbContext, you really have no choice but to disconnect the entities from their home DbContext before sending them over.
Or this might indicate that the two forms really should share a View State, and you should create an explicit View State object that includes the DbContext and the entieies and have both share that between forms.
Take a look at the Model-View-ViewModel (MVVM) pattern which introduces a distinction between the View (form) and the ViewModel, the data that models the use case user's interaction with the app. It's a useful pattern and is very widely used in XAML-based smart clients. Also XAML is (still) the future of smart client apps, so it's useful to learn.
I had a hard time naming and wording this question, as there's a lot to unpack, so I apologize in advance - for anyone who spends the time to review and respond to this, I very much appreciate you.
Background:
I have a relatively large ASP.NET MVC5 application using Entity Framework 6, using a SQL Server database. Currently, the solution is split in to a few projects, mostly split by layer (business, data, etc). There is a single .edmx file and dbContext for the application, and it points to a single database at the moment.
The code/solution above represents the "core" of the system being built. However, this application is customized per client, therefore each client could have their own modules, pages, logic, etc. Due to this, we have a project in the solution for each client (only a couple right now, but will eventually be 50+ - is that an issue? Split the solution up maybe?). The intention is to be able to deploy just that client's code along with the core, or to be able to deploy just the core as well.
In addition to the custom modules in the code, they may also have their own custom database, again derived from a Core database. The custom database will always be kept up to date with the core db, but may have additional objects (tables, stored procedures, etc). On thing to note, I do not have the option of veering away from this approach - each client will definitely have their own copy of the "core", but it will be kept up to date utilizing a push tool developed in-house.
Problem/Question:
With that, which will essentially be the Core database with the potential for extra objects added in for that client's implementation.
The issue I'm struggling with is how to implement this in Entity Framework in a way which does not require me to add all of those custom db objects to the Core database, or at the very least keep them logically separated, relegated to the client projects. What would be the best way to go about this?
My Idea For Implementation
This is definitely where I am struggling at the moment. I am not really sure if my current idea will work, but I am still investigating and trying to come up with better options.
My current idea is as follows... Since I can target a specific schema when generating an EDMX, place client specific objects in a schema for their project, and utilize those to generate a dbContext in each client project/database, which inherits from the Core's dbContext implementation (containing all the "core" objects). This would mean ClientA's project would have an edmx file with just their custom tables/objects, inheriting all of the core's objects, but keeping them separate from other client's objects.
I'm not completely certain whether this approach will work (playing with it now), my initial concerns are that Entity Framework doesn't appear to generate foreign keys between the contexts. For example, if ClientA's table has a foreign key pointing to a core table, the generation tool doesn't appear to generate that relationship. That said, could I manually implement this effectively? The core code is database first, however I could implement the smaller, client specific items code-first, which I believe would give me far more flexibility. Would this be an effective approach? If not, is there a better approach out there I could use?
As a developer in very similar situation (6 years of project for multiple clients) I can say that your approach is full of pain. Customising your code per client is a road to hell.
You need to deploy the same code to every client. Core stays the same. Satellite modules developed for a specific client should be done as generic as possible (so you can re-sell them multiple times) and also deployed to everyone. The trick is to have a good toggle system that will enable only the right functionality per client.
I.e. there is a controller that saves for example company information. Everyone gets the same code, but if a customer BobTheBuilder Ltd. requires a special validation for companies, then that code goes into MyApp.BobTheBuilder.* namespace and your configuration code should know that this code should be executed instead of your general code. Needless to say that this should be done via DI container and implementations should be replaced by injecting objects that implement the common interface.
As for database - you can have multiple DB Contexts that represent your database modules. They can live in the same database, but best to separate modules by schema name. So yes, all those objects go to your codebase. Only not every tenant will get all the tables - only enabled modules should be activated and create their tenant tables.
As for project per customer - that's also is a big pain. Imagine if you have more than 10 customers and need to update Newtonsoft.Json package - that usually takes a bit more than forever! We tried that and fell back to namespace per customer overrides.
Generally here is our schema:
Tenants all get the same codebase deployed to them, but functionality is disabled by toggles
Tenants each get their own database with all the tables and enabled schemas(modules)
Do not customise your core per tenant. All customisations go into modules.
CQRS is recommended, but you can live without it. Though life is a lot easier when you have only a handful of interfaces to think about.
DI is a must. Can't make all that happen without a good container that supports multi-tenancy.
There are modules that do some specific stuff developed per customer. Each module has it's own toggles and very configurable - so multiple tenants can get the same module, but can be re-configured independently.
You can implement inheritance with the Entity Framework in an ASP.NET MVC Application:
https://learn.microsoft.com/en-us/aspnet/mvc/overview/getting-started/getting-started-with-ef-using-mvc/implementing-inheritance-with-the-entity-framework-in-an-asp-net-mvc-application
There are a few approaches Table-Per-Hierarchy (TPH) inheritance, Table Per Type (TPT) inheritance and Table-per-Concrete Class (TPC) inheritance.
You might also consider a Microservic-ie architecture if you're concerned how the different schema's will integrate.
Entity Framework doesn't appear to generate foreign keys between the contexts.
That approach sounds painful. Using Microservices to encapsulate the Core and client dBs as their own entities you could then use Message Queue's to broker communication between them.
I am currently struggling to find a way to migrate to a new database schema for a database shared by multiple applications, while keeping applications working with the old schema intact.
There are multiple applications performing CRUD operations on the shared database, using a self-written ORM-like library. The main problems I see with this architecture are:
Each application implements its own business logic with a lot of code being redundant or code which should do the same in every application but is implemented differently and therfore hard to maintain
Since each application works directly with the ORM-library the other applications cannot know when some data was changed by another application without monitoring/polling the database for changes
The ORM-library does implement only limited concurrency, no transactions and is relatively slow
To solve the redundancy/inconsistency problems I am thinking about implementing a layered architecture.
Service Layer
Business Layer
Data Access Layer
Database
The applications then communicate with a SOAP web service on the service layer.
The service layer uses the business layer to perform validation and apply business logic. The business layer uses the data access layers repositories.
I am hoping to be able to also use the business layer in the client applications, with another repository implementation, which does not access the database directly but via the SOAP web service.
To solve the other problems I was hoping to use Entity Framework instead of the selfmade ORM-library. But the schema of the database is made in a kind of generic way. Meaning for each machine added to the database (database stores facility data) several machine specific tables are added. This results in redundant tables, named [machinename]_[tablename]. As far as I know, Entity Framework or any other ORM cannot deal with that (its poor design anyway, probably meant to speed queries up).
The plan would be to migrate to another database schema, but the problem with that is that all the applications using the database need to be changed to use the new schema/SOAP web service. This cannot happen from one day to another therefore it would be best if I can keep some of the applications unchanged, but still work on the only one database. And then later deal with reimplementing the other applications to use the web service.
I already thought about using views to simulate the old schema, so that the old applications can still work with the changed schema, but unfortunately the selfmade ORM does not support working with views.
I don't expect anyone to present me a solution but rather some basic approaches and/or ideas to improve the overall architecture of the system.
I have a WPF application and am using entity framework. I was reading the tutorial
and the author was talking about disconnected entities, which is the first time I have heard of them. I am a little confused as to when disconnected entities are actually needed, as I have been using EF just fine to do CRUD operations on my business objects. If I am using the same context when I am doing CRUD operations on a business object, why do I need to ever manually track the entity state changes? Thanks for any help.
If you are always keeping around the originating context instance, then you probably do not need to worry about disconnected entities. Disconnected entities often come up in the context of web services or web sites, where the context from which an entity was originally retrieved (and, for example, placed into a Session) is no longer available some time down the road when that entity has been modified and needs to be saved back to the database.
We have two web applications and each one is creating it's own data context for EF. When I make changes to an entity in one app I see the changes on the page and in the database when I view the data in SQL Server Mangement Studio. However, I don't immediately see the changes in the other application.
Both apps use dependency injection and both applications use the same business layer and data layer. So the UI in both apps go through a common controller class (not to be confused with MVC controllers) and the controller goes through the repository for the entity it is retrieving. Because they are different apps they each have their own instance of entity framework data context.
If there is some kind of caching, how might I turn that off?
Thanks in advance.
EDIT - Maybe the caching is ocurring somewhere above EF? Clearing my browser cache doesn't seem to fix the issue. After some time goes by I will suddenly see the update to the record in the other app, but for a while no amount of refreshing will show me the updates.
If the caching is occurring in the context, the cached values should go away when you dispose of the context at the end of the current HTTP request. You do have a per-request context, right?
Here's some context lifetime best practices.
Yes, contexts cache their results.
You should be creating a new context with every query (or implement something like NHibernate's Session-per-request)