I'm struggling finding a right solution for my application architecture. For my application I have a single class for customers. The data for filling my customer objects are spread over multiple different types of datasources. The main part is exposed in a readonly Oracle database, other parts are exposed using a webservices and I need te save some extra data to another datasource (for instance a MS SQL database using entityframework) since I only have readonly rights for most datasouces (they are managed somewhere else).
For this reason I wanna build some kind of central library with connectors to all of my datasources for creating a centralized Customer Object to work with. So far so good for this idea (I think) but I can't find any documentation or example with best practices how to achieve such a solution.
EXAMPLE:
* Main Application (multiple applications)
- Central Business Logic Layer (Business-API)
* Webservice Connector
* Oracle Connector
* EntityFramework Connector
Does anyone know if there is some good reading material on this specific subject?
Kind regards
The specific problem you describe with customer objects sounds a lot like the one solved by the Data Mapper pattern, which is technically a kind of Mediator. Quoting from the Wikipedia page for Data Mapper:
A Data Mapper is a Data Access Layer that performs bidirectional transfer of data between a persistent data store (often a relational database) and an in memory data representation (the domain layer). The goal of the pattern is to keep the in memory representation and the persistent data store independent of each other and the data mapper itself. The layer is composed of one or more mappers (or Data Access Objects), performing the data transfer. Mapper implementations vary in scope. Generic mappers will handle many different domain entity types, dedicated mappers will handle one or a few.
Although the language of the problem above speaks of a persistent data store that's singular, there's no reason why it couldn't be several data locations (Mediator pattern hides the details from the collaborators).
There is an extension of this pattern, known as the Repository pattern:
I suggest the DAO-Pattern to abstract from any data access. The business logic should not be aware of any datasources. This is the most important aim. Anything else has to be subordinated.
You can create a constructor that accepts datasources like:
public class Customer
{
public Customer(OracleConnector oracle, WebSerivceConnector webservice, EntityConnector entity)
{
this.oracle = oracle;
this.webservice = webservice;
this.entity = entity;
}
public void Fetch()
{
// fetch data from oracle, webservice, and entity.
this.Name = oracle.GetCustomerName();
}
}
This way only Customer knows how to get the data, all the logic is in one place. You can even make it more testable and less coupling by creating interfaces for connectors.
public interface IOracleConnector
{
// add something here
string GetCustomerName();
}
public class OracleConnector
: IOracleConnector
{
// add the implementation here.
}
Then change Customer constructor to accepts IOracleConnector like:
public Customer(IOracleConnector oracle, WebSerivceConnector webservice, EntityConnector entity)
{
// your code here.
}
Now, you can create a mock to test Customer without actually connecting to the database.
Related
i'm beginner in repository and layerd application and i don't inderstand well which is the interaction and the relationship between the repositories and business layer classes
Here is an example for purchaese order in 3 layers and I want to review whether correct or not and your correction
for DataAcesslayer
repository OrderRepositolry
Namespece Dal
{
Public class RepositoryOrder
{
POrderContext context = new POrderContext ();
Public IEnumrebale <Order> GetAll ()
{
Context.Orders;
}
// Following code
}
}
for the item of order repositories code :
namespece Dal
{
public class RepositoryOrderItem
{
POrderContext context = new POrderContext();
public IEnumrebale<OrderItem> GetAllItemById(Order o)
{
context.OrderItems.where(i => i.OrderId == o.Id);
}
public OrderItem GetItemById(int id)
{
context.OrderItems.Find(id);
}
//Following code
}
}
for businessLayer here is classOrderBLL code:
namespace BLL
{
public class OrderBLL
{
RepositoryOrder repoOrder = new RepositoryOrder();
RepositoryOrderItem repoItem = new RepositoryOrderItem();
public IList<Order> GetAll()
{
return repoOrder.GetAll();
}
public decimal GetPrixTotal(Order order)
{
var query = from item in repoItem.GetAllItemById(order)
select sum(item=>item.Prix * item.Quantite);
return query;
}
}
}
does the total price calculation is done at the level of repository
or at the level of BLL (we can make this request linq with context
in the repository)?
CRUD method is done at repository and they are called at BLL from
repository is it right?
does the where method in linq corresponds to logical business or
repository (data access layer) since it determines certain rules in
the business?
I'm sure this question will be voted down as "primarily opinion based" but before that happens I'll jump in to give my "primarily opinion based" answer :-)
There are two ways to partition a database application and they depend on how complex and large it will be. Entity Framework examples tend to give a very simplistic model, where the EF Data classes are exposed to the Business layer which then exposes them to the View Model or other layers. This may be correct for simplistic applications but for more complex ones, and ones where the data storage method is not RDBMS (i.e. No-SQL) or where you want to create separation between business and repository data structures it is too simple.
The repository layer should have a set of classes which describe how the data is accessed from the repository. If you have an RDBMS these might be EF POCO classes, but if you have a web-service endpoint as your repository this may be SOAP documents, or REST structures, or other Data Transfer Object. For an RDMBS like SQL Server that uses exclusively stored procedures for accessing its data, the Repository layer might simply be a set of classes which mirror the naming and parameters, and data sets returned by the stored procedures. Note that the data stuctures returned by anything other than an RDBMS might not be coherent - i.e. a "Customer" concept returned by one method call in the repository might be a different data structure to a "Customer" returned by a different call. In this case the repository classes would not suit EF.
Moving to the business object layer - this is where you create a model of the business domain, using data classes, validation classes and process class models. For instance a Process class for recording a sales order might combine a Business Customer, Business Sales Order, Business Product Catalog data concepts and tie in a number of Validation classes to form a single atomic business process. These classes might (if you are doing a very lightweight application) be similar to data at the Repository layer but they should be defined seperately. Its in this layer you hold calculated concepts such as "Sales Order Total" or "VAT Calculation" or "Shipping Cost". They might, or might not, get stored into your Repository but the definition of what they mean are modelled in the business layer.
The business layer provides the classes whose data is copied across into a View Model. These classes again can be very similar (and in the simplest of cases, identical to) the repository classes, but in their job is to model the user interface and user interaction. They might contain only some of the data from the business data classes, depending on the requirements of the UI. These classes should carry out user interface based validation, which they might delegate to the business tier, or might add additional validation to. The job of these classes is to manage the state-machine that is the user interface.
My summary is that in a large scale system you have three sets of classes; data repository interaction, business model interaction, and user interface interaction. Only in the simplest of systems are they modelled as a single set of physical POCO classes.
I'm trying to implement the Repository Pattern with UoW for our small Application.
As far as i read the main Idea of the UoW is to expose the Repositories via one Context and to Save all Stuff in one step, f.e. with Transaction, to make sure the Operations are atomic.
Everything fine so far and I checked the examples like http://www.asp.net/mvc/tutorials/getting-started-with-ef-5-using-mvc-4/implementing-the-repository-and-unit-of-work-patterns-in-an-asp-net-mvc-application
The problem: We have a logical BusinessModel, which gets Data from the TFS API and a SQL-Database using Entitiy Framework. So we have seperate Systems and no Contexts.
UoW still seems a good Idea to implement (Rollback, if one System could not save properly etc.), but does it need Repository as well? If I create a Repository, which hold just a List of the logical BusinessModel and gets the Data from EF/TFS API, I could do everything I'd like to in it.
Did I understand these two Concepts wrong? Or is just our environment not workingfor this Patterns?
Thanks in advance
Matthias
Edit: To show what I mean:
We have a DataLayer.TFS Project, which we can query TFS-relevant data like this:
IEnumerable<WorkItem> tfsItems = TFS.QueryByParameter(criterias);
And we have a DataLayer.SQL Project, where we have the EF and query data like this:
using (var context = new SimpleContextContainer())
{
//Some Tables to List etc.
}
Then we have a ugly procedure called Concat, which merges the two Lists:
private static List<NewItem> Concat(IEnumerable<WorkItem> tfsItems, IEnumerable<OldDBItem> dbItems)
{
List<NewItem> result = new List<NewItem>();
foreach(var tfsItem in tfsItems)
{
var tmpItem = new NewItem();
tmpItem.PbiId = tfsItem.Id;
//Do this for all properties
result.Add(tmpItem);
}
return result;
}
And we have an opposite Prodecure for splitting the Data.
It is not easy to tell what your architecture is here. But I think you can go with something like this:
Business-Layer: Work with your domain objects, your "business".
Domain-Layer: Your domain objects, accessing data from repositories to build them, merge them, provide constraints, etc
Data-Access-Layer: Repositories, one for each source
Persistence-Layer: Unit of Work
Data-Layer: SQL Database, TFS
Data-Access-Layer and persistence layer can be merged in case of TFS. For the SQL Database you get a context from EF, which should be used in that case in the data access layer. So you have methods like AddEntities or DeleteEntities there.
You can have constraints on the different layers of course.
The repository usually gives back full functional domain objects. So if you achieve that, you are on the right way I think. The domain-layer that I wrote above is now in place as far as I understood your comments. Perhaps you can and should get rid of that, replaced by a single assembly that defines your business objects that is used by the Repository and your business layer.
I am trying to grabs the idea of the pattern repository and trying to get it implemented in database structures I've already set up in the past. I'm now trying to get the best practice to work with my lookup tables. I've created a test project to play around and this is my database model:
You can see that I have three tables for the lookups: Lookup, Language and LookupLanguage. Language table simply contains the languages.
Lookup tables holds the different types used throughout the models.
And LookupLanguage links the both tables together:
I've created anew project with all the models 1 to 1 to the database tables:
I also created a generic repository and a generic CrudService interface:
public interface ICrudService<T> where T : IsActiveEntity, new()
{
int Create(T item);
void Save();
void Delete(int id);
T Get(int id);
IEnumerable<T> GetAll();
IEnumerable<T> Where(Expression<Func<T, bool>> func, bool showDeleted = false);
void Restore(int id);
}
Now, according to the following post: When implementing the repository pattern should lookup value / tables get their own Repository? , the repository should hide the underlying database layer. So I think I need a new implementation of a service and/or repository to get the lookups, but then, where do I have to tell in which language I need to have the lookup?
Let's take the status (new, accepted, refused) from the company as an example.
The company model is as follow:
public partial class Company : IsActiveEntity
{
[Required]
[MaxLength(50)]
public string CompanyName { get; set; }
public System.Guid StatusGuid { get; set; }
[ForeignKey("StatusGuid")]
public virtual Lookup Status { get; set; }
}
I guess I don't need to have a separate implementation of a repository?
But I need a separate implementation CompanyService.
interface ICompanyService : ICrudService<Company>
{
IQueryable<LookupLanguage> GetStatuses(Guid languageguid);
LookupLanguage GetStatus(Guid statusguid, Guid languageguid);
}
Is this the correct approach, or do I miss something here?
Creating a Generic LookupRepository in your case in a better option because of your table schema and maintainence perspective.
I'm not sure whether you are using both Service Locator and Repository pattern or just Repository because of the name ICompanyService. But regardless, I agree that Repositories should not represent tables 1-1 always but they do most of the times.
The SO link you provided has a different table structure than yours. You have a generic lookup table vs the link has a separate table for each lookup. In the case where you have separate tables it makes sense to have the lookup repository method go with the entity repository since you will have a separate code to fetch the data for each lookup(as they have separate tables with different schema).
But in you case you have a single table that stores all the lookup types for each language and it makes sense to have a single LookupRepository that returns all the various types of lookups based on Language and LookupType. If you create each lookup method in separate entity repositories (like GetStatuses in CompanyRepository and GetStatuses in ContactRepository) you will have to repeat the logic in the method for each repository.
Think if you change the schema of the lookup table (say add a column) and you want to test all places the lookups are used it will be nightmare if you have lookup methods all over the place and pretty easy if you have one method in LookupRepository.
interface ILookupService : ICrudService<Lookup>
{
IQueryable<Lookup> GetStatuses(Guid languageguid, LookupType lookupType);
Lookup GetStatus(Guid statusguid, Guid languageguid, LookupType lookupType);
}
As regards your question, "Is this the correct approach" - this entirely depends on your specific needs.
What you have done doesn't seem to have any real issues. You have implemented the repository pattern using generics which is great. You are using interfaces for your repositories which allows for easier unit testing, also great!
One of your tags seems to indicate you are interested in the Entity Framework. You do not seem to be using that. The Entity Framework would simplify your code by creating the boiler plate classes for you. You can still use your repository pattern code with the classes created by the Entity Framework.
It seems that you are confusing the idea of a service and a repository. A repository is a general object which allows you to get data from a store without caring about the implementation. In your example, ICompanyService is a repository.
It is really controversial topic and there are different approaches to this problem. In our data logic we are not using repository pattern because we do not want to abstract most of the benefits of Entity Framework. Instead, we pass the context to the business logic which is already a combination of UoW / Repository pattern. Your approach is okay if you are going this way on all of your company services. However what I have seen so far, putting methods to the related services by their return values is the best approach to remind where they are. For instance if you want to get the company lookup, create a ILookupService and put GetLookUpsByCompany(int companyId) method to retrieve the company lookups.
I would argue with the linked response. Repositories ARE linked to database entities, considering the Entity Framework itself as a uow/repository implementation is a best example. On the other hand, services are for domain concerns and if there is a mismatch between your database entities and domain entities (you have two separate layers), services can help to glue the two.
In your specific case, you have repositories although you call them services. And you need a repository per database entity, that's just easier to implement and maintain. And also it helps to answer your question: yes, you need the extra repository for the linking table.
A small suggestion. You seem to have a generic query function that only accepts where clauses
IEnumerable<T> Where(Expression<Func<T, bool>> func, bool showDeleted = false);
If you already follow this route that allows arbitrary filtering expressions (which itself is a little arguable as someone will point out that you can' possibly guarantee that all technically possible filters can be executed by the database engine), why don't you allow all possible queries, including ordering, paging, etc:
IQueryable<T> Query { get; }
This is as easy to implement as your version (you just expose the dbset) but allows clients to perform more complicated queries, with the same possible concern that such contract is possibly too broad.
Localization is a presentation layer thing. The lower layers of your application should bother with it as little as possible.
I see two different kind of lookups: translations of coded concepts (Mr/Miss/Mrs) and translations of entity properties (company name maybe, or job titles or product names).
Coded concepts
I would not use lookup tables for coded concepts. There is no need to bother the lower layers at all with this. You will only need to translate them once for the entire application and create simple resource files that contain the translations.
But if you do wish to keep the translations in the database, a separate lookup repository for the codes or even per code system will sort of replace the resource file and serve you fine.
Entity properties
I can imagine different/nastier localization issues when certain entities have one or more properties that get translated in different languages. Then, the translation becomes part of the entity. I'd want the repository to cough up entity objects that contain all translations of the description, in a dictionary or so. Cause the business layer should not worry about language when querying, caching and updating relations. It should not ask the company repository for the Dutch version of company X. It should simply ask for company X and be served a Company object that contains its name in Dutch, English and French.
I've one more remark about the actual database implementation:
I think the lookup tables are distracting from the actual entities, to the point where you have forgotten to create a relation between person and person company. ;) I'd suggest putting all translations of entity properties in a single XML type column instead.
This illustrates why the repository should handle entities plus translations. If you were to make this storage layer level implementation change at some point, i.e. go from lookup tables to xml columns, the repository interfaces should remain the same.
I am writing a proof of concept application.
When coming to the data layer we need the ability to connect to different databases and different technology might be used
Ado.net (sqlCommand etc..)
Entity Framework.
Nhibernate.
What I am saying is that the whatever calls our RepositoryService class is ignorant about the provider used.EG "Entity Framework, Raw Ado.Net NHibernate" etc..
Is there an example out there or an empty shell I can look at or a code snippet from you.
Just to give an idea how would you go about it.
Noddy implementation to give you an idea omitted possible IOC etc..:
public class BusinessService
{
public List<CustomerDto> GetCustomers()
{
RepositoryService repositoryService=new RepositoryService();
List<CustomerDto> customers = repositoryService.GetCustomers().ToList();
return customers
}
}
public class RepositoryService:IRepository
{
private string dbProvider;
public RepositoryService()
{
//In here determine the provider from config file EG Sql- EF etc.. and call the appriopiate repository
// dbProvider=???
}
public IEnumerable<CustomerDto> GetCustomers()
{
//Get the customers from the choosen repository
}
}
public interface IRepository
{
IEnumerable<CustomerDto> GetCustomers();
}
public class SqlRepository : IRepository
{
public IEnumerable<CustomerDto> GetCustomers()
{
throw new NotImplementedException();
}
}
public class EFRepository : IRepository
{
public IEnumerable<CustomerDto> GetCustomers()
{
throw new NotImplementedException();
}
}
public class CustomerDto
{
public string Name { get; set; }
public string Surname { get; set; }
}
Many thanks
You should be more clear about your objectives (and those of your manager). Accessing your data thrue some repository interfaces is a first step. The second step is to have a shared object representation of your data table rows (or your entities if you want to refine table mappings).
The idea behind the scene may be:
a) We don't know ORM technologies well and want to try without taking the risk to have poor performances.
b) Our database is very huge and we manipulate huges amounts of data.
c) Our database contains many thousands of tables.
d) ...
The general answer may be :
1) use the choosen ORM when possible.
2) downgrade to ADO.NET or even to stored procedures when performances are poor.
Entity Framework and NHibernate use an high level entity mapping abstraction. Do you want to use this? If not, you may use lightweight object mappers like Dapper or PetaPoco.
ORM are a good way to lower the development costs of 70% to 80% the database access code (95% if you just read data). Choosing to be able to use all of them will ensure you that the potential cost gains will be lost.
PetaPoco is very interesting for a first experiment because it includes the very light mapper source code in your C# project and generates table objects with an easy to understand T4 transform file (all the source code is small and included in your data access layer). Its major default is that its author does have time to work on it last years.
If ORM technologies can make program easier to write and scale, they have drawbacks:
1) because you work outside the database, operation between in memory (or not yet persisted) objects and database data can easily become very costly : if a search for data concerning one object in database generate one request, an operation on a collection of objects will generate as many requests as there are items in the collection.
2) because of the complex change tracking mechanisms in high level ORM, saving data can become very slow if you don't take care of this.
3) The more the ORM offers functionalities, the more your learning curve is long.
The way that I generally accomplish this task is to have different concrete implementations of your repository interfaces, so you can have an EFRepository or an NHibernateRepository or an AdoNetRepository or an InMemoryDatabaseRepository implementation.
As long as you encapsulate the construction of your repository (through a factory or dependency injection or whatever) the types that are consuming your repository don't have to be know exactly what kind of repository that they are working with.
I am learning DDD development for few days, and i start to like it.
I (think i) understand the principle of DDD, where your main focus is on business objects, where you have aggregates, aggregates roots, repositories just for aggregates roots and so on.
I am trying to create a simple project where i combine DDD development with Code First approach.
My questions are: (I am using asp.net MVC)
DDD Business Objects will be different than Code First objects?
Even if they will probably be the same, for example i can have a Product business object which has all the rules and methods, and i can have a Product code first (POCO) object which will just contain the properties i need to save in database.
If answer to question 1 is "true", then how do i notify the Product POCO object that a property from business object Product has been changed and i have to update it? I am using an "AutoMapper" or something like this?
If the answer is "no", i am completely lost.
Can you show me the most simple (CRUD) example of how can i put those two together?
Thank you
Update I no longer advocate for the use of "domain objects" and instead advocate a use of a messaging-based domain model. See here for an example.
The answer to #1 is it depends. In any enterprise application, you're going to find 2 major categories of stuff in the domain:
Straight CRUD
There's no need for a domain object here because the next state of the object doesn't depend on the previous state of the object. It's all data and no behavior. In this case, it's ok to use the same class (i.e. an EF POCO) everywhere: editing, persisting, displaying.
An example of this is saving a billing address on an order:
public class BillingAddress {
public Guid OrderId;
public string StreetLine1;
// etc.
}
On the other hand, we have...
State Machines
You need to have separate objects for domain behavior and state persistence (and a repository to do the work). The public interface on the domain object should almost always be all void methods and no public getters. An example of this would be order status:
public class Order { // this is the domain object
private Guid _id;
private Status _status;
// note the behavior here - we throw an exception if it's not a valid state transition
public void Cancel() {
if (_status == Status.Shipped)
throw new InvalidOperationException("Can't cancel order after shipping.")
_status = Status.Cancelled;
}
// etc...
}
public class Data.Order { // this is the persistence (EF) class
public Guid Id;
public Status Status;
}
public interface IOrderRepository {
// The implementation of this will:
// 1. Load the EF class if it exists or new it up with the ID if it doesn't
// 2. Map the domain class to the EF class
// 3. Save the EF class to the DbContext.
void Save(Order order);
}
The answer to #2 is that the DbContext will automatically track changes to EF classes.
The answer is No. One of the best things about EF code-first is that it fits nicely with DDD since you have to create your business objects by hand so do use your EF models to be equivalent to DDD entities and value objects. No need to add an extra layer of complexity, I don't think DDD recommends that anywhere.
You could even have your entities to implement an IEntity and you value objects to implement IValue, additionally follow the rest of DDD patterns namely Repositories to do the actual communication to the database. More of these ideas you can find this very good sample application in .NET, even though it doesn't use EF code first, it's still very valuable: http://code.google.com/p/ndddsample/
Recently I've done similar project. I was following this tutorial: link
And I've done it this way: I've created Blank solution, added projects: Domain, Service and WebUI.
Simply said in domain I've put model (for example classes for EF code first, methods etc.)
Service was used for domain to communicate with world (WebUI, MobileUI, other sites etc.) using asp.net webapi
WebUi was actually MVC application (but model was in domain so it was mostly VC)
Hope I've helped
The Pluralsight course: Entity Framework in the Enterprise goes into this exact scenario of Domain Driven Design incorporated with EF Code First.
For number 1, I believe you can do it either way. It's just a matter of style.
For number 2, the instructor in the video goes through a couple ways to account for this. One way is to have a "State" property on every class that is set on the client-side when modifying a value. The DbContext then knows what changes to persist.
Late question on this topic.
Reading Josh Kodroff's answer confirms my thoughts about the implementation of a Repository to, for instance, Entity Framework DAL.
You map the domain object to an EF persistance object and let EF handle it when saving.
When retrieving, you let EF fetch from database and map it to your domain object(aggregate root) and adds it to your collection.
Is this the correct strategy for repository implementation?