My experience has made me more accustomed to the following structure in my programs. Let's say it is a .NET WPF application. You create a WPF project, a Business Logic project (Class Library), a Data Access project (Class Library), and an Entities project (Class Library). The WPF project goes through the Business Logic Layer to the Data Access Layer. The Entities are lightweight DTOs and can flow freely from layer to layer.
My question is this. I like LINQ to SQL entities, except if I use it to create my entities, I not only wind up with a straight table to entities relationship, it also puts my Entities in my Data Access project and forces my UI project to have a reference to my Data Access project. Granted, I can make my DataContext internal (which I think it should be by default anyhow), except my DTOs are still in my Data Access project and it still forces my UI project to have a reference to my Data Access project.
Am I missing something here or is there another way to extricate my DTOs with LINQ to SQL or should I even care?
If we follow a Dependency inversion principle
High level modules should not depend upon low-level modules. Both should depend upon abstractions.
Abstractions should never depend upon details. Details should depend upon abstractions.
So in your case UI and Business logic should not depend upon Data access. Abstracted entities should never depend upon details of LINQ to SQL
Then we start design our application from high level layers
1 Create Project of entities abstractions
public interface ICustomer
{
int Id { get; set; }
}
2 Create Project business logic abstractions used by UI project
public interface ICustomerService
{
List<ICustomer> LoadTop50();
}
3 Create UI project
3.1 Create UI logic which use ICustomer and ICustomerService for showing person information
Notice: UI depends only on abstractions and have no knowledge about other layers.
4 Create Business project
4.1 Create DataAccess abstraction for fetching data
namespace Business.DataAccessAbstractions
{
public interface ICustomerDataAccess
{
List<ICustomer> Load(int topAmount);
}
}
4.2 Implement ICustomerService which use abstractions of ICustomerDataAccess
public class CustomerService : ICustomerService
{
private DataAccessAbstractions.ICustomerDataAccess _dataAccess;
public CustomerService(DataAccessAbstractions.ICustomerDataAccess dataAccess)
{
_dataAccess = dataAccess;
}
public IEnumerable<ICustomer> LoadTop50()
{
const int TOP_NUMBER = 50;
return _dataAccess.Load(TOP_NUMBER);
}
}
Notice: Business project create abstractions for data access. And implement abstractions, which UI will use for showing data.
5 Create DataAccess project
5.1 Create entities with LINQ to SQL.
5.2 Implement Business.DataAccessAbstractions.ICustomerDataAccess interface.
5.2.1 Make entities, generated by LINQ to SQL, implement ICustomer
[Table(Name="dbo.Customer")]
public partial class Customer : INotifyPropertyChanging,
INotifyPropertyChanged,
ICustomer
{
private int _Id;
[Column(Storage="_Id",
AutoSync=AutoSync.OnInsert,
DbType="Int NOT NULL IDENTITY",
IsPrimaryKey=true,
IsDbGenerated=true)]
public int Id
{
get
{
return this._Id;
}
set
{
if ((this._Id != value))
{
this.OnIDChanging(value);
this.SendPropertyChanging();
this._Id = value;
this.SendPropertyChanged("Id");
this.OnIDChanged();
}
}
}
}
You need only add ICustomer to the list of implemented interfaces. Or create/generate some "mapping logic" which convert entities generated by LINQ to SQL to the instances which will impelement ICustomer. I found that adding ICustomer was easiest way for this sample.
Notice: DataAccess project have dependencies only on abstractions which is implemented by using LINQ to SQL
6 Create Main project which will compound all dependencies together and launch your UI.
Notice: This project will have all references needed for your application to work properly.
Summary
With this approach you UI will not depend on details of LINQ to SQL.
With this approach you can freely modify your implementation of DataAccess until modifications will not break high level abstractions.
Of course if you decide to add new data field for Customer, which you want use in UI, then you need to modify whole chain of dependencies.
Related
I have a typical application with Controllers, Services, Repositories. So, there are 2 projects:
ASP.NET Core WebAPI with controllers
Core with all business logic
The WebAPI should know only about services from Core. In the Core I have public classes (services) that returns DTOs, but these services depends on DbContext that I want to mark as internal. Of course I can't
Error CS0051 Inconsistent accessibility: parameter type
'DevicesDbContext' is less accessible than method
'DeviceService.DeviceService(DevicesDbContext, IMapper)'
I'm using EF Core and instead of own Repositories I use DbContext. I have entity model that I have to use only in Core project. Could you please advice how can I achieve that?
For example my model is:
internal class Device
{
public int Id {get;set;}
}
DbContext:
internal class DevicesDbContext : DbContext
{
public DbSet<Device> Devices {get;set;}
}
Service:
public class DeviceService : IDeviceService
{
public DeviceService(DevicesDbContext dbContext, IMapper mapper)
{
}
..
}
I got that error in constructor of DeviceService. It is not a duplicate because I know what that error mean and how to fix that. Here I asked about design or architecture of this approach because I need to avoid of using models and dbcontext in WebAPI directly
If you don't want to use Repositories to guard data access (which generally still return Entities, not DTOs, so Entities need to be public) then the real question is:
"Why do you want to avoid using the DbContext & Entities in your Web API?"
Entity Framework is a framework. It's purpose is to facilitate data access to make your code easier to write and easier to understand. Just as you chose to use the .Net framework and leverage things like Linq, Generics, etc. by chossing EF you should seek to leverage everything it offers.
If you absolutely must keep the context and entities out of the API assembly references, or want to centralize business logic involving entities between a Web API and another set of MVC controllers then you're looking at building an anemic API. In this case:
Services.DLL -- References DbContext, entities..
public interface ISomethingService
{
IEnumerable<SomeDto> GetSome(/*params*/);
}
public class SomethingService : ISomethingService
{
public SomethingService(SomeDbContext context)
{ // Init stuff.
}
IEnumerable<SomeDto> ISomethingService.GetSome()
{
// Get some stuff and return DTOs.
}
}
Web API DLL -- Only references DTOs.
public class SomeAPI : ISomethingService
{
private ISomethingService Service { get; set; }
public SomeAPI(ISomethingService service)
{
Service = service;
}
public IEnumerable<SomeDto> GetSome()
{
return Service.GetSome();
}
}
The API is anemic in that it just passes requests through to a common service and forwards the response. The API doesn't need to implement the same interface, it can simply accept a reference to the service and consume it, passing through whatever parameters to get the DTOs that it will pass back.
The downside of this approach is that to modify the service you're flipping between the API and the Services layer vs. just working within the API. I don't favor using approaches like this because APIs and such often need to consider details like filtering, paging, etc. so I want to leverage the excellent LINQ capabilities that EF offers. I also leverage EF's IQueryable support heavily to keep my data access layer simple and compact, letting consuming services decide how to fetch the detail they need. Masking this with an extra service boundary adds complexity and inefficiencies as it either results in complex code, lots of very similar functions, and/or wasted memory/processing to return data that is not needed.
My solution in it's current (sad) state:
I want my business layer data-provider agnostic (isn't that a good thing?) with just an interface so I can switch out EF with NHibernate or Linq to Xml or whatever type of persistance provider me or my boss wants to use (or the new superior one that will inevitably be invented 2 seconds after this project is all done).
IPersistenceProvider is that interface, and I can just inject it with Unity (not the gaming platform, the DI container). To me, IPersistenceProvider belongs in the Data Layer and we can just keep adding folders (like EntityFramework) as new persistence paradigms are needed to be added to my resume (or the project).
Therefore, my business dll depends on my data dll. Here's some code in the business dll, depending on the data dll:
using System;
using Atlas.Data.Kernel;
namespace Atlas.Business.Kernel
{
public abstract class BusinessObject
{
public BusinessObject(IPersistenceProvider p)
{
}
public Guid Id;
}
}
I also feel like my DatabaseContext belongs in the Data Layer. But EF makes you reference the concrete types for its DbSets, which means the AtlasDataKernel dll would need to depend on the AtlasBusinessKernel dll, which would make a circular dll reference. En plus (that's French for moreover), a Data Layer thingy pointing to the Business Layer concrete types smells to me. DatabaseContext wants to go live in the business dll, but that's coupling my business layer with a particular persistence strategy artifact.
How to resolve this? I can collapse it into one dll (and indeed, I did that very thing on a previous project), but that kinda sucks and I won't be able to get into the .Net Architects club. They will mock me for my "1 N too few" architecture and laugh me out of the meeting. WWDED? (What would Dino Esposito Do?)
Split declaration from implementation.
The EntityFramework subdirectory should be a separate assembly (e.g. AtlasDataKernelEF) containing the EF stuff and the implementation of IPersistenceProvider, thus resolving the circular reference.
Also, if you should really get required to use a different ORM, you getyour production executables rid of all EF libraries.
You don't sketch how you instantiate EF data access, but you certainly need to wrap that in some kind of factory class.
Your project AtlasBusinessKernel shouldn't reference any resources in the AtalsDataKernal class. Any resources in the AtalsDataKernal that the AtalsBusinessKernel needs to use should be represented as an interface in the AtalasBusinessKernal project, could be an IDataConext interface or repository interfaces.
This only works if you have a third project which actually using the AtalsBusinessKernal project, perhaps a web application or a console application that represents the UI. This project would be responsible for instantiating the DatabaseContext, preferably using DI.
// In your AtlasDataKernal
public class DatabaseContext : IDataContext
{
// implementation
}
// In your AtlasBusinessKernal
public class MyBusinessLogic
{
private IDataContext dataContext;
public MyBusinessLogic(IDataContext context)
{
this.dataContext = context;
}
}
// In your web application or whatever project type it might be
public class MyWebApp
{
public DoSomeThing()
{
IDataContext context = new DatabaseContext();
MyBusinessLogic logic = new MyBusinessLogic(context);
}
}
I have a design problem with my poject that I don't know how to fix, I have a DAL Layer which holds Repositories and a Service Layer which holds "Processors". The role of processors is to provide access to DAL data and perform some validation and formatting logic.
My domain objects all have a reference to at least one object from the Service Layer (to retrieve the values of their properties from the repositories). However I face two cyclical dependencies. The first "cyclical dependency" comes from my design since I want my DAL to return domain objects - I mean that it is conceptual - and the second comes from my code.
A domain object is always dependent of at least one Service Object
The domain object retrieves his properties from the repositories by calling methods on the service
The methods of the service call the DAL
However - and there is the problem - when the DAL has finished his job, he has to return domain objects. But to create these objects he has to inject the required Service Object dependencies (As these dependencies are required by domain objects).
Therefore, my DAL Repositories have dependencies on Service Object.
And this results in a very clear cyclical dependency. I am confused about how I should handle this situation. Lastly I was thinking about letting my DAL return DTOs but it doesn't seem to be compatible with the onion architecture. Because the DTOs are defined in the Infrastructure, but the Core and the Service Layer should not know about Infrastucture.
Also, I'm not excited about changing the return types of all the methods of my repositories since I have hundreds of lines of code...
I would appreciate any kind of help, thanks !
UPDATE
Here is my code to make the situation more clear :
My Object (In the Core):
public class MyComplexClass1
{
MyComplexClass1 Property1 {get; set;}
MyComplexClass2 Property2 {get; set;}
private readonly IService MyService {get; set;}
public MyComplexClass1(IService MyService)
{
this.MyService = MyService;
this.Property1 = MyService.GetMyComplexClassList1();
.....
}
This is my Service Interface (In the Core)
public interface IService
{
MyComplexClass1 GetMyComplexClassList1();
...
}
This my Repository Interface (In the Core)
public interface IRepoComplexClass1
{
MyComplexClass1 GetMyComplexClassObject()
...
}
Now the Service Layer implements IService, and the DAL Layer Implements IRepoComplexClass1.
But my point is that in my repo, I need to construct my Domain Object
This is the Infrascruture Layer
using Core;
public Repo : IRepoComplexClass1
{
MyComplexClass1 GetMyComplexClassList1()
{
//Retrieve all the stuff...
//... And now it's time to convert the DTOs to Domain Objects
//I need to write
//DomainObject.Property1 = new MyComplexClass1(ID, Service);
//So my Repository has a dependency with my service and my service has a dependency with my repository, (Because my Service Methods, make use of the Repository). Then, Ninject is completely messed up.
}
I hope it's clearer now.
First of all, typically architectural guidance like the Onion Architecture and Domain Driven Design (DDD) do not fit all cases when designing a system. In fact, using these techniques is discouraged unless the domain has significant complexity to warrant the cost. So, the domain you are modelling is complex enough that it will not fit into a more simple pattern.
IMHO, both the Onion Architecture and DDD try to achieve the same thing. Namely, the ability to have a programmable (and perhaps easily portable) domain for complex logic that is devoid of all other concerns. That is why in Onion, for example, application, infrastructure, configuration and persistence concerns are at the edges.
So, in summary, the domain is just code. It can then utilize those cool design patterns to solve the complex problems at hand without worrying about anything else.
I really like the Onion articles because the picture of concentric barriers is different to the idea of a layered architecture.
In a layered architecture, it is easy to think vertically, up and down, through the layers. For example, you have a service on top which speaks the outside world (through DTOs or ViewModels), then the service calls the business logic, finally, the business logic calls down to some persistence layer to keep the state of the system.
However, the Onion Architecture describes a different way to think about it. You may still have a service at the top, but this is an application service. For example, a Controller in ASP.NET MVC knows about HTTP, application configuration settings and security sessions. But the job of the controller isn't just to defer work to lower (smarter) layers. The job is to as quickly as possible map from the application side to the domain side. So simply speaking, the Controller calls into the domain asking for a piece of complex logic to be executed, gets the result back, and then persists. The Controller is the glue that is holding things together (not the domain).
So, the domain is the centre of the business domain. And nothing else.
This is why some complain about ORM tools that need attributes on the domain entities. We want our domain completely clean of all concerns other than the problem at hand. So, plain old objects.
So, the domain does not speak directly to application services or repositories. In fact, nothing that the domain calls speaks to these things. The domain is the core, and therefore, the end of the execution stack.
So, for a very simple code example (adapted from the OP):
Repository:
// it is only infrastructure if it doesn't know about specific types directly
public Repository<T>
{
public T Find(int id)
{
// resolve the entity
return default(T);
}
}
Domain Entity:
public class MyComplexClass1
{
MyComplexClass1 Property1 {get; } // requred because cannot be set from outside
MyComplexClass2 Property2 {get; set;}
private readonly IService MyService {get; set;}
// no dependency injection frameworks!
public MyComplexClass1(MyComplexClass1 property1)
{
// actually using the constructor to define the required properties
// MyComplexClass1 is required and MyComplexClass2 is optional
this.Property1 = property1;
.....
}
public ComplexCalculationResult CrazyComplexCalculation(MyComplexClass3 complexity)
{
var theAnswer = 42;
return new ComplexCalculationResult(theAnswer);
}
}
Controller (Application Service):
public class TheController : Controller
{
private readonly IRepository<MyComplexClass1> complexClassRepository;
private readonly IRepository<ComplexResult> complexResultRepository;
// this can use IoC if needed, no probs
public TheController(IRepository<MyComplexClass1> complexClassRepository, IRepository<ComplexResult> complexResultRepository)
{
this.complexClassRepository = complexClassRepository;
this.complexResultRepository = complexResultRepository;
}
// I know about HTTP
public void Post(int id, int value)
{
var entity = this.complexClassRepository.Find(id);
var complex3 = new MyComplexClass3(value);
var result = entity.CrazyComplexCalculation(complex3);
this.complexResultRepository.Save(result);
}
}
Now, very quickly you will be thinking, "Woah, that Controller is doing too much". For example, how about if we need 50 values to construct MyComplexClass3. This is where the Onion Architecture is brilliant. There is a design pattern for that called Factory or Builder and without the constraints of application concerns or persistence concerns, you can implement it easily. So, you refactor into the domain these patterns (and they become your domain services).
In summary, nothing the domain calls knows about application or persistence concerns. It is the end, the core of the system.
Hope this makes sense, I wrote a little bit more than I intended. :)
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
We normally use abstract function/Interfaces in our projects. Why it is really needed? Why can't we just go for Business logic Layer, Data Access Layer and Presentation Layer only
Function in Presentation Layer:
abc();
Function in Business Logic Layer:
public void abc()
{
//Preparing the list
}
Function in Data Access Layer:
public abstract void abc();
Function in Data Access SQLServer Layer:
public override void abc()
{
//Connection with database
}
Question is: Why is the Data Access Layer required ?
The easiest way to understand this, imo, is an abstraction over DataLayer.
You have set a functions to retrieve a data from xml file. But one day your product scales out and xml is not enough like a data storage. So you pass to some embeded database: sqlite. But one day you need to reuse your library in some enterprise context. So now you need to develop access to sqlserver, oracle, webservice.... In all these changes you will need to change not only the code that actually access the data, but the code that actually consumes it too. And what about the code that already use for years your first xml data access on client and happy with it? How about backcompatibility?
Having the abstraction if not direcly solves most of this problems, but definitely makes scallable your application and more resistant to changes, that, in our world, happen sometimes too frequently.
Generally, if you use interfaces in your code, then you will gain code manuverability in the form of Dependency Injection.
This will help you replace parts of your implementation in certain situations for example providing Mock objects during Unit Testing.
The abstract class or interface is not really a separate layer - it should be part of your business logic layer and it defines the interface that the actual data access layer (SQL data repository, for example) needs to implement to provide the data access service to your business layer.
Without this interface your business layer would be directly dependent on the SQL layer, while the interface removes this dependency: You put the abstract class or the interface into the business logic layer. Then the SQL layer (a separate assembly, for example) implements the abstract class/interface. This way the SQL layer is dependent on the business layer, not the other way around.
The result is a flexible app with an independent business layer that can work with multiple data repositories - all it needs is a layer that implements the interface the business layer defines. And it is not really only about data repositories - your business layer shouldn't be dependent on the context (asp.net vs. console app vs. service etc.), it shouldn't be dependent on the user interface classes, modules interfacing with your business app, etc.
Why interfaces :
Have you ever used using in c# :
using (Form f = new Form())
{
}
Here you will see that you can use only those classes inside using which implements IDisposable interface .
Two things which does not know each other can interact with each other using Interfaces only.
Interface gurantees that "some" functionality has surely been implemented by this type.
Why layers :
So that you can have separate dlls which will let you to reuse in different application.
Basically all is for code reuse and Performance gain.
I think you are talking about Facade layer.
It is an optional layer which will simplify the functions of Business Layer. Let's imagine, you have a ProductManager and CategoryManager and you want to do a particular action which involves using both (for example, get me top 5 products in all categories) then you could use a facade layer that uses ProductManager and CategoryManager.
It is inspired by Facade Pattern.
The abstraction helps create functionality, be it through a base class, an interface, or composition which, when used properly, does wonders for maintenance, readability, and reusability of code.
In regards to the code posted in the question, the code marked "Data Access Layer" acts as a common abstraction for the business layer to use. By doing so, the specific implementations of the the DAL (such as what's under "Data Access SQLServer Layer" in the sample) are decoupled from the business layer. Now you can make implementations of the DAL that access different databases, or perhaps automatically feed data for testing, etc.
The repository pattern is a fantastic example of this at work in a DAL (example is simplified):
public interface IProductRepository
{
Product Get(int id);
...
}
public class SqlProductRepository : IProductRepository
{
public Product Get(int id) { ... }
...
}
public class MockProductRepository : IProductRepository
{
private IDictionary<int, Product> _products = new Dictionary<int, Product>()
{
{ 1, new Product() { Name = "MyItem" } }
};
public Product Get(int id) { return _products[id]; }
...
}
public class AwesomeBusinessLogic
{
private IProductRepository _repository;
public AwesomeBusinessLogic(IProductRepository repository)
{
_repository = repository;
}
public Product GetOneProduct()
{
return _repository.GetProduct(1);
}
}
Even though this example uses interfaces, the same applies to the use of base classes. The beauty is that now I can feed either SqlProductRepository or MockProductRepository into AwesomeBusinessLogic and not have to change anything about AwesomeBusinessLogic. If another case comes along, all that's needed is a new implementation of IProductRepository and AwesomeBusinessLogic will still handle it without change because it only accesses the repository through the interface.
All of the previous answers may explain the needs of abstract layers, but I still want to add some of my thoughts.
Let's say that in our project we just have one implementation of a service in each layer. For instance, I have a contact DAL and a contact BLL service , and we could do something like this
namespace Stackoverflow
{
public class ContactDbService
{
public Contact GetContactByID(Guid contactID)
{
//Fetch a contact from DB
}
}
}
Contact BLL service:
namespace Stackoverflow
{
public class ContactBLLService
{
private ContactDbService _dbService;
public ContactBLLService()
{
_dbService = new ContactDbService();
}
public bool CheckValidContact(Guid contactID)
{
var contact = _dbService.GetContactByID(contactID);
return contact.Age > 50;
}
}
}
without defining interfaces/ abstract classes.
If we do like that, there would be some obvious drawbacks.
Code communication:
Imagine that if your project involves, your services may have many different methods, how could a maintainer (apart from you) know what your services do? Will he have to read your entire service in order to fix a small bug like InvalidCastOperation?
By looking at the interface, people will have the immediate knowledge of the capabilities of the service(at least).
Unit testing
You could test your logic using a fake/mock service to detect bugs in advance as well as prevent regression bugs from happening later.
Easier to change:
By referencing only by interfaces/ abstract classes in other classes, you could easily replace those interface implementations later without too many efforts of work.
Abstraction enables you to do refactoring quickly. Think of instead of using SQL server, you decide to use some other provider; if you do not have a data access layer, then you to do a huge refactor because you are calling data access methods directly. But if you have a data access layer, you only write a new data access layer, inheriting from your abstract data access layer and you do not change anything in the business layer.
Say I have a 3-tier architecture (UI, Business, and Data). Usually, I create a 4th project called "Model" or "Common" to keep my data access objects and each of the other projects would then use this project.
Now I'm working on a project where some of my data access objects have methods like Save() etc that need access to the Data project. So, I would have a circular reference if I attempted to use the Model/Common project in the Data project.
In this scenario, where is the best place to keep the data access objects? I could keep it within the Data project itself, but then my UI project which needs to know about the data access objects, would need to access the Data layer, which is not good.
I don't think you have your n-tier quite right. It sounds like you're building more 2-tier systems.
In a real 3-tier project, only your data tier is allowed to talk to the database. You have that with your "Model" or "Common" projects. Those projects are your data tier. But where you veer off is that only the business tier should be allowed to talk to them. Your presentation code should not be allowed to talk to the data tier projects at all.
n-Tier comes in when you have more than 3 "tiers", but the same principle appliers: each tier only knows how to use (and only needs a reference to) the one below it, and then provides an api for the tier above it. In my own projects, I take your typical presentation, business, and data tiers and provide a 4th "translation" tier between business and data. This way the data tier can return generic types like dataset, datatable, and datarow, and the business tier only has to work in terms of strongly-typed business objects. The translation tier only converts between the generic data objects and strongly-typed objects. This way a change to one of the traditional tiers is less likely to require a change in another.
This is what I have in my project.
1.) Application.Infrastructure
Base classes for all businessobjects, busines object collection, data-access classes and my custom attributes and utilities as extension methods, Generic validation framework. This determines overall behavior organization of my final .net application.
2.) Application.DataModel
Typed Dataset for the Database.
TableAdapters extended to incorporate Transactions and other features I may need.
3.) Application.DataAccess
Data access classes.
Actual place where Database actions are queried using underlying Typed Dataset.
4.) Application.DomainObjects
Business objects and Business object collections.
Enums.
5.) Application.BusinessLayer
Provides manager classes accessible from Presentation layer.
HttpHandlers.
My own Page base class.
More things go here..
6.) Application.WebClient or Application.WindowsClient
My presentation layer
Takes references from Application.BusinessLayer and Application.BusinessObjects.
Application.BusinessObjects are used across the application and they travel across all layers whenever neeeded [except Application.DataModel and Application.Infrastructure]
All my queries are defined only Application.DataModel.
Application.DataAccess returns or takes Business objects as part of any data-access operation. Business objects are created with the help of reflection attributes. Each business object is marked with an attribute mapping to target table in database and properties within the business object are marked with attributes mapping to target coloumn in respective data-base table.
My validation framework lets me validate each field with the help of designated ValidationAttribute.
My framrwork heavily uses Attributes to automate most of the tedious tasks like mapping and validation. I can also new feature as new aspect in the framework.
A sample business object would look like this in my application.
User.cs
[TableMapping("Users")]
public class User : EntityBase
{
#region Constructor(s)
public AppUser()
{
BookCollection = new BookCollection();
}
#endregion
#region Properties
#region Default Properties - Direct Field Mapping using DataFieldMappingAttribute
private System.Int32 _UserId;
private System.String _FirstName;
private System.String _LastName;
private System.String _UserName;
private System.Boolean _IsActive;
[DataFieldMapping("UserID")]
[DataObjectFieldAttribute(true, true, false)]
[NotNullOrEmpty(Message = "UserID From Users Table Is Required.")]
public override int Id
{
get
{
return _UserId;
}
set
{
_UserId = value;
}
}
[DataFieldMapping("UserName")]
[Searchable]
[NotNullOrEmpty(Message = "Username Is Required.")]
public string UserName
{
get
{
return _UserName;
}
set
{
_UserName = value;
}
}
[DataFieldMapping("FirstName")]
[Searchable]
public string FirstName
{
get
{
return _FirstName;
}
set
{
_FirstName = value;
}
}
[DataFieldMapping("LastName")]
[Searchable]
public string LastName
{
get
{
return _LastName;
}
set
{
_LastName = value;
}
}
[DataFieldMapping("IsActive")]
public bool IsActive
{
get
{
return _IsActive;
}
set
{
_IsActive = value;
}
}
#region One-To-Many Mappings
public BookCollection Books { get; set; }
#endregion
#region Derived Properties
public string FullName { get { return this.FirstName + " " + this.LastName; } }
#endregion
#endregion
public override bool Validate()
{
bool baseValid = base.Validate();
bool localValid = Books.Validate();
return baseValid && localValid;
}
}
BookCollection.cs
/// <summary>
/// The BookCollection class is designed to work with lists of instances of Book.
/// </summary>
public class BookCollection : EntityCollectionBase<Book>
{
/// <summary>
/// Initializes a new instance of the BookCollection class.
/// </summary>
public BookCollection()
{
}
/// <summary>
/// Initializes a new instance of the BookCollection class.
/// </summary>
public BookCollection (IList<Book> initialList)
: base(initialList)
{
}
}
The Data layer should store information in terms of rows and columns (maybe using typed DataSets, if you like), if you are using a relational backend. No "business objects".
The Business layer should use your "business objects". It can have a reference to the BusinessObjects project.
In summary:
UI has references to Business and BusinessObjects
Business has references to BusinessObjects and Data
Hope this helps.
I have a BusinessObjects project, server side storing the mappings (ORM) and a corresponding DataAccess service exposing CRUD operations on them (and others also like GetAll) etc.
I would suggest creating and interface of what you want in the model project, and implementing that definition in the data layer. That way all three (four?) projects can use that definition, without knowing how it's implemented.
In my opinion, only the business layer should have knowledge of the data access objects. It should use them for data operations while applying its own business rules and logic, then return dumb objects (e.g. data transfer objects) to the UI layer above.
You could use some thing like AutoMapper to automatically map between you data and business objects.
It really depend on the pattern, if you are using MVC (Front Controller Pattern), the model Is the domain-specific representation of the data upon which the application operates (generally an ORM help with this) we use a DATA project for this classes.
Models are not data access objects, so the data access becomes in form of repositories in a Different project. Services for Business Rules and finally the Web project. In this approach the Data.dll is referenced in all projects.
The Model is like omnipresent.
DATA(Domain Model) -> REPOSITORY(Data Access) -> SERVICE(Business Rules) -> WEB