Exposing an object through a 'view' interface - c#

I've been trying to find a flexible way of exposing an object through a 'view'. I'm probably better off explaining by way of example.
I have an Entity Framework entity model, and a web service that can be used to query it. I am able to return the entity classes themselves, but this would include some fields I might not want to share - IDs, for examples, or *Reference properties from any associations in the entity model.
I figure what I need is a view of the data, but I don't particular want to write a view wrapper class for every return type. I'm hoping I'll be able to define an interface and somehow make use of that. For example:
interface IPersonView
{
string FirstName { get; }
string LastName { get; }
}
-
// (Web service method)
IPersonView GetPerson(int id)
{
var personEntity = [...];
return GetView<IPersonView>(personEntity);
}
However, in order to do something like this, I'd have to have my entities implement the view interfaces. I was hoping for a more flexible 'duck-typed' approach as there may be many views of an object, and I don't really to want to have to implement them all.
I've had some success building a dynamic type by reflecting the interface and copying fields and properties across, but I'm not able to cast this back to the interface type in order to get strong typing on the web service.
Just looking for some comments and advice, both would be welcome. Thanks.

You shouldn't ever really be passing entities directly out to a client, they should be used for persistance only. You should introduce DTOs/POCOs tailored to whatever data your API wants to return e.g.
public class PersonDto
{
public string FirstName { get; set; }
public string LastName { get; set; }
}
// public API method
public PersonDto GetPersonApi(int id)
{
var personEntity = // pull entity from db
return new PersonDto()
{
FirstName = personEntity.FirstName,
LastName = personEntity.LastName
};
}
This keeps a clean separation between your persistence layer & public interface. You can use a tool like AutoMapper to do the legwork in terms of mapping the data across. Just setup a mapping once e.g. in your global asax:
protected void Application_Start()
{
Mapper.CreateMap<Person, PersonDto>();
}
...
// public API method
public PersonDto GetPersonApi(int id)
{
var personEntity = // pull entity from db
return Mapper.Map<Person, PersonDto>(personEntity);
}

I typically see this done with AutoMapper or a similar tool. It makes mapping between similar classes much simpler. You still have to create the Views (which in an MVC-context would be a Model), but the most tedious part (the mapping) is taken care of for you so long as you use the same field names.
As a side note, sharing IDs and other reference data will be necessary if you want to update the data, since you'll need to know the keys in order to know which record(s) to update.

Related

Mapping one property in JSON using Automapper or Dapper? On which layer?

TL;DR:
Which approach with deserializing one JSON property would be the purest?
While fetching data from DB, using dapper multimap
While manual assignment, in service (or in some separate factory)
With automapper and MapFrom, in service
Long story:
Considering following design in DB:
Id
Name
Type
JSONAttributes
1
"Foo"
"Red"
{"Attr11":"Val11", "Attr12" : "Val12" }
2
"Bar"
"Green"
{"Attr21":"Val21", "Attr22" : "Val22" }
3
"Baz"
"Blue"
{"Attr31":"Val31", "Attr32" : "Val32" }
Every type has it's own attributes kept in last column in JSON string. The type is not returned by the procedure btw.
I need to transfer this data way up to the controller. I have onion (I believe) solution structure:
MyProject. Web (with controllers)
MyProject. Core (with services, DTOs, entities, interfaces)
MyProject. Data (with repositories, using Dapper)
... and others, not significant to this topic.
Also I have created the models to map these rows to. One abstract, generic and several derived:
namespace MyProject.Core.DTO.Details
{
public abstract class DtoItem
{
public int Id { get; set; }
public string Name { get; set; }
}
}
namespace MyProject.Core.DTO.Details
{
public class DtoTypeRed : DtoItem
{
public DtoTypeRedAttributes AttributesJson { get; set; }
}
}
namespace MyProject.Core.DTO.Details
{
public class DtoTypeGreen : DtoItem
{
public DtoTypeGreenAttributes AttributesJson { get; set; }
}
}
namespace MyProject.Core.DTO.Details
{
public class DtoTypeRedAttributes
{
[JsonPropertyName("Attr11")]
public string AttributeOneOne{ get; set; }
[JsonPropertyName("Attr12")]
public string AttributeOneTwo{ get; set; }
}
}
Also created entity, but only used in Option 2 (described later):
namespace MyProject.Core.Entities
{
public class Item
{
public int Id { get; set; }
public string Name { get; set; }
public string AttributesJson { get; set; }
}
}
My question is, what would be a better approach:
Option 1 - mapping to the DTO directly, in ItemRepository, in MyProject.Data, when fetching the data from the DB using Dapper multimap, like:
namespace MyProject.Data
{
public class DetailsRepository : IDetailsRepository
{
public async Task<DtoItem> GetDetails(int itemId, int itemTypeId)
{
using (var connection = _dataAccess.GetConnection())
{
switch (itemTypeId)
{
case 1:
return (await connection.QueryAsync<DtoTypeRed,string,DtoTypeRed>("[spGetDetails]", (redDetails, redDetailsAttributesJson) =>
{
redDetails.AttributesJson = JsonSerializer.Deserialize<List<DtoTypeRed>>(redDetailsAttributesJson).FirstOrDefault();
return redDetails;
},
splitOn: "AttributesJson",
param: new { itemId ,itemTypeId},
commandType: CommandType.StoredProcedure)).FirstOrDefault();
case 2:
return (await connection.QueryAsync<DtoTypeGreen,string,DtoTypeGreen>("[spGetDetails]", (greenDetails, greenDetailsAttributesJson) =>
{
greenDetails.AttributesJson = JsonSerializer.Deserialize<List<DtoTypeGreen>>(greenDetailsAttributesJson).FirstOrDefault();
return greenDetails;
},
splitOn: "AttributesJson",
param: new { itemId ,itemTypeId},
commandType: CommandType.StoredProcedure)).FirstOrDefault();
case ...
default: return null;
}
}
}
}
}
My colleague is suggesting that doing this kind of business logic in repository is not a good approach.
So, there is one alternative (at least one I'm aware of) - fetching data into Item Entity (leave the JSON in flat string), and map it to the DTOs on the Service layer (MyProject.Core) either with simple assignment (Option 2.1), which I found not much elegant way, or using Automapper (Option 2.2)
namespace MyProject.Data
{
public class DetailsRepository : IDetailsRepository
{
public async Task<Item> GetDetails(int itemId, int itemTypeId)
{
using (var connection = _dataAccess.GetConnection())
{
return await connection.QuerySingleAsync<Item>("[spGetDetails]", param: new { itemId ,itemTypeId}, commandType: CommandType.StoredProcedure);
}
}
}
}
Option 2.1:
namespace MyProject.Core.Services
{
public class DetailsService : IDetailsService
{
public async Task<DtoItem> GetDetails(int itemId, int itemTypeId)
{
var itemDetailsEntity = await _detailsRepo.GetDetails(int itemId, int itemTypeId);
switch(itemTypeId){
case 1:
var result = new DtoTypeRed();
result.Id= itemDetailsEntity.Id;
result.Name = itemDetailsEntity.Name;
result.AttributesJson = JsonSerializer.Deserialize<List<DtoTypeRedAttributes>>(itemDetailsEntity.AttributesJson).FirstOrDefault();
case ...
}
return result;
}
}
}
If this is the way, maybe some factory would be better? (I don't have any yet)
Option 2.2 with Automapper:
The thing is, I read somewhere, that Automapper is not really meant to include the JSON deserialization logic - it should be more "auto"
namespace MyProject.Core.Mappings
{
public class MapperProfiles : Profile
{
public MapperProfiles()
{
CreateMap<Entities.Item, DTO.Details.DtoTypeRed>()
.ForMember(dest => dest.AttributesJson, opts => opts.MapFrom(src =>JsonSerializer.Deserialize<List<DtoTypeRedAttributes>>(src.AttributesJson, null).FirstOrDefault()));
(...)
}
}
}
namespace MyProject.Core.Services
{
public class DetailsService : IDetailsService
{
public async Task<DtoItem> GetDetails(int itemId, int itemTypeId)
{
var itemDetailsEntity = await _detailsRepo.GetDetails(int itemId, int itemTypeId);
switch(itemTypeId){
case 1:
return _mapper.Map<DtoTypeRed>(itemDetailsEntity);
case 2:
return _mapper.Map<DtoTypeGreen>(itemDetailsEntity);
case ...
}
}
}
}
I really lack this "architectural" knowledge and experience, so any suggestions would be really appreciated. Maybe some other way I do not see here?
Considering your software design is rather unusual, I think there's two questions you should ask yourself first:
You're using a relational DB, but are also using it to store a JSON (as in a NoSQL DB). Is this is a preexisting schema where you don't have any influence on changing it? If not, why are you using this design and not separate tables for the different data structures, as you would normally do in a relational schema? This would also give you the advantages of a relational DB like querying, foreign keys, indexing.
If you have a controller where you hand out your object anyway, you will probably do this as a JSON, right? Then what would be the point of deserializing it? Can't you just leave the JSON as is?
Apart from that, if you want to stick to your options, I would go with a solution similar to your option 1. Your colleague is right that you don't want to have business logic in a repository, since the responsibilty of a repository is only to store and query data. Business logic in an onion architecture belongs to your service layer. However, if you are only deserializing data from the DB to a structure that is usable to you in your program, then this logic should be in the repository. All you do in this case is to fetch data and transform it into objects, same thing as an ORM tool would do.
One thing I would change in option 1 though, is to move the switch-case to your service layer, since this is business logic. So:
public class DetailsService : IDetailService {
public async Task<Item> GetDetails(int itemId, int itemTypeId) {
Item myItem;
switch(itemTypeId){
case 1:
myItem = _repo.getRedItem(itemId);
// and so on
}
}
}
Your repository would then have methods for the individual item types. So again, all it does is querying and deserializing data.
On your other two options: the point of DTOs usually is to have separate objects which you can share with other processes (like when sending them out in a controller). This way, changes in the DTO then don't influence changes in your entity and vice versa, as well as you can choose which properties you want to include in your DTO or entity and which not. In your case it seems you are only really using the DTOs in your program and the entities are just there as an intermediate step, which makes it pointless to even have any entity.
Also, using AutoMapper in your scenario only makes the code more complicated, which you can see in your code in MapperProfiles, where you probably agree with me that it isn't easy to understand and maintain.
Jakob's answer is good and I agree with a lot that he says. To add to that...
The challenge you have is that you want to be as pure to Clean Architecture as possible, but you're constrained by a system where there are two conflicting architectural patterns:
The "traditional" approach where the data structures are explicit.
Alternative approach where the data structures are mixed: some data structures are defined by the database (table design) whilst others are structured in the code (JSON) - of which the database has no visibility (to the database it's just a string).
A further complication is at the code level where you have a classic layered architecture with explicitly defined objects and layers, and then you have this mutant JSON thing.
Based on the information do far i wonder if you can use a generic list for the JSON attributes - because your code is taking the JSON and converting it into essentially key-value pairs - at least that's what's suggested by your code:
MyProject.Core.DTO.Details.DtoTypeRedAttributes.AttributeOneOne{ get; set; }
This says to me that the Controller is getting AttributeOneOne which will have a name/key and value inside it. Whatever consumes this will care about the name/key, but to the rest of your code you just have generic attributes.
I'm also suspicious/curious of the type field in the database which you say is 'never returned by the procedure', but you have type specific code? e.g. DtoTypeRedAttributes.
Based on all of that, I would simplify things (which still keeps you within Clean):
Deserializing the JSON into a generic key/value pair:
This could happen in either then data access layer or within the DTO itself.
A straight JSON to Key/Value pair conversion is purely technical and does not affect any business rule, so there's no business rule/logic drivers that influence where that can happen.
Q: How expensive is the deserialization? And are those attributes always accessed? If it's expense, and the attributes aren't always used, then you can use Lazy Load to only trigger the deserialization when it's needed - this would suggest doing it in the DTO, because if you do it in the data access layer you'll get that deserialization penalty every time, regardless.
A potential downside, if you care about it, is that you'll have a DTO with a private string RawJSONAttributes property and a public List<string,string> JSONAttributes key/value pair property. I.e. larger memory foot-print, and you'll have to consider consistency (do you care about the private string if someone changes the public values, etc).
DTO Consolidation
If the DTO's can get away with only providing a generic list of key/value pairs, then based on what else you've said so far, I don't see why you have separate red,green,blue code. One DTO would do it.
If you do need separate types then yes, some sort of factory would probably be a good approach.
The Data Access code could return "generic" DTOs (like WidgetInfo, in my diagram) up to the consuming logic.
That logic (which did the DTO conversion) would probably shield the consumers / Controllers above, so that they only saw BlueWidget, RedWidget (etc) DTOs and not the generic one.
Summary
The diagram shows how I would typically do this (consuming Controllers not shown), assuming generic DTO's were ok. Let me know if I need to diagram out the other approach (conversion of generic WidgetInfo to BlueWidget/RedWidget DTOs).
Note that the DTO has a public property for the attributes (list of key/value pairs); it would also have a private string for the RawJSONAttributes if you wanted to take the Lazy Load approach in the DTO.

Understanding rich domain models and dependencies

I'm trying to get my head around rich domain models and how to build semantic functionality into domain entities, where the domain entities are not tightly coupled to objects that provide implementations for semantic behaviour
For example, I want to build a User entity into my domain model, but I want it's implementation to be driven by identity framework
class User
{
public string Email { get; set; }
... All of the other IdentityUser properties...
public void DisableUser()
{
...behaviour to disable a user, most likely requires UserManager
}
public void AddToRole(Role role)
{
... most likely requires RoleManager
}
}
So now that I have a domain model that behaves according to the business rules, and is ignorant to persistence and implementation.
But how exactly are DisableUser() and AddToRole() supposed to work when they have no dependencies and aren't in any way coupled to UserManager and RoleManager?
Generally, what am I missing?
Should domain entities have dependencies on objects that provide behavior?
How should I decouple my domain model from implementation providers?
What I do is that I have each of my rich domain model entities receive a reference to the central domain object as a constructor parameter, and store it as a readonly member.
This is easy because the domain acts as the factory of its entities, so whenever it news one of them, it passes this as the first constructor parameter. (Entities are supposed to have assembly-internal constructors so that they cannot be instantiated by anyone but the domain itself.)
And if you really dig into the documentation of ORM frameworks you will usually find that they tend to allow you to supply a factory for your entities, so you can do things like that.
So, since each entity has a reference to the domain, it can obtain from it whatever it needs to do its job. (Presumably, your domain object will contain a reference to a UserManager and to a RoleManager, no?) This is essentially taking a pragmatic step back from dependency injection: you inject the domain object with its dependencies, but you have each entity of the domain fetch its dependencies from the domain object.
Here is an example in java:
package ...
import ...
public final class StarWarsDomain extends Domain
{
private static final Schema SCHEMA = ...
public StarWarsDomain( LogicDomain logicDomain, S2Domain delegeeDomain )
{
super( logicDomain, SCHEMA, delegeeDomain ); //these get stored in final members of 'Domain'
}
public UnmodifiableEnumerable<Film> getAllFilms()
{
return getAllEntitys( Film.COLONNADE ); //method of 'Domain'
}
public Film newFilm( String name )
{
assert !StringHelpers.isNullOrEmptyOrWhitespace( name );
Film film = addEntity( Film.COLONNADE ); //method of 'Domain'
film.setName( name );
return film;
}
}
A well crafted domain model should have no dependencies on any other architectural layers or services. With respect, domain model objects should be (in my case) POCOs (Plain Old CLR Objects). Services and layers such as business logic or persistence layers should then depend on these objects and return instances of them.
There are several keys to building a domain model that respects low coupling, high cohesion and persistence ignorance. In one statement, the secret to this is "write the code you wish you had".
Domain Model Example
public class Student
{
// Collections should be encapsulated!
private readonly ICollection<Course> courses;
// Expose constructors that express how students can be created.
// Notice that this constructor calls the default constructor in order to initialize the courses collection.
public Student(string firstName, string lastName, int studentNumber) : this()
{
FirstName = firstName;
LastName = lastName;
StudentNumber = studentNumber;
}
// Don't allow this constructor to be called from code.
// Your persistence layer should however be able to call this via reflection.
private Student()
{
courses = new List<Course>();
}
// This will be used as a primary key.
// We should therefore not have the ability to change this value.
// Leave that responsibility to the persistence layer.
public int Id { get; private set; }
// It's likely that students names or numbers won't change,
// so set these values in the constructor, and let the persistence
// layer populate these fields from the database.
public string FirstName { get; private set; }
public string LastName {get; private set; }
public int StudentNumber { get; private set; }
// Only expose courses via something that is read-only and can only be iterated over.
// You don't want someone overwriting your entire collection.
// You don't want someone clearing, adding or removing things from your collection.
public IEnumerable<Course> Courses => courses;
// Define methods that describe semantic behaviour for what a student can do.
public void Subscribe(Course course)
{
if(courses.Contains(course))
{
throw new Exception("Student is already subscribed to this course");
}
courses.Add(course);
}
public void Ubsubscribe(Course course)
{
courses.Remove(course);
}
}
Granted, this domain model object was written with Entity Framework in mind, but it's a far cry from the usual Entity Framework examples (which are anemic domain models by contrast). There are a few caveats that need to be considered when crafting domain model objects in this way, but Entity Framework will persist them (with a little jiggery-pokery), and you get a domain model object that defines a clean, semantic contract to layers that depend on it.

Best way to project ViewModel back into Model

Consider having a ViewModel:
public class ViewModel
{
public int id { get; set; }
public int a { get; set; }
public int b { get; set; }
}
and an original Model like this:
public class Model
{
public int id { get; set; }
public int a { get; set; }
public int b { get; set; }
public int c { get; set; }
public virtual Object d { get; set; }
}
Each time I get the view model I have to put all ViewModel properties one by one into Model. Something like:
var model = Db.Models.Find(viewModel.Id);
model.a = viewModel.a;
model.b = viewModel.b;
Db.SaveChanges();
Which always cause lots of problems. I even sometimes forget to mention some properties and then disaster happens!
I was looking for something like:
Mapper.Map(model, viewModel);
BTW: I use AutoMapper only to convert Model to ViewModel but vice-versa I always face errors.
Overall that might be not the answer, that you are looking for, but here's a quote from AutoMapper author:
I can’t for the life of me understand why I’d want to dump a DTO
straight back in to a model object.
I believe best way to map from ViewModel to Entity is not to use AutoMapper for this. AutoMapper is a great tool to use for mapping objects without using any other classes other than static. Otherwise, code gets messier and messier with each added service, and at some point you won't be able to track what caused your field update, collection update, etc.
Specific issues often faced:
Need for non-static classes to do mapping for your entities
You might need to use DbContext to load and reference entities, you might also need other classes - some tool that does image upload to your file storage, some non-static class that does hashing/salt for password, etc etc... You either have to pass it somehow to automapper, inject or create inside AutoMapper profile, and both practices are pretty troublemaking.
Possible need for multiple mappings over same ViewModel(Dto) -> Entity Pair
You might need different mappings for same viewmodel-entity pair, based on if this entity is an aggregate, or not + based on if you need to reference this entity or reference and update. Overall this is solvable, but causes a lot of not-needed noise in code and is even harder to maintain.
Really dirty code that's hard to maintain.
This one is about automatic mapping for primitives (strings, integers, etc) and manual mapping references, transformed values, etc. Code will look really weird for automapper, you would have to define maps for properties (or not, if you prefer implicit automapper mapping - which is also destructive when paired with ORM) AND use AfterMap, BeforeMap, Conventions, ConstructUsing, etc.. for mapping other properties, which complicates stuff even more.
Complex mappings
When you have to do complex mappings, like mapping from 2+ source classes to 1 destination class, you will have to overcomplicate things even more, probably calling code like:
var target = new Target();
Mapper.Map(source1, target);
Mapper.Map(source2, target);
//etc..
That code causes errors, because you cannot map source1 and source2 together, and mapping might depend on order of mapping source classes to target. And I'm not talking if you forget to do 1 mapping or if your maps have conflicting mappings over 1 property, overwriting each other.
These issues might seem small, but on several projects where I faced usage of automapping library for mapping ViewModel/Dto to Entity, it caused much more pain than if it was never used.
Here are some links for you:
Jimmy Bogard, author of AutoMapper about 2-way mapping for your entities
A small article with comments about problems faced when mapping ViewModel->Entity with code examples
Similar question in SO: Best Practices For Mapping DTO to Domain Object?
For this purpose we have written a simple mapper. It maps by name and ignores virtual properties (so it works with entity framework). If you want to ignore certain properties add a PropertyCopyIgnoreAttribute.
Usage:
PropertyCopy.Copy<ViewModel, Model>(vm, dbmodel);
PropertyCopy.Copy<Model, ViewModel>(dbmodel, vm);
Code:
public static class PropertyCopy
{
public static void Copy<TDest, TSource>(TDest destination, TSource source)
where TSource : class
where TDest : class
{
var destProperties = destination.GetType().GetProperties()
.Where(x => !x.CustomAttributes.Any(y => y.AttributeType.Name == PropertyCopyIgnoreAttribute.Name) && x.CanRead && x.CanWrite && !x.GetGetMethod().IsVirtual);
var sourceProperties = source.GetType().GetProperties()
.Where(x => !x.CustomAttributes.Any(y => y.AttributeType.Name == PropertyCopyIgnoreAttribute.Name) && x.CanRead && x.CanWrite && !x.GetGetMethod().IsVirtual);
var copyProperties = sourceProperties.Join(destProperties, x => x.Name, y => y.Name, (x, y) => x);
foreach (var sourceProperty in copyProperties)
{
var prop = destProperties.FirstOrDefault(x => x.Name == sourceProperty.Name);
prop.SetValue(destination, sourceProperty.GetValue(source));
}
}
}
I want to address a specific point in your question, regarding "forgetting some properties and disaster happens". The reason this happens is that you do not have a constructor on your model, you just have setters that can be set (or not) from anywhere. This is not a good approach for defensive coding.
I use constructors on all my Models like so:
public User(Person person, string email, string username, string password, bool isActive)
{
Person = person;
Email = email;
Username = username;
Password = password;
IsActive = isActive;
}
public Person Person { get; }
public string Email { get; }
public string Username { get; }
public string Password { get; }
public bool IsActive { get; }
As you can see I have no setters, so object construction must be done via constructor. If you try to create an object without all the required parameters the compiler will complain.
With this approach it becomes clear, that tools like AutoMapper don't make sense when going from ViewModel to Model, as Model construction using this pattern is no longer about simple mapping, its about constructing your object.
Also as your Models become more sophisticated you will find that they differ significantly from your ViewModels. ViewModels tend to be flat with simple properties like string, int, bool etc. Models on the other hand often include custom objects. You will notice in my example there is a Person object, but UserViewModel would use primitives instead like so:
public class UserViewModel
{
public int Id { get; set; }
public string LastName { get; set; }
public string FirstName { get; set; }
public string Email { get; set; }
public string Username { get; set; }
public string Password { get; set; }
public bool IsActive { get; set;}
}
So mapping from primitives to complex objects limits AutoMapper's usefulness.
My approach is always manual construction for the ViewModels to Model direction. In the other direction, Models to ViewModels, I often use a hybrid approach, I would manually map Person to FirstName, LastName, I'd but use a mapper for simple properties.
Edit: Based on the discussion below, AutoMapper is better at unflattering than I believed. Though I will refrain from recommending it one way or the other, if you do use it take advantage of features like Construction and Configuration Validation to help prevent silent failures.
Use Newtonsoft.Json to serialize viewmodel first and deserialize it to model.
First we need to Serialize the viewmodel:
var viewmodel = JsonConvert.SerializeObject(companyInfoViewModel);
Then Deserialize it to model:
var model = JsonConvert.DeserializeObject<CompanyInfo>(viewmodel);
Hence, all the data is passed from viewmodel to model easily.
One Line Code:
var company = JsonConvert.DeserializeObject<CompanyInfo>(JsonConvert.SerializeObject(companyInfoViewModel));

How to map to internal properties using Automapper?

we've been using Automapper for sometime and we think it is great utility, thanks for creating it!
However, we have a question:
Question
"How do you configure AutoMapper to map a source property to an internal destination property?"
Background
In our layered architecture, Dto objects never leave the Data Access layer, only Domain objects are allowed to pass in and out of the Data Access layer. Thus, from a domain POV, domain objects shouldn't contain any database knowledge. However, in reality database Ids are very useful to carry around - expect the 'business-layer' developer shouldn't know about them.
Solution: add the database Ids to the domain object but market them as internal so that they aren't exposed to the 'business-layer'. Next expose the Common layer (which owns the domain objects) internals to the Data Access layer. Problem solved. Expect we can't figure out how to get Automapper (> v3.3.0) to work with our internal properties.
In, version 3.3.0 BindingFlags were exposed, which use to solve the problem.
Example
Common.Dll
public class Person
{
public Parent Father { get; set; }
internal int FatherId {get; private set; }
}
DataAccess.dll
internal class PersonDto
{
public ParentDto Father { get; set; }
public int FatherId {get; private set; }
}
In our Profile class we have CreateMap<PersonDto, Person>();
Edit 1 - Fixed a typo in the return type of Father.
Edit 2 - Added more info..
In the Common.Dll, we have Services something like this:
public class ParentService
{
public Parent GetFather(Person person)
{
return repo.Parents.FirstOrDefault(parent => parent.Id = person.Father.Id);
}
}
And in the Business.Dll we have developer's using the Services something like this:
var father = parentService.GetFather(son);
// use father separately or assign it to the son. Like so:
// son.Father = father;
The whole point is, we don't want the business developer's to have access to son.FatherId from the Businssess.Dll nor do they have access to the Dto object that created the domain object.
Thus, all the 'database' knowledge is encapsulated within in the various Common.dll Services or in the DataAccess.dll.
Thanks.
This question is answered here.
I quote the answer for your convenience:
Just set the ShouldMapProperty property of your configuration object
in the initialize method.
Here is an example using the static API, however, you should be able
to achieve the same in a similar fashion by using the non-static API.
Mapper.Initialize(i =>
{
i.ShouldMapProperty = p => p.GetMethod.IsPublic || p.GetMethod.IsAssembly;
i.CreateMap<Source, Target>();
});

Rich domain model with behaviours and ORM

After watching NDC12 presentation "Crafting Wicked Domain Models" from Jimmy Bogard (http://ndcoslo.oktaset.com/Agenda), I was wandering how to persist that kind of domain model.
This is sample class from presentation:
public class Member
{
List<Offer> _offers;
public Member(string firstName, string lastName)
{
FirstName = firstName;
LastName = lastName;
_offers = new List<Offer>();
}
public string FirstName { get; set; }
public string LastName { get; set; }
public IEnumerable<Offer> AssignedOffers {
get { return _offers; }
}
public int NumberOfOffers { get; private set; }
public Offer AssignOffer(OfferType offerType, IOfferValueCalc valueCalc)
{
var value = valueCalc.CalculateValue(this, offerType);
var expiration = offerType.CalculateExpiration();
var offer = new Offer(this, offerType, expiration, value);
_offers.Add(offer);
NumberOfOffers++;
return offer;
}
}
so there are some rules contained in this domain model:
- Member must have first and last name
- Number of offers can't be changed outside
- Member is responsible for creating new offer, calculating its value and assignment
If if try to map this to some ORM like Entity Framework or NHibernate, it will not work.
So, what's best approach for mapping this kind of model to database with ORM?
For example, how do I load AssignedOffers from DB if there's no setter?
Only thing that does make sense for me is using command/query architecture: queries are always done with DTO as result, not domain entities, and commands are done on domain models. Also, event sourcing is perfect fit for behaviours on domain model. But this kind of CQS architecture isn't maybe suitable for every project, specially brownfield. Or not?
I'm aware of similar questions here, but couldn't find concrete example and solution.
This is actually a very good question and something I have contemplated. It is potentially difficult to create proper domain objects that are fully encapsulated (i.e. no property setters) and use an ORM to build the domain objects directly.
In my experience there are 3 ways of solving this issue:
As already mention by Luka, NHibernate supports mapping to private fields, rather than property setters.
If using EF (which I don't think supports the above) you could use the memento pattern to restore state to your domain objects. e.g. you use entity framework to populate 'memento' objects which your domain entities accept to set their private fields.
As you have pointed out, using CQRS with event sourcing eliminates this problem. This is my preferred method of crafting perfectly encapsulated domain objects, that also have all the added benefits of event sourcing.
Old thread. But there's a more recent post (late 2014) by Vaughn Vernon that addresses just this scenario, with particular reference to Entity Framework. Given that I somehow struggled to find such information, maybe it can be helpful to post it here as well.
Basically the post advocates for the Product domain (aggregate) object to wrap the ProductState EF POCO data object for what concerns the "data bag" side of things. Of course the domain object would still add all its rich domain behaviour through domain-specific methods/accessors, but it would resort to inner data object when it has to get/set its properties.
Copying snippet straight from post:
public class Product
{
public Product(
TenantId tenantId,
ProductId productId,
ProductOwnerId productOwnerId,
string name,
string description)
{
State = new ProductState();
State.ProductKey = tenantId.Id + ":" + productId.Id;
State.ProductOwnerId = productOwnerId;
State.Name = name;
State.Description = description;
State.BacklogItems = new List<ProductBacklogItem>();
}
internal Product(ProductState state)
{
State = state;
}
//...
private readonly ProductState State;
}
public class ProductState
{
[Key]
public string ProductKey { get; set; }
public ProductOwnerId ProductOwnerId { get; set; }
public string Name { get; set; }
public string Description { get; set; }
public List<ProductBacklogItemState> BacklogItems { get; set; }
...
}
Repository would use internal constructor in order to instantiate (load) an entity instance from its DB-persisted version.
The one bit I can add myself, is that probably Product domain object should be dirtied with one more accessor just for the purpose of persistence through EF: in the same was as new Product(productState) allows a domain entity to be loaded from database, the opposite way should be allowed through something like:
public class Product
{
// ...
internal ProductState State
{
get
{
// return this.State as is, if you trust the caller (repository),
// or deep clone it and return it
}
}
}
// inside repository.Add(Product product):
dbContext.Add(product.State);
For AssignedOffers : if you look at the code you'll see that AssignedOffers returns value from a field. NHibernate can populate that field like this: Map(x => x.AssignedOffers).Access.Field().
Agree with using CQS.
When doing DDD first thing, you ignore the persistence concerns. THe ORM is tighlty coupled to a RDBMS so it's a persistence concern.
An ORM models persistence structure NOT the domain. Basically the repository must 'convert' the received Aggregate Root to one or many persistence entities. The Bounded Context matters a lot since the Aggregate Root changes according to what are you trying to accomplish as well.
Let's say you want to save the Member in the context of a new offer assigned. Then you'll have something like this (of course this is only one possible scenario)
public interface IAssignOffer
{
int OwnerId {get;}
Offer AssignOffer(OfferType offerType, IOfferValueCalc valueCalc);
IEnumerable<Offer> NewOffers {get; }
}
public class Member:IAssignOffer
{
/* implementation */
}
public interface IDomainRepository
{
void Save(IAssignOffer member);
}
Next the repo will get only the data required in order to change the NH entities and that's all.
About EVent Sourcing, I think that you have to see if it fits your domain and I don't see any problem with using Event Sourcing only for storing domain Aggregate Roots while the rest (mainly infrastructure) can be stored in the ordinary way (relational tables). I think CQRS gives you great flexibility in this matter.

Categories

Resources