Consider having a ViewModel:
public class ViewModel
{
public int id { get; set; }
public int a { get; set; }
public int b { get; set; }
}
and an original Model like this:
public class Model
{
public int id { get; set; }
public int a { get; set; }
public int b { get; set; }
public int c { get; set; }
public virtual Object d { get; set; }
}
Each time I get the view model I have to put all ViewModel properties one by one into Model. Something like:
var model = Db.Models.Find(viewModel.Id);
model.a = viewModel.a;
model.b = viewModel.b;
Db.SaveChanges();
Which always cause lots of problems. I even sometimes forget to mention some properties and then disaster happens!
I was looking for something like:
Mapper.Map(model, viewModel);
BTW: I use AutoMapper only to convert Model to ViewModel but vice-versa I always face errors.
Overall that might be not the answer, that you are looking for, but here's a quote from AutoMapper author:
I can’t for the life of me understand why I’d want to dump a DTO
straight back in to a model object.
I believe best way to map from ViewModel to Entity is not to use AutoMapper for this. AutoMapper is a great tool to use for mapping objects without using any other classes other than static. Otherwise, code gets messier and messier with each added service, and at some point you won't be able to track what caused your field update, collection update, etc.
Specific issues often faced:
Need for non-static classes to do mapping for your entities
You might need to use DbContext to load and reference entities, you might also need other classes - some tool that does image upload to your file storage, some non-static class that does hashing/salt for password, etc etc... You either have to pass it somehow to automapper, inject or create inside AutoMapper profile, and both practices are pretty troublemaking.
Possible need for multiple mappings over same ViewModel(Dto) -> Entity Pair
You might need different mappings for same viewmodel-entity pair, based on if this entity is an aggregate, or not + based on if you need to reference this entity or reference and update. Overall this is solvable, but causes a lot of not-needed noise in code and is even harder to maintain.
Really dirty code that's hard to maintain.
This one is about automatic mapping for primitives (strings, integers, etc) and manual mapping references, transformed values, etc. Code will look really weird for automapper, you would have to define maps for properties (or not, if you prefer implicit automapper mapping - which is also destructive when paired with ORM) AND use AfterMap, BeforeMap, Conventions, ConstructUsing, etc.. for mapping other properties, which complicates stuff even more.
Complex mappings
When you have to do complex mappings, like mapping from 2+ source classes to 1 destination class, you will have to overcomplicate things even more, probably calling code like:
var target = new Target();
Mapper.Map(source1, target);
Mapper.Map(source2, target);
//etc..
That code causes errors, because you cannot map source1 and source2 together, and mapping might depend on order of mapping source classes to target. And I'm not talking if you forget to do 1 mapping or if your maps have conflicting mappings over 1 property, overwriting each other.
These issues might seem small, but on several projects where I faced usage of automapping library for mapping ViewModel/Dto to Entity, it caused much more pain than if it was never used.
Here are some links for you:
Jimmy Bogard, author of AutoMapper about 2-way mapping for your entities
A small article with comments about problems faced when mapping ViewModel->Entity with code examples
Similar question in SO: Best Practices For Mapping DTO to Domain Object?
For this purpose we have written a simple mapper. It maps by name and ignores virtual properties (so it works with entity framework). If you want to ignore certain properties add a PropertyCopyIgnoreAttribute.
Usage:
PropertyCopy.Copy<ViewModel, Model>(vm, dbmodel);
PropertyCopy.Copy<Model, ViewModel>(dbmodel, vm);
Code:
public static class PropertyCopy
{
public static void Copy<TDest, TSource>(TDest destination, TSource source)
where TSource : class
where TDest : class
{
var destProperties = destination.GetType().GetProperties()
.Where(x => !x.CustomAttributes.Any(y => y.AttributeType.Name == PropertyCopyIgnoreAttribute.Name) && x.CanRead && x.CanWrite && !x.GetGetMethod().IsVirtual);
var sourceProperties = source.GetType().GetProperties()
.Where(x => !x.CustomAttributes.Any(y => y.AttributeType.Name == PropertyCopyIgnoreAttribute.Name) && x.CanRead && x.CanWrite && !x.GetGetMethod().IsVirtual);
var copyProperties = sourceProperties.Join(destProperties, x => x.Name, y => y.Name, (x, y) => x);
foreach (var sourceProperty in copyProperties)
{
var prop = destProperties.FirstOrDefault(x => x.Name == sourceProperty.Name);
prop.SetValue(destination, sourceProperty.GetValue(source));
}
}
}
I want to address a specific point in your question, regarding "forgetting some properties and disaster happens". The reason this happens is that you do not have a constructor on your model, you just have setters that can be set (or not) from anywhere. This is not a good approach for defensive coding.
I use constructors on all my Models like so:
public User(Person person, string email, string username, string password, bool isActive)
{
Person = person;
Email = email;
Username = username;
Password = password;
IsActive = isActive;
}
public Person Person { get; }
public string Email { get; }
public string Username { get; }
public string Password { get; }
public bool IsActive { get; }
As you can see I have no setters, so object construction must be done via constructor. If you try to create an object without all the required parameters the compiler will complain.
With this approach it becomes clear, that tools like AutoMapper don't make sense when going from ViewModel to Model, as Model construction using this pattern is no longer about simple mapping, its about constructing your object.
Also as your Models become more sophisticated you will find that they differ significantly from your ViewModels. ViewModels tend to be flat with simple properties like string, int, bool etc. Models on the other hand often include custom objects. You will notice in my example there is a Person object, but UserViewModel would use primitives instead like so:
public class UserViewModel
{
public int Id { get; set; }
public string LastName { get; set; }
public string FirstName { get; set; }
public string Email { get; set; }
public string Username { get; set; }
public string Password { get; set; }
public bool IsActive { get; set;}
}
So mapping from primitives to complex objects limits AutoMapper's usefulness.
My approach is always manual construction for the ViewModels to Model direction. In the other direction, Models to ViewModels, I often use a hybrid approach, I would manually map Person to FirstName, LastName, I'd but use a mapper for simple properties.
Edit: Based on the discussion below, AutoMapper is better at unflattering than I believed. Though I will refrain from recommending it one way or the other, if you do use it take advantage of features like Construction and Configuration Validation to help prevent silent failures.
Use Newtonsoft.Json to serialize viewmodel first and deserialize it to model.
First we need to Serialize the viewmodel:
var viewmodel = JsonConvert.SerializeObject(companyInfoViewModel);
Then Deserialize it to model:
var model = JsonConvert.DeserializeObject<CompanyInfo>(viewmodel);
Hence, all the data is passed from viewmodel to model easily.
One Line Code:
var company = JsonConvert.DeserializeObject<CompanyInfo>(JsonConvert.SerializeObject(companyInfoViewModel));
Related
I faced a problem in a project and have successfully repro'd it in a bare test project.
I have the following dtos:
public class AppUserDto
{
public int Id { get; set; }
public string Name { get; set; }
}
public class IssueDto
{
public int Id { get; set; }
public AppUserDto Owner { get; set; }
public AppUserDto Creator { get; set; }
}
The corresponding models are absolutely the same except that there are model relationships instead of DTOs (obviously).
AutoMapper config:
Mapper.CreateMap<AppUser, AppUserDto>().MaxDepth(1);
Mapper.CreateMap<Issue, IssueDto>().MaxDepth(1);
The simplest of queries:
var i = context.Issues.ProjectTo<IssueDto>().FirstOrDefault();
This always throws a NotSupportedException:
The type 'AppUserDto' appears in two structurally incompatible initializations within a single LINQ to Entities query. A type can be initialized in two places in the same query, but only if the same properties are set in both places and those properties are set in the same order.
This is a problem from automapper of course.
Now I tried the following:
Mapper.CreateMap<AppUser, AppUserDto>().MaxDepth(1)
.ProjectUsing(u => new AppUserDto
{
Id = u == null ? -1 : u.Id,
Name = u == null ? null : u.Name,
});
This makes queries like context.Issues.ProjectTo<IssueDto>()... succeed. But this in turns make direct mappings for AppUser result in null values (or 0 for Id). So context.Users.ProjectTo<AppUserDto>().FirstOrDefault() (or even Mapper.Map<AppUserDto>(context.Users.FirstOrDefault())) always return an AppUserDto with default values for its props.
So, how to make multiple nested dto objects of the same type in the same base dto work without sacrificing direct mappings for said dto object?
Solving the problem with ProjectUsing (if we can make direct mappings work at the same time) is less than ideal, but if it's the only way to go I can manage.
EDIT:
There is a bug most likely, this is the github issue for anyone who's interested.
The culprit was actually the MaxDepth call itself. It may not seem so but sticking MaxDepth on every mapping might produce side effects as I've seen.
As it turns out, I don't have recursion on my Dtos at all (and that's what MaxDepth is for). So just removing all MaxDepth calls will solve this without needing ProjectUsing.
This has been cleared out here.
This is a long one.
So, I have a model and a viewmodel that I'm updating from an AJAX request. Web API controller receives the viewmodel, which I then update the existing model using AutoMapper like below:
private User updateUser(UserViewModel entityVm)
{
User existingEntity = db.Users.Find(entityVm.Id);
db.Entry(existingEntity).Collection(x => x.UserPreferences).Load();
Mapper.Map<UserViewModel, User>(entityVm, existingEntity);
db.Entry(existingEntity).State = EntityState.Modified;
try
{
db.SaveChanges();
}
catch
{
throw new DbUpdateException();
}
return existingEntity;
}
I have automapper configured like so for the User -> UserViewModel (and back) mapping.
Mapper.CreateMap<User, UserViewModel>().ReverseMap();
(Note that explicitly setting the opposite map and omitting the ReverseMap exhibits the same behavior)
I'm having an issue with a member of the Model/ViewModel that is an ICollection of a different object:
[DataContract]
public class UserViewModel
{
...
[DataMember]
public virtual ICollection<UserPreferenceViewModel> UserPreferences { get; set; }
}
The corresponding model is like such:
public class User
{
...
public virtual ICollection<UserPreference> UserPreferences { get; set; }
}
The Problem:
Every property of the User and UserViewModel classes maps correctly, except for the ICollections of UserPreferences/UserPreferenceViewModels shown above. When these collections map from the ViewModel to the Model, rather than map properties, a new instance of a UserPreference object is created from the ViewModel, rather than update the existing object with the ViewModel properties.
Model:
public class UserPreference
{
[Key]
public int Id { get; set; }
public DateTime DateCreated { get; set; }
[ForeignKey("CreatedBy")]
public int? CreatedBy_Id { get; set; }
public User CreatedBy { get; set; }
[ForeignKey("User")]
public int User_Id { get; set; }
public User User { get; set; }
[MaxLength(50)]
public string Key { get; set; }
public string Value { get; set; }
}
And the corresponding ViewModel
public class UserPreferenceViewModel
{
[DataMember]
public int Id { get; set; }
[DataMember]
[MaxLength(50)]
public string Key { get; set; }
[DataMember]
public string Value { get; set; }
}
And automapper configuration:
Mapper.CreateMap<UserPreference, UserPreferenceViewModel>().ReverseMap();
//also tried explicitly stating map with ignore attributes like so(to no avail):
Mapper.CreateMap<UserPreferenceViewModel, UserPreference>().ForMember(dest => dest.DateCreated, opts => opts.Ignore());
When mapping a UserViewModel entity to a User, the ICollection of UserPreferenceViewModels is also mapped the User's ICollection of UserPreferences, as it should.
However, when this occurs, the individual UserPreference object's properties such as "DateCreated", "CreatedBy_Id", and "User_Id" get nulled as if a new object is created rather than the individual properties being copied.
This is further shown as evidence as when mapping a UserViewModel that has only 1 UserPreference object in the collection, when inspecting the DbContext, there are two local UserPreference objects after the map statement. One that appears to be a new object created from the ViewModel, and one that is the original from the existing model.
How can I make automapper update an existing Model's collection;s members, rather than instantiate new members from the ViewModel's collection? What am I doing wrong here?
Screenshots to demonstrate before/after Mapper.Map()
This is a limitation of AutoMapper as far as I'm aware. It's helpful to keep in mind that while the library is popularly used to map to/from view models and entities, it's a generic library for mapping any class to any other class, and as such, doesn't take into account all the eccentricities of an ORM like Entity Framework.
So, here's the explanation of what's happening. When you map a collection to another collection with AutoMapper, you are literally mapping the collection, not the values from the items in that collection to items in a similar collection. In retrospect, this makes sense because AutoMapper has no reliable and independent way to ascertain how it should line up one individual item in a collection to another: by id? which property is the id? maybe the names should match?
So, what's happening is that the original collection on your entity is entirely replaced with a brand new collection composed of brand new item instances. In many situations, this wouldn't be a problem, but when you combine that with the change tracking in Entity Framework, you've now signaled that the entire original collection should be removed and replaced with a brand new set of entities. Obviously, that's not what you want.
So, how to solve this? Well, unfortunately, it's a bit of a pain. The first step is to tell AutoMapper to ignore the collection completely when mapping:
Mapper.CreateMap<User, UserViewModel>();
Mapper.CreateMap<UserViewModel, User>()
.ForMember(dest => dest.UserPreferences, opts => opts.Ignore());
Notice that I broke this up into two maps. You don't need to ignore the collection when mapping to your view model. That won't cause any problems because EF isn't tracking that. It only matters when you're mapping back to your entity class.
But, now you're not mapping that collection at all, so how do you get the values back on to the items? Unfortunately, it's a manual process:
foreach (var pref in model.UserPreferences)
{
var existingPref = user.UserPreferences.SingleOrDefault(m => m.Id == pref.Id);
if (existingPref == null) // new item
{
user.UserPreferences.Add(Mapper.Map<UserPreference>(pref));
}
else // existing item
{
Mapper.Map(pref, existingPref);
}
}
In the meantime there exists an AutoMapper Extension for that particular problem:
cfg.AddCollectionMappers();
cfg.CreateMap<S, D>().EqualityComparison((s, d) => s.ID == d.ID);
With AutoMapper.EF6/EFCore you can also auto generate all equality comparisons. Plaese see AutoMapper.Collection AutoMapper.EF6 or AutoMapper.Collection.EFCore
According to the AutoMapper source file that handles all ICollection (among other things) and the ICollection Mapper:
The collection is cleared by a call to Clear() then added again, so as far as I can see there is no way that AutoMapper will be able to automagically do the mapping this time.
I would implement some logic to loop over the collections and AutoMapper.Map the ones that are the same
I am having some problem about how to work with an entity say an EF entity and a surrogate type, which will be bound to the UI.
Suppose that I have following classes
// Db Entity
public class Car
{
public virtual int Id { get; set; }
public string ChassisNumber { get; set; }
public virtual string Brand { get; set; }
public virtual string Name { get; set; }
}
// Surrogate type that reflects some properties of Car entity
// This class will be bound to UI
public class SurrogateCar
{
public string Brand { get; set; }
public string Name { get; set; }
}
Now I will be getting List<Car> from db and want to create a List<SurrogateCar> that represents my entities. I can do this easily in many ways, one of them like this:
List<Car> cars = CarTable.GetMyCars(); // Just a dummy method, suppose it returns all entities from Db.
List<SurrogateCar> surrogates = new List<SurrogateCar>();
foreach (var car in cars)
{
surrogates.Add(new SurrogateCar { Brand = car.Brand, Name = car.Name });
}
or I can write a custom cast method. But what I worry about is the performance. This method will be called frequently, so creating a list and populating it one by one seems a potential problem to me.
Do you have any better ways to do this, or is it okay to use it like this?
Thanks.
If you have a web service, and that service is always going to return the SurrogateCar class, then you can write your entity query to return the class you want rather than getting the class you don't want:
var cars = from c in context.Cars where {your condition}
select new SurrogateCar
{
Brand=c.Brand,
Name=c.Name
};
If, on the other hand you need the list of cars all the time, then as Roger pointed out AutoMapper is great! You just call
CreateMap<Car, SurrogateCar>
then you just use Automapper to populate your new list:
surrogates.AddRange(Map<IEnumberable<Car>, IEnumerable<SurrogateCar>>(cars));
Don't worry about the performance until you've really measured that's your bottleneck! Most probably these mappings between different types aren't that slow.
There are tools out there, eg AutoMapper
http://automapper.org/
It's main purpose isn't performance though, but to potentially makes you write easier and less code.
I believe what you are really looking for is AutoMapper, it allows for seamless, easy code written around this situation. I would not worry too much about the performance unless you need to worry about it.
Here is a SO about mapping lists using automapper, also
After watching NDC12 presentation "Crafting Wicked Domain Models" from Jimmy Bogard (http://ndcoslo.oktaset.com/Agenda), I was wandering how to persist that kind of domain model.
This is sample class from presentation:
public class Member
{
List<Offer> _offers;
public Member(string firstName, string lastName)
{
FirstName = firstName;
LastName = lastName;
_offers = new List<Offer>();
}
public string FirstName { get; set; }
public string LastName { get; set; }
public IEnumerable<Offer> AssignedOffers {
get { return _offers; }
}
public int NumberOfOffers { get; private set; }
public Offer AssignOffer(OfferType offerType, IOfferValueCalc valueCalc)
{
var value = valueCalc.CalculateValue(this, offerType);
var expiration = offerType.CalculateExpiration();
var offer = new Offer(this, offerType, expiration, value);
_offers.Add(offer);
NumberOfOffers++;
return offer;
}
}
so there are some rules contained in this domain model:
- Member must have first and last name
- Number of offers can't be changed outside
- Member is responsible for creating new offer, calculating its value and assignment
If if try to map this to some ORM like Entity Framework or NHibernate, it will not work.
So, what's best approach for mapping this kind of model to database with ORM?
For example, how do I load AssignedOffers from DB if there's no setter?
Only thing that does make sense for me is using command/query architecture: queries are always done with DTO as result, not domain entities, and commands are done on domain models. Also, event sourcing is perfect fit for behaviours on domain model. But this kind of CQS architecture isn't maybe suitable for every project, specially brownfield. Or not?
I'm aware of similar questions here, but couldn't find concrete example and solution.
This is actually a very good question and something I have contemplated. It is potentially difficult to create proper domain objects that are fully encapsulated (i.e. no property setters) and use an ORM to build the domain objects directly.
In my experience there are 3 ways of solving this issue:
As already mention by Luka, NHibernate supports mapping to private fields, rather than property setters.
If using EF (which I don't think supports the above) you could use the memento pattern to restore state to your domain objects. e.g. you use entity framework to populate 'memento' objects which your domain entities accept to set their private fields.
As you have pointed out, using CQRS with event sourcing eliminates this problem. This is my preferred method of crafting perfectly encapsulated domain objects, that also have all the added benefits of event sourcing.
Old thread. But there's a more recent post (late 2014) by Vaughn Vernon that addresses just this scenario, with particular reference to Entity Framework. Given that I somehow struggled to find such information, maybe it can be helpful to post it here as well.
Basically the post advocates for the Product domain (aggregate) object to wrap the ProductState EF POCO data object for what concerns the "data bag" side of things. Of course the domain object would still add all its rich domain behaviour through domain-specific methods/accessors, but it would resort to inner data object when it has to get/set its properties.
Copying snippet straight from post:
public class Product
{
public Product(
TenantId tenantId,
ProductId productId,
ProductOwnerId productOwnerId,
string name,
string description)
{
State = new ProductState();
State.ProductKey = tenantId.Id + ":" + productId.Id;
State.ProductOwnerId = productOwnerId;
State.Name = name;
State.Description = description;
State.BacklogItems = new List<ProductBacklogItem>();
}
internal Product(ProductState state)
{
State = state;
}
//...
private readonly ProductState State;
}
public class ProductState
{
[Key]
public string ProductKey { get; set; }
public ProductOwnerId ProductOwnerId { get; set; }
public string Name { get; set; }
public string Description { get; set; }
public List<ProductBacklogItemState> BacklogItems { get; set; }
...
}
Repository would use internal constructor in order to instantiate (load) an entity instance from its DB-persisted version.
The one bit I can add myself, is that probably Product domain object should be dirtied with one more accessor just for the purpose of persistence through EF: in the same was as new Product(productState) allows a domain entity to be loaded from database, the opposite way should be allowed through something like:
public class Product
{
// ...
internal ProductState State
{
get
{
// return this.State as is, if you trust the caller (repository),
// or deep clone it and return it
}
}
}
// inside repository.Add(Product product):
dbContext.Add(product.State);
For AssignedOffers : if you look at the code you'll see that AssignedOffers returns value from a field. NHibernate can populate that field like this: Map(x => x.AssignedOffers).Access.Field().
Agree with using CQS.
When doing DDD first thing, you ignore the persistence concerns. THe ORM is tighlty coupled to a RDBMS so it's a persistence concern.
An ORM models persistence structure NOT the domain. Basically the repository must 'convert' the received Aggregate Root to one or many persistence entities. The Bounded Context matters a lot since the Aggregate Root changes according to what are you trying to accomplish as well.
Let's say you want to save the Member in the context of a new offer assigned. Then you'll have something like this (of course this is only one possible scenario)
public interface IAssignOffer
{
int OwnerId {get;}
Offer AssignOffer(OfferType offerType, IOfferValueCalc valueCalc);
IEnumerable<Offer> NewOffers {get; }
}
public class Member:IAssignOffer
{
/* implementation */
}
public interface IDomainRepository
{
void Save(IAssignOffer member);
}
Next the repo will get only the data required in order to change the NH entities and that's all.
About EVent Sourcing, I think that you have to see if it fits your domain and I don't see any problem with using Event Sourcing only for storing domain Aggregate Roots while the rest (mainly infrastructure) can be stored in the ordinary way (relational tables). I think CQRS gives you great flexibility in this matter.
In some cases there is a need to return composite DTOs from our repository, where the DTO just has a few properties that are Model properties and the function of the DTO is just to be a simple composite object (returning a Queryable is not enough because there is more information than T)
For example:
Model:
public class Job
{
int Id { get; set; }
//more properties
}
public class JobApplication
{
int Id { get; set; }
//more properties
}
Repository:
IQueryable<JobAndUserApplication> GetJobAndMatchingUserApplication(int userId):
public class JobAndUserApplication
{
public Job Job { get; set; }
public JobApplication JobApplication { get; set; }
}
Now - Id like to simply do (Project and To are Automapper functionality):
//this allows one efficient query to bring in the subproperties of the composite DTO
var jobVmList = jobRepository.GetAllJobsAndMatchingJobApplicationByUser(userId)
.Project()
.To<JobVM>()
.ToList();
So I need a mapping kind of like this:
Mapper.CreateMap<JobAndUserApplication, JobVM>()
.ForMember(jvm => jvm, opt => opt.ResolveUsing(src => src.Job));
//many other .ForMembers that are not relevant right now
I am attempting to map the Job property of the DTO directly on to the JobVM (which shares many of the same properties).
My mapping throws the following exception:
Custom configuration for members is only supported for top-level individual members on a type.
What am I doing wrong and how can I accomplish the mapping form the Job property of the DTO on the the JobVM itself?
Thanks
Automapper is telling you that you can only define custom actions on a member (property) of a class, not on the class itself. What you need to do is first create a Job to JobVM map:
Mapper.CreateMap<Job, JobVM>()
and
Mapper.CreateMap<JobAndUserApplication, JobVM>()
being sure to ignore and set any duplicate properties across the two types. Then run automapper twice, first from the child object:
var jobVM = Mapper.Map<Job, JobVM>(jobAndUserApplication.job);
then from the parent object
Mapper.Map<JobAndUserApplication, JobVM>(jobAndUserApplication, jobVM );
Or the other way around, depending on how your properties are laid out.
Quick side note: I have a feeling you might be mixing concerns, and my code smell alarm is going off. I'd take a second look at either your viewmodel or domain model, as this is not a typical issue I see come up. (just my $0.02 :-)