Mapping entities to DTOs without duplicated code - c#

I'm trying to get my head around this issue where I am using the Entity Framework (6) in an N-tier application. Since data from the repository (which contains all communication with the database) should be used in a higher tier (the UI, services etc), I need to map it to DTOs.
In the database, there's quite a few many-to-many relationships going on, so the datastructure can/will get complex somewhere along the line of the applications lifetime. What I stumbled upon is, that I am repeating the exact same code when writing the repository methods. An example of this is my FirmRepository which contains a GetAll() method and GetById(int firmId) method.
In the GetById(int firmId) method, I have the following code (incomplete since there's a lot more relations that needs to be mapped to DTOs):
public DTO.Firm GetById(int id)
{
// Return result
var result = new DTO.Firm();
try
{
// Database connection
using (var ctx = new MyEntities())
{
// Get the firm from the database
var firm = (from f in ctx.Firms
where f.ID == id
select f).FirstOrDefault();
// If a firm was found, start mapping to DTO object
if (firm != null)
{
result.Address = firm.Address;
result.Address2 = firm.Address2;
result.VAT = firm.VAT;
result.Email = firm.Email;
// Map Zipcode and City
result.City = new DTO.City()
{
CityName = firm.City.City1,
ZipCode = firm.City.ZipCode
};
// Map ISO code and country
result.Country = new DTO.Country()
{
CountryName = firm.Country.Country1,
ISO = firm.Country.ISO
};
// Check if this firm has any exclusive parameters
if (firm.ExclusiveParameterType_Product_Firm.Any())
{
var exclusiveParamsList = new List<DTO.ExclusiveParameterType>();
// Map Exclusive parameter types
foreach (var param in firm.ExclusiveParameterType_Product_Firm)
{
// Check if the exclusive parameter type isn't null before proceeding
if (param.ExclusiveParameterType != null)
{
// Create a new exclusive parameter type DTO
var exclusiveParameter = new DTO.ExclusiveParameterType()
{
ID = param.ExclusiveParameterType.ID,
Description = param.ExclusiveParameterType.Description,
Name = param.ExclusiveParameterType.Name
};
// Add the new DTO to the list
exclusiveParamsList.Add(exclusiveParameter);
}
}
// A lot more objects to map....
// Set the list on the result object
result.ExclusiveParameterTypes = exclusiveParamsList;
}
}
}
// Return DTO
return result;
}
catch (Exception e)
{
// Log exception
Logging.Instance.Error(e);
// Simply return null
return null;
}
}
This is just one method. The GetAll() method will then have the exact same mapping logic which results in duplicated code. Also, when more methods gets added, i.e. a Find or Search method, the same mapping needs to be copied again. This is, of course, not ideal.
I have read a lot about the famous AutoMapper framework that can map entites to/from DTOs, but since I have these many-to-many relations it quickly feels bloated with AutoMapper config code. I've also read this article, which make sense in my eyes: http://rogeralsing.com/2013/12/01/why-mapping-dtos-to-entities-using-automapper-and-entityframework-is-horrible/
Is there any other way of doing this without copy/pasting the same code over and over again?
Thanks in advance!

You can make an extension method on Entity firm (DB.Firm) like this,
public static class Extensions
{
public static DTO.Firm ToDto(this DB.Firm firm)
{
var result = new DTO.Firm();
result.Address = firm.Address;
result.Address2 = firm.Address2;
//...
return result;
}
}
Then you can convert DB.Firm object anywhere in your code like firm.ToDto();

An alternate strategy is to use a combination of the class constructor and an explicit and/or implicit conversion operator(s). It allows you to cast one user-defined entity to another entity. The feature also has the added benefit of abstracting the process out so you aren't repeating yourself.
In your DTO.Firm class, define either an explicit or implicit operator (Note: I am making assumptions about the name of your classes):
public class Firm {
public Firm(DB.Firm firm) {
Address = firm.Address;
Email = firm.Email;
City = new DTO.City() {
CityName = firm.City.City1;
ZipCode = firm.City.ZipCode;
};
// etc.
}
public string Address { get; set;}
public string Email { get; set; }
public DTO.City City { get; set; }
// etc.
public static explicit operator Firm(DB.Firm f) {
return new Firm(f);
}
}
You can then use it in your repository code like this:
public DTO.Firm GetById(int id) {
using (var ctx = new MyEntities()) {
var firm = (from f in ctx.Firms
where f.ID == id
select f).FirstOrDefault();
return (DTO.Firm)firm;
}
}
public List<DTO.Firm> GetAll() {
using (var ctx = new MyEntities()) {
return ctx.Firms.Cast<DTO.Firm>().ToList();
}
}
Here's the reference in MSDN.

About mapping: it actually does not really matter if you use Automapper or prepare you mappings completely manually in some method (extension one or as explicit casting operator as mentioned in other answers) - the point is to have it in one place for reusability.
Just remember - you used FirstOrDefault method, so you actually called the database for a Firm entity. Now, when you are using properties of this entity, especiallly collections, they will be lazy loaded. If you have a lot of them (as you suggest in your question), you may face a huge amount of additional call and it might be a problem, especcially in foreach loop. You may end up with dozen of calls and heavy performace issues just to retrieve one dto. Just rethink, if you really need to get such a big object with all its relations.
For me, your problem is much deeper and considers application architecture. I must say, I personally do not like repository pattern with Entity Framework, in addition with Unit Of Work pattern. It seems to be very popular (at least of you take a look at google results for the query), but for me it does not fit very well with EF. Of course, it's just my opinion, you may not agree with me. For me it's just building another abstraction over already implemented Unit Of Work (DbContext) and repositories (DbSet objects). I found this article very interesing considering this topic. Command/query separation way-of-doing-things seems much more elegant for me, and also it fits into SOLID rules much better.
As I said, it's just my opinion and you may or may not agree with me. But I hope it gives you some perpective here.

Related

EF 6 get entity after adding

I'm using EF 6 in my .NET MVC app. I have these classes:
public class Member
{
public int ID { get; set; }
public string Name { get; set; }
public int FactoryID { get; set; }
public Factory Factory { get; set; }
}
public class Factory
{
public int ID { get; set; }
public string Name { get; set; }
public virtual ICollection<Member> Members { get; set; }
}
code to add:
var newMember = new Member();
newMember.Name = 1;
newMember.FactoryID = 2;
context.Members.Add(newMember);
context.SaveChanges();
code to get:
var member = context.Members.SingleOrDefault(x => x.ID == id);
var factory = member.Factory;
so, when I add Member in one API call, then get Member in another API call then Member.Factory is defined.
When I try to get Member right away after adding then Member.Factory is NULL.
What's the reason of it? And how can it be resolved?
The reason it would work sometimes but not others is because EF will provide related entities it knows about when you reference entities by ID. If Lazy loading is enabled, EF will go to the DB to pull back any related entities that it doesn't know about. When it comes to serializing responses though, Lazy loading can result in performance issues or cyclic reference errors.
For example, with lazy loading turned off:
If I do something like:
using (var context = new MyContext())
{
var member = new Member
{
FactoryId = 3;
// ...
}
context.Members.Add(member);
context.SaveChanges();
return member;
}
the returned Member's "Factory" reference will be #null because that EF context had no notion of what Factory ID 3 actually was. The insert will succeed provided that a data record for Factory ID #3 exists, but the context does not know about it.
If in another example I do something like this:
using (var context = new MyContext())
{
// Call some code using this context that results in the following running...
var factory = context.Factories.Single(x => x.FactoryId == 3);
// more code...
var member = new Member
{
FactoryId = 3;
// ...
}
context.Members.Add(member);
context.SaveChanges();
return member;
}
In this case, EF will return Factory #3 along with the member because the context instance knew about Factory #3. When member was saved, the known reference was automatically associated.
The above example uses a DbContext in a using block, which makes the scenario seem obvious, however, with code that uses an IoC container to scope a DbContext to a request for example it can be a little less clear cut for any given scenario across various methods etc. that may be called to work out what entities the DbContext may, or may not know about.
When dealing with references where you will want to return details about entities and their references, or following code would benefit from accessing references, my typical advice is to set references, not FKs in your entities. This way you assure that the entity you are creating is in a complete and fit for purpose state.
For instance:
using (var context = new MyContext())
{
var factory = context.Factories.Single(x => x.FactoryId == factoryId);
var member = new Member
{
Factory = factory;
// ...
}
context.Members.Add(member);
context.SaveChanges();
return member;
}
I avoid exposing FKs entirely within my entities to enforce using the references, and use Shadow Properties (EFCore) and mapping (EF6) to ensure FKs are not accessible in my entities. The trouble with FKs is that when editing entities that have both a reference and a FK column, there are 2 sources of truth. Does updating the Factory change the factory reference, or does updating the FactoryId? What if I have a Factory reference pointing at Factory ID 3, but I change FactoryId on Member to 4? Some code may depend on the FK, while other may go to the Factory reference.
Explicitly working with references means that related entities are asserted at that point (rather than waiting for any number of FK violations on SaveChanges). It will use any loaded references the context has already loaded, or go to the DB if needed.
Where I do use FKs over references is for bulk operations where I just want to update or insert a large amount of information as quickly as possible. In these cases I use a bounded context with bare bones simple entity definitions with FKs and no references to create, set FKs, and save. No need for returning complete data and references.

Add() method adding duplicate rows for linked models in Code-First Entity Framework

Following is the action that is adding a Loan request to the database:
[HttpPost]
public ActionResult Add(Models.ViewModels.Loans.LoanEditorViewModel loanEditorViewModel)
{
if (!ModelState.IsValid)
return View(loanEditorViewModel);
var loanViewModel = loanEditorViewModel.LoanViewModel;
loanViewModel.LoanProduct = LoanProductService.GetLoanProductById(loanViewModel.LoanProductId); // <-- don't want to add to this table in database
loanViewModel.Borrower = BorrowerService.GetBorrowerById(loanViewModel.BorrowerId); //<-- don't want to add to this table in database
Models.Loans.Loan loan = AutoMapper.Mapper.Map<Models.Loans.Loan>(loanEditorViewModel.LoanViewModel);
loanService.AddNewLoan(loan);
return RedirectToAction("Index");
}
Following is the AddNewLoan() method:
public int AddNewLoan(Models.Loans.Loan loan)
{
loan.LoanStatus = Models.Loans.LoanStatus.PENDING;
_LoanService.Insert(loan);
return 0;
}
And here is the code for Insert()
public virtual void Insert(TEntity entity)
{
if (entity == null)
throw new ArgumentNullException(nameof(entity));
try
{
entity.DateCreated = entity.DateUpdated = DateTime.Now;
entity.CreatedBy = entity.UpdatedBy = GetCurrentUser();
Entities.Add(entity);
context.SaveChanges();
}
catch (DbUpdateException exception)
{
throw new Exception(GetFullErrorTextAndRollbackEntityChanges(exception), exception);
}
}
It is adding one row successfully in Loans table but it is also adding rows to LoanProduct and Borrower table as I showed in first code comments.
I checked the possibility of multiple calls to this action and Insert method but they are called once.
UPDATE
I am facing similar problem but opposite in functioning problem here: Entity not updating using Code-First approach
I think these two have same reason of Change Tracking. But one is adding other is not updating.
The following code seems a bit odd:
var loanViewModel = loanEditorViewModel.LoanViewModel;
loanViewModel.LoanProduct = LoanProductService.GetLoanProductById(loanViewModel.LoanProductId); // <-- don't want to add to this table in database
loanViewModel.Borrower = BorrowerService.GetBorrowerById(loanViewModel.BorrowerId); //<-- don't want to add to this table in database
Models.Loans.Loan loan = AutoMapper.Mapper.Map<Models.Loans.Loan>(loanEditorViewModel.LoanViewModel);
You are setting entity references on the view model, then calling automapper. ViewModels should not hold entity references, and automapper should effectively be ignoring any referenced entities and only map the entity structure being created. Automapper will be creating new instances based on the data being passed in.
Instead, something like this should work as expected:
// Assuming these will throw if not found? Otherwise assert that these were returned.
var loanProduct = LoanProductService.GetLoanProductById(loanViewModel.LoanProductId);
var borrower = BorrowerService.GetBorrowerById(loanViewModel.BorrowerId);
Models.Loans.Loan loan = AutoMapper.Mapper.Map<Models.Loans.Loan>(loanEditorViewModel.LoanViewModel);
loan.LoanProduct = loanProduct;
loan.Borrower = borrower;
Edit:
The next thing to check is that your Services are using the exact same DbContext reference. Are you using Dependency Injection with an IoC container such as Autofac or Unity? If so, make sure that the DbContext is set registered as Instance Per Request or similar lifetime scope. If the Services effectively new up a new DbContext then the LoanService DbContext will not know about the instances of the Product and Borrower that were fetched by another service's DbContext.
If you are not using a DI library, then you should consider adding one. Otherwise you will need to update your services to accept a single DbContext with each call or leverage a Unit of Work pattern such as Mehdime's DbContextScope to facilitate the services resolving their DbContext from the Unit of Work.
For example to ensure the same DbContext:
using (var context = new MyDbContext())
{
var loanProduct = LoanProductService.GetLoanProductById(context, loanViewModel.LoanProductId);
var borrower = BorrowerService.GetBorrowerById(context, loanViewModel.BorrowerId);
Models.Loans.Loan loan = AutoMapper.Mapper.Map<Models.Loans.Loan>(loanEditorViewModel.LoanViewModel);
loan.LoanProduct = loanProduct;
loan.Borrower = borrower;
LoanService.AddNewLoan(context, loan);
}
If you are sure that the services are all provided the same DbContext instance, then there may be something odd happening in your Entities.Add() method. Honestly your solution looks to have far too much abstraction around something as simple as a CRUD create and association operation. This looks like a case of premature code optimization for DRY without starting with the simplest solution. The code can more simply just scope a DbContext, fetch the applicable entities, create the new instance, associate, add to the DbSet, and SaveChanges. There's no benefit to abstracting out calls for rudimentary operations such as fetching a reference by ID.
public ActionResult Add(Models.ViewModels.Loans.LoanEditorViewModel loanEditorViewModel)
{
if (!ModelState.IsValid)
return View(loanEditorViewModel);
var loanViewModel = loanEditorViewModel.LoanViewModel;
using (var context = new AppContext())
{
var loanProduct = context.LoanProducts.Single(x => x.LoanProductId ==
loanViewModel.LoanProductId);
var borrower = context.Borrowers.Single(x => x.BorrowerId == loanViewModel.BorrowerId);
var loan = AutoMapper.Mapper.Map<Loan>(loanEditorViewModel.LoanViewModel);
loan.LoanProduct = loanProduct;
loan.Borrower = borrower;
context.SaveChanges();
}
return RedirectToAction("Index");
}
Sprinkle with some exception handling and it's done and dusted. No layered service abstractions. From there you can aim to make the action test-able by using an IoC container like Autofac to manage the Context and/or introducing a repository/service layer /w UoW pattern. The above would serve as a minimum viable solution for the action. Any abstraction etc. should be applied afterwards. Sketch out with pencil before cracking out the oils. :)
Using Mehdime's DbContextScope it would look like:
public ActionResult Add(Models.ViewModels.Loans.LoanEditorViewModel loanEditorViewModel)
{
if (!ModelState.IsValid)
return View(loanEditorViewModel);
var loanViewModel = loanEditorViewModel.LoanViewModel;
using (var contextScope = ContextScopeFactory.Create())
{
var loanProduct = LoanRepository.GetLoanProductById( loanViewModel.LoanProductId).Single();
var borrower = LoanRepository.GetBorrowerById(loanViewModel.BorrowerId);
var loan = LoanRepository.CreateLoan(loanViewModel, loanProduct, borrower).Single();
contextScope.SaveChanges();
}
return RedirectToAction("Index");
}
In my case I leverage a repository pattern that uses the DbContextScopeLocator to resolve it's ContextScope to get a DbContext. The Repo manages fetching data and ensuring that the creation of entities are given all required data necessary to create a complete and valid entity. I opt for a repository-per-controller rather than something like a generic pattern or repository/service per entity because IMO this better manages the Single Responsibility Principle given the code only has one reason to change (It serves the controller, not shared between many controllers with potentially different concerns). Unit tests can mock out the repository to serve expected data state. Repo get methods return IQueryable so that the consumer logic can determine how it wants to consume the data.
Finally with the help of the link shared by #GertArnold Duplicate DataType is being created on every Product Creation
Since all my models inherit a BaseModel class, I modified my Insert method like this:
public virtual void Insert(TEntity entity, params BaseModel[] unchangedModels)
{
if (entity == null)
throw new ArgumentNullException(nameof(entity));
try
{
entity.DateCreated = entity.DateUpdated = DateTime.Now;
entity.CreatedBy = entity.UpdatedBy = GetCurrentUser();
Entities.Add(entity);
if (unchangedModels != null)
{
foreach (var model in unchangedModels)
{
_context.Entry(model).State = EntityState.Unchanged;
}
}
_context.SaveChanges();
}
catch (DbUpdateException exception)
{
throw new Exception(GetFullErrorTextAndRollbackEntityChanges(exception), exception);
}
}
And called it like this:
_LoanService.Insert(loan, loan.LoanProduct, loan.Borrower);
By far the simplest way to tackle this is to add the two primitive foreign key properties to the Loan class, i.e. LoanProductId and BorrowerId. For example like this (I obviously have to guess the types of LoanProduct and Borrower):
public int LoanProductId { get; set; }
[ForeignKey("LoanProductId")]
public Product LoanProduct { get; set; }
public int BorrowerId { get; set; }
[ForeignKey("BorrowerId")]
public User Borrower { get; set; }
Without the primitive FK properties you have so-called independent associations that can only be set by assigning objects of which the state must be managed carefully. Adding the FK properties turns it into foreign key associations that are must easier to set. AutoMapper will simply set these properties when the names match and you're done.
Check Models.Loans.Loan?Is it a joined model of Loans table , LoanProduct and Borrower table.
You have to add
Loans lentity = new Loans()
lentity.property=value;
Entities.Add(lentity );
var lentity = new Loans { FirstName = "William", LastName = "Shakespeare" };
context.Add<Loans >(lentity );
context.SaveChanges();

How do I simplify the access of a has-many relationship with the entity framework?

Here is what I want to do:
var user = db.User.First(conditions);
user.Book.First();
Here is currently how I have to do that.
var user = db.User.Include("Book").First(conditionsForUser);
user.Book.First();
The reason why I want to simplify this, is because I don't want to have to specify what is included every time I want to access a relationship. Seems very cumbersome.
e.g.: I would like to just be able to do the following, given I have previously retrieved a user:
user.Book.First()
user.Blog.First()
user.SomeOtherHasManyRelationship.Where(conditions)
Here is what I have so far:
public object RelationshipFor(string relationship)
{
using (var db = User.DbContext())
{
var relationshipType = TypeRepresentedBy(relationship); // unused for now, not sure if I need the type of the relationship
var myTable = ((ICollection)db.Send(RelationshipName)); // RelationshipName is "User" in this instance.
var meWithRelationship = myTable.Where(i => i.Send(IdColumn) == Id).Include(relationship); // currently, myTable doesn't know about 'Where' for some reason.
return meWithRelationship.Send(relationship);
}
}
And then how that would be used would be the following:
user.RelationshipFor("Book") // returns a list of books
I have some other logic in my code which abstracts that further which would allow me to do user.Book.First().
Hopefully I can get permission to open source a lot of this, as I'm modelling a lot of the api after ActiveRecord-style crud.
Note, that I'm using I set of extensions I made to help dealing with dynamicness less painful: https://github.com/NullVoxPopuli/csharp-extensions
UPDATE 1:
public object RelationshipFor(string relationship)
{
using (var db = User.DbContext())
{
var myTable = (DbSet<DatabaseModels.User>)db.Send(RelationshipName);
var myInclude = myTable.Include(i => i.Send(relationship));
var meWithRelationship = myInclude.First(i => (long)i.Send(IdColumn) == Id);
return meWithRelationship.Send(relationship);
}
}
For now, I've hard coded the cast of the user in an attempt to just get something working.
My error now is:
Unable to cast object of type 'System.Linq.Expressions.MethodCallExpressionN' to type 'System.Linq.Expressions.MemberExpression'.
This is not a trivial problem, and there's no "one size fits all" approach. What you actually seem to be after is lazy loading, which was not included in EF7 for many reasons.
I don't know what the code you show is supposed to do, but one option would be to introduce a repository pattern, where you specify the "entities to include" at the collection level:
public class UserRepository
{
private readonly IQueryable<User> _dataSet;
public UserRepository(IQueryable<User> userDataSet)
{
_dataSet = userDataSet;
}
public IQueryable<User> Include()
{
return _dataSet.Include(u => u.Book)
.Include(u => u.Blog);
}
}
And you can move lots of the logic to a generic base class, leaving you with just the Include() method. You can for example work with strings as you show (or enums, or ...), to only select related entities to include:
public class GenericRepository
{
// ...
public IQueryable<User> Include(string includeGroup = null)
{
return IncludeGroup(includeGroup);
}
protected virtual IncludeGroup(string includeGroup)
{
return _dataSet;
}
}
And then in UserRepository:
protected override IQueryable<User> IncludeGroup(string includeGroup)
{
switch (includeGroup.ToUpperInvariant())
{
case "BOOK":
return _dataSet.Include(u => u.Book)
.Include(u => u.Book.Author);
case "BLOG":
return _dataSet.Include(u => u.Blog);
default:
return base.Include(includeGroup);
}
}
And then use it like this:
var userRepo = new UserRepository(db.User);
var userWithBooks = userRepo.Include("Book");
var firstUser = userWithBooks.FirstOrDefault(u => u.Name == "Foo");
var firstUserFirstBook = firstUser.Book.FirstOrDefault();
One alternative would be to always include all navigation properties (recursively), but that would be a horrible approach in terms of query efficiency, as every query will be one massive join to all related tables, whether that is necessary or not.

C# Generics / Interfaces - Returning different types based on the class

For work, we have specific types of records that come in, but each project has its own implementation. The columns and the like are different, but in general the process is the same (records are broken into batches, batches are assigned, batches are completed, batches are returned, batches are sent out, etc.). Many of the columns are common, too, but sometimes there are name changes (BatchId in one vs Id in another. [Column("name")] takes care of this issue).
Currently this is what I have for the implementation of the batch assignment functionality with the common components given in the interface:
public interface IAssignment
{
// properties
...
// methods
T GetAssignmentRecord<T>(int UserId, int BatchId) where T : IAssignment;
List<T> GetAssignmentRecords<T>(int UserId) where T : IAssignment;
}
Now I currently have two projects that have batch assignment. Due to these being done in EntityFramework, Assignment in Namespace1 and Assignment in Namespace2 are completely different things but are bound by certain common components (an ID, an assigned user, checked in, etc.) which drive all of the methods for returning them.
I think my main question is if I'm doing this incorrectly and if there is a better way to achieve this such that I can pipe data into my Controllers and have the controllers look somewhat similar project to project while having as much of the method work being handled automatically (primarily so that a "fix one, fix all" scenario occurs when I need to do updates).
Here's an example of how I'm doing the implementation for namespace1:
public class Assignment
{
...
public T GetAssignmentRecord<T>(int UserId, int BatchId) where T : IAssignment
{
var db = new Database1Context();
return (T) Convert.ChangeType(db.Assignment.Where(c => c.UserId == UserId && c.BatchId == BatchId && c.Assigned).First(), typeof(T));
}
}
In the Controller:
Assignment assignment = new Assignment();
var record = assignment.GetAssignmentRecord<Assignment>(userid, batchid);
// do stuff
The controller code is actually how I'm assuming it would work. I've completed through the Assignment class and now I'm perplexed if I'm doing it the proper way. The reason I feel this may be incorrect is I'm basically saying "The interface is looking for a generic, I'm getting a strong typed object from the database using entity framework, I'm casting it to a generic, and when I'm making the request, I'm asking for the same strong typed object that I converted to generic initially."
Is there a better way of doing this? Or a completely different direction I should be going?
Providing I understood correctly what your goal is, I'd do it e.g. this way...
interface IAssignment
{
}
interface IRepo<out T> where T : IAssignment
{
T GetAssignmentRecord(int UserId, int BatchId);
IEnumerable<T> GetAssignmentRecords(int UserId);
}
class AssignmentRecord : IAssignment
{
}
class AssignmentWeb : IAssignment
{
}
class RepoDb : IRepo<AssignmentRecord>
{
public AssignmentRecord GetAssignmentRecord(int UserId, int BatchId)
{
//using(var db = new MyDbContext())
//{
// return db.Assignment.Where(c => c.UserId == UserId && c.BatchId == BatchId && c.Assigned).First();
//}
return new AssignmentRecord();
}
public IEnumerable<AssignmentRecord> GetAssignmentRecords(int UserId)
{
//using(var db = new MyDbContext())
//{
// return db.Assignment.Where(c => c.UserId == UserId && c.BatchId == BatchId && c.Assigned);
//}
return new List<AssignmentRecord>
{
new AssignmentRecord(),
new AssignmentRecord(),
new AssignmentRecord(),
new AssignmentRecord(),
new AssignmentRecord(),
new AssignmentRecord(),
new AssignmentRecord(),
new AssignmentRecord(),
};
}
}
class RepoWeb : IRepo<AssignmentWeb>
{
public AssignmentWeb GetAssignmentRecord(int UserId, int BatchId)
{
// fetch it from some web service...
return new AssignmentWeb();
}
public IEnumerable<AssignmentWeb> GetAssignmentRecords(int UserId)
{
//using(var db = new MyDbContext())
//{
// return db.Assignment.Where(c => c.UserId == UserId && c.BatchId == BatchId && c.Assigned);
//}
return new List<AssignmentWeb>
{
new AssignmentWeb(),
new AssignmentWeb(),
new AssignmentWeb(),
};
}
}
class MYController
{
public IRepo<IAssignment> Repository { get; set; } // you can inject this e.g. DI
public IAssignment GetAssignment(int userid, int batchid)
{
return Repository.GetAssignmentRecord(userid, batchid);
}
public IEnumerable<IAssignment> GetAllAssignments(int userid)
{
return Repository.GetAssignmentRecords(userid);
}
}
class ProgramAssignment
{
static void Main(string[] args)
{
try
{
var controller = new MYController();
controller.Repository = new RepoDb();
IAssignment assignment = controller.GetAssignment(0, 0);
IEnumerable<IAssignment> all = controller.GetAllAssignments(0);
controller.Repository = new RepoWeb();
assignment = controller.GetAssignment(0, 0);
all = controller.GetAllAssignments(0);
}
catch
{
Console.WriteLine("");
}
}
}
As to why the out - here is some more in my other post...
How to make generic class that contains a Set of only its own type or subtypes as Children?
Assuming that the 2 Assignment has different properties (maybe some additional), but some of the property is same, and they are from different database, there are many ways to doing it. But "the best" (for me) is by doing dependency injection.
Your activities (methods) in Assignment class, should be moved to a separated "service" class. This increases the modularity of Assignment, as it only became a POCO.
For data access, create a separated class (repository) to retrieve/insert/update/delete your data. Example will be like:
public AssignmentRepository: IAssignmentRepository{
public Assignment GetAssignmentRecord(int userId, int batchId){
}
}
public BatchAssignmentRepository: IAssignmentRepository{
public Assignment GetAssignmentRecord(int userId, int batchId){
}
}
If you ask why there are 2 repository instead of 1, will it make the code redundant? Yes it is, but you also must consider it will increase the modularity. If you change something in BatchAssignment (maybe change the column name, add additional column, etc) then you do not need to apply the same in Assignment, and avoiding you of "if batchAssignment else" logic inside.
The use from the caller will be like this:
IAssignmentService service = new AssignmentService();
IAssignmentRepository repository = new AssignmentRepository();
Assignment a = repository.GetAssignmentRecord(userId, batchId);
service.DoSomething(a);
Think about an adapter layer. That layer should transform the incoming data to a common structure/class and then can be handled consistently, generics notwithstanding. Of course it also re-transforms on the "outbound" side to that expected by the particular databases. This assumes that no datasource has data that is undefined in the others, or that you can define valid default values for said missing data.
I imagine you need different adapters for the different projects. Perhaps this is a job for dependency injection. Basically at runtime you fetch the particular code (adapter class) needed.
Introduction to Unity.

How to get from specific Model to ViewModel and vice versa, using nested classes?

I want to know a good way of converting the Model to ViewModel and ViewModel to Model without AutoMapper or something similar, because I want to understand what is behind and learn how to do it myself. Of course, by Model I mean the classes generated by EF.
I made something like this so far, but have some issues when nested classes are involved:
// to VM
public static Author ToViewModel(EntityAuthor author)
{
if (author == null)
return null;
Author result = new Author();
result.Id = author.ATH_ID;
result.FirstName = author.ATH_FirstName;
result.LastName = author.ATH_LastName;
return result;
}
public static BlogPost ToViewModel(EntityBlogPost post)
{
if (post == null)
return null;
Experiment result = new Experiment();
result.Id = post.BP_ID;
result.Title = post.BP_Title;
result.Url = post.BP_Url;
result.Description = post.BP_Description;
result.Author = ToViewModel(post.Author);
return result;
}
// from VM
public static EntityAuthor ToModel(Author author)
{
if (author == null)
return null;
EntityAuthor result = new EntityAuthor();
result.ATH_ID= author.Id;
result.ATH_FirstName = author.FirstName;
result.ATH_LastName = author.LastName;
return result;
}
public static EntityBlogPost ToModel(BlogPost post)
{
if (post == null)
return null;
EntityBlogPost result = new EntityBlogPost();
result.BP_ID = post.Id;
result.BP_Title = post.Title;
result.BP_Url = post.Url;
result.BP_Description = post.Description;
result.Author = ToModel(post.Author);
return result;
}
Note: The EntityBlogPost holds the Foreign key to the EntityAuthor. One issue that I face now is when I want to edit a BlogPost, its corresponding entity requires the author's foreign key: "BP_ATH_ID" to be set, but this is '0' since the author of the edited post is 'null', because I don't want to http-post the author. Still, the author needs to be in the view-model because I want to display it (during http-get). Here is my controller to understand better (the view is not of importance):
// GET: I make use of Author for this
public ActionResult Edit(int id)
{
return View(VMConverter.ToViewModel(new BlogPostService().GetByID(id)));
}
//
// POST: I don't make use of Author for this
[HttpPost]
public ActionResult Edit(BlogPost input)
{
if (ModelState.IsValid)
{
new BlogPostService().Update(VMConverter.ToModel(input));
return RedirectToAction("List");
}
return View(input);
}
At the moment I have some Services behind my controller which work only over the Model (as you can see in my code). The intent was to reuse this "service layer" for other applications as well.
public void Update(EntityBlogPost post)
{
// let's keep it simple for now
this.dbContext.Entry(post).State = EntityState.Modified;
this.dbContext.SaveChanges();
}
Ok, so back to my question. What would be a nice way to handle this transition Model->ViewModel and back?
In my opinion the approach is problematic in both directions.
Model to ViewModel (GET requests)
If you are using a method like this...
public static Author ToViewModel(EntityAuthor author)
...the question is: Where do you get the EntityAuthor author from? Of course you load it from the database using Find or Single or something. This materializes the whole EntityAuthor entity with all properties. Do you need them all in the view? Maybe yes, in this simple example. But imagine a big Order entity with a lot of references to other entities - customer, delivery address, order items, contact person, invoice address, etc., etc. - and you want to display a view with only some properties: due date, customer name, contact person email address.
To apply the ToViewModel method you have to load the EntityOrder with a whole bunch of properties you don't need for the view and you even have to apply Include for the related entities. This again will load all properties of those entities but you need only a selection of them in the view.
The usual way to load only the properties you need for the view is a projection, for example:
var dto = context.Orders.Where(o => o.Id == someOrderId)
.Select(o => new MyViewDto
{
DueDate = o.DueDate,
CustomerName = o.Customer.Name,
ContactPersonEmailAddress = o.ContactPerson.EmailAddress
})
.Single();
As you can see I have introduced a new helper class MyViewDto. Now you could create specific ToViewModel methods:
public static OrderViewModel ToMyViewModel(MyViewDto dto)
The mapping between dto and viewModel is a good candidate for AutoMapper. (You cannot use AutoMapper for the projection step above.)
An alternative is to project directly into the ViewModel, i.e. replace MyViewDto above by OrderViewModel. You have to expose IQueryable<Order> to the view layer though where the ViewModels live in. Some people don't like it, personally I am using this approach.
The downside is that you need a lot of different methods of type ToMyViewModel, basically for every view another method.
ViewModel to Model (POST requests)
This is the bigger problem as you already have noticed in your example: Many views don't show full entities or show entity data that are supposed to be "view only" and don't get posted back to the server.
If you use the method (with AutoMapper or not)...
public static EntityAuthor ToModel(Author author)
...you obviously don't create a full EntityAuthor object in most cases because the view represented by the view model Author author doesn't show all properties and at least doesn't post them all back. Using an Update method like this:
this.dbContext.Entry(post).State = EntityState.Modified;
...would destroy the entity partially in the database (or throw an exception in the best case because some required FKs or properties are not set correctly). The achieve a correct Update you actually have to merge values which are stored in the database and are left unchanged with changed values posted back from the view.
You could use specific Update methods tailored to the view:
public void UpdateForMyView1(EntityBlogPost post)
{
this.dbContext.EntityBlogPosts.Attach(post);
this.dbContext.Entry(post).Property(p => p.Title).IsModified = true;
this.dbContext.Entry(post).Property(p => p.Description).IsModified = true;
this.dbContext.SaveChanges();
}
This would be a method for a view which only allows to edit Title and Description of an EntityBlogPost. By marking specific properties as Modified EF will only update those columns in the database.
The alternative is to introduce DTOs again and mapping methods between view model and those DTOs:
public static MyUpdateAuthorDto ToMyUpdateAuthorDto(Author author)
This is only property copying or AutoMapper. The Update could be done by:
public void UpdateForMyView1(MyUpdateAuthorDto dto)
{
var entityAuthor = this.dbContext.EntityAuthors.Find(dto.AuthorId);
this.dbContext.Entry(entityAuthor).CurrentValues.SetValues(dto);
this.dbContext.SaveChanges();
}
This updates only the properties which match in EntityAuthor and in dto and marks them as Modified if they did change. This would solve your problem of the missing foreign key because it is not part of the dto and won't be updated. The original value in the database remains unchanged.
Note that SetValues takes an object as parameter. So, you could use some kind of resusable Update method:
public void UpdateScalarAuthorProperties(int authorId, object dto)
{
var entityAuthor = this.dbContext.EntityAuthors.Find(authorId);
this.dbContext.Entry(entityAuthor).CurrentValues.SetValues(dto);
this.dbContext.SaveChanges();
}
This approach only works for updates of scalar and complex properties. If your view is allowed to change related entities or relationships between entities the procedure is not that easy. For this case I don't know another way than writing specific methods for every kind of Update.
A nice way to handle these transition would be to use AutoMapper. That is what automapper was created for, really.
If you want to learn how it works, please use assemlby decompiler (ILSpy is one of them) and use it on AutoMapper.dll.
The magic word here is Reflection.
Start with:
foreach (PropertyInfo prop in typeof(EntityAuthor).GetProperties())
{
...
}
Reflection mechanisms allows you to list all properties of source and destination class, compare their names and when you match these names, you can set property of destination object using SetValue method, based on values from source object.

Categories

Resources