TL;DR:
Which approach with deserializing one JSON property would be the purest?
While fetching data from DB, using dapper multimap
While manual assignment, in service (or in some separate factory)
With automapper and MapFrom, in service
Long story:
Considering following design in DB:
Id
Name
Type
JSONAttributes
1
"Foo"
"Red"
{"Attr11":"Val11", "Attr12" : "Val12" }
2
"Bar"
"Green"
{"Attr21":"Val21", "Attr22" : "Val22" }
3
"Baz"
"Blue"
{"Attr31":"Val31", "Attr32" : "Val32" }
Every type has it's own attributes kept in last column in JSON string. The type is not returned by the procedure btw.
I need to transfer this data way up to the controller. I have onion (I believe) solution structure:
MyProject. Web (with controllers)
MyProject. Core (with services, DTOs, entities, interfaces)
MyProject. Data (with repositories, using Dapper)
... and others, not significant to this topic.
Also I have created the models to map these rows to. One abstract, generic and several derived:
namespace MyProject.Core.DTO.Details
{
public abstract class DtoItem
{
public int Id { get; set; }
public string Name { get; set; }
}
}
namespace MyProject.Core.DTO.Details
{
public class DtoTypeRed : DtoItem
{
public DtoTypeRedAttributes AttributesJson { get; set; }
}
}
namespace MyProject.Core.DTO.Details
{
public class DtoTypeGreen : DtoItem
{
public DtoTypeGreenAttributes AttributesJson { get; set; }
}
}
namespace MyProject.Core.DTO.Details
{
public class DtoTypeRedAttributes
{
[JsonPropertyName("Attr11")]
public string AttributeOneOne{ get; set; }
[JsonPropertyName("Attr12")]
public string AttributeOneTwo{ get; set; }
}
}
Also created entity, but only used in Option 2 (described later):
namespace MyProject.Core.Entities
{
public class Item
{
public int Id { get; set; }
public string Name { get; set; }
public string AttributesJson { get; set; }
}
}
My question is, what would be a better approach:
Option 1 - mapping to the DTO directly, in ItemRepository, in MyProject.Data, when fetching the data from the DB using Dapper multimap, like:
namespace MyProject.Data
{
public class DetailsRepository : IDetailsRepository
{
public async Task<DtoItem> GetDetails(int itemId, int itemTypeId)
{
using (var connection = _dataAccess.GetConnection())
{
switch (itemTypeId)
{
case 1:
return (await connection.QueryAsync<DtoTypeRed,string,DtoTypeRed>("[spGetDetails]", (redDetails, redDetailsAttributesJson) =>
{
redDetails.AttributesJson = JsonSerializer.Deserialize<List<DtoTypeRed>>(redDetailsAttributesJson).FirstOrDefault();
return redDetails;
},
splitOn: "AttributesJson",
param: new { itemId ,itemTypeId},
commandType: CommandType.StoredProcedure)).FirstOrDefault();
case 2:
return (await connection.QueryAsync<DtoTypeGreen,string,DtoTypeGreen>("[spGetDetails]", (greenDetails, greenDetailsAttributesJson) =>
{
greenDetails.AttributesJson = JsonSerializer.Deserialize<List<DtoTypeGreen>>(greenDetailsAttributesJson).FirstOrDefault();
return greenDetails;
},
splitOn: "AttributesJson",
param: new { itemId ,itemTypeId},
commandType: CommandType.StoredProcedure)).FirstOrDefault();
case ...
default: return null;
}
}
}
}
}
My colleague is suggesting that doing this kind of business logic in repository is not a good approach.
So, there is one alternative (at least one I'm aware of) - fetching data into Item Entity (leave the JSON in flat string), and map it to the DTOs on the Service layer (MyProject.Core) either with simple assignment (Option 2.1), which I found not much elegant way, or using Automapper (Option 2.2)
namespace MyProject.Data
{
public class DetailsRepository : IDetailsRepository
{
public async Task<Item> GetDetails(int itemId, int itemTypeId)
{
using (var connection = _dataAccess.GetConnection())
{
return await connection.QuerySingleAsync<Item>("[spGetDetails]", param: new { itemId ,itemTypeId}, commandType: CommandType.StoredProcedure);
}
}
}
}
Option 2.1:
namespace MyProject.Core.Services
{
public class DetailsService : IDetailsService
{
public async Task<DtoItem> GetDetails(int itemId, int itemTypeId)
{
var itemDetailsEntity = await _detailsRepo.GetDetails(int itemId, int itemTypeId);
switch(itemTypeId){
case 1:
var result = new DtoTypeRed();
result.Id= itemDetailsEntity.Id;
result.Name = itemDetailsEntity.Name;
result.AttributesJson = JsonSerializer.Deserialize<List<DtoTypeRedAttributes>>(itemDetailsEntity.AttributesJson).FirstOrDefault();
case ...
}
return result;
}
}
}
If this is the way, maybe some factory would be better? (I don't have any yet)
Option 2.2 with Automapper:
The thing is, I read somewhere, that Automapper is not really meant to include the JSON deserialization logic - it should be more "auto"
namespace MyProject.Core.Mappings
{
public class MapperProfiles : Profile
{
public MapperProfiles()
{
CreateMap<Entities.Item, DTO.Details.DtoTypeRed>()
.ForMember(dest => dest.AttributesJson, opts => opts.MapFrom(src =>JsonSerializer.Deserialize<List<DtoTypeRedAttributes>>(src.AttributesJson, null).FirstOrDefault()));
(...)
}
}
}
namespace MyProject.Core.Services
{
public class DetailsService : IDetailsService
{
public async Task<DtoItem> GetDetails(int itemId, int itemTypeId)
{
var itemDetailsEntity = await _detailsRepo.GetDetails(int itemId, int itemTypeId);
switch(itemTypeId){
case 1:
return _mapper.Map<DtoTypeRed>(itemDetailsEntity);
case 2:
return _mapper.Map<DtoTypeGreen>(itemDetailsEntity);
case ...
}
}
}
}
I really lack this "architectural" knowledge and experience, so any suggestions would be really appreciated. Maybe some other way I do not see here?
Considering your software design is rather unusual, I think there's two questions you should ask yourself first:
You're using a relational DB, but are also using it to store a JSON (as in a NoSQL DB). Is this is a preexisting schema where you don't have any influence on changing it? If not, why are you using this design and not separate tables for the different data structures, as you would normally do in a relational schema? This would also give you the advantages of a relational DB like querying, foreign keys, indexing.
If you have a controller where you hand out your object anyway, you will probably do this as a JSON, right? Then what would be the point of deserializing it? Can't you just leave the JSON as is?
Apart from that, if you want to stick to your options, I would go with a solution similar to your option 1. Your colleague is right that you don't want to have business logic in a repository, since the responsibilty of a repository is only to store and query data. Business logic in an onion architecture belongs to your service layer. However, if you are only deserializing data from the DB to a structure that is usable to you in your program, then this logic should be in the repository. All you do in this case is to fetch data and transform it into objects, same thing as an ORM tool would do.
One thing I would change in option 1 though, is to move the switch-case to your service layer, since this is business logic. So:
public class DetailsService : IDetailService {
public async Task<Item> GetDetails(int itemId, int itemTypeId) {
Item myItem;
switch(itemTypeId){
case 1:
myItem = _repo.getRedItem(itemId);
// and so on
}
}
}
Your repository would then have methods for the individual item types. So again, all it does is querying and deserializing data.
On your other two options: the point of DTOs usually is to have separate objects which you can share with other processes (like when sending them out in a controller). This way, changes in the DTO then don't influence changes in your entity and vice versa, as well as you can choose which properties you want to include in your DTO or entity and which not. In your case it seems you are only really using the DTOs in your program and the entities are just there as an intermediate step, which makes it pointless to even have any entity.
Also, using AutoMapper in your scenario only makes the code more complicated, which you can see in your code in MapperProfiles, where you probably agree with me that it isn't easy to understand and maintain.
Jakob's answer is good and I agree with a lot that he says. To add to that...
The challenge you have is that you want to be as pure to Clean Architecture as possible, but you're constrained by a system where there are two conflicting architectural patterns:
The "traditional" approach where the data structures are explicit.
Alternative approach where the data structures are mixed: some data structures are defined by the database (table design) whilst others are structured in the code (JSON) - of which the database has no visibility (to the database it's just a string).
A further complication is at the code level where you have a classic layered architecture with explicitly defined objects and layers, and then you have this mutant JSON thing.
Based on the information do far i wonder if you can use a generic list for the JSON attributes - because your code is taking the JSON and converting it into essentially key-value pairs - at least that's what's suggested by your code:
MyProject.Core.DTO.Details.DtoTypeRedAttributes.AttributeOneOne{ get; set; }
This says to me that the Controller is getting AttributeOneOne which will have a name/key and value inside it. Whatever consumes this will care about the name/key, but to the rest of your code you just have generic attributes.
I'm also suspicious/curious of the type field in the database which you say is 'never returned by the procedure', but you have type specific code? e.g. DtoTypeRedAttributes.
Based on all of that, I would simplify things (which still keeps you within Clean):
Deserializing the JSON into a generic key/value pair:
This could happen in either then data access layer or within the DTO itself.
A straight JSON to Key/Value pair conversion is purely technical and does not affect any business rule, so there's no business rule/logic drivers that influence where that can happen.
Q: How expensive is the deserialization? And are those attributes always accessed? If it's expense, and the attributes aren't always used, then you can use Lazy Load to only trigger the deserialization when it's needed - this would suggest doing it in the DTO, because if you do it in the data access layer you'll get that deserialization penalty every time, regardless.
A potential downside, if you care about it, is that you'll have a DTO with a private string RawJSONAttributes property and a public List<string,string> JSONAttributes key/value pair property. I.e. larger memory foot-print, and you'll have to consider consistency (do you care about the private string if someone changes the public values, etc).
DTO Consolidation
If the DTO's can get away with only providing a generic list of key/value pairs, then based on what else you've said so far, I don't see why you have separate red,green,blue code. One DTO would do it.
If you do need separate types then yes, some sort of factory would probably be a good approach.
The Data Access code could return "generic" DTOs (like WidgetInfo, in my diagram) up to the consuming logic.
That logic (which did the DTO conversion) would probably shield the consumers / Controllers above, so that they only saw BlueWidget, RedWidget (etc) DTOs and not the generic one.
Summary
The diagram shows how I would typically do this (consuming Controllers not shown), assuming generic DTO's were ok. Let me know if I need to diagram out the other approach (conversion of generic WidgetInfo to BlueWidget/RedWidget DTOs).
Note that the DTO has a public property for the attributes (list of key/value pairs); it would also have a private string for the RawJSONAttributes if you wanted to take the Lazy Load approach in the DTO.
I have a search model class that searches different entity sets with the entity itself implementing a IAssignable interface. The code looks like this.
public void Search()
{
List<T> lessons = new List<T>();
List<T> courses = new List<T>();
if (ShowLessons)
lessons = db.Set<Lesson>()
.Where(IAssignableExtensions.SearchPredicate(q))
.Select(LessonMapping).ToList();
if (ShowCourses)
courses = db.Set<Course>()
.Where(IAssignableExtensions.SearchPredicate(q))
.Select(CourseMapping).ToList();
Results = lessons.Union(courses).ToList<T>();
}
The static extension is irrelevant, it just searched based on the query. I would prefer to bust this into it's own rather than static extension but eh. Now this works as expected. I am pulling to memory two datasets, lessons and courses, I am unioning them into a IEnumerable of a generic type based on teh Course Mapping or Lesson Mapping Expressions.
public Expression<Func<IAssignable, T>> LessonMapping { get; set; }
public Expression<Func<IAssignable, T>> CourseMapping { get; set; }
The problem is when I want to do any type of paging. As you can see the lessons and courses are searched, brought into memory and then unioned and returned. If I do any paging using an IPagedList for example, it is bringing back ALL lessons and courses then it is only using a subset of the total data in the list for the pages.
If Entity Framework supported interfaces I would just do a cast on the interface and union right at the db call. I haven't changed this code yet but I feel I might have to create a custom stored procedure or use the Query call on the datacontext, but if I use a stored procedure I have to make sure to update it on any changes to the domain, and if I use the Query I have to re-jig the selects, interfaces and still have to worry about inline sql...
Anyone have any ideas?
UPDATE
The solution that I ended up using after thinking about Erik's solution was to just use a projected object that implemented IAssignable.
public class SomeProjection : IAssignable
{
public int ID { get; set; }
public string Name { get; set; }
public string Description {get;set;}
public string Privacy {get;set;}
}
And then used it within the union call queryable
Results = db.Set<Lesson>().Select(p => new SomeProjection() { Privacy = p.Privacy, ID = p.ID, Name = p.Name, Description = p.Description })
.Union(db.Set<Course>().Select(p => new SomeProjection() { Privacy = p.Privacy, ID = p.ID, Name = p.Name, Description = p.Description }))
.Where(IAssignableExtensions.SearchPredicate(q))
.Select(Mapping).ToList<T>();
If Entity Framework supported interfaces I would just do a cast on the interface and union right at the db call.
It has nothing to do with what Entity Framework supports. If you create an interface, it is independent of the SQL technology in the back end and you want EF to somehow magically select properties based on an interface with no mappings or configuration? Not going to happen.
Instead you could simply use inheritance if there are some properties that are the same between objects, then you don't even need to union them, unless you are where-ing on properties that don't exist between both.
Okay, I've seen some similar questions to this, but the answers either confused me or seemed completely over-engineered, so I'd like to ask my own question.
I have a class called Tree, which has an object property from the class Plot, which has an object property from the class Year, which has an object property from the class Series, which has a string property called Id. This is summarized below.
public class Tree {
public virtual Plot Plot { get; set; }
// other properties...
}
public class Plot {
public virtual Year Year { get; set; }
// other properties...
}
public class Year {
public virtual Series Series { get; set; }
// other properties...
}
public class Series {
public virtual string Id { get; set; }
// other properties...
}
Each of these classes corresponds to the table of a database, and properties correspond to foreign key fields (for example, the Trees table has a field called PlotKey, which refers to a record in the Plots table). All I want to do is load all trees from the database whose corresponding Series have the Id "Adrian_2012" or "IPED Sample". I thought this would be a pretty easy taking using the following code:
IList<Tree> trees = session.CreateCriteria<Tree>()
.Add(Expression.Or(
Expression.Eq("Plot.Year.Series.Id", "Adrian_2012")
Expression.Eq("Plot.Year.Series.Id", "IPED Sample")
))
.List<Tree>();
But this is throwing: "NHibernate.Exceptions.GenericADOException : could not execute query". I have tried using Expression.Disjunction, I have tried using Aliases, Restrictions, and SimpleExpressions, and I know that nothing stupid like unmapped properties or misspelled criteria is occurring. The only other thing I've seen that might help is the ISession.QueryOver<>() function, but I get very confused by lambda expressions. Does anyone have a solution for me that would use just a simple CreateCriteria<> statement like that above?
Thanks in advance!
One not nice side of the Criteria queries is, that we have to define associations chain explicitly. I.e. we have to introduce JOIN:
15.4. Associations (cite:)
You may easily specify constraints upon related entities by navigating associations using CreateCriteria().
So to have a JOIN we need syntax like this
var trees = session
.CreateCriteria<Tree>()
.CreateCriteria("Plot", "p")
.CreateCriteria("Year", "y")
.CreateCriteria("Series", "s")
.Add(Expression.Or(
Expression.Eq("s.Id", "Adrian_2012")
Expression.Eq("s.Id", "IPED Sample")
))
.List<Tree>();
Also, check this:
NHibernate - CreateCriteria vs CreateAlias
In an application I'm working on, I have what are essentially a bunch of lookup tables in a database which all contain two things: The ID (int) and a Value (string).
There's only a handful of them, but I want to map all of them to a single Context which depends on the table name. Something like:
class LookupContext : DbContext
{
public DbSet<Lookup> Lookups { get; set; }
public LookupContext(String table)
{
// Pseudo code:
// Bind Lookups based on what table is
Lookups = MyDatabase.BindTo(table);
}
}
So if I create a new LookupContext("foo"), it binds against the foo table. If I do new LookupContext("bar") it uses the bar table, and so forth.
Is there any way to do this? Or do I have to create a separate context + model for every table I have?
This is more or less my first time doing this, so I'm not really sure if what I'm doing is right.
The answer we should be able to give you is to use enums, but that's not available quite yet - it's in the next version of EF. See here for details: http://blogs.msdn.com/b/adonet/archive/2011/06/30/walkthrough-enums-june-ctp.aspx
With earlier versions of EF, you can simply create a class per lookup value (assuming state as an example) and have code that looks something like the following:
public class State
{
public int StateId {get;set;}
public string StateName {get;set;}
}
public class LookupContext : DbContext
{
public DbSet<State> States {get;set;}
// ... more lookups as DbSets
}
This will allow you to use one context but will still require one class per table. You can also use the fluent API if you want your table/column names to differ from your class/property names respectively. Hope that helps!
I actually realized I was completely over complicating things beyond reason. There was no reason for storing multiple tables with two columns.
I'm better off storing my data as:
public class LookupValue
{
public string LookupValueId { get; set; }
public string Value { get; set; }
public string LookupType { get; set; }
}
Where the third field was simply the name of the table that I was previously storing in the database.
I'm still interested in the idea of mapping a single Context class to multiple tables, but I believe what I described above is the least convoluted way of accomplishing what I need.
So I will call a repository to retrieve the root object of a complex object graph, using FluentNHibernate. But for some sub-level objects I don't want to retrieve all elements, but only those where a date parameter equals certain condition. In below code, I want the lower level Order object to be filtered in this way by the OrderTime field.
Meaning I want to retrieve all UserGroups, with all Users, but the Orders object of each User shall only contain orders from a specific date or date range.
So what are my options on how to retrieve this object graph? I don't want lazy loading, I just want to specify a handful of different retrieval conditions, which will never change. So they can be separate functions of the repository, like suggested at the end. But how would I go about coding those methods, how to specify these conditions?
Objects:
public class UserGroup
{
public int Id;
public IList<User> Users;
}
public class User
{
public int Id;
public string Name;
public IList<Order> Orders;
}
public class Order
{
public int Id;
public decimal Price;
public System.DateTime OrderTime;
}
Repository:
public class UserGroupRepository
{
public List<UserGroup> GetAll()
{
using (ISession session = FNH_Manager.OpenSession()) {
dynamic obj = session.CreateCriteria(typeof(UserGroup)).List<UserGroup>();
return obj;
}
}
}
Potential new Repository methods: ?
public List<UserGroup> GetAll_FilterOrderDate(System.DateTime _date)
{
}
public List<UserGroup> GetAll_FilterOrderDate(List<System.DateTime> _dates)
{
}
It really depends on what you want to do with the orders.
Is there a reason you need to query on the aggregate root? Would it make sense to query over the actual orders by date instead? So you'd end up with:
session.QueryOver<Order>().Where(t => t.OrderDate > ...);
If your associations are set up correctly you'll still be able to navigate to the user.
Personally I find the repository pattern to be a bit restrictive and would rather use query objects, so you'd end up with something like:
queryService.FindAll<UserGroup>(new GetAllByFilterOrderDate(DateTime.Now));
However if the concept of a repository works for you then by all means stick to it, but it means you'll try to force your object model into this 'UserGroup' centric view.