ADO.NET build complex objects using DataReader - c#

I'm trying to create a way to make an unique search into the database and build the right object for my needs. I mean, I use a SQL query that returns me a lot of rows and then I build the collections based on that database rows. E.g.:
We have a table called People and another table called Phones.
Let's suppose that this is my SQL query and will return the following below:
SELECT
P.[Id], P.[Name], PH.[PhoneNumber]
FROM
[dbo].[People] P
INNER JOIN
[dbo].[Phones] PH ON PH.[Person] = P.[Id]
And that's the results returned:
1 NICOLAS (123)123-1234
1 NICOLAS (235)235-2356
So, my class will be:
public interface IModel {
void CastFromReader(IDataReader reader);
}
public class PhoneModel : IModel {
public string PhoneNumber { get; set; }
public PhoneModel() { }
public PhoneModel(IDataReader reader) : this() {
CastFromReader(reader);
}
public void CastFromReader(IDataReader reader) {
PhoneNumber = (string) reader["PhoneNumber"];
}
}
public class PersonModel : IModel {
public int Id { get; set; }
public string Name { get; set; }
public IList<PhoneModel> Phones { get; set; }
public PersonModel() {
Phones = new List<PhoneModel>();
}
public PersonModel(IDataReader reader) : this() {
CastFromReader(reader);
}
public void CastFromReader(IDataReader reader) {
Id = Convert.ToInt32(reader["Id"]);
Name = (string) reader["Name"];
var phone = new PhoneModel();
phone.CastFromReader(reader);
Phones.Add(phone);
// or
Phones.Add(new PhoneModel {
PhoneNumber = (string) reader["PhomeNumber"]
});
}
}
This code will generate a PersonModel object with two phone numbers. That's good so far.
However, I'm struggling to make some good way to deal when I want to manage more tables with this process.
Let's suppose, then, I have a new table called Appointments. It stores the user's appointments to the schedule.
So, adding this table to the query, the result will be:
1 NICOLAS (123)123-1234 17/09/2014
1 NICOLAS (123)123-1234 19/09/2014
1 NICOLAS (123)123-1234 27/09/2014
1 NICOLAS (235)235-2356 17/09/2014
1 NICOLAS (235)235-2356 19/09/2014
1 NICOLAS (235)235-2356 17/09/2014
As you guys can see, the problem is to manage the phones and the appointments this way. Do you can think in anything that could solve this issue?
Thank you all for the opinions!

You cannot transfer your query result to strongly typed objects without first defining these objects' types. If you want to keep query data in memory, I recommend that you transfer it into objects of a previously defined type at some point.
What follows is therefore not something that I would actually recommend doing. But I want to demonstrate to you a possibility. Judge for yourself.
As I suggested in a previous comment, you can mimick strongly typed DTOs using the Dynamic Language Runtime (DLR), which has become available with .NET 4.
Here is an example for a custom DynamicObject type that provides a seemingly strongly-typed façade for a IDataReader.
using System.Data;
using System.Dynamic; // needs assembly references to System.Core & Microsoft.CSharp
using System.Linq;
public static class DataReaderExtensions
{
public static dynamic AsDynamic(this IDataReader reader)
{
return new DynamicDataReader(reader);
}
private sealed class DynamicDataReader : DynamicObject
{
public DynamicDataReader(IDataReader reader)
{
this.reader = reader;
}
private readonly IDataReader reader;
// this method gets called for late-bound member (e.g. property) access
public override bool TryGetMember(GetMemberBinder binder, out object result)
{
int index = reader.GetOrdinal(binder.Name);
result = index >= 0 ? reader.GetValue(index) : null;
return index >= 0;
}
}
}
Then you can use it like this:
using (IDataReader reader = someSqlCommand.ExecuteReader(…))
{
dynamic current = reader.AsDynamic(); // façade representing the current record
while (reader.Read())
{
// the magic will happen in the following two lines:
int id = current.Id; // = reader.GetInt32(reader.GetOrdinal("Id"))
string name = current.Name; // = reader.GetString(reader.GetOrdinal("Name"))
…
}
}
But beware, with this implementation, all you get is a façade for the current record. If you want to keep data of several records in memory, this implementation won't help a lot. For that purpose, you could look into several further possibilities:
Use anonymous objects: cachedRecords.Add(new { current.Id, current.Name });. This is only any good if you access the cachedRecords in the same method where you build it, because the anonymous type used will not be usable outside of the method.
Cache current's data in an ExpandoObject.

If you want to manually write a data type for each combination of columns resulting from your queries, then you have a lot of work to do, and you will end up with lots of very similar, but slightly different classes that are hard to name. Note also that these data types should not be treated as something more than what they are: Data Transfer Objects (DTOs). They are not real domain objects with domain-specific behaviour; they should just contain and transport data, nothing else.
What follows are two suggestions, or ideas. I will only scratch at the surface here and not go into too many details; since you haven't asked a very specific question, I won't provide a very specific answer.
1. A better approach might be to determine what domain entity types you've got (e.g. Person, Appointment) and what domain value types you have (e.g. Phone Number), and then build an object model from that:
struct PhoneNumber { … }
partial interface Person
{
int Id { get; }
string Name { get; }
PhoneNumber PhoneNumber { get; }
}
partial interface Appointment
{
DateTime Date { get; }
Person[] Participants { get; }
}
and then have your database code map to these. If, for example, some query returns a Person Id, Person Name, Phone Number, and an Appointment Date, then each attribute will have to be put into the correct entity type, and they will have to be linked together (e.g. via Participants) correctly. Quite a bit of work. Look into LINQ to SQL, Entity Framework, NHibernate or any other ORM if you don't want to do this manually. If your database model and your domain model are too different, even these tools might not be able to make the translation.
2. If you want to hand-code your data query layer that transforms data into a domain model, you might want to set up your queries in such a way that if they return one attribute A of entity X, and entity X has other attributes B, C, and D, then the query should also return these, such that you can always build a complete domain object from the query result. For example, if a query returned a Person Id and a Person Phone Number, but not the Person Name, you could not build Person objects (as defined above) from the query because the name is missing.
This second suggestion will at least partially save you from having to define lots of very similar DTO types (one per attribute combination). This way, you can have a DTO for a Person record, another for a Phone Number record, another for an Appointment record, perhaps (if needed) another for a combination of Person and Phone Number; but you won't need to distinguish between types such as PersonWithAllAttributes, PersonWithIdButWithoutNameOrPhoneNumber, PersonWithoutIdButWithPhoneNumber, etc. You'll just have Person containing all attributes.

Related

Mapping one property in JSON using Automapper or Dapper? On which layer?

TL;DR:
Which approach with deserializing one JSON property would be the purest?
While fetching data from DB, using dapper multimap
While manual assignment, in service (or in some separate factory)
With automapper and MapFrom, in service
Long story:
Considering following design in DB:
Id
Name
Type
JSONAttributes
1
"Foo"
"Red"
{"Attr11":"Val11", "Attr12" : "Val12" }
2
"Bar"
"Green"
{"Attr21":"Val21", "Attr22" : "Val22" }
3
"Baz"
"Blue"
{"Attr31":"Val31", "Attr32" : "Val32" }
Every type has it's own attributes kept in last column in JSON string. The type is not returned by the procedure btw.
I need to transfer this data way up to the controller. I have onion (I believe) solution structure:
MyProject. Web (with controllers)
MyProject. Core (with services, DTOs, entities, interfaces)
MyProject. Data (with repositories, using Dapper)
... and others, not significant to this topic.
Also I have created the models to map these rows to. One abstract, generic and several derived:
namespace MyProject.Core.DTO.Details
{
public abstract class DtoItem
{
public int Id { get; set; }
public string Name { get; set; }
}
}
namespace MyProject.Core.DTO.Details
{
public class DtoTypeRed : DtoItem
{
public DtoTypeRedAttributes AttributesJson { get; set; }
}
}
namespace MyProject.Core.DTO.Details
{
public class DtoTypeGreen : DtoItem
{
public DtoTypeGreenAttributes AttributesJson { get; set; }
}
}
namespace MyProject.Core.DTO.Details
{
public class DtoTypeRedAttributes
{
[JsonPropertyName("Attr11")]
public string AttributeOneOne{ get; set; }
[JsonPropertyName("Attr12")]
public string AttributeOneTwo{ get; set; }
}
}
Also created entity, but only used in Option 2 (described later):
namespace MyProject.Core.Entities
{
public class Item
{
public int Id { get; set; }
public string Name { get; set; }
public string AttributesJson { get; set; }
}
}
My question is, what would be a better approach:
Option 1 - mapping to the DTO directly, in ItemRepository, in MyProject.Data, when fetching the data from the DB using Dapper multimap, like:
namespace MyProject.Data
{
public class DetailsRepository : IDetailsRepository
{
public async Task<DtoItem> GetDetails(int itemId, int itemTypeId)
{
using (var connection = _dataAccess.GetConnection())
{
switch (itemTypeId)
{
case 1:
return (await connection.QueryAsync<DtoTypeRed,string,DtoTypeRed>("[spGetDetails]", (redDetails, redDetailsAttributesJson) =>
{
redDetails.AttributesJson = JsonSerializer.Deserialize<List<DtoTypeRed>>(redDetailsAttributesJson).FirstOrDefault();
return redDetails;
},
splitOn: "AttributesJson",
param: new { itemId ,itemTypeId},
commandType: CommandType.StoredProcedure)).FirstOrDefault();
case 2:
return (await connection.QueryAsync<DtoTypeGreen,string,DtoTypeGreen>("[spGetDetails]", (greenDetails, greenDetailsAttributesJson) =>
{
greenDetails.AttributesJson = JsonSerializer.Deserialize<List<DtoTypeGreen>>(greenDetailsAttributesJson).FirstOrDefault();
return greenDetails;
},
splitOn: "AttributesJson",
param: new { itemId ,itemTypeId},
commandType: CommandType.StoredProcedure)).FirstOrDefault();
case ...
default: return null;
}
}
}
}
}
My colleague is suggesting that doing this kind of business logic in repository is not a good approach.
So, there is one alternative (at least one I'm aware of) - fetching data into Item Entity (leave the JSON in flat string), and map it to the DTOs on the Service layer (MyProject.Core) either with simple assignment (Option 2.1), which I found not much elegant way, or using Automapper (Option 2.2)
namespace MyProject.Data
{
public class DetailsRepository : IDetailsRepository
{
public async Task<Item> GetDetails(int itemId, int itemTypeId)
{
using (var connection = _dataAccess.GetConnection())
{
return await connection.QuerySingleAsync<Item>("[spGetDetails]", param: new { itemId ,itemTypeId}, commandType: CommandType.StoredProcedure);
}
}
}
}
Option 2.1:
namespace MyProject.Core.Services
{
public class DetailsService : IDetailsService
{
public async Task<DtoItem> GetDetails(int itemId, int itemTypeId)
{
var itemDetailsEntity = await _detailsRepo.GetDetails(int itemId, int itemTypeId);
switch(itemTypeId){
case 1:
var result = new DtoTypeRed();
result.Id= itemDetailsEntity.Id;
result.Name = itemDetailsEntity.Name;
result.AttributesJson = JsonSerializer.Deserialize<List<DtoTypeRedAttributes>>(itemDetailsEntity.AttributesJson).FirstOrDefault();
case ...
}
return result;
}
}
}
If this is the way, maybe some factory would be better? (I don't have any yet)
Option 2.2 with Automapper:
The thing is, I read somewhere, that Automapper is not really meant to include the JSON deserialization logic - it should be more "auto"
namespace MyProject.Core.Mappings
{
public class MapperProfiles : Profile
{
public MapperProfiles()
{
CreateMap<Entities.Item, DTO.Details.DtoTypeRed>()
.ForMember(dest => dest.AttributesJson, opts => opts.MapFrom(src =>JsonSerializer.Deserialize<List<DtoTypeRedAttributes>>(src.AttributesJson, null).FirstOrDefault()));
(...)
}
}
}
namespace MyProject.Core.Services
{
public class DetailsService : IDetailsService
{
public async Task<DtoItem> GetDetails(int itemId, int itemTypeId)
{
var itemDetailsEntity = await _detailsRepo.GetDetails(int itemId, int itemTypeId);
switch(itemTypeId){
case 1:
return _mapper.Map<DtoTypeRed>(itemDetailsEntity);
case 2:
return _mapper.Map<DtoTypeGreen>(itemDetailsEntity);
case ...
}
}
}
}
I really lack this "architectural" knowledge and experience, so any suggestions would be really appreciated. Maybe some other way I do not see here?
Considering your software design is rather unusual, I think there's two questions you should ask yourself first:
You're using a relational DB, but are also using it to store a JSON (as in a NoSQL DB). Is this is a preexisting schema where you don't have any influence on changing it? If not, why are you using this design and not separate tables for the different data structures, as you would normally do in a relational schema? This would also give you the advantages of a relational DB like querying, foreign keys, indexing.
If you have a controller where you hand out your object anyway, you will probably do this as a JSON, right? Then what would be the point of deserializing it? Can't you just leave the JSON as is?
Apart from that, if you want to stick to your options, I would go with a solution similar to your option 1. Your colleague is right that you don't want to have business logic in a repository, since the responsibilty of a repository is only to store and query data. Business logic in an onion architecture belongs to your service layer. However, if you are only deserializing data from the DB to a structure that is usable to you in your program, then this logic should be in the repository. All you do in this case is to fetch data and transform it into objects, same thing as an ORM tool would do.
One thing I would change in option 1 though, is to move the switch-case to your service layer, since this is business logic. So:
public class DetailsService : IDetailService {
public async Task<Item> GetDetails(int itemId, int itemTypeId) {
Item myItem;
switch(itemTypeId){
case 1:
myItem = _repo.getRedItem(itemId);
// and so on
}
}
}
Your repository would then have methods for the individual item types. So again, all it does is querying and deserializing data.
On your other two options: the point of DTOs usually is to have separate objects which you can share with other processes (like when sending them out in a controller). This way, changes in the DTO then don't influence changes in your entity and vice versa, as well as you can choose which properties you want to include in your DTO or entity and which not. In your case it seems you are only really using the DTOs in your program and the entities are just there as an intermediate step, which makes it pointless to even have any entity.
Also, using AutoMapper in your scenario only makes the code more complicated, which you can see in your code in MapperProfiles, where you probably agree with me that it isn't easy to understand and maintain.
Jakob's answer is good and I agree with a lot that he says. To add to that...
The challenge you have is that you want to be as pure to Clean Architecture as possible, but you're constrained by a system where there are two conflicting architectural patterns:
The "traditional" approach where the data structures are explicit.
Alternative approach where the data structures are mixed: some data structures are defined by the database (table design) whilst others are structured in the code (JSON) - of which the database has no visibility (to the database it's just a string).
A further complication is at the code level where you have a classic layered architecture with explicitly defined objects and layers, and then you have this mutant JSON thing.
Based on the information do far i wonder if you can use a generic list for the JSON attributes - because your code is taking the JSON and converting it into essentially key-value pairs - at least that's what's suggested by your code:
MyProject.Core.DTO.Details.DtoTypeRedAttributes.AttributeOneOne{ get; set; }
This says to me that the Controller is getting AttributeOneOne which will have a name/key and value inside it. Whatever consumes this will care about the name/key, but to the rest of your code you just have generic attributes.
I'm also suspicious/curious of the type field in the database which you say is 'never returned by the procedure', but you have type specific code? e.g. DtoTypeRedAttributes.
Based on all of that, I would simplify things (which still keeps you within Clean):
Deserializing the JSON into a generic key/value pair:
This could happen in either then data access layer or within the DTO itself.
A straight JSON to Key/Value pair conversion is purely technical and does not affect any business rule, so there's no business rule/logic drivers that influence where that can happen.
Q: How expensive is the deserialization? And are those attributes always accessed? If it's expense, and the attributes aren't always used, then you can use Lazy Load to only trigger the deserialization when it's needed - this would suggest doing it in the DTO, because if you do it in the data access layer you'll get that deserialization penalty every time, regardless.
A potential downside, if you care about it, is that you'll have a DTO with a private string RawJSONAttributes property and a public List<string,string> JSONAttributes key/value pair property. I.e. larger memory foot-print, and you'll have to consider consistency (do you care about the private string if someone changes the public values, etc).
DTO Consolidation
If the DTO's can get away with only providing a generic list of key/value pairs, then based on what else you've said so far, I don't see why you have separate red,green,blue code. One DTO would do it.
If you do need separate types then yes, some sort of factory would probably be a good approach.
The Data Access code could return "generic" DTOs (like WidgetInfo, in my diagram) up to the consuming logic.
That logic (which did the DTO conversion) would probably shield the consumers / Controllers above, so that they only saw BlueWidget, RedWidget (etc) DTOs and not the generic one.
Summary
The diagram shows how I would typically do this (consuming Controllers not shown), assuming generic DTO's were ok. Let me know if I need to diagram out the other approach (conversion of generic WidgetInfo to BlueWidget/RedWidget DTOs).
Note that the DTO has a public property for the attributes (list of key/value pairs); it would also have a private string for the RawJSONAttributes if you wanted to take the Lazy Load approach in the DTO.

Is there a work around for unioning two entities of the same interface using entity framework?

I have a search model class that searches different entity sets with the entity itself implementing a IAssignable interface. The code looks like this.
public void Search()
{
List<T> lessons = new List<T>();
List<T> courses = new List<T>();
if (ShowLessons)
lessons = db.Set<Lesson>()
.Where(IAssignableExtensions.SearchPredicate(q))
.Select(LessonMapping).ToList();
if (ShowCourses)
courses = db.Set<Course>()
.Where(IAssignableExtensions.SearchPredicate(q))
.Select(CourseMapping).ToList();
Results = lessons.Union(courses).ToList<T>();
}
The static extension is irrelevant, it just searched based on the query. I would prefer to bust this into it's own rather than static extension but eh. Now this works as expected. I am pulling to memory two datasets, lessons and courses, I am unioning them into a IEnumerable of a generic type based on teh Course Mapping or Lesson Mapping Expressions.
public Expression<Func<IAssignable, T>> LessonMapping { get; set; }
public Expression<Func<IAssignable, T>> CourseMapping { get; set; }
The problem is when I want to do any type of paging. As you can see the lessons and courses are searched, brought into memory and then unioned and returned. If I do any paging using an IPagedList for example, it is bringing back ALL lessons and courses then it is only using a subset of the total data in the list for the pages.
If Entity Framework supported interfaces I would just do a cast on the interface and union right at the db call. I haven't changed this code yet but I feel I might have to create a custom stored procedure or use the Query call on the datacontext, but if I use a stored procedure I have to make sure to update it on any changes to the domain, and if I use the Query I have to re-jig the selects, interfaces and still have to worry about inline sql...
Anyone have any ideas?
UPDATE
The solution that I ended up using after thinking about Erik's solution was to just use a projected object that implemented IAssignable.
public class SomeProjection : IAssignable
{
public int ID { get; set; }
public string Name { get; set; }
public string Description {get;set;}
public string Privacy {get;set;}
}
And then used it within the union call queryable
Results = db.Set<Lesson>().Select(p => new SomeProjection() { Privacy = p.Privacy, ID = p.ID, Name = p.Name, Description = p.Description })
.Union(db.Set<Course>().Select(p => new SomeProjection() { Privacy = p.Privacy, ID = p.ID, Name = p.Name, Description = p.Description }))
.Where(IAssignableExtensions.SearchPredicate(q))
.Select(Mapping).ToList<T>();
If Entity Framework supported interfaces I would just do a cast on the interface and union right at the db call.
It has nothing to do with what Entity Framework supports. If you create an interface, it is independent of the SQL technology in the back end and you want EF to somehow magically select properties based on an interface with no mappings or configuration? Not going to happen.
Instead you could simply use inheritance if there are some properties that are the same between objects, then you don't even need to union them, unless you are where-ing on properties that don't exist between both.

How Do I Query Objects With Nested Properties Using NHibernate?

Okay, I've seen some similar questions to this, but the answers either confused me or seemed completely over-engineered, so I'd like to ask my own question.
I have a class called Tree, which has an object property from the class Plot, which has an object property from the class Year, which has an object property from the class Series, which has a string property called Id. This is summarized below.
public class Tree {
public virtual Plot Plot { get; set; }
// other properties...
}
public class Plot {
public virtual Year Year { get; set; }
// other properties...
}
public class Year {
public virtual Series Series { get; set; }
// other properties...
}
public class Series {
public virtual string Id { get; set; }
// other properties...
}
Each of these classes corresponds to the table of a database, and properties correspond to foreign key fields (for example, the Trees table has a field called PlotKey, which refers to a record in the Plots table). All I want to do is load all trees from the database whose corresponding Series have the Id "Adrian_2012" or "IPED Sample". I thought this would be a pretty easy taking using the following code:
IList<Tree> trees = session.CreateCriteria<Tree>()
.Add(Expression.Or(
Expression.Eq("Plot.Year.Series.Id", "Adrian_2012")
Expression.Eq("Plot.Year.Series.Id", "IPED Sample")
))
.List<Tree>();
But this is throwing: "NHibernate.Exceptions.GenericADOException : could not execute query". I have tried using Expression.Disjunction, I have tried using Aliases, Restrictions, and SimpleExpressions, and I know that nothing stupid like unmapped properties or misspelled criteria is occurring. The only other thing I've seen that might help is the ISession.QueryOver<>() function, but I get very confused by lambda expressions. Does anyone have a solution for me that would use just a simple CreateCriteria<> statement like that above?
Thanks in advance!
One not nice side of the Criteria queries is, that we have to define associations chain explicitly. I.e. we have to introduce JOIN:
15.4. Associations (cite:)
You may easily specify constraints upon related entities by navigating associations using CreateCriteria().
So to have a JOIN we need syntax like this
var trees = session
.CreateCriteria<Tree>()
.CreateCriteria("Plot", "p")
.CreateCriteria("Year", "y")
.CreateCriteria("Series", "s")
.Add(Expression.Or(
Expression.Eq("s.Id", "Adrian_2012")
Expression.Eq("s.Id", "IPED Sample")
))
.List<Tree>();
Also, check this:
NHibernate - CreateCriteria vs CreateAlias

Setting what table a DbContext maps to

In an application I'm working on, I have what are essentially a bunch of lookup tables in a database which all contain two things: The ID (int) and a Value (string).
There's only a handful of them, but I want to map all of them to a single Context which depends on the table name. Something like:
class LookupContext : DbContext
{
public DbSet<Lookup> Lookups { get; set; }
public LookupContext(String table)
{
// Pseudo code:
// Bind Lookups based on what table is
Lookups = MyDatabase.BindTo(table);
}
}
So if I create a new LookupContext("foo"), it binds against the foo table. If I do new LookupContext("bar") it uses the bar table, and so forth.
Is there any way to do this? Or do I have to create a separate context + model for every table I have?
This is more or less my first time doing this, so I'm not really sure if what I'm doing is right.
The answer we should be able to give you is to use enums, but that's not available quite yet - it's in the next version of EF. See here for details: http://blogs.msdn.com/b/adonet/archive/2011/06/30/walkthrough-enums-june-ctp.aspx
With earlier versions of EF, you can simply create a class per lookup value (assuming state as an example) and have code that looks something like the following:
public class State
{
public int StateId {get;set;}
public string StateName {get;set;}
}
public class LookupContext : DbContext
{
public DbSet<State> States {get;set;}
// ... more lookups as DbSets
}
This will allow you to use one context but will still require one class per table. You can also use the fluent API if you want your table/column names to differ from your class/property names respectively. Hope that helps!
I actually realized I was completely over complicating things beyond reason. There was no reason for storing multiple tables with two columns.
I'm better off storing my data as:
public class LookupValue
{
public string LookupValueId { get; set; }
public string Value { get; set; }
public string LookupType { get; set; }
}
Where the third field was simply the name of the table that I was previously storing in the database.
I'm still interested in the idea of mapping a single Context class to multiple tables, but I believe what I described above is the least convoluted way of accomplishing what I need.

FluentNhibernate, retrieve partial Object Graph

So I will call a repository to retrieve the root object of a complex object graph, using FluentNHibernate. But for some sub-level objects I don't want to retrieve all elements, but only those where a date parameter equals certain condition. In below code, I want the lower level Order object to be filtered in this way by the OrderTime field.
Meaning I want to retrieve all UserGroups, with all Users, but the Orders object of each User shall only contain orders from a specific date or date range.
So what are my options on how to retrieve this object graph? I don't want lazy loading, I just want to specify a handful of different retrieval conditions, which will never change. So they can be separate functions of the repository, like suggested at the end. But how would I go about coding those methods, how to specify these conditions?
Objects:
public class UserGroup
{
public int Id;
public IList<User> Users;
}
public class User
{
public int Id;
public string Name;
public IList<Order> Orders;
}
public class Order
{
public int Id;
public decimal Price;
public System.DateTime OrderTime;
}
Repository:
public class UserGroupRepository
{
public List<UserGroup> GetAll()
{
using (ISession session = FNH_Manager.OpenSession()) {
dynamic obj = session.CreateCriteria(typeof(UserGroup)).List<UserGroup>();
return obj;
}
}
}
Potential new Repository methods: ?
public List<UserGroup> GetAll_FilterOrderDate(System.DateTime _date)
{
}
public List<UserGroup> GetAll_FilterOrderDate(List<System.DateTime> _dates)
{
}
It really depends on what you want to do with the orders.
Is there a reason you need to query on the aggregate root? Would it make sense to query over the actual orders by date instead? So you'd end up with:
session.QueryOver<Order>().Where(t => t.OrderDate > ...);
If your associations are set up correctly you'll still be able to navigate to the user.
Personally I find the repository pattern to be a bit restrictive and would rather use query objects, so you'd end up with something like:
queryService.FindAll<UserGroup>(new GetAllByFilterOrderDate(DateTime.Now));
However if the concept of a repository works for you then by all means stick to it, but it means you'll try to force your object model into this 'UserGroup' centric view.

Categories

Resources