Repository pattern to query multiple databases - c#

This is a very weird architecture. Please bear with me.
We have an existing tiered application (data, logic/service, client).
The latest requirement is that the service layer should access two data sources!!!! (no other way around)
These two data sources have the same DB schema.
As with most tiered architectures, we have read and write methods like:
IEnumerable<Product> GetAllProducts(),
Product GetProductById(ProductKey id),
IEnumerable<Product> FindProductsByName(string name)
the product DTOs are:
class Product
{
public ProductKey Key { get; set;}
...
}
class ProductKey
{
public long ID { get; }
}
We narrowed it down to two possible solutions:
Alternative 1:
Add a parameter into the read methods so that the service knows what DB to use like so:
Product GetProductById(ProductKey id, DataSource dataSource)
DataSource is an enumeration.
Alternative 2 (my solution):
Add the DataSource property to the key classes. this will be set by Entity Framework when the object is retrieved. Also, this will not be persisted into the db.
class ProductKey
{
public long ID { get; }
public DataSource Source { get; } //enum
}
The advantage is that the change will have minimal impact to the client.
However, people dont like this solution because
the DataSource doesn't add business value. (My response is that
the ID doesn't add business value either. Its a surrogate key. Its
purpose is for tracking the persistence)
The children in the object graph will also contain DataSource which is redundant
Which solution is more sound? Do you have other alternatives?
Note: these services are used everywhere.

What I would suggest is door number 3:
[||||||||||||||]
[|||||||||s! ]
[||||nerics! ]
[ Generics! ]
I use a "dynamic repository" (or at least that is what I have called it). It is setup to be able to connect to any datacontext or dbset while still being in the same using block (i.e. without re-instantiation).
Here is a snippet of how I use it:
using (var dr = new DynamicRepo())
{
dr.Add<House>(model.House);
foreach (var rs in model.Rooms)
{
rs.HouseId = model.House.HouseId;
dr.Add<Room>(rs);
}
}
This uses the "default" dbcontext that is defined. Each one must be defined in the repository, but not instantiated. Here is the constructor I use:
public DynamicRepo(bool Main = true, bool Archive = false)
{
if (Main)
{
this.context = new MainDbContext();
}
if (Archive)
{
this.context = new ArchiveDbContext();
}
}
This is a simplified version where there are only two contexts. A more in depth selection method can be implemented to choose which context to use.
And then once initialized, here would be how the Add works:
public void Add<T>(T te) where T : class
{
DbSet<T> dbSet = context.Set<T>();
dbSet.Add(te);
context.SaveChanges();
}
A nice advantage of this is that there is only one spot to maintain the code for interacting with the database. All the other logic can be abstracted away into different classes. It definitely saved me a lot of time to use a generic repository in this fashion - even if I spent some time modifying it at first.
I hope I didn't misunderstand what you were looking for, but if you are trying to have one repository for multiple data sources, I believe this is a good approach.

Related

How to add to an intermediary table in Entity Framework

I have the following entity:
class Car
{
public string make;
public string model;
public string registration;
}
We have multiple dealerships and we want to ensure that dealer1 can only see cars that belong to them, and dealer2 can only see their cars.
We don't want to implement this check in the everyday business logic of our applications since it could lead to inconsistent enforcement of the rules, so I'm creating a thin wrapper around Entity Framework which does that.
I have another entity:
class Dealer
{
public Guid id;
}
I don't want car to reference dealer, so instead I plan to have my wrapper code look like this:
void AddCar(Car car, Dealer dealer)
{
// Some authorization logic goes here
*Add dealer if not already added
context.Add(car)
*Add link between car and dealer to third table
}
Is there any way to add data to a third link table without defining a new class to represent that link for every type of entity? E.g. can I just do like a dumb table insert or something like that?
I've tried to simplify my example as much as possible for clarity, but the reality is that I'm trying to make the wrapper generic as I have no idea what entities exist across all the micro services it will be used in (and nor should I)
You can execute SQL queries in entity framework by using ExecuteSql.
int carId = 1;
int dealerId = 1;
using (var context = new AppDbContext())
{
var sql = $"INSERT INTO [CarDealer] ([CarId], [DealerId]) VALUES ({carId}, {dealerId})";
var rowsModified = context.Database.ExecuteSql(sql);
}

Mapping one property in JSON using Automapper or Dapper? On which layer?

TL;DR:
Which approach with deserializing one JSON property would be the purest?
While fetching data from DB, using dapper multimap
While manual assignment, in service (or in some separate factory)
With automapper and MapFrom, in service
Long story:
Considering following design in DB:
Id
Name
Type
JSONAttributes
1
"Foo"
"Red"
{"Attr11":"Val11", "Attr12" : "Val12" }
2
"Bar"
"Green"
{"Attr21":"Val21", "Attr22" : "Val22" }
3
"Baz"
"Blue"
{"Attr31":"Val31", "Attr32" : "Val32" }
Every type has it's own attributes kept in last column in JSON string. The type is not returned by the procedure btw.
I need to transfer this data way up to the controller. I have onion (I believe) solution structure:
MyProject. Web (with controllers)
MyProject. Core (with services, DTOs, entities, interfaces)
MyProject. Data (with repositories, using Dapper)
... and others, not significant to this topic.
Also I have created the models to map these rows to. One abstract, generic and several derived:
namespace MyProject.Core.DTO.Details
{
public abstract class DtoItem
{
public int Id { get; set; }
public string Name { get; set; }
}
}
namespace MyProject.Core.DTO.Details
{
public class DtoTypeRed : DtoItem
{
public DtoTypeRedAttributes AttributesJson { get; set; }
}
}
namespace MyProject.Core.DTO.Details
{
public class DtoTypeGreen : DtoItem
{
public DtoTypeGreenAttributes AttributesJson { get; set; }
}
}
namespace MyProject.Core.DTO.Details
{
public class DtoTypeRedAttributes
{
[JsonPropertyName("Attr11")]
public string AttributeOneOne{ get; set; }
[JsonPropertyName("Attr12")]
public string AttributeOneTwo{ get; set; }
}
}
Also created entity, but only used in Option 2 (described later):
namespace MyProject.Core.Entities
{
public class Item
{
public int Id { get; set; }
public string Name { get; set; }
public string AttributesJson { get; set; }
}
}
My question is, what would be a better approach:
Option 1 - mapping to the DTO directly, in ItemRepository, in MyProject.Data, when fetching the data from the DB using Dapper multimap, like:
namespace MyProject.Data
{
public class DetailsRepository : IDetailsRepository
{
public async Task<DtoItem> GetDetails(int itemId, int itemTypeId)
{
using (var connection = _dataAccess.GetConnection())
{
switch (itemTypeId)
{
case 1:
return (await connection.QueryAsync<DtoTypeRed,string,DtoTypeRed>("[spGetDetails]", (redDetails, redDetailsAttributesJson) =>
{
redDetails.AttributesJson = JsonSerializer.Deserialize<List<DtoTypeRed>>(redDetailsAttributesJson).FirstOrDefault();
return redDetails;
},
splitOn: "AttributesJson",
param: new { itemId ,itemTypeId},
commandType: CommandType.StoredProcedure)).FirstOrDefault();
case 2:
return (await connection.QueryAsync<DtoTypeGreen,string,DtoTypeGreen>("[spGetDetails]", (greenDetails, greenDetailsAttributesJson) =>
{
greenDetails.AttributesJson = JsonSerializer.Deserialize<List<DtoTypeGreen>>(greenDetailsAttributesJson).FirstOrDefault();
return greenDetails;
},
splitOn: "AttributesJson",
param: new { itemId ,itemTypeId},
commandType: CommandType.StoredProcedure)).FirstOrDefault();
case ...
default: return null;
}
}
}
}
}
My colleague is suggesting that doing this kind of business logic in repository is not a good approach.
So, there is one alternative (at least one I'm aware of) - fetching data into Item Entity (leave the JSON in flat string), and map it to the DTOs on the Service layer (MyProject.Core) either with simple assignment (Option 2.1), which I found not much elegant way, or using Automapper (Option 2.2)
namespace MyProject.Data
{
public class DetailsRepository : IDetailsRepository
{
public async Task<Item> GetDetails(int itemId, int itemTypeId)
{
using (var connection = _dataAccess.GetConnection())
{
return await connection.QuerySingleAsync<Item>("[spGetDetails]", param: new { itemId ,itemTypeId}, commandType: CommandType.StoredProcedure);
}
}
}
}
Option 2.1:
namespace MyProject.Core.Services
{
public class DetailsService : IDetailsService
{
public async Task<DtoItem> GetDetails(int itemId, int itemTypeId)
{
var itemDetailsEntity = await _detailsRepo.GetDetails(int itemId, int itemTypeId);
switch(itemTypeId){
case 1:
var result = new DtoTypeRed();
result.Id= itemDetailsEntity.Id;
result.Name = itemDetailsEntity.Name;
result.AttributesJson = JsonSerializer.Deserialize<List<DtoTypeRedAttributes>>(itemDetailsEntity.AttributesJson).FirstOrDefault();
case ...
}
return result;
}
}
}
If this is the way, maybe some factory would be better? (I don't have any yet)
Option 2.2 with Automapper:
The thing is, I read somewhere, that Automapper is not really meant to include the JSON deserialization logic - it should be more "auto"
namespace MyProject.Core.Mappings
{
public class MapperProfiles : Profile
{
public MapperProfiles()
{
CreateMap<Entities.Item, DTO.Details.DtoTypeRed>()
.ForMember(dest => dest.AttributesJson, opts => opts.MapFrom(src =>JsonSerializer.Deserialize<List<DtoTypeRedAttributes>>(src.AttributesJson, null).FirstOrDefault()));
(...)
}
}
}
namespace MyProject.Core.Services
{
public class DetailsService : IDetailsService
{
public async Task<DtoItem> GetDetails(int itemId, int itemTypeId)
{
var itemDetailsEntity = await _detailsRepo.GetDetails(int itemId, int itemTypeId);
switch(itemTypeId){
case 1:
return _mapper.Map<DtoTypeRed>(itemDetailsEntity);
case 2:
return _mapper.Map<DtoTypeGreen>(itemDetailsEntity);
case ...
}
}
}
}
I really lack this "architectural" knowledge and experience, so any suggestions would be really appreciated. Maybe some other way I do not see here?
Considering your software design is rather unusual, I think there's two questions you should ask yourself first:
You're using a relational DB, but are also using it to store a JSON (as in a NoSQL DB). Is this is a preexisting schema where you don't have any influence on changing it? If not, why are you using this design and not separate tables for the different data structures, as you would normally do in a relational schema? This would also give you the advantages of a relational DB like querying, foreign keys, indexing.
If you have a controller where you hand out your object anyway, you will probably do this as a JSON, right? Then what would be the point of deserializing it? Can't you just leave the JSON as is?
Apart from that, if you want to stick to your options, I would go with a solution similar to your option 1. Your colleague is right that you don't want to have business logic in a repository, since the responsibilty of a repository is only to store and query data. Business logic in an onion architecture belongs to your service layer. However, if you are only deserializing data from the DB to a structure that is usable to you in your program, then this logic should be in the repository. All you do in this case is to fetch data and transform it into objects, same thing as an ORM tool would do.
One thing I would change in option 1 though, is to move the switch-case to your service layer, since this is business logic. So:
public class DetailsService : IDetailService {
public async Task<Item> GetDetails(int itemId, int itemTypeId) {
Item myItem;
switch(itemTypeId){
case 1:
myItem = _repo.getRedItem(itemId);
// and so on
}
}
}
Your repository would then have methods for the individual item types. So again, all it does is querying and deserializing data.
On your other two options: the point of DTOs usually is to have separate objects which you can share with other processes (like when sending them out in a controller). This way, changes in the DTO then don't influence changes in your entity and vice versa, as well as you can choose which properties you want to include in your DTO or entity and which not. In your case it seems you are only really using the DTOs in your program and the entities are just there as an intermediate step, which makes it pointless to even have any entity.
Also, using AutoMapper in your scenario only makes the code more complicated, which you can see in your code in MapperProfiles, where you probably agree with me that it isn't easy to understand and maintain.
Jakob's answer is good and I agree with a lot that he says. To add to that...
The challenge you have is that you want to be as pure to Clean Architecture as possible, but you're constrained by a system where there are two conflicting architectural patterns:
The "traditional" approach where the data structures are explicit.
Alternative approach where the data structures are mixed: some data structures are defined by the database (table design) whilst others are structured in the code (JSON) - of which the database has no visibility (to the database it's just a string).
A further complication is at the code level where you have a classic layered architecture with explicitly defined objects and layers, and then you have this mutant JSON thing.
Based on the information do far i wonder if you can use a generic list for the JSON attributes - because your code is taking the JSON and converting it into essentially key-value pairs - at least that's what's suggested by your code:
MyProject.Core.DTO.Details.DtoTypeRedAttributes.AttributeOneOne{ get; set; }
This says to me that the Controller is getting AttributeOneOne which will have a name/key and value inside it. Whatever consumes this will care about the name/key, but to the rest of your code you just have generic attributes.
I'm also suspicious/curious of the type field in the database which you say is 'never returned by the procedure', but you have type specific code? e.g. DtoTypeRedAttributes.
Based on all of that, I would simplify things (which still keeps you within Clean):
Deserializing the JSON into a generic key/value pair:
This could happen in either then data access layer or within the DTO itself.
A straight JSON to Key/Value pair conversion is purely technical and does not affect any business rule, so there's no business rule/logic drivers that influence where that can happen.
Q: How expensive is the deserialization? And are those attributes always accessed? If it's expense, and the attributes aren't always used, then you can use Lazy Load to only trigger the deserialization when it's needed - this would suggest doing it in the DTO, because if you do it in the data access layer you'll get that deserialization penalty every time, regardless.
A potential downside, if you care about it, is that you'll have a DTO with a private string RawJSONAttributes property and a public List<string,string> JSONAttributes key/value pair property. I.e. larger memory foot-print, and you'll have to consider consistency (do you care about the private string if someone changes the public values, etc).
DTO Consolidation
If the DTO's can get away with only providing a generic list of key/value pairs, then based on what else you've said so far, I don't see why you have separate red,green,blue code. One DTO would do it.
If you do need separate types then yes, some sort of factory would probably be a good approach.
The Data Access code could return "generic" DTOs (like WidgetInfo, in my diagram) up to the consuming logic.
That logic (which did the DTO conversion) would probably shield the consumers / Controllers above, so that they only saw BlueWidget, RedWidget (etc) DTOs and not the generic one.
Summary
The diagram shows how I would typically do this (consuming Controllers not shown), assuming generic DTO's were ok. Let me know if I need to diagram out the other approach (conversion of generic WidgetInfo to BlueWidget/RedWidget DTOs).
Note that the DTO has a public property for the attributes (list of key/value pairs); it would also have a private string for the RawJSONAttributes if you wanted to take the Lazy Load approach in the DTO.

EF: manual (non autogenerated) keys require ad hoc handling of new entities

In a MVVM application with EF Core as ORM I decided to model a table with a manually inserted, textual primary key.
This is because in this specific application I'd rather use meaningful keys instead of meaningless integer ids, at least for simple key-value tables like the table of the countries of the world.
I have something like:
Id | Description
-----|--------------------------
USA | United States of America
ITA | Italy
etc. etc.
So the entity is:
public class Country
{
[Key]
[DatabaseGenerated(DatabaseGeneratedOption.None)]
public string Id { get; set; }
public string Description { get; set; }
}
Here is my viewmodel. It's little more than a container for the ObservableCollection of Countries. Actually it gets loaded from a repository. It's trivial and I inlcuded the entire code at the end. It's not really relevant and I could do with just the DbContext as well. But I wanted to show all the layers to see where the solution belongs to. Oh yes, then it contains the synchronizing code that actually offends EF Core.
public class CountriesViewModel
{
//CountryRepository normally would be injected
public CountryRepository CountryRepository { get; set; } = new CountryRepository(new AppDbContext());
public ObservableCollection<Country> Countries {get; set;}
public CountriesViewModel()
{
Countries = new ObservableCollection<Country>();
Countries.CollectionChanged += Countries_CollectionChanged;
}
private void Countries_CollectionChanged(object sender, System.Collections.Specialized.NotifyCollectionChangedEventArgs e)
{
foreach (Country c in e.NewItems)
{
CountryRepository.Add(c);
}
}
}
In my MainWindow I just have:
<Window.DataContext>
<local:CountriesViewModel/>
</Window.DataContext>
<DockPanel>
<DataGrid ItemsSource="{Binding Countries}"/>
</DockPanel>
Problem and question
Now this doesn't work. When we try to insert a new record, in this case I do it using the automatic feature of DataGrid I get a:
System.InvalidOperationException: 'Unable to track an entity of type 'Country'
because primary key property 'Id' is null.'
Each time i add a new record to the ObservableCollection I also try to add it back to the repository, that in turn adds it on the EF DbContext that doesn't accept entities with null key.
So what are my options here?
One is postponing the addition of the new record till the Id has been inserted. This is not trivial as the collection handling that I've shown, but this is not the problem. The worst is that this way I would have some record that are tracked by EF (the updated and the deleted and the new with pk assigned) and some that are tracked by the view model (the new ones with the key not yet assigned).
Another is using alternate keys; I would have an integer, autogenerated primary key and the ITA,USA etc code would be an alternate key that would be used also in relations. It's not so bad from as simplicity, but I'd like a application-only solution.
What I'm looking for
I'm looking for a neat solution here, a pattern to be used whenever this problem arises and that plays well in the context of a MVVM/EF application.
Of course I could also look in the direction of the view events, that is force the user to insert the key before of a certain event that triggers the insertion. I would consider it a second-class solution because it is sort of view dependent.
Remaining code
Just for completeness, in case that you want to run the code, here is the remaining code.
DbContext
(Configured for postgres)
public class AppDbContext : DbContext
{
protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
{
optionsBuilder.UseNpgsql("Host=localhost;Database=WpfApp1;Username=postgres;Password=postgres");
}
public DbSet<Country> Countries { get;set; }
}
Repository
The reason why I implemented the repository for such a simple example is because I think that a possible solution may be to include the new-without-key records managment in the Repository instead of in the viewmodel. I still hope that someone comes out with a simpler solution.
public class CountryRepository
{
private AppDbContext AppDbContext { get; set; }
public CountryRepository(AppDbContext appDbContext) => AppDbContext = appDbContext;
public IEnumerable<Country> All() => AppDbContext.Countries.ToList();
public void Add(Country country) => AppDbContext.Add(country);
//ususally we don't have a save here, it's in a Unit of Work;
//for the example purpose it's ok
public int Save() => AppDbContext.SaveChanges();
}
Probably the cleanest way to address the aforementioned issue in EF Core is to utilize temporary value generation on add. In order to do that, you would need a custom ValueGenerator like this:
using Microsoft.EntityFrameworkCore.ChangeTracking;
using Microsoft.EntityFrameworkCore.ValueGeneration;
public class TemporaryStringValueGenerator : ValueGenerator<string>
{
public override bool GeneratesTemporaryValues => true; // <-- essential
public override string Next(EntityEntry entry) => Guid.NewGuid().ToString();
}
and fluent configuration similar to this:
modelBuilder.Entity<Country>().Property(e => e.Id)
.HasValueGenerator<TemporaryStringValueGenerator>()
.ValueGeneratedOnAdd();
The potential drawbacks are:
In pre EF Core 3.0 the generated temporary value is set onto entity instance, thus would be visible in the UI. This has been fixed in EF Core 3.0, so now Temporary key values are no longer set onto entity instances
Even though the property looks empty (null) and is required (default for primary/alternate keys), if you don't provide explicit value, EF Core will try to issue INSERT command and read the "actual" value back from database similar to identity and other database generated values, which in this case will lead to non user friendly database generated runtime exception. But EF Core in general does not do validations, so this won't be so different - you have to add and validate property required rule in the corresponding layer.

lazyloading without using lazy<t>

I have:
Category class
public partial class Category : BaseEntity
{
...
public string Name { get; set; }
private ICollection<Discount> _appliedDiscounts;
public virtual ICollection<Discount> AppliedDiscounts
{
get { return _appliedDiscounts ?? (_appliedDiscounts = new List<Discount>()); }
protected set { _appliedDiscounts = value; }
}
}
Service:
public IList<Category> GetCategories()
{
// ado.net to return category entities.
}
public ICollection<Discount> GetDiscount(int categoryId)
{
// ado.net .
}
I don't want to use ORM like EF.. but plain ado.net and i don't want to put in ugly Lazy<T> in my domain definition, for e.g public Lazy....
So how in this case could I get AppliedDiscounts automatically get binded lazy to GetDiscount without using explicitly declaration of Lazy<T> on the Category class ?
I don't know why you don't like the Lazy<T> type - it is simple, useful and you don't have to worry about anything.
And no one forces you to use public Lazy<IEnumerable<Discount>> GetDiscounts();
You could use it internally:
Lazy<IEnumerable<Discount>> discounts = new Lazy<IEnumerable<Discount>>(() => new DiscountCollection());
public IEnumerable<Discount> GetDiscounts()
{
return discounts.Value;
}
It operates as intended - until no one asks for discounts it won't be created.
If you really want - you could create your own implementation. Something like Singleton class in Richter's "CLR via C#" book (because Lazy has all the 'properties' of a proper singleton container - thread safety, only one instance of inner 'singleton' value could be evaluated...).
But do you really want to create it and test? You will just replace a well-designed standard component with a fishy custom one.
AFTER ACTUALLY READING YOUR QUESTION WITH ATTENTION
1) If your lazy loading does not need any thread safety you could accomplish similar behaviour even without any Lazy or complex constructs - just use Func delegate:
public partial class Category : BaseEntity
{
...
private Func<ICollection<Discount>> getDiscounts;
public Category(Func<ICollection<Discount>> getDiscounts) { ... }
public string Name { get; set; }
private ICollection<Discount> _appliedDiscounts;
public virtual ICollection<Discount> AppliedDiscounts
{
get { return _appliedDiscounts ??
(_appliedDiscounts = new List<Discount>(this.getDiscounts())); }
protected set { _appliedDiscounts = value; }
}
}
public IList<Category> GetCategories()
{
// ado.net to return category entities.
... = new Category(() => this.GetDiscount((Int32)curDataObject["categoryId"]))
}
public ICollection<Discount> GetDiscount(int categoryId)
{
// ado.net .
}
If you inject your service it will be even more simple:
public virtual ICollection<Discount> AppliedDiscounts
{
get { return _appliedDiscounts ??
(_appliedDiscounts = new List<Discount>(this.service.GetDiscounts(this.CategoryId))); }
protected set { _appliedDiscounts = value; }
}
2) If you need to use these objects in multiple threads then you will have to redesign your classes - they don't look like threadsafe.
AFTER THE COMMENT
what i want to do is exactly just like this guy
stackoverflow.com/questions/8188546/… . I want to know the concept how
ORM like EF do with the domain, keep it clean and separated from
injecting service class but still able to handle lazy loading. I know
i can use Reflection to get all the object properties and its object
variables(like AppliedDiscounts), but dont' know how to transform
these dynamically to lazy type so that it could be loaded later when
needed.
It is universal principle that you can't get something for nothing. You can't make your entities both clean and separated from any services(even through some proxy), and to allow them to load lazily - if they don't know anything about services and services don't know anything about them then how would the lazy loading work? There is no way to achieve such absolute decoupling(for two components to interact they have to either know about each other, know about some third module-communicator, or some module should know about them. But such coupling could be partially or completely hidden.
Technologies that provide entity object models usually use some of the following techniques:
Code generation to create wrappers(or proxies) above your simple data objects, or solid instances of your interfaces. It could be C# code or IL weaving, well, it could be even an in-memory assembly created dynamically in runtime using something like Reflection.Emit. This is not the easiest or most direct approach, but it will give you enormous code-tuning capabilities. A lot of modern frameworks use it.
Implementation of all those capabilities in Context classes - you won't have the lazy loading in your end objects, you will have to use it explicitly with Context classes: context.Orders.With("OrderDetails"). The positive side is that the entities will be clean.
Injection of service(or only of the needed subset of its operations) - that's what you'd prefer to avoid.
Use of events or some other observer-like pattern - your entities
will be clean from service logic and dependencies(at least in some
sense), but will contain some hookup infrastructure that won't be
very straightforward or easy to manage.
For your custom object model 2 or 3 are the best bets. But you could try 1 with Roslyn

Exposing an object through a 'view' interface

I've been trying to find a flexible way of exposing an object through a 'view'. I'm probably better off explaining by way of example.
I have an Entity Framework entity model, and a web service that can be used to query it. I am able to return the entity classes themselves, but this would include some fields I might not want to share - IDs, for examples, or *Reference properties from any associations in the entity model.
I figure what I need is a view of the data, but I don't particular want to write a view wrapper class for every return type. I'm hoping I'll be able to define an interface and somehow make use of that. For example:
interface IPersonView
{
string FirstName { get; }
string LastName { get; }
}
-
// (Web service method)
IPersonView GetPerson(int id)
{
var personEntity = [...];
return GetView<IPersonView>(personEntity);
}
However, in order to do something like this, I'd have to have my entities implement the view interfaces. I was hoping for a more flexible 'duck-typed' approach as there may be many views of an object, and I don't really to want to have to implement them all.
I've had some success building a dynamic type by reflecting the interface and copying fields and properties across, but I'm not able to cast this back to the interface type in order to get strong typing on the web service.
Just looking for some comments and advice, both would be welcome. Thanks.
You shouldn't ever really be passing entities directly out to a client, they should be used for persistance only. You should introduce DTOs/POCOs tailored to whatever data your API wants to return e.g.
public class PersonDto
{
public string FirstName { get; set; }
public string LastName { get; set; }
}
// public API method
public PersonDto GetPersonApi(int id)
{
var personEntity = // pull entity from db
return new PersonDto()
{
FirstName = personEntity.FirstName,
LastName = personEntity.LastName
};
}
This keeps a clean separation between your persistence layer & public interface. You can use a tool like AutoMapper to do the legwork in terms of mapping the data across. Just setup a mapping once e.g. in your global asax:
protected void Application_Start()
{
Mapper.CreateMap<Person, PersonDto>();
}
...
// public API method
public PersonDto GetPersonApi(int id)
{
var personEntity = // pull entity from db
return Mapper.Map<Person, PersonDto>(personEntity);
}
I typically see this done with AutoMapper or a similar tool. It makes mapping between similar classes much simpler. You still have to create the Views (which in an MVC-context would be a Model), but the most tedious part (the mapping) is taken care of for you so long as you use the same field names.
As a side note, sharing IDs and other reference data will be necessary if you want to update the data, since you'll need to know the keys in order to know which record(s) to update.

Categories

Resources