C# Dependency Injection with 3 projects on same Solution - c#

I have a solution with separate projects. The structure is like this:
1 - the web project
2 - the Aplicacao, that is called by the controllers of 1 to do the logic
3 - the Dominio, that has the classes of my solution
4 - the Dados, that has the entity framework stuff...
So, when I have a POST to register a new user, the UsuarioController (UC) is called. The UC has to instanciate a UsuarioAplicacao (UA) passing a usuario. The UA has to instanciate a UsuarioRepositorio (UR) passing a usuario as parameter and here it will be salved to the database.
I'm used to instanciate the class I need inside the constructor of the class, like so:
public ActionResult Registrar (Usuario usuario)
{
if (ModelState.IsValid)
{
var _usuarioApp = new UsuarioAplicacao();
_usuarioApp.Salvar(usuario);
rest of the code...
}
}
But, while studing, I've learned that this isn't quite right. I should use the dependency injection (DI)...
So, using DI with the constructor, I can't figure out how to do it...
I've tried like this in my controller:
private UsuarioAplicacao _usuarioAplicacao;
public UsuarioController (UsuarioAplicacao usuarioAplicacao)
{
this._usuarioAplicacao = usuarioAplicacao;
}
then, on my UsuarioAplicacao (class that do the logics and call UsuarioRepositorio to save the object into the DB):
private readonly UsuarioRepositorio _usuarioRepositorio;
public UsuarioAplicacao (UsuarioRepositorio usuarioRepositorio)
{
this._usuarioRepositorio = usuarioRepositorio;
}
and finnaly, inside my UsuarioRepositorio (class responsable for saving tha data into the DB via Entity Framework):
private readonly Contexto _contexto;
public UsuarioRepositorio(Contexto contexto)
{
this._contexto = contexto;
}
(_contexto is my EF context class)
These are my onstructors. But I'm getting Null Reference Exceptions...
Can you guys help me with dependency injection?
Ninject I couldn't understand how to use either...
thanks in advance

you are getting null references because you didn't configure any dependency injection. Simply passing a parameter on the controller constructor is not enough, you have to actually tell the depedency injector how to construct your dependencies.
Using Unity, you do something like this on your WebApiConfig (or global.asax if on mvc)
public static void UnityContext_OnRegisterComponents(Microsoft.Practices.Unity.UnityContainer container)
{
container.RegisterType<ICarRepository, CarRepository>(new HierarchicalLifetimeManager());
}
Now, for the concept of dependency injection, it is used to keep low coupling among projects. Your controller must not know your Business rules, neither how to create objects that belong to another assembly.
Example: When you create your repositories and isntantiate your entities directly on the controller, you are creating a dependency between those assemblies, making it really hard to test your code later, also making it a lot more complex.
In terms of architecture, I use something like this:
Web - Front end
Business - Where you do your business Rules
Contracts - Where you declare your data transfer objects
Data - Where you declare your entities, and entity framework do low level stuff, like opening connections, saving things to the database and etc, and where you declare your repositories
In that architeture:
Web access Contracts and Business
Business access Contracts and Data
Data Doesnt Access Anything
Business will ask Data for entities, Data will answer with the entity models from your database, and any manipulation you need to do, will be done in the business, returning a data transfer object (of the Contracts assembly), which will them be used in your front end, either as it is returned, or defining a new model to fit the transfer object into your front end.
In this scenario, keeping dependency injection in mind, it would create a coupling if the business layer knew how to create Data Layer objects, so instead of doing this, you configura a dependency injection container, and that object will be responsible for instantiating everything that every layer needs, keeping all your projects decoupled.

Related

CRUD operation for .net-core web API with interfaces - saveChanges() doesn't exist

Overview
I am making a .net-core web API with simple CRUD operations.
I have made the GET methods and got them up and running, however when I try to implement the Create method using dbContext.Add(myItem) I am not able to use dbContext.SaveChanges() afterwards.
My current work is available at:
https://github.com/petermefrandsen/TKD-theory-API
So far
I have tried adding a overwriting method to my database context.
Additionally I have tried adding the entity framework reference to the project.
As I do use interfaces for loose coupling I am at a loss when I comes to comparing with tutorials and other peoples similar problems. (read I am fairly new to c#).
Code
Controller:
[Route("dan/")]
[HttpPost]
public ActionResult<DanTheoryItem> PostDanTheoryItem(DanTheoryItem danTheoryItem)
{
_context.PostDanTheoryItem(danTheoryItem);
return new ActionResult<DanTheoryItem>(danTheoryItem);
}
IContext:
DanTheoryItem PostDanTheoryItem(DanTheoryItem danTheoryItem);
Context:
public DanTheoryItem PostDanTheoryItem(DanTheoryItem danTheoryItem)
{
var theoryItem = new DbDanTheoryItems
{
Id = danTheoryItem.Id,
KoreanTheoryItemId = danTheoryItem.KoreanTheoryItemId,
MainCategory = danTheoryItem.MainCategory,
SubCategory = danTheoryItem.SubCategory,
SubToSubCategory = danTheoryItem.SubToSubCategory,
NameLatin = danTheoryItem.NameLatin,
NamePhonetic = danTheoryItem.NamePhonetic,
NameAudio = danTheoryItem.NameAudio
};
_dbContext.DanTheoryItems.Add(theoryItem);
//_dbContext.SaveChanges();
return danTheoryItem;
}
Desired result
I'd like to have the controller call the context methods that will write the desired data to the database.
Your interface doesn't contain a SaveChanges method. Since you are using dependency injection only the methods in your interface will be available to your controller.
If you inherit from System.Data.Entity.IDbContext class in your custom interface, the method will be exposed to you in your controller.
Instantiating an instance of your DbLokisaurTKDTheoryAppContext class in your controller will expose the SaveChanges method as well.
It's hard to say exactly because you've neglected to post certain key portions of your code. However, my best guess is that you "context" is being provided IDbContext (of your own creating), instead of DbContext, and your interface doesn't define a SaveChanges method.
Honestly, just get rid of IDbContext. The point of an interface is to define a contract between multiple implementations. Here, there can be only one implementation: DbContext, and DbContext isn't even aware of this interface. Just inject your derived DbContext directly.
Using an interface isn't a magic wand. You have a hard dependency on Entity Framework here, so interface or not, you're tightly coupled. That's not necessary a bad thing though. EF is serving as your data layer, and this application is data driven; it's going to be tightly coupled to the data layer no matter what.
The API itself serves as your abstraction. Other layers will presumably just use the API, so no further abstraction is necessary, and in fact just adds more maintenance concern with no added benefit.

Entity Framework circular dll reference

My solution in it's current (sad) state:
I want my business layer data-provider agnostic (isn't that a good thing?) with just an interface so I can switch out EF with NHibernate or Linq to Xml or whatever type of persistance provider me or my boss wants to use (or the new superior one that will inevitably be invented 2 seconds after this project is all done).
IPersistenceProvider is that interface, and I can just inject it with Unity (not the gaming platform, the DI container). To me, IPersistenceProvider belongs in the Data Layer and we can just keep adding folders (like EntityFramework) as new persistence paradigms are needed to be added to my resume (or the project).
Therefore, my business dll depends on my data dll. Here's some code in the business dll, depending on the data dll:
using System;
using Atlas.Data.Kernel;
namespace Atlas.Business.Kernel
{
public abstract class BusinessObject
{
public BusinessObject(IPersistenceProvider p)
{
}
public Guid Id;
}
}
I also feel like my DatabaseContext belongs in the Data Layer. But EF makes you reference the concrete types for its DbSets, which means the AtlasDataKernel dll would need to depend on the AtlasBusinessKernel dll, which would make a circular dll reference. En plus (that's French for moreover), a Data Layer thingy pointing to the Business Layer concrete types smells to me. DatabaseContext wants to go live in the business dll, but that's coupling my business layer with a particular persistence strategy artifact.
How to resolve this? I can collapse it into one dll (and indeed, I did that very thing on a previous project), but that kinda sucks and I won't be able to get into the .Net Architects club. They will mock me for my "1 N too few" architecture and laugh me out of the meeting. WWDED? (What would Dino Esposito Do?)
Split declaration from implementation.
The EntityFramework subdirectory should be a separate assembly (e.g. AtlasDataKernelEF) containing the EF stuff and the implementation of IPersistenceProvider, thus resolving the circular reference.
Also, if you should really get required to use a different ORM, you getyour production executables rid of all EF libraries.
You don't sketch how you instantiate EF data access, but you certainly need to wrap that in some kind of factory class.
Your project AtlasBusinessKernel shouldn't reference any resources in the AtalsDataKernal class. Any resources in the AtalsDataKernal that the AtalsBusinessKernel needs to use should be represented as an interface in the AtalasBusinessKernal project, could be an IDataConext interface or repository interfaces.
This only works if you have a third project which actually using the AtalsBusinessKernal project, perhaps a web application or a console application that represents the UI. This project would be responsible for instantiating the DatabaseContext, preferably using DI.
// In your AtlasDataKernal
public class DatabaseContext : IDataContext
{
// implementation
}
// In your AtlasBusinessKernal
public class MyBusinessLogic
{
private IDataContext dataContext;
public MyBusinessLogic(IDataContext context)
{
this.dataContext = context;
}
}
// In your web application or whatever project type it might be
public class MyWebApp
{
public DoSomeThing()
{
IDataContext context = new DatabaseContext();
MyBusinessLogic logic = new MyBusinessLogic(context);
}
}

Relationship between EF-Generated classes and model?

I'm using ASP .NET MVC (C#) and EntityFramework (database first) for a project.
Let's say I'm on a "Movie detail" page which shows the detail of one movie of my database. I can click on each movie and edit each one.
Therefore, I have a Movie class, and a Database.Movie class generated with EF.
My index action looks like :
public ActionResult MovieDetail(int id)
{
Movie movie = Movie.GetInstance(id);
return View("MovieDetail", movie);
}
GetInstance method is supposed to return an instance of Movie class which looks like this for the moment :
public static Movie GetInstance(int dbId)
{
using (var db = new MoviesEntities())
{
Database.Movie dbObject = db.Movies.SingleOrDefault(r => r.Id == dbId);
if (dbObject != null)
{
Movie m = new Movie(dbObject.Id, dbObject.Name, dbObject.Description);
return m;
}
return null;
}
}
It works fine but is this a good way to implement it? Is there an other cleaner way to get my instance of Movie class ?
Thanks
is this a good way to implement it?
That's a very subjective question. It's valid, and there's nothing technically wrong with this implementation. For my small-size home projects, I've used similar things.
But for business applications, it's better to keep your entities unrelated to your MVC application. This means that your data context + EF + generated entities should be kept in a separate project (let's call it the 'Data' project), and the actual data is passed in the form of a DTO.
So if your entity resembles this:
public class Person {
public int Id { get; set; }
public string Name { get; set; }
}
You'd expect there to be an equivalent DTO class that is able to pass that data:
public class PersonDTO {
public int Id { get; set; }
public string Name { get; set; }
}
This means that your 'Data' project only replies with DTO classes, not entities.
public static MovieDTO GetInstance(int dbId)
{
...
}
It makes the most sense that your DTOs are also in a separate project. The reason for all this abstraction is that when you have to change your datacontext (e.g. the application will start using a different data source), you only need to make sure that the new data project also communicates with the same DTOs. How it works internally, and which entities it uses, is only relevant inside the project. From the outside (e.g. from your MVC application), it doesn't matter how you get the data, only that you pass it in a form that your MVC projects already understand (the DTO classes).
All your MVC controller logic will not have to change, because the DTO objects haven't changed. This could save you hours. If you link the entity to your Controller AND View, you'll have to rewrite both if you suddenly decide to change the entity.
If you're worried about the amount of code you'll have to write for converting entities to DTOs and vice versa, you can look into tools like Automapper.
The main question: Is this needed?
That, again, is a very subjective question. It's relative to the scope of the project, but also the expected lifetime of the application. If it's supposed to be used for a long time, it might be worth it to keep everything interchangeable. If this is a small scale, short lifetime project, the added time to implement this might not be worth it.
I can't give you a definitive answer on this. Evaluate how well you want the application to adapt to changes, but also how likely it is that the applicaiton will change in the future.
Disclaimer: This is how we do it at the company where I work. This is not the only solution to this type of problem, but it's the one I'm familiar with. Personally, I don't like making abstractions unless there's a functional reason for it.
A few things:
The naming you're using is a little awkward and confusing. Generally, you don't ever want to have two classes in your project named the same, even if they're in different namespaces. There's nothing technically wrong with it, but it creates confusion. Which Movie do I need here? And if I'm dealing with a Movie instance, is it Movie or Database.Movie? If you stick to names like Movie and MovieDTO or Movie and MovieViewModel, the class names clearly indicate the purpose (lack of suffix indicates a database-backed entity).
Especially if you're coming from another MVC framework like Rails or Django, ASP.NET's particular flavor of MVC can be a little disorienting. Most other MVC frameworks have a true Model, a single class that functions as the container for all the business logic and also acts as a repository (which could be considered business logic, in a sense). ASP.NET MVC doesn't work that way. Your entities (classes that represent database tables) are and should be dumb. They're just a place for Entity Framework to stuff data it pulls from the database. Your Model (the M in MVC) is really more a combination of your view models and your service/DAL layer. Your Movie class (not to be confused with Database.Movie... see why that naming bit is important) on the other hand is trying to do triple duty, acting as the entity, view model and repository. That's simply too much. Your classes should do one thing and do it well.
Again, if you have a class that's going to act as a service or repository, it should be an actual service or repository, with everything those patterns imply. Even then, you should not instantiate your context in a method. The easiest correct way to handle it is to simply have your context be a class instance variable. Something like:
public class MovieRepository
{
private readonly MovieEntities context;
public MovieRepository()
{
this.context = new MovieEntities();
}
}
Even better, though is to use inversion of control and pass in the context:
public class MovieRepository
{
private readonly MovieEntities context;
public MovieRepository(MovieEntities context)
{
this.context = context;
}
}
Then, you can employ a dependency injection framework, like Ninject or Unity to satisfy the dependency for you (preferably with a request-scoped object) whenever you need an instance of MovieRepository. That's a bit high-level if you're just starting out, though, so it's understandable if you hold off on going the whole mile for now. However, you should still design your classes with this in mind. The point of inversion of control is to abstract dependencies (things like the context for a class that needs to pull entities from the database), so that you can switch out these dependencies if the need should arise (say perhaps if there comes a time when you're going to retrieve the entities from an Web API instead of through Entity Framework, or even if you just decide to switch to a different ORM, such as NHibernate). In your code's current iteration, you would have to touch every method (and make changes to your class in general, violating open-closed).
entity-model never should act as view-model. Offering data to the views is an essential role of the view-model. view-model can easily be recognized because it doesn’t have any other role or responsibility other than holding data and business rules that act solely upon that data. It thus has all the advantages of any other pure model such as unit-testability.
A good explanation of this can be found in Dino Esposito’s The Three Models of ASP.NET MVC Apps.
You can use AutoMapper
What is AutoMapper?
AutoMapper is a simple little library built to solve a deceptively complex problem - getting rid of code that mapped one object to another. This type of code is rather dreary and boring to write, so why not invent a tool to do it for us?
How do I get started?
Check out the getting started guide.
Where can I get it?
First, install NuGet. Then, install AutoMapper from the package manager console:
PM> Install-Package AutoMapper

Trying to simplify our repository pattern

Currently we have implemented a repository pattern at work. All our repositories sit behind their own interfaces and are mapped via Ninject. Our project is quite large and there are a couple quirks with this pattern I'm trying to solve.
First, there are some controllers where we need upwards of 10 to 15 repositories all in the same controller. The constructor gets rather ugly when asking for so many repositories. The second quirk reveals itself after you call methods on multiple repositories. After doing work with multiple repositories we need to call the SaveChanges method, but which repository should we call it on? Every repository has one. All repositories have the same instance of the Entity Framework data context injected so picking any random repository to call save on will work. It just seems so messy.
I looked up the "Unit Of Work" pattern and came up with a solution that I think solves both problems, but I'm not 100% confident in this solution. I created a class called DataBucket.
// Slimmed down for readability
public class DataBucket
{
private DataContext _dataContext;
public IReportsRepository ReportRepository { get; set; }
public IEmployeeRepository EmployeeRepository { get; set; }
public IDashboardRepository DashboardRepository { get; set; }
public DataBucket(DataContext dataContext,
IReportsRepository reportsRepository,
IEmployeeRepository employeeRepository,
IDashboardRepository dashboardRepository)
{
_dataContext = dataContext;
this.ReportRepository = reportsRepository;
this.EmployeeRepository = employeeRepository;
this.DashboardRepository = dashboardRepository;
}
public void SaveChanges()
{
_dataContext.SaveChanges();
}
}
This appears to solve both issues. There is now only one SaveChanges method on the data bucket itself and you only inject one object, the data bucket. You then access all the repositories as properties. The data bucket would be a little messy looking since it would be accepting ALL (easily 50 or more) of our repositories in its constructor.
The process of adding a new repository would now include: creating the interface, creating the repository, mapping the interface and repository in Ninject, and adding a property to the data bucket and populating it.
I did think of an alternative to this that would eliminate a step from above.
public class DataBucket
{
private DataContext _dataContext;
public IReportsRepository ReportRepository { get; set; }
public IEmployeeRepository EmployeeRepository { get; set; }
public IDashboardRepository DashboardRepository { get; set; }
public DataBucket(DataContext dataContext)
{
_dataContext = dataContext;
this.ReportRepository = new ReportsRepository(dataContext);
this.EmployeeRepository = new EmployeeRepository(dataContext);
this.DashboardRepository = new DashboardRepository(dataContext);
}
public void SaveChanges()
{
_dataContext.SaveChanges();
}
}
This one pretty much eliminates all the repository mappings in Ninject because they are all instantiated in the data bucket. So now the steps to adding a new repository include: Create interface, create repository, add property to data bucket and instantiate.
Can you see any flaws with this model? On the surface it seems much more convenient to consume our repositories in this way. Is this a problem that has been addressed before? If so, what is the most common and/or most efficient approach to this issue?
First, there are some controllers where we need upwards of 10 to 15 repositories all in the same controller.
Say hello to Abstract factory pattern. Instead of registering all repositories in Ninject and injecting them to controllers register just single implementation of the factory which will be able to provide any repository you need - you can even create them lazily only if the controller really needs them. Than inject the factory to controller.
Yes it also has some disadvantages - you are giving controller permission to get any repository. Is it problem for you? You can always create multiple factories for some sub systems if you need or simply expose multiple factory interfaces on single implementation. It still doesn't cover all cases but it is better than passing 15 parameters to constructor. Btw. are you sure those controllers should not be split?
Note: This is not Service provider anti-pattern.
After doing work with multiple repositories we need to call the SaveChanges method, but which repository should we call it on?
Say hello to Unit of Work pattern. Unit of Work is logical transaction in your application. It persists all changes from logical transaction together. Repository should not be responsible for persisting changes - the unit of work should be. Somebody mentioned that DbContext is implementation of Repository pattern. It is not. It is implementation of Unit of Work pattern and DbSet is implementation of Repository pattern.
What you need is central class holding the instance of the context. The context will be also passed to repositories because they need it to retrieve data but only the central class (unit of work) will offer saving changes. It can also handle database transaction if you for example need to change isolation level.
Where should be unit of work handled? That depends where your logical operation is orchestrated. If the operation is orchestrated directly in controller's actions you need to have unit of work in the action as well and call SaveChanges once all modifications are done.
If you don't care about separation of concerns too much you can even combine unit of work and factory into single class. That brings us to your DataBucket.
I think you are absolutely right to use the Unit of Work pattern in this case. Not only does this prevent you from needing a SaveChanges method on every repository, it provides you a nice way to handle transactions from within code rather than in your database itself. I included a Rollback method with my UOW so that if there was an exception I could undo any of the changes the operation had already made on my DataContext.
One thing you could do to prevent weird dependency issues would be to group related repositories on their own Unit of Work, rather than having one big DataBucket that holds every Repository you have (if that was your intent). Each UOW would only need to be accessible at the same level as the repositories it contained, and other repositories should probably not depend on other UOWs themselves (your repositories shouldn't need to use other repositories).
If wanted to be an even bigger purist of the pattern, you could also structure your UOWs to represent just that, a single Unit of Work. You define them to represent a specific operation in your domain, and provide it with the repositories required to complete that operation. Individual repositories could exist on more than one UOW, if it made sense to be used by more than one operation in your domain.
For example, a PlaceCustomerOrderUnitOfWork may need a CustomerRepository, OrderRepository, BillingRepository, and a ShippingRepository
An CreateCustomerUnitOfWork may need just a CustomerRepository. Either way, you can easily pass that dependency around to its consumers, more fine grained interfaces for your UOW can help target your testing and reduce the effort to create a mock.
The notion of every repository having a SaveChanges is flawed because calling it saves everything. It is not possible to modify part of a DataContext, you always save everything. So a central DataContext holder class is a good idea.
Alternatively, you could have a repository with generic methods that can operate on any entity type (GetTable<T>, Query<T>, ...). That would get rid of all those classes and merge them into one (basically, only DataBucket remains).
It might even be the case that you don't need repositories at all: You can inject the DataContext itself! The DataContext by itself is a repository and a full fledged data access layer. It doesn't lend itself to mocking though.
If you can do this depends on what you need the "repository" do provide.
The only issue with having that DataBucket class would be that this class needs to know about all entities and all repositories. So it sits very high in the software stack (at the top). At the same time it is being used by basically everything so it sits at the bottom, too. Wait! That is a dependency cycle over the whole codebase.
This means that everything using it and everything being used by it must sit in the same assembly.
What I have done in the past was to create child injection containers (I was using Unity) and register a data context with a ContainerControlledLifetime. So that when the repositories are instantiated, they always have the same data context injected into them. I then hang on to that data context and when my "Unit of Work" is complete, I call DataContext.SaveChanges() flushing all the changes out to the database.
This has some other advantages such as (with EF) some local caching, such that if more than one repository needs to get the same entity, only the first repository actually causes a database round trip.
It's also a nice way to "batch up" the changes and make sure they execute as a single atomic transaction.

How to expose the DataContext from with-in a DataContext class?

Is it possible to expose the DataContext when extending a class in the DataContext? Consider this:
public partial class SomeClass {
public object SomeExtraProperty {
this.DataContext.ExecuteQuery<T>("{SOME_REALLY_COMPLEX_QUERY_THAT_HAS_TO_BE_IN_RAW_SQL_BECAUSE_LINQ_GENERATES_CRAP_IN_THIS INSTANCE}");
}
}
How can I go about doing this? I have a sloppy version working now, where I pass the DataContext to the view model and from there I pass it to the method I have setup in the partial class. I'd like to avoid the whole DataContext passing around and just have a property that I can reference.
UPDATE FOR #Aaronaught
So, how would I go about writing the code? I know that's a vague question, but from what I've seen online so far, all the tutorials feel like they assume I know where to place the code and how use it, etc.
Say I have a very simple application structured as (in folders):
Controllers
Models
Views
Where do the repository files go? In the Models folder or can I create a "Repositories" folder just for them?
Past that how is the repository aware of the DataContext? Do I have to create a new instance of it in each method of the repository (if so that seems in-efficient... and wouldn't that cause problems with pulling an object out of one instance and using it in a controller that's using a different instance...)?
For example I currently have this setup:
public class BaseController : Controller {
protected DataContext dc = new DataContext();
}
public class XController : BaseController {
// stuff
}
This way I have a "global" DataContext available to all controllers who inherit from BaseController. It is my understanding that that is efficient (I could be wrong...).
In my Models folder I have a "Collections" folder, which really serve as the ViewModels:
public class BaseCollection {
// Common properties for the Master page
}
public class XCollection : BaseCollection {
// X View specific properties
}
So, taking all of this where and how would the repository plug-in? Would it be something like this (using the real objects of my app):
public interface IJobRepository {
public Job GetById(int JobId);
}
public class JobRepository : IJobRepository {
public Job GetById(int JobId) {
using (DataContext dc = new DataContext()) {
return dc.Jobs.Single(j => (j.JobId == JobId));
};
}
}
Also, what's the point of the interface? Is it so other services can hook up to my app? What if I don't plan on having any such capabilities?
Moving on, would it be better to have an abstraction object that collects all the information for the real object? For example an IJob object which would have all of the properties of the Job + the additional properties I may want to add such as the Name? So would the repository change to:
public interface IJobRepository {
public IJob GetById(int JobId);
}
public class JobRepository : IJobRepository {
public IJob GetById(int JobId) {
using (DataContext dc = new DataContext()) {
return dc.Jobs.Single(j => new IJob {
Name = dc.SP(JobId) // of course, the project here is wrong,
// but you get the point...
});
};
}
}
My head is so confused now. I would love to see a tutorial from start to finish, i.e., "File -> New -> Do this -> Do that".
Anyway, #Aaronaught, sorry for slamming such a huge question at you, but you obviously have substantially more knowledge at this than I do, so I want to pick your brain as much as I can.
Honestly, this isn't the kind of scenario that Linq to SQL is designed for. Linq to SQL is essentially a thin veneer over the database; your entity model is supposed to closely mirror your data model, and oftentimes your Linq to SQL "entity model" simply isn't appropriate to use as your domain model (which is the "model" in MVC).
Your controller should be making use of a repository or service of some kind. It should be that object's responsibility to load the specific entities along with any additional data that's necessary for the view model. If you don't have a repository/service, you can embed this logic directly into the controller, but if you do this a lot then you're going to end up with a brittle design that's difficult to maintain - better to start with a good design from the get-go.
Do not try to design your entity classes to reference the DataContext. That's exactly the kind of situation that ORMs such as Linq to SQL attempt to avoid. If your entities are actually aware of the DataContext then they're violating the encapsulation provided by Linq to SQL and leaking the implementation to public callers.
You need to have one class responsible for assembling the view models, and that class should either be aware of the DataContext itself, or various other classes that reference the DataContext. Normally the class in question is, as stated above, a domain repository of some kind that abstracts away all the database access.
P.S. Some people will insist that a repository should exclusively deal with domain objects and not presentation (view) objects, and refer to the latter as services or builders; call it what you like, the principle is essentially the same, a class that wraps your data-access classes and is responsible for loading one specific type of object (view model).
Let's say you're building an auto trading site and need to display information about the domain model (the actual car/listing) as well as some related-but-not-linked information that has to be obtained separately (let's say the price range for that particular model). So you'd have a view model like this:
public class CarViewModel
{
public Car Car { get; set; }
public decimal LowestModelPrice { get; set; }
public decimal HighestModelPrice { get; set; }
}
Your view model builder could be as simple as this:
public class CarViewModelService
{
private readonly CarRepository carRepository;
private readonly PriceService priceService;
public CarViewModelService(CarRepository cr, PriceService ps) { ... }
public CarViewModel GetCarData(int carID)
{
var car = carRepository.GetCar(carID);
decimal lowestPrice = priceService.GetLowestPrice(car.ModelNumber);
decimal highestPrice = priceService.GetHighestPrice(car.ModelNumber);
return new CarViewModel { Car = car, LowestPrice = lowestPrice,
HighestPrice = highestPrice };
}
}
That's it. CarRepository is a repository that wraps your DataContext and loads/saves Cars, and PriceService essentially wraps a bunch of stored procedures set up in the same DataContext.
It may seem like a lot of effort to create all these classes, but once you get into the swing of it, it's really not that time-consuming, and you'll ultimately find it way easier to maintain.
Update: Answers to New Questions
Where do the repository files go? In the Models folder or can I create a "Repositories" folder just for them?
Repositories are part of your model if they are responsible for persisting model classes. If they deal with view models (AKA they are "services" or "view model builders") then they are part of your presentation logic; technically they are somewhere between the Controller and Model, which is why in my MVC apps I normally have both a Model namespace (containing actual domain classes) and a ViewModel namespace (containing presentation classes).
how is the repository aware of the DataContext?
In most instances you're going to want to pass it in through the constructor. This allows you to share the same DataContext instance across multiple repositories, which becomes important when you need to write back a View Model that comprises multiple domain objects.
Also, if you later decide to start using a Dependency Injection (DI) Framework then it can handle all of the dependency resolution automatically (by binding the DataContext as HTTP-request-scoped). Normally your controllers shouldn't be creating DataContext instances, they should actually be injected (again, through the constructor) with the pre-existing individual repositories, but this can get a little complicated without a DI framework in place, so if you don't have one, it's OK (not great) to have your controllers actually go and create these objects.
In my Models folder I have a "Collections" folder, which really serve as the ViewModels
This is wrong. Your View Model is not your Model. View Models belong to the View, which is separate from your Domain Model (which is what the "M" or "Model" refers to). As mentioned above, I would suggest actually creating a ViewModel namespace to avoid bloating the Views namespace.
So, taking all of this where and how would the repository plug-in?
See a few paragraphs above - the repository should be injected with the DataContext and the controller should be injected with the repository. If you're not using a DI framework, you can get away with having your controller create the DataContext and repositories, but try not to cement the latter design too much, you'll want to clean it up later.
Also, what's the point of the interface?
Primarily it's so that you can change your persistence model if need be. Perhaps you decide that Linq to SQL is too data-oriented and you want to switch to something more flexible like Entity Framework or NHibernate. Perhaps you need to implement support for Oracle, mysql, or some other non-Microsoft database. Or, perhaps you fully intend to continue using Linq to SQL, but want to be able to write unit tests for your controllers; the only way to do that is to inject mock/fake repositories into the controllers, and for that to work, they need to be abstract types.
Moving on, would it be better to have an abstraction object that collects all the information for the real object? For example an IJob object which would have all of the properties of the Job + the additional properties I may want to add such as the Name?
This is more or less what I recommended in the first place, although you've done it with a projection which is going to be harder to debug. Better to just call the SP on a separate line of code and combine the results afterward.
Also, you can't use an interface type for your Domain or View Model. Not only is it the wrong metaphor (models represent the immutable laws of your application, they are not supposed to change unless the real-world requirements change), but it's actually not possible; interfaces can't be databound because there's nothing to instantiate when posting.
So yeah, you've sort of got the right idea here, except (a) instead of an IJob it should be your JobViewModel, (b) instead of an IJobRepository it should be a JobViewModelService, and (c) instead of directly instantiating the DataContext it should accept one through the constructor.
Keep in mind that the purpose of all of this is to keep a clean, maintainable design. If you have a 24-hour deadline to meet then you can still get it to work by just shoving all of this logic directly into the controller. Just don't leave it that way for long, otherwise your controllers will (d)evolve into God-Object abominations.
Replace {SOME_REALLY_COMPLEX_QUERY_THAT_HAS_TO_BE_IN_RAW_SQL_BECAUSE_LINQ_GENERATES_CRAP_IN_THIS INSTANCE} with a stored procedure then have Linq to SQL import that function.
You can then call the function directly from the data context, get the results and pass it to the view model.
I would avoid making a property that calls the data context. You should just get the value from a service or repository layer whenever you need it instead of embedding it into one of the objects created by Linq to SQL.

Categories

Resources