Create service layer with mocked data from LINQ DataContext - c#

I'm using LINQ-to-SQL with ASP.NET MVC 4, and as of this moment I have a repository layer which contacts a real database. This is not very good when I want to unit test.
Also there should never be a logic in the repository layer, so this is why I want to mock the LINQ DataContext so I can create a service layer that talks either to the mock DataContext or to the real DataContext.
I see that my LINQ DataContext class inherits DataContext, but there is no interface, so I can't really mock that. I also see that DataContext uses Table<> class and there exists an interface ITable, so I probably could mock that. Also my LINQ DataContext is a partial class, so maybe I could manipulate that in some kind of way?
When I google this, all articles are from 2008 and are outdated.
Can anyone guide me in the right appropriate direction?
Here is an example of what I want to do. I will have seperate service class for each controller.
public class MyServiceClass
{
IDataContext _context;
// Constructors with dependency injection
public MyServiceClass()
{
_context = new MyRealDataContext();
}
public MyServiceClass(IDataContext ctx)
{
_context = ctx;
}
// Service functions
public IEnumerable<ModelClass> GetAll()
{
return _context.ModelClass;
}
public ModelClass GetOne(int id)
{
return _context.Where(s => s.ID == id).SingleOrDefault();
}
}

Although Linq-to-Sql is still supported in .NET 4+, it has been pushed back in favor of Entity Framework. That's probably why you're finding mostly older articles.
Anyway the best way to go is to write your own DataAccess layer-interface, used through your application. You then can have an implementation of that interface that uses your linq-to-sql for production and a mocked implementation for your unit tests.
Use dependency injection to instantiate the actual implementation class.
For creating a mock implementation you do it either manually (Creating a class in your test project that implements the IDataContext interface but returns hard-coded data) or use one of the mocking frameworks around there.
I have not used every one of them but moq was quite nice. Microsoft has now also their framework in Visual Studio 2012 called Fakes, worth looking at.
Example of using moq
var expectedResultList = new List<ModelClass>(){ ... };
var mockDataContext = new Mock<IDataContext>();
mock.Setup(c => c.GetAll()).Returns(expectedResultList);
MyServiceClass service = new MyServiceClass(mockDataContext.Object);
var list = service.GetAll();
Assert.AreEqual(expectedResultList, list);
In this code you set up your mock object so that it will return your expected list when the GetAll method is called.
This way you can easily test your business logic based on different returns from your data access.
Example of IDataContext
public interface IDataContext<T>
{
IEnumerable<T> GetAll();
T GetById(int id);
int Save(T model);
}
public class LinqToSqlDataContext<T> : IDataContext<T>
{
private DataContext context = new DataContext();
public IEnumerable<T> GetAll()
{
// query datacontext and return enumerable
}
public T GetById(int id)
{
// query datacontext and return object
}
public int Save(T model)
{
// save object in datacontext
}
}
public class MyFirstServiceClass
{
private IDataContext<MyClass> context;
public MyFirstServiceClass(IDataContext<MyClass> ctx)
{
this.context = ctx;
}
....
}
public class MySecondServiceClass
{
private IDataContext<MySecondClass> context;
public MyFirstServiceClass(IDataContext<MySecondClass> ctx)
{
this.context = ctx;
}
....
}

Related

Should we use the entities generated by entity framework

Hello I use entity framework with a unit of work pattern and I would like to know if in my application layer I should work directly with entities generated by entity framework or recreate POCO objects in my application layer and map my POCO?
Because I would like my application layer not to make any reference to my entities, I would like for example to create another project in my solution that could map my entities to my poco in my application but I don't know if this is a good practice and especially I don't know how to do it
Thank you in advance!
In my UnitOfWork I have used a generic repository pattern that uses the models generated by the EF directly. The IRepository<T> interface looks a bit like this:
public interface IRepository<T> where T : class
{
void Add(T entity);
T GetById(long Id);
//etc - all the stuff you probably have
}
I have implementation of the IRepository called Repository
public Repository<T> : IRepository<T>
{
public readonly Infomaster _dbContext;
public Repository(Infomaster dbContext)
{
_dbContext = dbContext;
}
public void Add(T entity)
{
_dbContext.Set<T>.Add(t);
}
}
The use of the set and the type allows me to access the dataset (dbSet) of that particular type which allows me to create a generic pattern. You can use specific classes but it's a lot more work.
This means in my UnitOfWork, I only need to do the following:
public class UnitOfWork : IUnitOfWork
{
//Db context
Infomaster _dbContext;
//User is a model from my EF
public IRepository<User> UserRepository { get; private set; }
public UnitOfWork()
{
_dbContext = new Infomaster();
UserRepository = new Repository<User>(_dbContext);
}
public int Commit()
{
return _dbContext.Save();
}
}
I find that is the best way and requires the use of the model classes, I am using a code first database but I have done with database first.
(from iPhone - can and will update from laptop)

Entity Framework - infrastructure in WPF/MVVM application

I'm using EF for the first time, in a WPF application, using MVVM pattern. I read a lot of stuff but I couldn't end up with a solution. My problem is how to integrate EF in my app.
The most common way I found is build your own Repository + UnitOfWork. I don't like it. Basically because I already have DbContext and DbSet that can work as unit of work and repository, so why reinvent the wheel?
So I tried to use DbContext directly from view models like this
public class BookCollectionViewModel : ViewModelBase
{
public void LoadCollection()
{
Books.clear();
var books = new List<Book>();
using(var ctx = new DbContext())
{
books = ctx.DbSet<Book>().ToList();
}
books.Foreach(b => Books.Add(b));
}
ObservableCollection<Book> Books { get; } = new ObservableCollection<Book>();
}
But I don't like to use DbContext directly from view models, so I built a service layer
public class DbServices
{
public TReturn Execute<TEntity>(Func<IDbSet<TEntity>, TReturn> func)
{
TReturn retVal = default(TReturn);
using(var ctx = new DbContext())
{
retVal = func(ctx.DbSet<TEntity>());
}
return retVal;
}
}
public class BookCollectionViewModel : ViewModelBase
{
private DbServices mDbServices = new DbServices();
public void LoadCollection()
{
Books.clear();
var books = mDbServices.Execute<Book>((dbSet) => return dbSet.ToList());
books.Foreach(b => Books.Add(b))
}
ObservableCollection<Book> Books { get; } = new ObservableCollection<Book>();
}
But this way every action is atomic, so when I modify an entity I have to call SaveChanges() every time or loose changes, because DbContext is always disposed. So why not create a class-wide DbContext?
public class DbServices
{
private Lazy<> mContext;
public DbServices()
{
mContext = new Lazy<TContext>(() => {return new DbContext();});
}
public TContext Context { get { return context.Value; } }
public TReturn Execute<TEntity>(Func<IDbSet<TEntity>, TReturn> func)
{
return func(Context.DbSet<TEntity>());
}
}
Unfortunately, this way again doesn't work, because once a dbcontext is created, it is never disposed... So how about explicitly Open/Close the DbContext?
The question is: Where and how should I create/dispose the DbContext? The only thing I'm sure of is that I don't want to rebuild repository and unit of work, since they already exist as DbContext and DbSet...
I'm in the same position. I find that any persistent repository on the client side causes users not to see each others' changes.
Here's a good video explaining why EF is not the same as a repository
https://www.youtube.com/watch?v=rtXpYpZdOzM
Also I found an excellent end-to-end tutorial on WPF,MVVM & EF
http://www.software-architects.com/devblog/2010/09/10/MVVM-Tutorial-from-Start-to-Finish.
In this he exposes the data through a WCF data service and detaches them from the dbcontext straight away.
Hope it helps

How to implement FIND method of EF in Unit Test?

I have a Web API 2.0 project that I am unit testing. My controllers have a Unit of Work. The Unit of Work contains numerous Repositories for various DbSets. I have a Unity container in the Web API and I am using Moq in the test project. Within the various repositories, I use the Find method of Entity Framework to locate an entity based on it's key. Additionally, I am using Entity Framework 6.0.
Here is an very general example of the Unit of Work:
public class UnitOfWork
{
private IUnityContainer _container;
public IUnityContainer Container
{
get
{
return _container ?? UnityConfig.GetConfiguredContainer();
}
}
private ApplicationDbContext _context;
public ApplicationDbContext Context
{
get { _context ?? Container.Resolve<ApplicationDbContext>(); }
}
private GenericRepository<ExampleModel> _exampleModelRepository;
public GenericRepository<ExampleModel> ExampleModelRepository
{
get { _exampleModelRepository ??
Container.Resolve<GenericRepository<ExampleModel>>(); }
}
//Numerous other repositories and some additional methods for saving
}
The problem I am running into is that I use the Find method for some of my LINQ queries in the repositories. Based on this article, MSDN: Testing with your own test doubles (EF6 onwards), I have to create a TestDbSet<ExampleModel> to test the Find method. I was thinking about customizing the code to something like this:
namespace TestingDemo
{
class TestDbSet : TestDbSet<TEntity>
{
public override TEntity Find(params object[] keyValues)
{
var id = (string)keyValues.Single();
return this.SingleOrDefault(b => b.Id == id);
}
}
}
I figured I would have to customize my code so that TEntity is a type of some base class that has an Id property. That's my theory, but I'm not sure this is the best way to handle this.
So I have two questions. Is the approach listed above valid? If not, what would be a better approach for overriding the Find method in the DbSet with the SingleOrDefault method? Also, this approach only really works if their is only one primary key. What if my model has a compound key of different types? I would assume I would have to handle those individually. Okay, that was three questions?
To expand on my comment earlier, I'll start with my proposed solution, and then explain why.
Your problem is this: your repositories have a dependency on DbSet<T>. You are unable to test your repositories effectively because they depend on DbSet<T>.Find(int[]), so you have decided to substitute your own variant of DbSet<T> called TestDbSet<T>. This is unnecessary; DbSet<T> implements IDbSet<T>. Using Moq, we can very cleanly create a stub implementation of this interface that returns a hard coded value.
class MyRepository
{
public MyRepository(IDbSet<MyType> dbSet)
{
this.dbSet = dbSet;
}
MyType FindEntity(int id)
{
return this.dbSet.Find(id);
}
}
By switching the dependency from DbSet<T> to IDbSet<T>, the test now looks like this:
public void MyRepository_FindEntity_ReturnsExpectedEntity()
{
var id = 5;
var expectedEntity = new MyType();
var dbSet = Mock.Of<IDbSet<MyType>>(set => set.Find(It.is<int>(id)) === expectedEntity));
var repository = new MyRepository(dbSet);
var result = repository.FindEntity(id);
Assert.AreSame(expectedEntity, result);
}
There - a clean test that doesn't expose any implementation details or deal with nasty mocking of concrete classes and lets you substitute out your own version of IDbSet<MyType>.
On a side note, if you find yourself testing DbContext - don't. If you have to do that, your DbContext is too far up the stack and it will hurt if you ever try and move away from Entity Framework. Create an interface that exposes the functionality you need from DbContext and use that instead.
Note: I used Moq above. You can use any mocking framework, I just prefer Moq.
If your model has a compound key (or has the capability to have different types of keys), then things get a bit trickier. The way to solve that is to introduce your own interface. This interface should be consumed by your repositories, and the implementation should be an adapter to transform the key from your composite type into something that EF can deal with. You'd probably go with something like this:
interface IGenericDbSet<TKeyType, TObjectType>
{
TObjectType Find(TKeyType keyType);
}
This would then translate under the hood in an implementation to something like:
class GenericDbSet<TKeyType,TObjectType>
{
GenericDbSet(IDbSet<TObjectType> dbset)
{
this.dbset = dbset;
}
TObjectType Find(TKeyType key)
{
// TODO: Convert key into something a regular dbset can understand
return this.dbset(key);
}
}
I realise this is an old question, but after coming up against this issue myself when mocking data for unit tests I wrote this generic version of the 'Find' method that can be used in the TestDBSet implementation that is explained on msdn
Using this method means you dont have to create concrete types for each of your DbSets. One point to note is that this implementaion works if your entities have primary keys in one of the following forms (im sure you could modify to suite other forms easily):
'Id'
'ID'
'id'
classname +'id'
classname +'Id'
classname + 'ID'
public override T Find(params object[] keyValues)
{
ParameterExpression _ParamExp = Expression.Parameter(typeof(T), "a");
Expression _BodyExp = null;
Expression _Prop = null;
Expression _Cons = null;
PropertyInfo[] props = typeof(T).GetProperties();
var typeName = typeof(T).Name.ToLower() + "id";
var key = props.Where(p => (p.Name.ToLower().Equals("id")) || (p.Name.ToLower().Equals(typeName))).Single();
_Prop = Expression.Property(_ParamExp, key.Name);
_Cons = Expression.Constant(keyValues.Single(), key.PropertyType);
_BodyExp = Expression.Equal(_Prop, _Cons);
var _Lamba = Expression.Lambda<Func<T, Boolean>>(_BodyExp, new ParameterExpression[] { _ParamExp });
return this.SingleOrDefault(_Lamba);
}
Also from a performance point of view its not going to be as quick as the recommended method, but for my purposes its fine.
So based on the example, I did the following to be able to Unit Test my UnitOfWork.
Had to make sure my UnitOfWork was implementing IApplicationDbContext. (Also, when I say UnitOfWork, my controller's UnitOfWork is of type IUnitOfWork.)
I left all of the DbSet's in my IApplicationDbContext alone. I chose this pattern once I noticed IDbSet didn't include RemoveRange and FindAsync, which I use throughout my code. Also, with EF6, the DbSet can be set to virtual and this was recommended in MSDN, so that made sense.
I followed the Creating the in-memory test doubles
example to create the TestDbContext and all the recommended classes (e.g. TestDbAsyncQueryProvider, TestDbAsyncEnumerable, TestDbAsyncEnumerator.) Here is the code:
public class TestContext : DbContext, IApplicationDbContext
{
public TestContext()
{
this.ExampleModels= new TestBaseClassDbSet<ExampleModel>();
//New up the rest of the TestBaseClassDbSet that are need for testing
//Created an internal method to load the data
_loadDbSets();
}
public virtual DbSet<ExampleModel> ExampleModels{ get; set; }
//....List of remaining DbSets
//Local property to see if the save method was called
public int SaveChangesCount { get; private set; }
//Override the SaveChanges method for testing
public override int SaveChanges()
{
this.SaveChangesCount++;
return 1;
}
//...Override more of the DbContext methods (e.g. SaveChangesAsync)
private void _loadDbSets()
{
_loadExampleModels();
}
private void _loadExampleModels()
{
//ExpectedGlobals is a static class of the expected models
//that should be returned for some calls (e.g. GetById)
this.ExampleModels.Add(ExpectedGlobal.Expected_ExampleModel);
}
}
As I mentioned in my post, I needed to implement the FindAsync method, so I added a class called TestBaseClassDbSet, which is an alteration of the TestDbSet class in the example. Here is the modification:
//BaseModel is a class that has a key called Id that is of type string
public class TestBaseClassDbSet<TEntity> :
DbSet<TEntity>
, IQueryable, IEnumerable<TEntity>
, IDbAsyncEnumerable<TEntity>
where TEntity : BaseModel
{
//....copied all the code from the TestDbSet class that was provided
//Added the missing functions
public override TEntity Find(params object[] keyValues)
{
var id = (string)keyValues.Single();
return this.SingleOrDefault(b => b.Id == id);
}
public override Task<TEntity> FindAsync(params object[] keyValues)
{
var id = (string)keyValues.Single();
return this.SingleOrDefaultAsync(b => b.Id == id);
}
}
Created an instance of TestContext and passed that into my Mock.
var context = new TestContext();
var userStore = new Mock<IUserStore>();
//ExpectedGlobal contains a static variable call Expected_User
//to be used as to populate the principle
// when mocking the HttpRequestContext
userStore
.Setup(m => m.FindByIdAsync(ExpectedGlobal.Expected_User.Id))
.Returns(Task.FromResult(ExpectedGlobal.Expected_User));
var mockUserManager = new Mock(userStore.Object);
var mockUnitOfWork =
new Mock(mockUserManager.Object, context)
{ CallBase = false };
I then inject the mockUnitOfWork into the controller, and voila. This implementation seems to be working perfect. That said, based on some feeds I have read online, it will probably be scrutinized by some developers, but I hope some others find this to be useful.

Is it a good design to inject services as factories?

I have been reading Mark Seemann's excellent book on DI and hope to implement it in my next WPF project. However I have a query regarding object lifetime. So far, most examples seem to explain the repository pattern per request for MVC applications. In WPF there isn't really an alternative to this (I think). Seeing as the object graph of the entire application is constructed in the composition root, how can I make sure that my unit-of-work stuff is working properly. For example:
public class ContextFactory : IContextFactory
{
DBContext context;
public ContextFactory()
{
context = new MyDBContext();
}
public DBContext GetContext()
{
return context;
}
}
public class ItemOneRepository() : IItemOneRepository
{
DBContext context;
public ItemOneRepository(IContextFactory contextFactory)
{
this.context = contextFactory.GetContext();
}
public IEnumerable GetItems()
{
return context.ItemOnes;
}
}
public class ItemTwoRepository() : IItemTwoRepository
{
DBContext context;
public ItemTwoRepository(IContextFactory contextFactory)
{
this.context = contextFactory.GetContext();
}
public IEnumerable GetItemsByItemOneID(int itemOneID)
{
return context.ItemTwos.Where(i => i.itemOneID == itemOneID);
}
}
public class ThingService : IThingService
{
IItemOneRepository itemOneRepo;
IItemTwoRepository itemTwoRepo;
public ThingService(
IItemOneRepository itemOneRepository,
IItemTwoRepository itemTwoRepository)
{
itemOneRepo = itemOneRepository;
itemTwoRepo = itemTwoRepository;
}
public IEnumerable Things GetThing()
{
var ItemOnes = itemOneRepo.GetItems();
return ItemOnes.Select(i =>
new Thing(
i.FieldOne,
i.FieldFour,
itemRepoTwo.GetItemsByItemOneID(i.ID)
)
);
}
}
In this case the MyDBContext instance is created through ContextFactory in the composition root. ItemOneRepository and ItemTwoRepository are using the same unit-of-work (MyDBContext), but so is the rest of the application which is plainly wrong. What if I changed the repositories to accept a DBContext instead of ContextFactory and added a ThingServiceFactory class like:
public ThingServiceFactory : IThingServiceFactory
{
IContextFactory contextFactory;
public ThingServiceFactory(IContextFactory factory)
{
contextFactory = factory;
}
public IThingService Create()
{
MyDBContext context = contextFactory.Create();
ItemOneRepository itemOneRepo = new ItemOneRepository(context);
ItemOneRepository itemTwoRepo = new ItemTwoRepository(context);
return new ThingService(itemOneRepo, itemTwoRepo);
}
}
This is better as I can now pass the ThingServiceFactory to my ViewModels instead of an instance of ThingService (complete with DBContext). I can then create a unit-of-work whenever I need one and instantly dispose of it when I’ve finished. However, is this really the correct approach. Do I really need to write a factory for every unit-of-work operation I need? Surely there is a better way...
There's IMO only one good solution to this problem and that is to apply a command-based and query-based application design.
When you define a single ICommandHandler<TCommand> abstraction to define business transactions, you can inject closed versions of that interface into any form that needs this. Say for instance you have a "move customer" 'command' operation:
public class MoveCustomer
{
public Guid CustomerId;
public Address NewAddress;
}
And you can create a class that will be able to execute this command:
public class MoveCustomerHandler : ICommandHandler<MoveCustomer>
{
private readonly DBContext context;
// Here we simply inject the DbContext, not a factory.
public MoveCustomerHandler(DbContext context)
{
this.context = context;
}
public void Handle(MoveCustomer command)
{
// write business transaction here.
}
}
Now your WPF Windows class can depend on ICommandHandler<MoveCustomer> as follows:
public class MoveCustomerWindow : Window
{
private readonly ICommandHandler<MoveCustomer> handler;
public MoveCustomerWindows(ICommandHandler<MoveCustomer> handler)
{
this.handler = handler;
}
public void Button1Click(object sender, EventArgs e)
{
// Here we call the command handler and pass in a newly created command.
this.handler.Handle(new MoveCustomer
{
CustomerId = this.CustomerDropDown.SelectedValue,
NewAddress = this.AddressDropDown.SelectedValue,
});
}
}
Since MoveCustomerWindow lives for quite some time, it will drag on its dependencies for as long as it lives. If those dependencies shouldn't live that long (for instance your DbContext) you will be in trouble and Mark Seemann calls this problem Captive Dependency.
But since we now have a single ICommandHandler<TCommand> abstraction between our presentation layer and our business layer, it becomes very easy to define a single decorator that allows postponing the creation of the real MoveCustomerHandler. For instance:
public class ScopedCommandHandlerProxy<TCommand> : ICommandHandler<TCommand>
{
private readonly Func<ICommandHandler<TCommand>> decorateeFactory;
private readonly Container container;
// We inject a Func<T> that is able to create the command handler decoratee
// when needed.
public ScopedCommandHandlerProxy(
Func<ICommandHandler<TCommand>> decorateeFactory,
Container container)
{
this.decorateeFactory = decorateeFactory;
this.container = container;
}
public void Handle(TCommand command)
{
// Start some sort of 'scope' here that allows you to have a single
// instance of DbContext during that scope. How to do this depends
// on your DI library (if you use any).
using (container.BeginLifetimeScope())
{
// Create a wrapped handler inside the scope. This way it will get
// a fresh DbContext.
ICommandHandler<TCommand> decoratee =this.decorateeFactory.Invoke();
// Pass the command on to this handler.
decoratee.Handle(command);
}
}
}
This sounds a bit complex, but this completely allows you to hide the fact that a new DbContext is needed from the client Window and you hide this complexity as well from your business layer; you can simply inject a DbContext into your handler. Both sides know nothing about this little peace of infrastructure.
Of course you still have to wire this up. Without a DI library you do something like this:
var handler = new ScopedCommandHandlerProxy<MoveCustomerCommand>(
() => new MoveCustomerCommandHandler(new DbContext()),
container);
How to register this in a DI library is completely depending on the library of choice, but with Simple Injector you do it as follows:
// Register all command handler implementation all at once.
container.Register(
typeof(ICommandHandler<>),
typeof(ICommandHandler<>).Assembly);
// Tell Simple Injector to wrap each ICommandHandler<T> implementation with a
// ScopedCommandHandlerProxy<T>. Simple Injector will take care of the rest and
// will inject the Func<ICommandHandler<T>> for you. The proxy can be a
// singleton, since it will create the decoratee on each call to Handle.
container.RegisterDecorator(
typeof(ICommandHandler<>),
typeof(ScopedCommandHandlerProxy<>),
Lifestyle.Singleton);
This is just one of the many advantages that this type of design gives you. Other advantages is that it makes much easier to apply all sorts of cross-cutting concerns such as audit trailing, logging, security, validation, de-duplication, caching, deadlock-prevention or retry mechanisms, etc, etc. The possibilities are endless.
ItemOneRepository and ItemTwoRepository are using the same
unit-of-work (MyDBContext), but so is the rest of the application
which is plainly wrong.
If your factory is registered with a transient lifecycle, you will get a new instance every time it's injected, which will be a new DBContext each time.
However, I would recommend a more explicit unit of work implementation:
public DBContext GetContext() //I would rename this "Create()"
{
return new MyDBContext();
}
And:
public IEnumerable GetItemsByItemOneID(int itemOneID)
{
using (var context = contextFactory.Create())
{
return context.ItemTwos.Where(i => i.itemOneID == itemOneID);
}
}
This gives you fine-grained control over the unit of work and transaction.
You might also ask yourself if the repositories are gaining you anything vs. just using the context directly via the factory. Depending on the complexity of your application, the repositories may be unnecessary overhead.

test Entity Framework Models

I'm going to test my EF Models. In order to do this I've create IDbContext class. But I don't know how to rewrite my Save and Delete methods, because I don't know how to write
db.Partner.AddObject(obj); How to rewrite these methods?
public interface IDbContext
{
int SaveChanges();
DbSet<Partner> Partner { get; set; }
}
public class PartnerRepository : IPartnerRepository
{
readonly IDbContext _context;
public PartnerRepository()
{
_context = (IDbContext)new VostokPortalEntities();
}
public PartnerRepository(IDbContext context)
{
_context = context;
}
public void Save(Partner obj)
{
using (var db = new VostokPortalEntities())
{
if (obj.PartnerID == 0)
{
db.Partner.AddObject(obj);
}
else
{
db.Partner.Attach(obj);
db.ObjectStateManager.ChangeObjectState(obj, System.Data.EntityState.Modified);
}
db.SaveChanges();
}
}
public void Delete(Partner obj)
{
using (var db = new VostokPortalEntities())
{
db.Partner.Attach(obj);
db.ObjectStateManager.ChangeObjectState(obj, System.Data.EntityState.Deleted);
db.SaveChanges();
}
}
public List<Partner> GetAll()
{
using (var db = new VostokPortalEntities())
{
return db.Partner.OrderByDescending(i => i.PartnerID).ToList();
}
}
}
Is this proper way to test EF Models?
Unit-testing of repositories takes a lot of time and does not give you many benefits. Why? Because repository don't have complex business logic. Usually there is pretty simple calls to underlying data-access API (i.e. ORM). I think it's match better to spend time on writing full-stack acceptance tests, which also will show if your repository do its job.
BTW there is interesting rule Don't Mock what you don't own:
By testing interactions with a mocked version of type we don't own, we
really are not using our test to check for the correct behavior, nor
to drive out a collaborator’s design. All our test is doing is
reiterating our guess as to how the other type works. Sure, it’s
better than no test, but not necessarily by much.

Categories

Resources