I have just started a new project using nHibernate and Fluent for mapping. The architect has sent me a database from which I have generated several hundred entity classes and the corresponding Fluent mapping files. I know this is not the ideal DDD way of doing things but life is rarely ideal.
What I want to do is test that all the mappings are correct, columns mapped correct, OneToMany, ManyToMany etc. Is there some automated or easy way to do this? I have considered just writing a simple repository that loads a record from every entity and make sure no exception is raised, but most of the tables have no data in them yet.
Have a look at the PersistenceSpecification in Fluent NHibernate. It's hardly perfect, but it handles a lot of simple cases well.
In order to test ORM mappings one strategy I've used that has saved hours of work involves using a row test approach in unit tests, like the RowTest attribute in MBUnit or NUnit. This will save you from writing individual unit tests for distinct row values. take a look at this webcast for a quick start.
Regarding the database you could follow 2 strategies:
If you need to test in a specific DB instance or engine you can use transactions and make sure all writings to the DB are rolled back after the asserts.
If you can use your own instance and engine you can use SQLLite or SQL CE as a unit test only DB. Since this DB will be only used in unit tests you can always create a brand new DB every time you run the unit tests.
Related
I'm trying to reduce the startup time for tests against an EF 6x datastore. The tests are within a transaction and the db gets rolled back once done. I would appreciate any suggestions on how to retain an instance of the DbContext between test sessions so that EF doesn't have to go through the whole view generation process again?
I don't want to use mocks/fakes, non-Microsoft branch of EF and interactive views are already in place. Thank you.
Different options. As you did not mentioned your aim of testing and there is not any code, the options are:
If you are inserting many records into your tables, you can do a bulk insert. The best library for doing this is:EntityFramework.BulkInsert-ef6. You can install it through Nuget console.
If you see slowness while working with data and you have many load/manipulation/save operations, you have to do in-memory operation as Sampath recommends.
If you are loading data, just load the columns that you need. You also should use lazy loading option(which from your post, I think you know it well).
4.Some portion of the slowness could be because of the architecture of your database. The key column types have a considerable effect on Where operations!
I would like to recommend you to use in-memory data for that. I am also used this pattern and it is really well and very fast. This is the pattern where the industry recommended and trouble free in long run. Always try to use best practices when you develop a software app.
When writing tests for your application it is often desirable to avoid
hitting the database. Entity Framework allows you to achieve this by
creating a context – with behavior defined by your tests – that makes
use of in-memory data.
Here is the article about how to do that :Testing with a mocking framework
Another article for you : Unit testing in C# using xUnit, Entity Framework, Effort and ASP.NET Boilerplate
I am trying to write a unit test for a method that saves multiple records into a database. The method is passed a collection argument. The method loops through the collection. The userid associated with the object is set and the record is updated in the the database. This is done for every object in the collection. Any idea on how to create a unit test besides having it write to the database.
Thanks,
As mentioned in comments, you have an option to abstract the database operations by some interface. If you use ORM, you can implement generic repositories, if you use plain ADO.NET, you can implement something like transaction script.
Another option would be to use SQLite in-memory database. It is not clear what db interface you are using, but SQLite is supported by the majority of database access methods in .NET, including Entity Framework. This would not exactly be a unit test but it does the job.
As has been suggested in the comments, you have 2 choices
Create an abstraction for the actual writing to the database and verify the interactions with that abstraction are as you would expect for your method. This will give you fast unit tests but you will still have to write integration tests for the implementation of your abstraction which actually puts data in the database. To verify the interactions you can either use a mocking library or create an implementation of the interface just for testing with.
Create an integration test that writes to the database and verify that the data is inserted as you would expect. these tests will be slower, but will give you confidence that the data will actually be placed in the database.
My preference is for the second option, as this tests that the data will actually be persisted correctly, something you are going to have to do eventually, but not everyone likes to do this.
I'm trying to test my code using EntityFramework code first. In order to make it testable and to allow isolation testing, I created an interface which my DbContext implements. I'm not testing the DbContext class - I'm going to assume EF code works as expected.
Now, consider the following method:
public IEnumerable<User> GetOddId()
{
return context_.Users.Where((u, i) => i % 2 == 1).AsEnumerable();
}
This method will pass with my mock FakeDbSet (because it would use the in-memory LINQ provider) whereas it would fail with an Exception when using the EF/LINQ to SQL drivers.
Would you leave it as it is and hope people know enough not to write such queries? Would you give up isolation testing and test on an actual db?
Would the LocalDb with DataMigrations (perhaps with appropriate seeds) help with testing on an actual db?
Please justify the answer(s).
TLDR: How to test EntityFramework code, considering the differences between in-memory LINQ and SQL LINQ?
Much later edit: I've since found a very good framework that does exactly what I need. I wrote a blog post about unit testing with Effort. Also please note all this might not be needed in the upcoming EF6, which promises a some unit testing features.
We use SQLite's in-memory databases for this purpose. They are extremely quick to create, query and tear down and barely have any impact on overall test speed. Once you've set yourself up a test framework to create a database and inject data, tests are quick to write.
Of course, SQLite is a much simpler database than most, so complex queries may fail to translate to its version of SQL, but for testing 90% of cases, it works well.
Do these tests constitute integration tests? I don't think so. They are still only testing one unit of your code, namely the bit that generates a LINQ query. You're testing for two things: 1) that the query returns the correct data (but you could check this using an in-memory collection as you stated), and 2) that the query can be translated into valid SQL by Entity Framework. The only real way to test the latter is to fire the query at a real Entity Framework but with a stubbed database.
Whilst you could argue that a true unit test should test just the output of your code (i.e. parse and check the expression tree that has been generated), as well as being harder to write, it doesn't really prove anything. If, for example, you modify the code to generate an inner join instead of a subquery, would you want the test to break? Only if it returns different results, I would have thought.
Where I work, we have a dev/beta/production SQL server. Sometimes we'll create tests against beta and seed test data (e.g. insert before testing specific selects and such) before executing tests on an actual database. People will draw a distinction between unit and integration testing, but it at least lets us test our logic, which is better than just crossing fingers and hoping for the best at the data-access layer.
What good is an in-memory provider that's easy to test for but doesn't actually represent the live system in some important cases?
EDIT: We don't use EF, btw...
You might want to use a mocking framework like Telerik's JustMock (or choose from many others).
This would give you lots of control over what happens in your test code. (Short introduction here.)
Instead of implementing a query to a real database you could 'simulate' the query and return a pre-defined collection of objects.
You could, for example, create multiple unit tests that call your GetOddId() method, and define different return collections that cover all the test cases you need (an empty list, correct content, wrong contents, throwing an exception, whatever, ...).
There is also a free 'Lite' version here or via NuGet.
I have been doing a lot of unit testing lately with mocking. The one thing that strikes me as a bit of a problem are the differences between querying against an in memory list (via a mock of my repository) and querying directly against the database via entity framework.
Some of these situations might be:
Testing a filter parameter which would be case insensitive against a database but case sensitive
against an in memory collection leading to a false fail.
Linq statements that might pass against an in memory collection but would fail against entity framework because they arent supported leading to a false pass.
What is the correct way to handle or account for these differences so that there are not false passes or fails in tests? I really like mocking as it makes things so much quicker and easier to test. But it seems to me that the only way to get a really accurate test would be to just test against a the entity framework/database environment.
Besides the unit tests you do you should also create integration tests which run against a real database setup as encountered in production.
I'm not an expert for EF but with NHibernate for example you can create a configuration which points to an in-memory instance of SQLite where you then run your quick tests against (i.e. during a development cycle where you want to get through the test suite as fast as possible). When you want to run your integration tests against a real database you simply change the NHibernate config to point to a real database setup and run the same tests again.
Would be surprising if you could not achieve something similar with EF.
You can use DevMagicFake, this framework will fake the DB for you and can also generate data so you can test your application without testing the DB
First and most important is you can define any behavior data within your mock. Second is speed. From unit testing perspective testing speed counts. Database connections are bottleneck most of time so that's why you mock it with tests.
To implement testing properly you need to work on your overall arch first.
For instance to access data layer I use repository pattern sometimes. It's described really good in Eric Evans DDD book.
So let's say if your repository is defined as below
interface IRepository: IQueryable, ICollection
you can handle linq queries pretty straightforward.
Further reading Repository
I would make my mocks more granular, so that you don't actually query against a larger set in a mock repository.
I typically have setters on my mock repository that I set in each test to control the output of the mocked repository.
This way you don't have to rely on writing queries against a generic mock, and your focus can be on testing the logic in the method under test
I'm trying to avoid using an in memory database for testing (though I might have to do this if the following is impossible). I'm using NHibernate 3.0 with LINQ. I'd like to be able to mock session.Query<T>() to return some dummy values but I can't since it's an extension method and these are pretty much impossible to test.
Does anyone have any suggestions (other than using an in memory database) for testing session queries with LINQ?
I've tried this before with previous versions of NH without much luck. I eventually used another class to wrap the query and mocked that instead.
I do think it's also worth writing an integration test against a real sql server, to make sure that the repository behaves as expected.
A better approach will be to mock the concept of what you are trying to do, not the inner api of an external system.
For instance
Write the query in a separated artifact, like IQuerySomething / QuerySomething
Test your query against a database. Try this database be prety like the real db.
When testing something that depends on IQuerySomething, mock IQuerySomething.
Fabio Maulo wrote about this pattern as EQO (Enhanced Query Object), i recommend you his post.
This is the way we use in .net for almost everything.
It's looking like you are going to overcomplicate things. I will try to save your time =)
First of all let's start that there is two tipes of testing for the typical project (I am sure you know this, but it is better to mention). Integration tests and Unit tests. And typically (I will assume that you have a typicall application in order no to add "typically" to every sentence) you need both of them.
Integration tests are going on real database and some of them on In-Memory one for better test performance.
So you probably have mappings in your application and want to test them, it is better to do with integration tests on real DB, and if you are using Fluent Nhibernate (if you don't it is better to start using it) this will be a pice of cake.
Then you probably have a kind of Repository or another data access layer (where you are using Linq) that you want to test too. And you probably want to have tests like:
When I submit a query get-customer-by-name, my data access component should return customer with specified name.
This is better to achieve using in-memory database, because this is cheaper. This will save you some time in the typical scenario.
But if you have a lot of complex queries, then I would agree with José F. Romaniello, that it is better to use Enhanced Query Object and test it separately.
You can put your attention on Sharp Arhitecture framework that is targeting a lot of issues when using Nhibernate and testing persistence layer.