Best way to Unit test non-existing Database - c#

We have a build server which does not have a database running on its machine.
Now we want to make unit tests which also cover SQL-related tasks, for example writing mock-data and reading them again to verify the output (it gets manipulated by C# code in the process).
So the common approach would be to provide a connection-string to the remote server which actually runs a database.
This works, but it is unwanted because the database-server could be inaccessible when the build is triggered, and therefore the automatic tests would fail.
What is the best way to create a "pseudo-database" while running Unit tests?
Is this even possible without an actual database? Does it even make sense?

Use mocking framework like Moq, check here. That is the exact purpose of the Mocking frameworks that only a given component can be unit tested and any integration like database can be mocked to return the valid pre-defined result at the run-time and thus determine the working of given code / unit

The commonly accepted answer to "how do I unit test something with external dependencies" is to use a mocking layer.
However, that very quickly becomes unwieldy - you end up writing and maintaining a lot of code just to mimic your database, and it's not clear this code will pay for itself. For instance, if your SQL statements has a typo, your unit tests won't catch that.
Instead, it is much better to factor the code which manipulates the data out into a layer which can be unit-tested without database access.

Related

Unit testing C# database dependent code

I would like to create unit tests for data dependent code. For example:
A user class that has the usual create, update & delete.
If I wanted to create a test for "user already exists" scenario, or and update or delete test. I would need to know that a specific user already exists in my database.
In such cases, what would be the best approach to have stand alone tests for these operations that can run in any order?
When you have dependencies like this think about whether you want to be Integration Testing as opposed to Unit Testing. If you do want to do Unit tests take a look at using Mock Data.
Integration Testing:
Tests how you code integrates with different parts of a system. This can be making sure your code connects to a database properly or has created a file on the filesystem. These tests are usually very straight-forward and do not have the same constraint of "being able to run in any order." However, they require a specific configuration in order to pass which means they do float well from developer to developer.
Unit Testing: Tests your code's ability to preform a function. For example "Does my function AddTwoNumbers(int one, int two) actually add two numbers?" Unit tests are to ensure any changes in code does not effect the expected results.
When getting into areas like "Does my code call the database any enter the result correctly?" you need to consider that unit tests are not meant to interact with the system. This is where we get into using "mock data." Mock classes and mock data take the place of an actual system to just ensure that your code "called out in the way we were expecting." The difficult part about this is it can be done but most of the .Net Framework classes do not provide the needed Interfaces in order to do it easily.
See the MSDN page on Tesing for more info. Also, consider this MSDN article on Mock Data.

Writing Unit Tests for method that queries database

I am learning TDD and I currently have a method that is working but I thought I'd have a go at rebuilding it using TDD.
The method essentially takes 6 parameters, queries a database, does some logic and returns a List<T>
My initial tests including checking for empty/zero defined string and int method parameter values but now I'm not sure what to do. If I wasn't using TDD, I would just create code to find the DB connection string and open up a DB connection, query the database, read the values etc.
Obviously we can't do that in Unit Testing so I was after some advice of how to proceed.
Remember that TDD is as much about good design than it is about testing. This method has too much going on; it violates the Separation of Concerns principle.
You've already identified several areas that will need to be tested:
The method essentially takes 6 parameters, queries a database, does some logic and returns a List<T>
You have several discrete steps there, and there are probably a few more hiding in the code. Breaking those up is the name of the game when it comes to TDD.
For starters, it might be a good idea to factor out the piece that performs the logic.
Is your method building a query dynamically? Break that piece out as well and test it to make sure the query is written properly.
You can put the execution of the query into a standalone repository or something similar, and write integration tests against that. That way you only have a simple test hitting the database instead of the current complex method.
If you try to test this as is, you'll likely end up with a monster test that requires a lot of setup and duplicates all of your business logic, and when it breaks it'll be unclear as to what went wrong.
In general, there's nothing "wrong" about using TDD to test database code. However, you might try abstracting out the database code, then mocking it out.
The method essentially takes 6 parameters, queries a database, does
some logic and returns a List
That seems to be too much to be a unit testable code!!
A unit testable code should be doing very specific things and doing it in small modules. So, in your case you need to refactor and break your method into following (at least):
data base query: wrapped inside a DataProvider with a backing interface. And your unit test would mock this interface.
does some logic : this is the best candidate for a unit test. This should be a module that just takes data provider interface and does the logic and returns modified list which you will validate in your unit test.
Also, remember a unit test should cover at least three scenarios for each testable module:
a positive test
a negative test
test throwing meaningful exception for invalid values.
Hope this is helpful.
Another option is to start a transaction before the test and do a rollback afterwards. This way tests are independent so can still, according to some definitions, be considered unit tests.
Contrary to what's mentioned in other answers, you should refactor the code to get to a better design after the test passes. Then you can verify that your refactoring didn't break anything just by rerunning the test.
You might want to try looking at DbUnit for running unit tests on your data access layer. It puts your database in a known state between test runs preventing corruption of your test database.
You can:
Use the class/test init to raise a blank DB or a copy of small DB with a known set of data.
In the test method enter test data (if the DB is empty), then perform the query, then compare result with expect result.
In the test/class cleanup remove DB.
This tests your unit but is considered an "integration test" by some.
- The term "unit test" has some disagreement due to the ambiguity of the term "unit".
You could also use an in-memory DB or an in-process DB to make the test environment simpler.

Unit Testing Database Methods

I am currently working on a c# project which makes use of an SQLite Database. For the project I am required to do unit testing but was told that unit testing shouldn't involve external files, like database files for the testing and instead the test should emulate the database.
If I have a function that tests if a something exists in a database how could this sort of method be tested with a unit testing.
in general it makes life easier if external files are avoided and everything is done in code. There are no rules which says "shouldn't", and sometimes it just makes more sense to have the external dependency. But only after you have considered how not to have it, and realized what the tradeoffs are.
Secondly, what Bryan said is a good option and the one I've used before.
In an application that uses a database, there will be at least one component whose responsibility is to communicate with that database. The unit test for that component could involve a mocked database, but it is perfectly valid (and often desirable) to test the component using a real database. After all, the component is supposed to encapsulate and broker communication with that database -- the unit test should test that. There are numerous strategies to perform such unit tests conveniently -- see the list of related SO questions in the sidebar for examples.
The general prescription to avoid accessing databases in unit tests applies to non-database components. Since non-database components typically outnumber database-related components by a wide margin, the vast majority of unit tests should not involve a database. Indeed, if such non-database components required a database to be tested effectively, there is likely a design problem present -- probably improper separation of concerns.
Thus, the principle that unit tests should avoid databases is generally true, but it is not an absolute rule. It is just a (strong) guideline that aids in structuring complex systems. Following the rule too rigidly makes it very difficult to adequately test "boundary" components that encapsulate external systems -- places in which bugs find it very easy to hide! So, the question that one should really be asking oneself when a unit test demands a database is this: is the component under test legitimately accessing the database directly or should it instead collaborate with another that has that responsibility?
This same reasoning applies to the use of external files and other resources in unit tests as well.
With SQLite, you could use an in-memory database. You can stage your database by inserting data and then run your tests against it.
Once databases get involved it always blurs the line between unit testing and integration testing. Having said that, it is always a nice and very useful test to be able to put something in a database (Setup), Do your test and remove it at the end (Cleanup). This lets you test end to end one part of your application.
Personally I like to do this in an attribute driven fashion. By Specifying the Sql scripts to run for each test as an attribute like so ..
[PreSqlExecute("SetupTestUserDetails.sql")]
[PostSqlExecute("CleanupTestUserDetails.sql")]
public void UpdateUserNameTest()
{
The connectionstrings come from the app.config as usual and can even be a symbolic link to the app.config in your main project.
Unfortunately this isn't a standard feature with the MS test runner that ships with visual studio. If you are using Postsharp as your AOP framework, this is easy to do. If not, you can still get the same functionality for standard MS Test Runner, by using a feature of .Net called "Context Bound Objects". This lets you inject custom code into an object creation chain to do AOP like stuff as long as your objects inherit from ContextBoundObject.
I did a blog post with more details and a small, complete code sample here.
http://www.chaitanyaonline.net/2011/09/25/improving-integration-tests-in-net-by-using-attributes-to-execute-sql-scripts/
I think is really bad idea to have unit tests that depends on database information.
Also I think is a bad idea to use sqlite for unit tests.
You need to test objects protocol, so if you need something in your tests you should create them somewhere in the tests (usually at setUp).
Since is difficult to remove persistence, the popular way to do it is using SQLite, but always create what you need in unit tests.
check this link Unit Tests And Databases this will be more helpful I think
It's best to use a mocking framework, to mimic a database. For C# there is the Entity Framework. Even the use of sqlite is an outside dependency to your code.

Applying CQRS - Is unit testing the thin read layer necessary?

Given that some of the advice for implementing CQRS advocates fairly close-to-the-metal query implementation, such as ADO.NET queries directly against the database (or perhaps a LINQ-based ORM), is it a mistake to try and unit test them?
I wonder if it's really even necessary?
My thoughts on the matter:
The additional architectural complexity to provide a mockable "Thin Read Layer" seems opposite to the very nature of the advice to keep the architectural ceremony to a minimum.
The number of unit tests to effectively cover every angle of query that a user might compose is horrendous.
Specifically I'm trying CQRS out in an ASP.NET MVC application and am wondering whether to bother unit testing my controller action methods, or just test the Domain Model instead.
Many thanks in advance.
In my experience 90%-99% of the reads you will be doing if you are creating a nice de-normalized read model DO NOT warrant having unit tests around them.
I have found that the most effective and efficient way to TDD a CQRS application is to write integration tests that push commands into your domain, then use the Queries to get the data back out of the DB for your assertions.
I would tend to agree with you that unit testing this kind of code is not so beneficial. But there is still some scope for some useful tests.
You must be performing some validation of the user's read query parameters, if so, then test that invalid request parameters throw a suitable exception, and valid parameters are allowed.
If you're using an ORM, I find the cost/benefit ratio too great for testing the mapping code. Assume your ORM is already tested, there could be errors in the mapping, but you'll soon find and fix them.
I also find it useful to write some Integration tests (using the same Testing framework), just to be sure I can make a connection to the database, and the ORM configuration doesn't throw any mapping exceptions. You certainly don't want to be writing unit tests that query the actual db.
As you probably already know unit testing is less about code coverage and preventing bugs than it is about good design. While I often skip testing the read-model event handlers when I'm in a hurry, there can be no doubt that it probably should be done for all the reasons code should be TDD'd.
I also have not be unit testing my HTTP actions (I don't have controllers per se since I'm using Nancy not .NET MVC).
These are integration points and don't tend to contain much logic since most of it is encapsulated in the command handlers and domain model.
The reason I think it is fairly easy not to test these is because they are very simple and very repetitive, there is almost no deep thinking in the denormalization of events to the read-model. The same is true for my HTTP handlers, which pretty much just process the request and issue a command to the domain, with some basic logic for return a response to the client.
When developing, I often make mistakes in this code and I probably would make far fewer of these mistakes if I was using TDD, but it would also take much longer and these mistakes tend to be very easy to spot and fix.
My gut tells me I should still apply TDD here though, it is still all very loosely coupled and it shouldn't be hard to write the tests. If you are finding it hard to write the tests that could indicate a code smell in your controllers.
The way that I have seen something like this unit tested is to have the unit test create a set of things in the database, you run your unit tests, then clean out the created things.
In one past job I saw this set up very nicely using a data structure to describe the objects and their relationships. This was run through the ORM to create those objects, with those relationships, data from that was used for the queries, and then the ORM was used to delete the objects. To make unit tests easier to set up every class specified default values to use in unit tests that didn't override those values. Then the data structure in the unit tests only needed to specify non-default values, which made the setup of the unit tests much more compact.
These unit tests were very useful, and caught a number of bugs in the database interaction.
In one my asp.net mvc application i've also applied sqrc. But instead of sql and 'ADO.NET queries' or 'Linq' we using document database(mongodb) and each command or event handler direct update database.
And i've created one test for one command/event handler. And after 100% unit testing i know that my domain work on 95% correct.
But for actions/controllers/ui i've applied ui tests(using selenium).
So seems both unit tests for the domain(commands/events handlers and direct updates to database) and ui tests it your 'integration tests'.
I think that you should test domain part at least, because all your logic incapsulated in command/event handlers.
FYI: I've also started developing domain part first from entity framework, than direct updates into database through stored procedures but was really happy with document database. I tried some different document databases but mongodb looks like best for me.

Unit testing rules

My question may appear really stupid for some of you but i have to ask.. Sorry..
I don't really understand principles of unit testing.. How can you test classes of your business classes or Data access layer without modify your database?
I explain, i have a functionality who update a field in a database.. Nothing so amazing..
The Business layer class is instantiated and the method BLL.Update() makes some controls and finally instantiate a DAL class who launch a stored procedure in the database with the correct parameters.
Its works but my question is..
To do unit tests who test the DALayer class i have to impact the database in the tests! To test for example if the value 5 is well passed to the DataBase i have to do that and database's field will be 5 after the test!
So i know normally the system is not impacted by tests so i don't understand how you can do tests without without execute methods..
Tx for answers and excuse my poor English..
I will divide your question into several sub questions because it is hard to answer them together.
Unit testing x Integration testing
When you write unit test you are testing simple unit. That means you are testing single execution path in tested method. You need to avoid testing its dependencies like mentioned database. You usually write simple unit test for each execution path so that you have good code coverage by your tests.
When you write integration test you are testing all layers to see if integration and configuration works. You usually don't write integration test for each execution path because there is to many combination accross all layers.
Testing business classes - unit testing
You need to test your business classes without dependency to DAL and DB. To do that you have to design your BL class so that those dependencies are injected from outside. First you need to define abstract class or interface for DAL and pass that DAL interface as parameter to constructor (another way is to expose setable property on BL class). When you test your BL class you will use another implementation of DAL interface which is not dependent on DB. There are well known test patterns Mock, Stub and Fake which defines how to create and use those dummy implementations. Mocking is also supported by many testing frameworks.
Testing data access layer - integration testing
You need to test your DAL against real DB. You will prepare testing DB with test set of data and you will write your tests to work with that data. Each test will run in its own transaction which will be rolled back at the end so it will not modify initial data set.
Best regards, Ladislav
For the scenario you describe in terms of db interaction, mocking is useful. If you have a chance, take a look at Rhino Mocks
You use Inversion of Control together with a mocking framework, e.g. Rhino Mocks as someone already mentioned
If you are not resorting to mocking and using actual DB in testing then it will be integration testing in layman terms and it's not a unit test anymore. I worked on a project where a dedicated sql .mdf was in source control that was attached to database server using NUnit in [Setup] part of [SetupFixture], similary detached in [TearDown]. This was done everytime NUnit test were performed and could be very time consuming depending upon the SQL code you have as well as size of data can make worse.
Now the catch is the maintenance overhead, you may change the DB scehma during you sprint cycle and at relaease, the change DB script has to be run on all databases used in your development and testing including the one used for integration testing as mentioned above. Not just that, new test data (as someone mentioned above) has to be popoulated for new tables/columns created and likewise, the exisitng data may need to be cleansed as well owing to requirements changes or bug fixes.
This seems to be a task in itself and someone competent in team can take the ownership or if time permits you can integrate the execution of change scripts as part of Continuous Integration if you already implemented one. Still the adding and cleansing the test data need to be taken care of manually.

Categories

Resources