Given that some of the advice for implementing CQRS advocates fairly close-to-the-metal query implementation, such as ADO.NET queries directly against the database (or perhaps a LINQ-based ORM), is it a mistake to try and unit test them?
I wonder if it's really even necessary?
My thoughts on the matter:
The additional architectural complexity to provide a mockable "Thin Read Layer" seems opposite to the very nature of the advice to keep the architectural ceremony to a minimum.
The number of unit tests to effectively cover every angle of query that a user might compose is horrendous.
Specifically I'm trying CQRS out in an ASP.NET MVC application and am wondering whether to bother unit testing my controller action methods, or just test the Domain Model instead.
Many thanks in advance.
In my experience 90%-99% of the reads you will be doing if you are creating a nice de-normalized read model DO NOT warrant having unit tests around them.
I have found that the most effective and efficient way to TDD a CQRS application is to write integration tests that push commands into your domain, then use the Queries to get the data back out of the DB for your assertions.
I would tend to agree with you that unit testing this kind of code is not so beneficial. But there is still some scope for some useful tests.
You must be performing some validation of the user's read query parameters, if so, then test that invalid request parameters throw a suitable exception, and valid parameters are allowed.
If you're using an ORM, I find the cost/benefit ratio too great for testing the mapping code. Assume your ORM is already tested, there could be errors in the mapping, but you'll soon find and fix them.
I also find it useful to write some Integration tests (using the same Testing framework), just to be sure I can make a connection to the database, and the ORM configuration doesn't throw any mapping exceptions. You certainly don't want to be writing unit tests that query the actual db.
As you probably already know unit testing is less about code coverage and preventing bugs than it is about good design. While I often skip testing the read-model event handlers when I'm in a hurry, there can be no doubt that it probably should be done for all the reasons code should be TDD'd.
I also have not be unit testing my HTTP actions (I don't have controllers per se since I'm using Nancy not .NET MVC).
These are integration points and don't tend to contain much logic since most of it is encapsulated in the command handlers and domain model.
The reason I think it is fairly easy not to test these is because they are very simple and very repetitive, there is almost no deep thinking in the denormalization of events to the read-model. The same is true for my HTTP handlers, which pretty much just process the request and issue a command to the domain, with some basic logic for return a response to the client.
When developing, I often make mistakes in this code and I probably would make far fewer of these mistakes if I was using TDD, but it would also take much longer and these mistakes tend to be very easy to spot and fix.
My gut tells me I should still apply TDD here though, it is still all very loosely coupled and it shouldn't be hard to write the tests. If you are finding it hard to write the tests that could indicate a code smell in your controllers.
The way that I have seen something like this unit tested is to have the unit test create a set of things in the database, you run your unit tests, then clean out the created things.
In one past job I saw this set up very nicely using a data structure to describe the objects and their relationships. This was run through the ORM to create those objects, with those relationships, data from that was used for the queries, and then the ORM was used to delete the objects. To make unit tests easier to set up every class specified default values to use in unit tests that didn't override those values. Then the data structure in the unit tests only needed to specify non-default values, which made the setup of the unit tests much more compact.
These unit tests were very useful, and caught a number of bugs in the database interaction.
In one my asp.net mvc application i've also applied sqrc. But instead of sql and 'ADO.NET queries' or 'Linq' we using document database(mongodb) and each command or event handler direct update database.
And i've created one test for one command/event handler. And after 100% unit testing i know that my domain work on 95% correct.
But for actions/controllers/ui i've applied ui tests(using selenium).
So seems both unit tests for the domain(commands/events handlers and direct updates to database) and ui tests it your 'integration tests'.
I think that you should test domain part at least, because all your logic incapsulated in command/event handlers.
FYI: I've also started developing domain part first from entity framework, than direct updates into database through stored procedures but was really happy with document database. I tried some different document databases but mongodb looks like best for me.
Related
We have a build server which does not have a database running on its machine.
Now we want to make unit tests which also cover SQL-related tasks, for example writing mock-data and reading them again to verify the output (it gets manipulated by C# code in the process).
So the common approach would be to provide a connection-string to the remote server which actually runs a database.
This works, but it is unwanted because the database-server could be inaccessible when the build is triggered, and therefore the automatic tests would fail.
What is the best way to create a "pseudo-database" while running Unit tests?
Is this even possible without an actual database? Does it even make sense?
Use mocking framework like Moq, check here. That is the exact purpose of the Mocking frameworks that only a given component can be unit tested and any integration like database can be mocked to return the valid pre-defined result at the run-time and thus determine the working of given code / unit
The commonly accepted answer to "how do I unit test something with external dependencies" is to use a mocking layer.
However, that very quickly becomes unwieldy - you end up writing and maintaining a lot of code just to mimic your database, and it's not clear this code will pay for itself. For instance, if your SQL statements has a typo, your unit tests won't catch that.
Instead, it is much better to factor the code which manipulates the data out into a layer which can be unit-tested without database access.
I am working on writing a tool which
- sets up a connection to Sql and runs a series of stored procedures
- Hits the file system to verify and also delete files
- Talks to other subsystems through exposed APIs
I am new to the concept of TDD but have been doing a lot of reading on it. I wanted apply TDD for this development but I am stuck. There are a lot of interactions with external systems which need to be mocked/stubbed or faked. What I am finding difficult is the proper approach to take in doing this in TDD.. here is a sample of what I would like accomplished.
public class MyConfigurator
{
public static void Start()
{
CheckSystemIsLicenced(); // will throw if its not licenced. Makes call to a library owned by company
CleanUpFiles(); // clean up several directories
CheckConnectionToSql(); //ensure connection to sql can be made
ConfigureSystemToolsOnDatabase(); //runs a set of stored procedure. Range of checks are also implemented and will throw if something goes wrong.
}
}
After this I have another class which cleans up the system if things have gone wrong. For the purpose of this question, its not that relevant but it essentially will just clear certain tables and fix up database so that the tool can run again from scratch to do its configuration tasks.
It almost appears to be here that when using TDD the only tests I end up having are things like (assuming I am using FakeItEasy)
A.CallTo(()=>fakeLicenceChecker.CheckSystemIsLicenced("lickey")).MustHaveHappened();
It just is a whole lot of tests which just appear to be "MustHaveHappened". Am I doing something wrong? Is there a different way to start this project using TDD? Or is this a particular scenario where perhaps TDD is not really recommended? Any guidance would be greatly appreciated.
In your example, if the arrangement of the unit test shows lickey as the input, then it is reasonable to assert that the endpoint has been called with the proper value. In more complex scenarios, the input-to-assert flow covers more subsystems so that the test itself doesn't seem as trivial. You might set up an ID value as input and test that down the line you are outputting a value for an object that is deterministically related to the input ID.
One aspect of TDD is that the code changes while the tests do not - except for functionally equivalent refactoring. So your first tests would naturally arrange and assert data at the outermost endpoints. You would start with a test that writes a real file to the filesystem, calls your code, and then checks to see that the file is deleted as expected. Of course, the file system is a messy workspace for portable testing, so you might decide early on to abstract the file system by one step. Ditto with the database by using EF and mocking your DbContext or by using a mocked repository pattern. These abstractions can be pre-TDD application architecture decisions.
Something I do frequently is to use utility code that starts with an IFileSystem interface that declares methods that mimic a lot of what is available in System.IO.File. In production I use an implementation of IFileSystem that just passes through to File.XXX() methods. Then you can mock up and verify the interface instead of trying to setup and cleanup real files.
In this particular method the only thing you can test is that the methods were called. It's ok to do what you are doing by asserting the mock classes. It's up to you to determine if this particular test is valuable or not. TDD assumes tests for everything, but I find it to be more practical to focus your testing on scenarios where it adds value. Hard for others to make that determination, but you should trust yourself to make the call in each specific scenario.
I think integration tests would add the most bang for buck. Use the real DB and FileSystem.
If you have complex logic in the tool, then you may want to restructure the tool design to abstract out the DB and fileSystem and write the unit tests with mocks. From the code snippet you posted, it looks like a simple script to me.
I am learning TDD and I currently have a method that is working but I thought I'd have a go at rebuilding it using TDD.
The method essentially takes 6 parameters, queries a database, does some logic and returns a List<T>
My initial tests including checking for empty/zero defined string and int method parameter values but now I'm not sure what to do. If I wasn't using TDD, I would just create code to find the DB connection string and open up a DB connection, query the database, read the values etc.
Obviously we can't do that in Unit Testing so I was after some advice of how to proceed.
Remember that TDD is as much about good design than it is about testing. This method has too much going on; it violates the Separation of Concerns principle.
You've already identified several areas that will need to be tested:
The method essentially takes 6 parameters, queries a database, does some logic and returns a List<T>
You have several discrete steps there, and there are probably a few more hiding in the code. Breaking those up is the name of the game when it comes to TDD.
For starters, it might be a good idea to factor out the piece that performs the logic.
Is your method building a query dynamically? Break that piece out as well and test it to make sure the query is written properly.
You can put the execution of the query into a standalone repository or something similar, and write integration tests against that. That way you only have a simple test hitting the database instead of the current complex method.
If you try to test this as is, you'll likely end up with a monster test that requires a lot of setup and duplicates all of your business logic, and when it breaks it'll be unclear as to what went wrong.
In general, there's nothing "wrong" about using TDD to test database code. However, you might try abstracting out the database code, then mocking it out.
The method essentially takes 6 parameters, queries a database, does
some logic and returns a List
That seems to be too much to be a unit testable code!!
A unit testable code should be doing very specific things and doing it in small modules. So, in your case you need to refactor and break your method into following (at least):
data base query: wrapped inside a DataProvider with a backing interface. And your unit test would mock this interface.
does some logic : this is the best candidate for a unit test. This should be a module that just takes data provider interface and does the logic and returns modified list which you will validate in your unit test.
Also, remember a unit test should cover at least three scenarios for each testable module:
a positive test
a negative test
test throwing meaningful exception for invalid values.
Hope this is helpful.
Another option is to start a transaction before the test and do a rollback afterwards. This way tests are independent so can still, according to some definitions, be considered unit tests.
Contrary to what's mentioned in other answers, you should refactor the code to get to a better design after the test passes. Then you can verify that your refactoring didn't break anything just by rerunning the test.
You might want to try looking at DbUnit for running unit tests on your data access layer. It puts your database in a known state between test runs preventing corruption of your test database.
You can:
Use the class/test init to raise a blank DB or a copy of small DB with a known set of data.
In the test method enter test data (if the DB is empty), then perform the query, then compare result with expect result.
In the test/class cleanup remove DB.
This tests your unit but is considered an "integration test" by some.
- The term "unit test" has some disagreement due to the ambiguity of the term "unit".
You could also use an in-memory DB or an in-process DB to make the test environment simpler.
I am currently working on a c# project which makes use of an SQLite Database. For the project I am required to do unit testing but was told that unit testing shouldn't involve external files, like database files for the testing and instead the test should emulate the database.
If I have a function that tests if a something exists in a database how could this sort of method be tested with a unit testing.
in general it makes life easier if external files are avoided and everything is done in code. There are no rules which says "shouldn't", and sometimes it just makes more sense to have the external dependency. But only after you have considered how not to have it, and realized what the tradeoffs are.
Secondly, what Bryan said is a good option and the one I've used before.
In an application that uses a database, there will be at least one component whose responsibility is to communicate with that database. The unit test for that component could involve a mocked database, but it is perfectly valid (and often desirable) to test the component using a real database. After all, the component is supposed to encapsulate and broker communication with that database -- the unit test should test that. There are numerous strategies to perform such unit tests conveniently -- see the list of related SO questions in the sidebar for examples.
The general prescription to avoid accessing databases in unit tests applies to non-database components. Since non-database components typically outnumber database-related components by a wide margin, the vast majority of unit tests should not involve a database. Indeed, if such non-database components required a database to be tested effectively, there is likely a design problem present -- probably improper separation of concerns.
Thus, the principle that unit tests should avoid databases is generally true, but it is not an absolute rule. It is just a (strong) guideline that aids in structuring complex systems. Following the rule too rigidly makes it very difficult to adequately test "boundary" components that encapsulate external systems -- places in which bugs find it very easy to hide! So, the question that one should really be asking oneself when a unit test demands a database is this: is the component under test legitimately accessing the database directly or should it instead collaborate with another that has that responsibility?
This same reasoning applies to the use of external files and other resources in unit tests as well.
With SQLite, you could use an in-memory database. You can stage your database by inserting data and then run your tests against it.
Once databases get involved it always blurs the line between unit testing and integration testing. Having said that, it is always a nice and very useful test to be able to put something in a database (Setup), Do your test and remove it at the end (Cleanup). This lets you test end to end one part of your application.
Personally I like to do this in an attribute driven fashion. By Specifying the Sql scripts to run for each test as an attribute like so ..
[PreSqlExecute("SetupTestUserDetails.sql")]
[PostSqlExecute("CleanupTestUserDetails.sql")]
public void UpdateUserNameTest()
{
The connectionstrings come from the app.config as usual and can even be a symbolic link to the app.config in your main project.
Unfortunately this isn't a standard feature with the MS test runner that ships with visual studio. If you are using Postsharp as your AOP framework, this is easy to do. If not, you can still get the same functionality for standard MS Test Runner, by using a feature of .Net called "Context Bound Objects". This lets you inject custom code into an object creation chain to do AOP like stuff as long as your objects inherit from ContextBoundObject.
I did a blog post with more details and a small, complete code sample here.
http://www.chaitanyaonline.net/2011/09/25/improving-integration-tests-in-net-by-using-attributes-to-execute-sql-scripts/
I think is really bad idea to have unit tests that depends on database information.
Also I think is a bad idea to use sqlite for unit tests.
You need to test objects protocol, so if you need something in your tests you should create them somewhere in the tests (usually at setUp).
Since is difficult to remove persistence, the popular way to do it is using SQLite, but always create what you need in unit tests.
check this link Unit Tests And Databases this will be more helpful I think
It's best to use a mocking framework, to mimic a database. For C# there is the Entity Framework. Even the use of sqlite is an outside dependency to your code.
My question may appear really stupid for some of you but i have to ask.. Sorry..
I don't really understand principles of unit testing.. How can you test classes of your business classes or Data access layer without modify your database?
I explain, i have a functionality who update a field in a database.. Nothing so amazing..
The Business layer class is instantiated and the method BLL.Update() makes some controls and finally instantiate a DAL class who launch a stored procedure in the database with the correct parameters.
Its works but my question is..
To do unit tests who test the DALayer class i have to impact the database in the tests! To test for example if the value 5 is well passed to the DataBase i have to do that and database's field will be 5 after the test!
So i know normally the system is not impacted by tests so i don't understand how you can do tests without without execute methods..
Tx for answers and excuse my poor English..
I will divide your question into several sub questions because it is hard to answer them together.
Unit testing x Integration testing
When you write unit test you are testing simple unit. That means you are testing single execution path in tested method. You need to avoid testing its dependencies like mentioned database. You usually write simple unit test for each execution path so that you have good code coverage by your tests.
When you write integration test you are testing all layers to see if integration and configuration works. You usually don't write integration test for each execution path because there is to many combination accross all layers.
Testing business classes - unit testing
You need to test your business classes without dependency to DAL and DB. To do that you have to design your BL class so that those dependencies are injected from outside. First you need to define abstract class or interface for DAL and pass that DAL interface as parameter to constructor (another way is to expose setable property on BL class). When you test your BL class you will use another implementation of DAL interface which is not dependent on DB. There are well known test patterns Mock, Stub and Fake which defines how to create and use those dummy implementations. Mocking is also supported by many testing frameworks.
Testing data access layer - integration testing
You need to test your DAL against real DB. You will prepare testing DB with test set of data and you will write your tests to work with that data. Each test will run in its own transaction which will be rolled back at the end so it will not modify initial data set.
Best regards, Ladislav
For the scenario you describe in terms of db interaction, mocking is useful. If you have a chance, take a look at Rhino Mocks
You use Inversion of Control together with a mocking framework, e.g. Rhino Mocks as someone already mentioned
If you are not resorting to mocking and using actual DB in testing then it will be integration testing in layman terms and it's not a unit test anymore. I worked on a project where a dedicated sql .mdf was in source control that was attached to database server using NUnit in [Setup] part of [SetupFixture], similary detached in [TearDown]. This was done everytime NUnit test were performed and could be very time consuming depending upon the SQL code you have as well as size of data can make worse.
Now the catch is the maintenance overhead, you may change the DB scehma during you sprint cycle and at relaease, the change DB script has to be run on all databases used in your development and testing including the one used for integration testing as mentioned above. Not just that, new test data (as someone mentioned above) has to be popoulated for new tables/columns created and likewise, the exisitng data may need to be cleansed as well owing to requirements changes or bug fixes.
This seems to be a task in itself and someone competent in team can take the ownership or if time permits you can integrate the execution of change scripts as part of Continuous Integration if you already implemented one. Still the adding and cleansing the test data need to be taken care of manually.