Hi guys I'm start learning Unit Test or TDD .
I had some experience on C# so I was start with follow link
https://msdn.microsoft.com/en-us/library/dd264975.aspx
follow the sample , so far so good.
But when I start first method by myself ,
I had some questions .
I wrote a method named "Log"
it will create a .txt file with timestamp
for example , after I called Log("some thing error");
it will created a file 201510210030.txt
the content is "[2015-10-21 00:30:47] some thing error"
How do I test it ?
may I read the log file ?
But every time I change the file name,
Maybe I will change the fold postion on the other situation.
Or maybe I will change log to DB or some logger server (with IoC)
How do I test it ? connect to DB or logger server ? read the file ?
It too much possible if the method fail(DB shut down or file auth fail balabala).
It not really "Unit" enough , so ... how do I test a method like this ,
Or just some concept I do not understand with Unit Test .
Thanks a lot .
With unit testing (as with all things), you need to be pragmatic. It's always great to aim for the 100% code coverage bench mark, but that's often-times unrealistic. The moment you start introducing actual databases or file systems into your tests, you've left the realm of unit tests and entered integration testing. There's absolutely nothing wrong with integration testing, but it's important not to confuse the two.
What I would recommend is to make sure you have all the logging logic in it's own separate class that is provided to the class you're testing via dependency injection (you mentioned IoC, so I assume you're familiar). Once you do that, you can pass in a mock "Logger" into your class that doesn't touch the file system at all. This will ensure that the class you're testing will handle anything that implements that "Logger"'s interface.
If you're looking to actually unit test the logger itself, then I'm afraid that's not really possible. The logger is very tightly coupled to the file system (or database), and so you can't really unit test it. You can always extract all the persistence logic to another class and mock that out, but you've only pushed the problem back. There's always a wall you'll come up against between your application and infrastructure, where the two are tightly coupled. The important thing is to keep the class that handles that interaction relatively simple, and make sure it's tested with integration tests.
Related
I am working on writing a tool which
- sets up a connection to Sql and runs a series of stored procedures
- Hits the file system to verify and also delete files
- Talks to other subsystems through exposed APIs
I am new to the concept of TDD but have been doing a lot of reading on it. I wanted apply TDD for this development but I am stuck. There are a lot of interactions with external systems which need to be mocked/stubbed or faked. What I am finding difficult is the proper approach to take in doing this in TDD.. here is a sample of what I would like accomplished.
public class MyConfigurator
{
public static void Start()
{
CheckSystemIsLicenced(); // will throw if its not licenced. Makes call to a library owned by company
CleanUpFiles(); // clean up several directories
CheckConnectionToSql(); //ensure connection to sql can be made
ConfigureSystemToolsOnDatabase(); //runs a set of stored procedure. Range of checks are also implemented and will throw if something goes wrong.
}
}
After this I have another class which cleans up the system if things have gone wrong. For the purpose of this question, its not that relevant but it essentially will just clear certain tables and fix up database so that the tool can run again from scratch to do its configuration tasks.
It almost appears to be here that when using TDD the only tests I end up having are things like (assuming I am using FakeItEasy)
A.CallTo(()=>fakeLicenceChecker.CheckSystemIsLicenced("lickey")).MustHaveHappened();
It just is a whole lot of tests which just appear to be "MustHaveHappened". Am I doing something wrong? Is there a different way to start this project using TDD? Or is this a particular scenario where perhaps TDD is not really recommended? Any guidance would be greatly appreciated.
In your example, if the arrangement of the unit test shows lickey as the input, then it is reasonable to assert that the endpoint has been called with the proper value. In more complex scenarios, the input-to-assert flow covers more subsystems so that the test itself doesn't seem as trivial. You might set up an ID value as input and test that down the line you are outputting a value for an object that is deterministically related to the input ID.
One aspect of TDD is that the code changes while the tests do not - except for functionally equivalent refactoring. So your first tests would naturally arrange and assert data at the outermost endpoints. You would start with a test that writes a real file to the filesystem, calls your code, and then checks to see that the file is deleted as expected. Of course, the file system is a messy workspace for portable testing, so you might decide early on to abstract the file system by one step. Ditto with the database by using EF and mocking your DbContext or by using a mocked repository pattern. These abstractions can be pre-TDD application architecture decisions.
Something I do frequently is to use utility code that starts with an IFileSystem interface that declares methods that mimic a lot of what is available in System.IO.File. In production I use an implementation of IFileSystem that just passes through to File.XXX() methods. Then you can mock up and verify the interface instead of trying to setup and cleanup real files.
In this particular method the only thing you can test is that the methods were called. It's ok to do what you are doing by asserting the mock classes. It's up to you to determine if this particular test is valuable or not. TDD assumes tests for everything, but I find it to be more practical to focus your testing on scenarios where it adds value. Hard for others to make that determination, but you should trust yourself to make the call in each specific scenario.
I think integration tests would add the most bang for buck. Use the real DB and FileSystem.
If you have complex logic in the tool, then you may want to restructure the tool design to abstract out the DB and fileSystem and write the unit tests with mocks. From the code snippet you posted, it looks like a simple script to me.
I would like to create unit tests for data dependent code. For example:
A user class that has the usual create, update & delete.
If I wanted to create a test for "user already exists" scenario, or and update or delete test. I would need to know that a specific user already exists in my database.
In such cases, what would be the best approach to have stand alone tests for these operations that can run in any order?
When you have dependencies like this think about whether you want to be Integration Testing as opposed to Unit Testing. If you do want to do Unit tests take a look at using Mock Data.
Integration Testing:
Tests how you code integrates with different parts of a system. This can be making sure your code connects to a database properly or has created a file on the filesystem. These tests are usually very straight-forward and do not have the same constraint of "being able to run in any order." However, they require a specific configuration in order to pass which means they do float well from developer to developer.
Unit Testing: Tests your code's ability to preform a function. For example "Does my function AddTwoNumbers(int one, int two) actually add two numbers?" Unit tests are to ensure any changes in code does not effect the expected results.
When getting into areas like "Does my code call the database any enter the result correctly?" you need to consider that unit tests are not meant to interact with the system. This is where we get into using "mock data." Mock classes and mock data take the place of an actual system to just ensure that your code "called out in the way we were expecting." The difficult part about this is it can be done but most of the .Net Framework classes do not provide the needed Interfaces in order to do it easily.
See the MSDN page on Tesing for more info. Also, consider this MSDN article on Mock Data.
I am attempting to do TDD right ! I was reading about the TDD Inside Out as opposed to Outside In. Reason being is that i don't know how my layers up front so my idea was to start writing a test, have it fail and then start writing my first layer.
While writing my first layer I notice that I need another layer, lets call it a service layer. This is where I get confused, what do I do here ?
Do I stop and create a new Test that fails so that I can implement my new service layer using TDD? Once done, I go back to my original layer and should I create a mock of my service layer here ? Or use the service layer I just create via TDD?
This is TDD right ? So if I am mocking things out then maybe my TDD is not driving my development ? But of course if I don't mock things out, these technically are not unit tests but more of integration tests ?
If indeed my unit tests (written via TDD) use mocks then I need to have some other kind of tests to test the integration of each individual layer as one unit ??
An integration test or e2e test?
I think my problems are basically when I need to introduce new layers, should I mock these out, should I create a new test to drive the development of this new layer?
I hope somebody can help with untangling this muddle I have myself in !
Thanks
With more experience , you will become better with that.
But for now let me say this.
First of all, think about TDD as a tool to design clean code (check Uncle Bob's Clean Code to get more insight.). By no means it replaces any system design efforts. That means that you have to know what you want classes to design (at least roughly) and you have to define interfaces between those classes as well.
Second, unit tests according to Mike Cohn - Working Effectively with Legacy Code - Chapter 2 are tests that do not:
talk to a database
communiating accross a network
touching the file system
require you to do special things with your environment to run.
So, you should be well in the limits of a unit test.
In general, you want to write unit tests for each component (or class). That means you create fake classes or mocks for each of the interfaces, e.g. for each of your service layer classes. This means that you have to know the exact interface (method parameters and return value) that each of the call needs.
Try to get as far as possible with one instance and then move on to the next.
If you are unsure about how your design has to look like, consider building an untested prototype. Just as much code so you see the components working together and to help build your interfaces. Then sketch down the design, throw away your prototype and start over with a TDD approach.
When developing in TDD style you should work with interfaces as much as possible.
Unit testing means that you test every unit isolated from most (ideally all) other code.
So in your case: if the code you currently work on needs to make calls to some service layer. then jsut create an interface for the new modules and mock their correct behaviour (or expected error behaviour if you want to test error handling).
... and put testing your new service layer on your todo list ;)
This way you concentrate your work on your current unit and have an interface ready for your service layer, when you start working on this.
if you want to test how your layers work together, you need some kind of integration test.
I have a method ExportXMLFiles(string path) to export xml files at a certain path with some elements inside it like FirstName, LastName, MajorSubject. These values are getting picked from a database.
Now I need to write a Unit test method for it and I have not worked on much unit tests except simple and straight forward ones. My confusion is, do I need to connect to database and create a XML file or do I need to pass hard coded values while creating XML file so that I can validate the values in XML created?
Is there any other way for doing this?
You absolutely do not want to use an actual database in your unit test. It adds one level of complexity that you don't want to deal with in your unit tests. It also makes your unit tests less reliable and slower. See if you can break the database functionality into an interface that you can instantiate using a mocking framework. Try looking into something like moq or if that isn't enough check out moles from Microsoft .
Edit - Another post mentioned that if the functionality is to write to the disk then your unit test should validate that the file was written out to disk. Using Moles you can simulate file systems and test your file system calls and simulate write failures or whatever other cases you need which would give you a lot more flexibility and speed than actually physically writing to disk. Things like a disk write failing would be miserable to test without something like moles.
A unit test should be small in scope and isolated from dependencies eg databases and file systems. So what you want to do is look at mocking out the database access and what would get written to a file so that you can run your test without needing particular values in the database. Unit tests should be fast to run, have repeatable results (ie run twice, get the same answer), isolated from other tests and able to be run in any order.
A unit test is looking at ONE item of functionality and not relying on the behaviour of anything else.
So look at using a pattern such as dependency injection so you can provide (ie inject) database and file system dependencies. Look at a mocking framework such as NMock or write your own lightweight fake objects that implement the same interface as the dependencies and then you can pass those into your functions being tested.
What is the responsibility of this method ?
Is it to dump given data in the form of xml files at a certain path? If yes, then you'd have to check that the files are in fact created.
This is not a unit test but an integration test (because this is the class at the boundary between your app and the filesystem). You should abstract away the input data source (the DB) via an interface/role. You can also create a Role to CreateXmlFile(contents) but I think that's overkill.
// setup mock data source to supply canned data
// call myObject.ExportToXml(mockDataSource, tempPath)
// verify files are created in tempPath
Finally this class needs to implement a Role (DataExporter) so that the tests that use DataExporter are fast / don't have to deal with filesystems (or XML).
My question may appear really stupid for some of you but i have to ask.. Sorry..
I don't really understand principles of unit testing.. How can you test classes of your business classes or Data access layer without modify your database?
I explain, i have a functionality who update a field in a database.. Nothing so amazing..
The Business layer class is instantiated and the method BLL.Update() makes some controls and finally instantiate a DAL class who launch a stored procedure in the database with the correct parameters.
Its works but my question is..
To do unit tests who test the DALayer class i have to impact the database in the tests! To test for example if the value 5 is well passed to the DataBase i have to do that and database's field will be 5 after the test!
So i know normally the system is not impacted by tests so i don't understand how you can do tests without without execute methods..
Tx for answers and excuse my poor English..
I will divide your question into several sub questions because it is hard to answer them together.
Unit testing x Integration testing
When you write unit test you are testing simple unit. That means you are testing single execution path in tested method. You need to avoid testing its dependencies like mentioned database. You usually write simple unit test for each execution path so that you have good code coverage by your tests.
When you write integration test you are testing all layers to see if integration and configuration works. You usually don't write integration test for each execution path because there is to many combination accross all layers.
Testing business classes - unit testing
You need to test your business classes without dependency to DAL and DB. To do that you have to design your BL class so that those dependencies are injected from outside. First you need to define abstract class or interface for DAL and pass that DAL interface as parameter to constructor (another way is to expose setable property on BL class). When you test your BL class you will use another implementation of DAL interface which is not dependent on DB. There are well known test patterns Mock, Stub and Fake which defines how to create and use those dummy implementations. Mocking is also supported by many testing frameworks.
Testing data access layer - integration testing
You need to test your DAL against real DB. You will prepare testing DB with test set of data and you will write your tests to work with that data. Each test will run in its own transaction which will be rolled back at the end so it will not modify initial data set.
Best regards, Ladislav
For the scenario you describe in terms of db interaction, mocking is useful. If you have a chance, take a look at Rhino Mocks
You use Inversion of Control together with a mocking framework, e.g. Rhino Mocks as someone already mentioned
If you are not resorting to mocking and using actual DB in testing then it will be integration testing in layman terms and it's not a unit test anymore. I worked on a project where a dedicated sql .mdf was in source control that was attached to database server using NUnit in [Setup] part of [SetupFixture], similary detached in [TearDown]. This was done everytime NUnit test were performed and could be very time consuming depending upon the SQL code you have as well as size of data can make worse.
Now the catch is the maintenance overhead, you may change the DB scehma during you sprint cycle and at relaease, the change DB script has to be run on all databases used in your development and testing including the one used for integration testing as mentioned above. Not just that, new test data (as someone mentioned above) has to be popoulated for new tables/columns created and likewise, the exisitng data may need to be cleansed as well owing to requirements changes or bug fixes.
This seems to be a task in itself and someone competent in team can take the ownership or if time permits you can integrate the execution of change scripts as part of Continuous Integration if you already implemented one. Still the adding and cleansing the test data need to be taken care of manually.