Writing Integration tests to test database, web service calls - c#

We are just starting to write integration tests to test database,data access layer, webservice calls etc.
Currently I have some idea to write integration tests like
1) Always recreating tables in initialize function.
2) Always clear the data inside a function, if you are saving new inside the same function.
But I would like to know some more good practices.

As with all testing, it is imperative to start from a known state, and upon test completion, clear down to a clean state.
Also, test code often gets overlooked as not real code and hence not maintained properly... it is more important than code. At least as much design, should go into the architecture of your tests. Plan reasonable levels of abstraction, ie if you are testing a web app, consider having tiers like so: an abstraction of your browser interaction, an abstraction of your components on pages, an abstraction of pages and your tests. Tests interact with pages and components, pages interact with components, components interact with the browser interaction tier and the browser interaction tier interacts with your (possibly third party) browser automation library.
If your test code is not maintained properly or well-thought out, they become a hindrance more than an aid to writing new code.
If your team is new to testing, there are many coding katas out there that aim to teach the importance of good tests (and out of this comes good code), they generally focus on a unit testing level, however many of the principals are the same.

In general, I would suggest you look into mocking your database access layer and web service call classes in order to make it more testable. There are lots of books on this subject, like Osherove's The Art of Unit Testing.
That having been said, integration tests should be repeatable. Thus, I would choose option 1, to write a script that can recreate the database from scratch with test data. Option 2 is more difficult, because it can hard to be sure that the cleaning function does not leave unwanted data residue.

When unit testing the DAL, I do it like this:
[TestFixtureSetUp]
public void TestFixtureSetUp()
{
//this grabs the data from the database using an XSD file to map the schema
//and saves it as xml (backup.xml)
DatabaseFixtureSetUp();
}
[SetUp]
public void SetUp()
{
//this inserts test data into the database from xml (testdata.xml)
//it clears the tables first so you have fresh data before every test.
DatabaseSetUp();
}
[TestFixtureTearDown]
public void TestFixtureTearDown()
{
//this clears the table and inserts the backup data into the database
//to return it to the state it was before running the tests.
DatabaseFixtureTearDown();
}

Related

How to write unit test for service having dependency on other service or database

Sorry if I am asking very basic question,
I have set of web services (developed using .Net WebApi). These services are either business layer or data access layer APIs. These APIs are either dependent on other services or database itself.
I want to write unit test cases for it. I have following questions
As business layer APIs has dependency on data access service or some other service. If I write unit test just to invoke business API then it would invoke data access API. Is this the correct way to write unit test case? or should I inject all dependency object with unit test? I think earlier one would be integration test not unit test.
Should I write unit tests for Data access layer at all? I checked this link (Writing tests for data access code: Unit tests are waste) and it says DAL does not require unit tests. Should I still write tests for data access layer. I think it would be integration test not unit tests?
Question 1:
I would say if you want to do TDD, then it's not the "correct" way, because as you said, you would be performing integration tests. Then again, maybe you don't want to do TDD and integration tests are good enough for you, but to answer the question: this wouldn't be the proper way to **unit-**test your code.
Question 2
I would say it depends what you have in your data access layer. For instance, if you implement repositories, you will probably want to write a few tests.
Save method
You want to make sure that given an entity that you have retrieved from your repository, editing some properties of this entity and persisting the changes will actually save the modifications and not create a new entity. Now: you might think this is an integration test, but it really depends on how well designed your code is. For instance, your repository could be just an extra layer of logic on top of a low-level ORM. In that case, when testing the save method, what you will do is that you will assert that the right methods are called with the right parameters on the ORM service injected in your repository.
Errors and exceptions
While accessing data, it is possible to have problems such as connection to the database being broken, or that the format of the data is not as expected, or deserialization problems. If you want to provide some good error handling and perhaps create custom exceptions and add more information to them depending on the context, then you definitely want to write tests to make sure the corrext information is propagated
on the other hand
If your DAL is just a few classes that wrap a non-mockable ORM, and you don't have any logic in there, then perhaps you don't need tests, but it seems that this doesn't happen too often, you will pretty much always have a bit of logic that can go wrong and that you want to test.

TDD on a configuration tool touching database

I am working on writing a tool which
- sets up a connection to Sql and runs a series of stored procedures
- Hits the file system to verify and also delete files
- Talks to other subsystems through exposed APIs
I am new to the concept of TDD but have been doing a lot of reading on it. I wanted apply TDD for this development but I am stuck. There are a lot of interactions with external systems which need to be mocked/stubbed or faked. What I am finding difficult is the proper approach to take in doing this in TDD.. here is a sample of what I would like accomplished.
public class MyConfigurator
{
public static void Start()
{
CheckSystemIsLicenced(); // will throw if its not licenced. Makes call to a library owned by company
CleanUpFiles(); // clean up several directories
CheckConnectionToSql(); //ensure connection to sql can be made
ConfigureSystemToolsOnDatabase(); //runs a set of stored procedure. Range of checks are also implemented and will throw if something goes wrong.
}
}
After this I have another class which cleans up the system if things have gone wrong. For the purpose of this question, its not that relevant but it essentially will just clear certain tables and fix up database so that the tool can run again from scratch to do its configuration tasks.
It almost appears to be here that when using TDD the only tests I end up having are things like (assuming I am using FakeItEasy)
A.CallTo(()=>fakeLicenceChecker.CheckSystemIsLicenced("lickey")).MustHaveHappened();
It just is a whole lot of tests which just appear to be "MustHaveHappened". Am I doing something wrong? Is there a different way to start this project using TDD? Or is this a particular scenario where perhaps TDD is not really recommended? Any guidance would be greatly appreciated.
In your example, if the arrangement of the unit test shows lickey as the input, then it is reasonable to assert that the endpoint has been called with the proper value. In more complex scenarios, the input-to-assert flow covers more subsystems so that the test itself doesn't seem as trivial. You might set up an ID value as input and test that down the line you are outputting a value for an object that is deterministically related to the input ID.
One aspect of TDD is that the code changes while the tests do not - except for functionally equivalent refactoring. So your first tests would naturally arrange and assert data at the outermost endpoints. You would start with a test that writes a real file to the filesystem, calls your code, and then checks to see that the file is deleted as expected. Of course, the file system is a messy workspace for portable testing, so you might decide early on to abstract the file system by one step. Ditto with the database by using EF and mocking your DbContext or by using a mocked repository pattern. These abstractions can be pre-TDD application architecture decisions.
Something I do frequently is to use utility code that starts with an IFileSystem interface that declares methods that mimic a lot of what is available in System.IO.File. In production I use an implementation of IFileSystem that just passes through to File.XXX() methods. Then you can mock up and verify the interface instead of trying to setup and cleanup real files.
In this particular method the only thing you can test is that the methods were called. It's ok to do what you are doing by asserting the mock classes. It's up to you to determine if this particular test is valuable or not. TDD assumes tests for everything, but I find it to be more practical to focus your testing on scenarios where it adds value. Hard for others to make that determination, but you should trust yourself to make the call in each specific scenario.
I think integration tests would add the most bang for buck. Use the real DB and FileSystem.
If you have complex logic in the tool, then you may want to restructure the tool design to abstract out the DB and fileSystem and write the unit tests with mocks. From the code snippet you posted, it looks like a simple script to me.

Unit Testing Database Methods

I am currently working on a c# project which makes use of an SQLite Database. For the project I am required to do unit testing but was told that unit testing shouldn't involve external files, like database files for the testing and instead the test should emulate the database.
If I have a function that tests if a something exists in a database how could this sort of method be tested with a unit testing.
in general it makes life easier if external files are avoided and everything is done in code. There are no rules which says "shouldn't", and sometimes it just makes more sense to have the external dependency. But only after you have considered how not to have it, and realized what the tradeoffs are.
Secondly, what Bryan said is a good option and the one I've used before.
In an application that uses a database, there will be at least one component whose responsibility is to communicate with that database. The unit test for that component could involve a mocked database, but it is perfectly valid (and often desirable) to test the component using a real database. After all, the component is supposed to encapsulate and broker communication with that database -- the unit test should test that. There are numerous strategies to perform such unit tests conveniently -- see the list of related SO questions in the sidebar for examples.
The general prescription to avoid accessing databases in unit tests applies to non-database components. Since non-database components typically outnumber database-related components by a wide margin, the vast majority of unit tests should not involve a database. Indeed, if such non-database components required a database to be tested effectively, there is likely a design problem present -- probably improper separation of concerns.
Thus, the principle that unit tests should avoid databases is generally true, but it is not an absolute rule. It is just a (strong) guideline that aids in structuring complex systems. Following the rule too rigidly makes it very difficult to adequately test "boundary" components that encapsulate external systems -- places in which bugs find it very easy to hide! So, the question that one should really be asking oneself when a unit test demands a database is this: is the component under test legitimately accessing the database directly or should it instead collaborate with another that has that responsibility?
This same reasoning applies to the use of external files and other resources in unit tests as well.
With SQLite, you could use an in-memory database. You can stage your database by inserting data and then run your tests against it.
Once databases get involved it always blurs the line between unit testing and integration testing. Having said that, it is always a nice and very useful test to be able to put something in a database (Setup), Do your test and remove it at the end (Cleanup). This lets you test end to end one part of your application.
Personally I like to do this in an attribute driven fashion. By Specifying the Sql scripts to run for each test as an attribute like so ..
[PreSqlExecute("SetupTestUserDetails.sql")]
[PostSqlExecute("CleanupTestUserDetails.sql")]
public void UpdateUserNameTest()
{
The connectionstrings come from the app.config as usual and can even be a symbolic link to the app.config in your main project.
Unfortunately this isn't a standard feature with the MS test runner that ships with visual studio. If you are using Postsharp as your AOP framework, this is easy to do. If not, you can still get the same functionality for standard MS Test Runner, by using a feature of .Net called "Context Bound Objects". This lets you inject custom code into an object creation chain to do AOP like stuff as long as your objects inherit from ContextBoundObject.
I did a blog post with more details and a small, complete code sample here.
http://www.chaitanyaonline.net/2011/09/25/improving-integration-tests-in-net-by-using-attributes-to-execute-sql-scripts/
I think is really bad idea to have unit tests that depends on database information.
Also I think is a bad idea to use sqlite for unit tests.
You need to test objects protocol, so if you need something in your tests you should create them somewhere in the tests (usually at setUp).
Since is difficult to remove persistence, the popular way to do it is using SQLite, but always create what you need in unit tests.
check this link Unit Tests And Databases this will be more helpful I think
It's best to use a mocking framework, to mimic a database. For C# there is the Entity Framework. Even the use of sqlite is an outside dependency to your code.

Integration testing in .net with nhibernate

I've been doing some honest-to-God-TDD for my latest project, and am loving it, having previously done unit testing but not serious TDD I'm finding it helpful.
Some background on my project:
ASP.Net Front End-
NHibernate for database interaction with SQL Server 2008 express-
C# 4.0 DLL's for DOmain Logic and Unit Tests which are done in NUnit and ran through resharper.
Teamcity CI server running a NAnt build script.
I'm in a sort of 'alpha' release now, and am finding all the bugs are integration bugs, mainly as my integration testing has been manual use of the site, and some minor automated stuff (which I've stopped running). This is pretty poor given how strictly I've nurtured my test suite and I want to rectify it.
My question is, what is the best way to do integration tests, or is there any articles I can read. I understand that testing the UI is going to be a pain in ASP.NET Webforms (will move to a more testable framework in future, but one step at a time). But I want to make sure my interactions with hibernate are tested correctly.
So I need some tips on integration testing in relation to
Nhibernate (caching, sessions etc)-
Test data, I've found 'NDBUnit' is that what i should be looking at using to get my data in a good state? Is that compatible with NHibernate?
Should I swap the database out for SQLite or something? Or just setup another SQL server DB which holds test data only?
How can I make these tests maintainable? I had a few integration tests but they caused me hassles and found myself avoiding them. I think this was mainly due to me not setting a consistent state each time.
Just some general advice too would be great, I've read TDD by example by Kent Beck and The Art of Unit Testing by Roy Osherove and they were great for unit testing /tdd but I would love to read a little more about integration tests and strategies for writing them (what you should test etc) ---
Concerning the Database part:
- You may use it directly along the lines of this article: Unit testing with built-in NHibernate support.
- Using SQLite to speed up the tests can also be very useful. Only that in this case you should be aware that you're not really targeting the real production configuration anymore. - SQLite does not support all features that SQL Server does. This article shows a code sample for the necessary test setup to switch to SQLite - it's quite easy and straightforward.
- NDbUnit as a mechanism to have the test data prepared is also a good choice - as long as you don't expect to have frequent schema changes on your Db (in this case it becomes quite a lot of work to maintain all the related xml files...).
Concerning the ASP.NET part:
- You may find a tool like Selenium helpful for UI-driven tests.
HTH!
Thomas
A bit late to the party, but this is what I've been doing for a while.
I am writing REST API's, which are to be consumed by our mobile apps. The mobile apps are also written in C#, so it makes sense for me to write an API wrapper (SDK).
When integration testing, I set up test cases that tests all endpoints of the API, using the SDK. When running the tests, the API is running on my local IIS, in development mode. Everytime the server is started, my dev database is wiped, recreated, and seeded with data for all tables, giving me a somewhat realistic scenario. I also don't have to worry about testing updates/deletes, because all it takes is a rebuild of the server project, and NHibernate will drop, create and seed my database. This could be changed to every request if desired.
When testing my repositories, it's important for me to know if my queries are translateable by NHibernate, so all my repository tests are using LocalDB, which is recreated for every single test case. That means every test case can set up the required seed data for the query tests to succeed.
Another thing, when seeding your database with realistic data, you are also getting your foreign-key setups tested for free. Also, the seeder is using your domain classes, so it look's good, too!
An example of a seeder:
public void Seed(ISession s)
{
using(var tx = s.BeginTransaction()
{
var account1 = new Account { FirstName = "Bob", LastName = "Smith" };
var account2 = new Account { FirstName = "John", LastName = "Doe" };
account1.AddFriend(account2); // manipulates a friends collection
s.Save(account1);
}
}
You should call the seeder when creating your session factory.
Important: setting this up is done with an IoC container.
The short answer is: don't do integration tests.
We have the following setup:
Our unit tests test as little as possible. We only test one function of the business code (not directly a method, but more a logical piece of functionality);
Write a unit test for every function of your business code;
Write a unit test for interaction between different functions of your business code;
Make sure these tests cover everything.
This will give you a set of unit tests that cover everything.
We do have integration tests, but these are written up in Word documents and are often just the original specs. A QA person runs these when the code is delivered and most of the times it just works, because we've already tested every little piece.
On InfoQ, there is a terrific presentation which explains this way of working very well: Integration Tests Are a Scam.
One thing about testing NHibernate. We have applied the repository pattern. What this does is that our NHibernate queries become very very simple, like:
public class NhRepository<T> : IRepository<T>
{
public T Get(int id)
{
return GetSession().Get<T>(id);
}
}
public interface IUserRepository : IRepository<User>
{
IEnumerable<User> GetActiveUsers();
}
public class UserRepository : NhRepository<User>, IUserRepository
{
public IEnumerable<User> GetActiveUsers()
{
return
from user in GetSession().Users
where user.IsActive
return user;
}
}
This in combination with the Windsor IoC container provides our data access. The thing with this setup is that:
The queries are incredibly simple (98% of them anyway) and we don't test them thoroughly. That may sound strange, but we tend to test these more using peer review than any other mechanism;
These repositories can be easily mocked. This means that for the above repository, we have a mock that does something like this:
var repositoryMock = mocks.StrictMock<IUserRepository>();
var activeUsers = new List<User>();
for (int i = 0; i < 10; i++)
{
var user = UserMocks.CreateUser();
user.IsActive = true;
activeUsers.Add(user);
}
Expect.Call(repositoryMock.GetActiveUsers()).Return(activeUsers);
The actual code is a lot more concise, but you get the idea.

Unit testing rules

My question may appear really stupid for some of you but i have to ask.. Sorry..
I don't really understand principles of unit testing.. How can you test classes of your business classes or Data access layer without modify your database?
I explain, i have a functionality who update a field in a database.. Nothing so amazing..
The Business layer class is instantiated and the method BLL.Update() makes some controls and finally instantiate a DAL class who launch a stored procedure in the database with the correct parameters.
Its works but my question is..
To do unit tests who test the DALayer class i have to impact the database in the tests! To test for example if the value 5 is well passed to the DataBase i have to do that and database's field will be 5 after the test!
So i know normally the system is not impacted by tests so i don't understand how you can do tests without without execute methods..
Tx for answers and excuse my poor English..
I will divide your question into several sub questions because it is hard to answer them together.
Unit testing x Integration testing
When you write unit test you are testing simple unit. That means you are testing single execution path in tested method. You need to avoid testing its dependencies like mentioned database. You usually write simple unit test for each execution path so that you have good code coverage by your tests.
When you write integration test you are testing all layers to see if integration and configuration works. You usually don't write integration test for each execution path because there is to many combination accross all layers.
Testing business classes - unit testing
You need to test your business classes without dependency to DAL and DB. To do that you have to design your BL class so that those dependencies are injected from outside. First you need to define abstract class or interface for DAL and pass that DAL interface as parameter to constructor (another way is to expose setable property on BL class). When you test your BL class you will use another implementation of DAL interface which is not dependent on DB. There are well known test patterns Mock, Stub and Fake which defines how to create and use those dummy implementations. Mocking is also supported by many testing frameworks.
Testing data access layer - integration testing
You need to test your DAL against real DB. You will prepare testing DB with test set of data and you will write your tests to work with that data. Each test will run in its own transaction which will be rolled back at the end so it will not modify initial data set.
Best regards, Ladislav
For the scenario you describe in terms of db interaction, mocking is useful. If you have a chance, take a look at Rhino Mocks
You use Inversion of Control together with a mocking framework, e.g. Rhino Mocks as someone already mentioned
If you are not resorting to mocking and using actual DB in testing then it will be integration testing in layman terms and it's not a unit test anymore. I worked on a project where a dedicated sql .mdf was in source control that was attached to database server using NUnit in [Setup] part of [SetupFixture], similary detached in [TearDown]. This was done everytime NUnit test were performed and could be very time consuming depending upon the SQL code you have as well as size of data can make worse.
Now the catch is the maintenance overhead, you may change the DB scehma during you sprint cycle and at relaease, the change DB script has to be run on all databases used in your development and testing including the one used for integration testing as mentioned above. Not just that, new test data (as someone mentioned above) has to be popoulated for new tables/columns created and likewise, the exisitng data may need to be cleansed as well owing to requirements changes or bug fixes.
This seems to be a task in itself and someone competent in team can take the ownership or if time permits you can integrate the execution of change scripts as part of Continuous Integration if you already implemented one. Still the adding and cleansing the test data need to be taken care of manually.

Categories

Resources