I am unit testing some CRUD operations. My questions are:
1) If I need to test the Add, Get, and Delete methods. The persistence layer is a database. Since I need to have a test object to Get and Delete, should I combine all 3 of these into one [TestMethod], or separate these into 3 methods and re-add the object before the Get and Delete tests?
Ideally you should have individual tests for each case.
You should use some sort of mocking - either via a framework or by setting up the database yourself - to set the initial conditions for each test.
So to test add you would start with a blank database and then add some new data, try to add the same data again (it should fail), add incomplete data etc.
Then to test get and delete you would start with a pre-populated database and perform the various tests you need.
I'd generally make a separate test. If I'm testing a "get" type method, the test setup would insert the object (generally by way of some mock framework) I expect to get as necessary, it just wouldn't be asserted against in the same way the actual get would.
This does mean that if the add implementation breaks, both the tests for the get and the add will fail in one way or another, assuming proper coverage. But that's kind of what you want anyhow, right?
If you are writing you own ORM to handle CRUD, I suggest you separate each action in a different test. Don't create big tests, that has many points of failure and many reasons to change, because it will turn your test project hard to maintain. Test each feature separately.
Now, if you are using some third part ORM to deal with CRUD, you should not test the tool at all, unless you don't trust it. But, in this case, you should find a better alternative. :)
You can do some Acceptance Tests to check if everything is working and, at this time, you will really reach the database.
Whatever makes testing easier for you :) as long as you get a return stating which method passed/failed then it should be good.
Hello :) I'd separate and test individually, and set up cleanly before and clear up after each.
Related
Does anything described below exist?
Hi, I'm a c# and javascript programmer. When creating tests the pain point for me is the creation of the test dependencies. Especially when I am making assertions against values that I expect in the database.
I know that writing tests that make calls to the database is a bad practice since many database calls can slow down the entire test suite. The alternative is we as developers must create these large sometimes complicated mock objects that contain the values that the database would otherwise be returning.
Instead I would like to create my tests against an actual database. Then I would like for my test running application or testing framework to make note of the object returned from the database. The testing framework would replace the dependency on the database with an automatically created stub object for all subsequent runs of this test.
Essentially the database would only get hit the very first time a test is run and from that point forward it would instead use that data it retrieved from the first pass of the test as the stub or mock object.
This would entirely mitigate the need to ever manually create an object for the purpose of testing.
You could use AutoFixture to create the data.
It does have some support for data annotations, but you'd probably still need to tweak it extensively to fit your particular database schema.
I've been writing some integration tests recently against ASP.Net MVC controller actions and have been frustrated by the difficulty in setting up the test data that needs to be present in order to run the test.
For example, I want to test the "add", "edit" and "delete" actions of a controller. I can write the "add" test fine, but then find that to write the "edit" test I was am either going to have to call the code of the "add" test to create a record so that I can edit it, or do a lot of setup in the test class, neither of which are particularly appealing.
Ideally I want to use or develop an integration test framework to make it easier to add seed data in a reusable way for integration tests so that the arrange aspect of an arrange/act/assert test can focus on arranging what I specifically need to arrange for my test rather than concerning itself with arranging a load of reference data only indirectly related to the code under the test.
I happen to be using NHibernate but I believe any data seeding functionality should be oblivious to that and be able to manipulate the database directly; the ORM may change, but I will allways be using a SQL database.
I'm using NUnit so envisage hooking into the test/testfixture setup/teardown (but I think a good solution would potentially transferable to other test frameworks).
I'm using FluentMigrator in my main project to manage schema and seeding of reference data so it would be nice, but not essential to be able to use the FluentMigrator framework for a consistent approach across the solution.
So my question is, "How do you seed your database data for integration testing in C#?" Do you execute the SQL directly? Do you use a framework?
You can make your integration testing on Sql Server Compact, you will have a .sdf file and you can connect to it giving the file's path as connection string. That would be faster and easier to setup and work with.
Your integration test would not probably need millions of rows of data. You can insert your test data into your database and save it as TestDbOriginal.sdf.
When you are running your tests, just make a copy of this 'TestDbOriginal.sdf' and work on that copy, which is already seeded with data. If you want to test a specific scenario, you will need to prepare your data by calling some methods like add, remove, edit .
When you go production or performance testing, switch back to your original server version, be it Sql Server 2008 or whatever.
I don't know if it's necessarily the 'right' thing to do, but I've always seeded using my add/create method(s).
I'm trying to avoid using an in memory database for testing (though I might have to do this if the following is impossible). I'm using NHibernate 3.0 with LINQ. I'd like to be able to mock session.Query<T>() to return some dummy values but I can't since it's an extension method and these are pretty much impossible to test.
Does anyone have any suggestions (other than using an in memory database) for testing session queries with LINQ?
I've tried this before with previous versions of NH without much luck. I eventually used another class to wrap the query and mocked that instead.
I do think it's also worth writing an integration test against a real sql server, to make sure that the repository behaves as expected.
A better approach will be to mock the concept of what you are trying to do, not the inner api of an external system.
For instance
Write the query in a separated artifact, like IQuerySomething / QuerySomething
Test your query against a database. Try this database be prety like the real db.
When testing something that depends on IQuerySomething, mock IQuerySomething.
Fabio Maulo wrote about this pattern as EQO (Enhanced Query Object), i recommend you his post.
This is the way we use in .net for almost everything.
It's looking like you are going to overcomplicate things. I will try to save your time =)
First of all let's start that there is two tipes of testing for the typical project (I am sure you know this, but it is better to mention). Integration tests and Unit tests. And typically (I will assume that you have a typicall application in order no to add "typically" to every sentence) you need both of them.
Integration tests are going on real database and some of them on In-Memory one for better test performance.
So you probably have mappings in your application and want to test them, it is better to do with integration tests on real DB, and if you are using Fluent Nhibernate (if you don't it is better to start using it) this will be a pice of cake.
Then you probably have a kind of Repository or another data access layer (where you are using Linq) that you want to test too. And you probably want to have tests like:
When I submit a query get-customer-by-name, my data access component should return customer with specified name.
This is better to achieve using in-memory database, because this is cheaper. This will save you some time in the typical scenario.
But if you have a lot of complex queries, then I would agree with José F. Romaniello, that it is better to use Enhanced Query Object and test it separately.
You can put your attention on Sharp Arhitecture framework that is targeting a lot of issues when using Nhibernate and testing persistence layer.
I'm writing unit-tests for an app that uses a database, and I'd like to be able to run the app against some sample/test data - but I'm not sure of the best way to setup the initial test data for the tests.
What I'm looking for is a means to run the code-under-test against the same database (or schematically identical) that I currently use while debugging - and before each test, I'd like to ensure that the database is reset to a clean slate prior to inserting the test data.
I realize that using an IRepository pattern would allow me to remove the complexity of testing against an actual database, but I'm not sure that will be possible in my case.
Any suggestions or articles that could point me in the right direction?
Thanks!
--EDIT--
Thanks everyone, those are some great suggestions! I'll probably go the route of mocking my data access layer, combined with some simple set-up classes to generate exactly the data I need per test.
Here's the general approach I try to use. I conceive of tests at about three or four levels:: unit-tests, interaction tests, integration tests, acceptance tests.
At the unit test level, it's just code. Any database interaction is mocked out, either manually or using one of the popular frameworks, so loading data is not an issue. They run quick, and make sure the objects work as expected. This allows for very quick write-test/write code/run test cycles. The mock objects serve up the data that is needed by each test.
Interaction tests test the interactions of non-trivial class interactions. Again, no database required, it's mocked out.
Now at the integration level, I'm testing integration of components, and that's where real databases, queues, services, yada yada, get thrown in. If I can, I'll use one of the popular in-memory databases, so initialization is not an issue. It always starts off empty, and I use utility classes to scrub the database and load exactly the data I want before each test, so that there's no coupling between the tests.
The problem I've hit using in-memory databases is that they often don't support all the features I need. For example, perhaps I require an outer join, and the in-memory DB doesn't support that. In that case, I'll typically test against a local conventional database such as MySQL, again, scrubbing it before each test. Since the app is deployed to production in a separate environment, that data is untouched by the testing cycle.
The best way I've found to handle this is to use a static test database with known data, and use transactions to ensure that your tests don't change anything.
In your test setup you would start a transaction, and in your test cleanup, you would roll the transaction back. This lets you modify data in your tests but also makes sure everything gets restored to its original state when the test completes.
I know you're using C# but in the Java World there's the Spring framework. It allows you to run database minipulations in a transaction and after this transaction, you roll this one back. This means that you operate against a real database without touching the state after the test finishes. Perhaps this could be a hint to further investigation in C#.
Mocking is of cause the best way to unit test your code.
As far as integration tests go, I have had some issues using in-memory databases like SQLite, mainly because of small differences in behaviour and/or syntax.
I have been using a local instance of MySql for integration tests in several projects. A returning problem is the server setup and creation of test data.
I have created a small Nuget package called Mysql.Server (see more at https://github.com/stumpdk/MySql.Server), that simply sets up a local instance of MySql every time you run your tests.
With this instance running you can easily set up table structures and sample data for your tests without being concerned of either your production environment or local server setup.
I don't think there is an easy way to finish this. You just have to create those Pre-Test sql setup scripts and post-test Tear-down scripts. Then you need trigger those scripts for each run. A lot of people suggest SQLLite for unit test setup.
I found it best to have my tests go to a different db so I could wipe it clean and put in the data I wanted for the test.
You may want to have the database be something that can be set within the program, then your test can tell the classes to change the database.
This code clears all data from all user's tables in MS SQL Server:
private DateTime _timeout;
public void ClearDatabase(SqlConnection connection)
{
_timeout = DateTime.Now + TimeSpan.FromSeconds(30);
do
{
SqlCommand command = connection.CreateCommand();
command.CommandText = "exec sp_MSforeachtable 'DELETE FROM ?'";
try
{
command.ExecuteNonQuery();
return;
}
catch (SqlException)
{
}
} while (!TimeOut());
if (TimeOut())
Assert.Fail("Fail to clear DB");
}
private bool TimeOut()
{
return DateTime.Now > _timeout;
}
If you are thinking about a real database usage, then mostlikely we're talking integration tests here. I.e tests, which check app behavior as a composition of different components contrary to unit tests, where components are supposed to be tested in isolation.
Having the testing scope defined, I wouldn't recommend using things like in-memory databases or mocking libraries as the other authors suggested. The problem is that usually there is a slightly different behavior or reduced set of features for in-memory databases and there is no database at all with mocking, therefore you'll be testing some other application in general sense and not the one you'll be delivering to your customers.
I'd rather suggest to minimize the amount of integration tests by covering just a crucial parts of your logic leaving the rest for unit testing, while using a real database with the setup as close to the production one as possible. Test runs could be too slow and a real pain if there are a lot of integration ones.
Also you might use some tricks to optimize the speed of your tests execution:
Split tests to Read and Write in regard to the data mutations they introduce and run the former ones in parallel and without any cleanup. (E.g HTTP GET requests are safe to be run in parallel if the system under test is a webapp and tests are more like end-to-end);
Use the only insert/delete script for all the data and optimize as much as possible. You might find Reseed library I'm developing currently helpful. It's able to generate both insert and delete scripts for you. So basically what you asked for. Or check out Respawn which could be used for database cleanup;
Use database snapshots for the restore, which might be faster than full insert/delete cycle;
Wrap each test in transaction and revert it afterwards (this one is also not 100% honest and somewhat fragile);
Parallelize your tests by using a pool of databases instead of the only. Docker and TestContainers could be suitable here;
When there are a number of people working on a project, all of who could alter the database schema, what's the simplest way to unit test / test / verify it? The main suggestion we've had so far is to write tests for each table to verify column names, constraints, etc.
Has anyone else done anything similar / simpler? We're using C# with SQL Server, if that makes any real difference.
Updates:
The segment of the project we're working on is using SSIS packages to do the bulk of the work so there is very little C# code to write unit tests agains.
The code for creating tables / stored procedures is spread across SQL files. Because of the build system, we could maintain a separate VS DB project file as well, but I'm not sure how that would help us verify the schema either.
One possibly answer is to use Visual Studio for Database developers and keep your schema in source control with the rest of your code. This allows you to see differences and you get a history of who changed what.
Alternatively you could use a tool like SQLCompare to see what has been modified in one database compared to another.
Your (relational) database does two things as far as I'm concerned: 1) Hold data and 2) Hold relations between data.
Holding data is not a behavior so you would not test it
And for ensuring relations just use constraints. Lots of constraints. All over the place.
That is an interesting question! There are lots of tools out there for testing stored procedures but not for testing the database schema.
Don't you find that the unit tests written for code generally find any problems with the database schema?
One approach I have used is to write stored procedures to copy test data from the developer's schema to a test schema. This is pretty rough and ready as the stored procedures generally crash when they come across any differences between the schemas but it does alert you to any changes you haven't been told about.
And nominate someone to be the DBA who monitors changes to the schema?
I've had to do this type of thing before, although not in C#. To begin with, I built a schema migration tool, based on the discussion at Ode to Code (page 1 of 5) (there are also existing tools to do similar things). Importantly, the migration tool I built allowed you to specify the database you were applying the changes to and what version you wanted to apply. Then, following a test first methodology, whenever I needed to make a schema change I would write a test script which would create a test database, apply version changes to the one before my target change script, add some data, apply the change script under test, and confirm that the data was in an expected state.
My main goal with this was to confirm that no data was lost or corrupted during schema migrations, not to check specifically that the schema was in a particular state. A good awareness of your production data set is required, so you can write representative sample data for the tests.
It's debatable if this should be considered unit testing or integration testing. I would tend to consider it integration testing, based on the fact that I don't want to run old tests every time I iterate my code. Whatever you want to call it, I found it to be a useful tool for that situation.
This is an old question but it appears that people are still landing here. So the best tool I have found so far is "SQL Test" by Red Gate. It allows you to create scripts that run as transactions. Allowing you to run "sandboxed" queries for checking the state of the database.
This does not really fit the unit test paradigm. I would suggest version controlling the schema and limiting write access to a single qualified team member such as the DBA or team lead, who can validate any requested changes against the entire application. Schema changes should not be done haphazardly.
Don't you find that the unit tests written for code generally find any problems with the database schema?
This assumes, of course, that your tests test everything.