Unit test code that interacts with database without creating data in database - c#

I want to test my C# code in Visual Studio with unit tests.
The code interacts with a MySQL database.
An example of what I want to test
A user opens my application and will be saved in the database via a webservice. Now I want to know if the table Users has one more row before the new user came. But if I test the method in the webservice, which will create a record in the database, test data will be in the database. And I don't want test data in my database.
Is there a solution for this problem?

I think what you're looking for is mocking. This allows you to simulate your data access code in order to test your business logic.
There are lots of good frameworks out there including moq and rhino mocks.
If you want to actually populate your database in order to test your data access layer then you're looking for more of an integration test. I've written quite a thorough answer covering my approach to db integration tests here.

What you are looking for is a Mock. For example Moq for C#.
Wrap your database logic in an interface and mock it in your unit test. That way you can interact with the mocked database logic, without really executing it.

Mocking is a good way to do it.
It you want more integration test-like approach (which can be helpful when you want to test actual database interaction), you can use a local running MySql server instance.
I've created a project that let you run tests on a live MySql server without setting it up.
You can download it a Nuget package (Mysql.Server) or check it out here: https://github.com/stumpdk/MySql.Server

You can also look for Microsoft fakes framework.
https://learn.microsoft.com/en-us/visualstudio/test/isolating-code-under-test-with-microsoft-fakes?view=vs-2017

Related

Any Automocking frameworks that generate from actual data in database

Does anything described below exist?
Hi, I'm a c# and javascript programmer. When creating tests the pain point for me is the creation of the test dependencies. Especially when I am making assertions against values that I expect in the database.
I know that writing tests that make calls to the database is a bad practice since many database calls can slow down the entire test suite. The alternative is we as developers must create these large sometimes complicated mock objects that contain the values that the database would otherwise be returning.
Instead I would like to create my tests against an actual database. Then I would like for my test running application or testing framework to make note of the object returned from the database. The testing framework would replace the dependency on the database with an automatically created stub object for all subsequent runs of this test.
Essentially the database would only get hit the very first time a test is run and from that point forward it would instead use that data it retrieved from the first pass of the test as the stub or mock object.
This would entirely mitigate the need to ever manually create an object for the purpose of testing.
You could use AutoFixture to create the data.
It does have some support for data annotations, but you'd probably still need to tweak it extensively to fit your particular database schema.

Is there any reason to use Mock objects when unit testing an Entity Framework DAL?

From what I understand, Mocking is used to remove dependency of another service so that you can properly test the execution of business logic without having to worry about those other services working or not.
However, if what you are testing IS that particular service (i.e. Entity Framework), Implementation-style unit tests against a preset test database are really the only tests that will tell you anything useful.
Am I missing anything? Does mocking have any place in my testing of an Entity Framework DAL?
You are correct in your assertion about mocking:
Mocking is used to remove dependency of another service so that you
can properly test the execution of business logic without having to
worry about those other services working or not.
In my words: the idea behind unit testing is to test a single code path through a single method. When that method hands execution over to another object there is a dependency. When control passes to an unmocked dependency you are no longer unit testing, but are instead integration testing.
Data access layer testing is typically an integration test. As you speculate you can utilize a predictable data set to ensure your DAL is returning results as expected. I would not expect a DAL to have any dependencies which would require mocking. Your testing that the values returned by your DAL match what you would expect given your dataset.
All of the above said it is not your responsibility to test the Entity Framework. If you find yourself testing the way EF works and are not creating tests about your specific DAL implementation then you are writing the wrong tests. Put another way: you test your code, let someone else test theirs.
Finally, three years ago I asked a similar question which elicited answers which greatly improved my understanding of testing. While not identical to your question I'd recommend reading through the responses.
In my opinion, mocking objects has nothing to do with the layer you are about to test. You have a component that you want to test and it has dependencies (that you can mock). So go ahead and mock.
One can assume EF works. You would test your code that interacts with EF. In this case, you fake EF in order to test your interacting code.
You basically shouldn't be testing EF. Leave that to Microsoft.
One thing you might consider is that EF can work with a local file and not your actual repository and you can switch between the two with connection strings. This example would create the .sdf file in an AppData folder or in the bin folder if it's a console application.
<connectionStrings>
<add name="SecurityContext"
connectionString="Data Source=|DataDirectory|YourDBContext.sdf"
providerName="System.Data.SqlServerCe.4.0" />
I like this when I'm starting a project or testing. You can load the DB with data and such and presto: EF has a mocked DB for you to run tests against without touching production data.

How to seed database data for integration testing in C#?

I've been writing some integration tests recently against ASP.Net MVC controller actions and have been frustrated by the difficulty in setting up the test data that needs to be present in order to run the test.
For example, I want to test the "add", "edit" and "delete" actions of a controller. I can write the "add" test fine, but then find that to write the "edit" test I was am either going to have to call the code of the "add" test to create a record so that I can edit it, or do a lot of setup in the test class, neither of which are particularly appealing.
Ideally I want to use or develop an integration test framework to make it easier to add seed data in a reusable way for integration tests so that the arrange aspect of an arrange/act/assert test can focus on arranging what I specifically need to arrange for my test rather than concerning itself with arranging a load of reference data only indirectly related to the code under the test.
I happen to be using NHibernate but I believe any data seeding functionality should be oblivious to that and be able to manipulate the database directly; the ORM may change, but I will allways be using a SQL database.
I'm using NUnit so envisage hooking into the test/testfixture setup/teardown (but I think a good solution would potentially transferable to other test frameworks).
I'm using FluentMigrator in my main project to manage schema and seeding of reference data so it would be nice, but not essential to be able to use the FluentMigrator framework for a consistent approach across the solution.
So my question is, "How do you seed your database data for integration testing in C#?" Do you execute the SQL directly? Do you use a framework?
You can make your integration testing on Sql Server Compact, you will have a .sdf file and you can connect to it giving the file's path as connection string. That would be faster and easier to setup and work with.
Your integration test would not probably need millions of rows of data. You can insert your test data into your database and save it as TestDbOriginal.sdf.
When you are running your tests, just make a copy of this 'TestDbOriginal.sdf' and work on that copy, which is already seeded with data. If you want to test a specific scenario, you will need to prepare your data by calling some methods like add, remove, edit .
When you go production or performance testing, switch back to your original server version, be it Sql Server 2008 or whatever.
I don't know if it's necessarily the 'right' thing to do, but I've always seeded using my add/create method(s).

Unit testing, project with many database calls

I have a question on unit testing. Currently I have a large project that calls many SP and does not get a return for most methods. Really it is a large wrapper for many SQL calls. There is not a lot of logic as it is all held in the SP it also has sections of in line sql.
I need to unit test this c# project but it is becoming clear that the unit test would be pointless as it would call many SP which all would be mocked. Am I worried I am thinking about this incorrectly.
My question is that has anyone had this problem and what did they do? Should I be doing database unit tests instead, any insight would be a great help.
Thanks.
A unit test should not touch a data access layer as that would be an integration test/system test. What you can test is that your project in fact calls your data access layer. Doing this will give you peace of mind that during refactors that clicking a button does always call the data access layer.
//Arrange
var dataAccessMock = new Mock<IDataAccessMock>();
dataAccessMock(da => da.ExecuteSomething());
IYourApplication app = new YourApplication(dataAccessMock);
//Act
app.SomeProcessThatCallsExecuteSomething("1234567890");
/Assert
dataAccessMock.Verify(dp=>da.ExecuteSomething(), Times.Once());
note, in this example I am using Moq
After this is is tested to your liking you can focus on your integration test to verify your stored procedures are working as intended. For this you will potentially need to do quite a bit of work to attach a database in a known state, run your stored procedures, and then revert or trash your database so the tests are repeatable.
You should split your testing strategy into integration testing and unit testing.
For integration testing you can rely on your existing database. You will typically write more high-level tests here and verify that your application interacts with your database correctly.
For unit testing you should only pick selected scenarios that actually make sense for mocking out. These are typically scenarios where a lot of business logic "sits on top" of your database logic and you want to verify that business logic.
Over time you can mock out more and more database calls, but for the beginning identify the critical spots.
You have discovered one reason that business logic should generally go in the business, rather than data access, layer. Certainly there are exceptions dictated by performance and sometimes security concerns, but they should remain exceptions.
Having said that, you can still develop a strategy to test your sprocs (though depending on how extensive they are, it may or may not be correct to call those tests "unit tests").
You can use a unit testing framework either way.
In the initialization section, restore a testing copy of the database to a known state, e.g. by loading it from a previously saved copy.
Then, execute unit tests that exercise the stored procedures. Since the stored procedures generally do not return anything, your unit test code will have to select values from the database to check whether the expected changes were made or not.
It may be necessary, depending on possible interactions between stored procedures, to restore the database between each test, or between groups of related tests.
Data / Persistence Layer could is the often most neglected code from a unit testing perspective (true unit testing using test doubles: mocks, stubs, fakes, etc.). If you are connecting to a database then you are integration testing. I find value in a) well architected data/persistence layers that as a side effect are easy to test (uses interfaces, good data access framework abstractedion, etc. and b) are actually unit and integration tested property.

Unit-Testing: Database set-up for tests

I'm writing unit-tests for an app that uses a database, and I'd like to be able to run the app against some sample/test data - but I'm not sure of the best way to setup the initial test data for the tests.
What I'm looking for is a means to run the code-under-test against the same database (or schematically identical) that I currently use while debugging - and before each test, I'd like to ensure that the database is reset to a clean slate prior to inserting the test data.
I realize that using an IRepository pattern would allow me to remove the complexity of testing against an actual database, but I'm not sure that will be possible in my case.
Any suggestions or articles that could point me in the right direction?
Thanks!
--EDIT--
Thanks everyone, those are some great suggestions! I'll probably go the route of mocking my data access layer, combined with some simple set-up classes to generate exactly the data I need per test.
Here's the general approach I try to use. I conceive of tests at about three or four levels:: unit-tests, interaction tests, integration tests, acceptance tests.
At the unit test level, it's just code. Any database interaction is mocked out, either manually or using one of the popular frameworks, so loading data is not an issue. They run quick, and make sure the objects work as expected. This allows for very quick write-test/write code/run test cycles. The mock objects serve up the data that is needed by each test.
Interaction tests test the interactions of non-trivial class interactions. Again, no database required, it's mocked out.
Now at the integration level, I'm testing integration of components, and that's where real databases, queues, services, yada yada, get thrown in. If I can, I'll use one of the popular in-memory databases, so initialization is not an issue. It always starts off empty, and I use utility classes to scrub the database and load exactly the data I want before each test, so that there's no coupling between the tests.
The problem I've hit using in-memory databases is that they often don't support all the features I need. For example, perhaps I require an outer join, and the in-memory DB doesn't support that. In that case, I'll typically test against a local conventional database such as MySQL, again, scrubbing it before each test. Since the app is deployed to production in a separate environment, that data is untouched by the testing cycle.
The best way I've found to handle this is to use a static test database with known data, and use transactions to ensure that your tests don't change anything.
In your test setup you would start a transaction, and in your test cleanup, you would roll the transaction back. This lets you modify data in your tests but also makes sure everything gets restored to its original state when the test completes.
I know you're using C# but in the Java World there's the Spring framework. It allows you to run database minipulations in a transaction and after this transaction, you roll this one back. This means that you operate against a real database without touching the state after the test finishes. Perhaps this could be a hint to further investigation in C#.
Mocking is of cause the best way to unit test your code.
As far as integration tests go, I have had some issues using in-memory databases like SQLite, mainly because of small differences in behaviour and/or syntax.
I have been using a local instance of MySql for integration tests in several projects. A returning problem is the server setup and creation of test data.
I have created a small Nuget package called Mysql.Server (see more at https://github.com/stumpdk/MySql.Server), that simply sets up a local instance of MySql every time you run your tests.
With this instance running you can easily set up table structures and sample data for your tests without being concerned of either your production environment or local server setup.
I don't think there is an easy way to finish this. You just have to create those Pre-Test sql setup scripts and post-test Tear-down scripts. Then you need trigger those scripts for each run. A lot of people suggest SQLLite for unit test setup.
I found it best to have my tests go to a different db so I could wipe it clean and put in the data I wanted for the test.
You may want to have the database be something that can be set within the program, then your test can tell the classes to change the database.
This code clears all data from all user's tables in MS SQL Server:
private DateTime _timeout;
public void ClearDatabase(SqlConnection connection)
{
_timeout = DateTime.Now + TimeSpan.FromSeconds(30);
do
{
SqlCommand command = connection.CreateCommand();
command.CommandText = "exec sp_MSforeachtable 'DELETE FROM ?'";
try
{
command.ExecuteNonQuery();
return;
}
catch (SqlException)
{
}
} while (!TimeOut());
if (TimeOut())
Assert.Fail("Fail to clear DB");
}
private bool TimeOut()
{
return DateTime.Now > _timeout;
}
If you are thinking about a real database usage, then mostlikely we're talking integration tests here. I.e tests, which check app behavior as a composition of different components contrary to unit tests, where components are supposed to be tested in isolation.
Having the testing scope defined, I wouldn't recommend using things like in-memory databases or mocking libraries as the other authors suggested. The problem is that usually there is a slightly different behavior or reduced set of features for in-memory databases and there is no database at all with mocking, therefore you'll be testing some other application in general sense and not the one you'll be delivering to your customers.
I'd rather suggest to minimize the amount of integration tests by covering just a crucial parts of your logic leaving the rest for unit testing, while using a real database with the setup as close to the production one as possible. Test runs could be too slow and a real pain if there are a lot of integration ones.
Also you might use some tricks to optimize the speed of your tests execution:
Split tests to Read and Write in regard to the data mutations they introduce and run the former ones in parallel and without any cleanup. (E.g HTTP GET requests are safe to be run in parallel if the system under test is a webapp and tests are more like end-to-end);
Use the only insert/delete script for all the data and optimize as much as possible. You might find Reseed library I'm developing currently helpful. It's able to generate both insert and delete scripts for you. So basically what you asked for. Or check out Respawn which could be used for database cleanup;
Use database snapshots for the restore, which might be faster than full insert/delete cycle;
Wrap each test in transaction and revert it afterwards (this one is also not 100% honest and somewhat fragile);
Parallelize your tests by using a pool of databases instead of the only. Docker and TestContainers could be suitable here;

Categories

Resources