Where do you call Context.SaveChanges if used in a service - c#

Im building a .Net WebApi and I abstract the logic away so I can use unit tests to test it. I use Entity Framework and the DbContext is injected by DI.
The flow of data is like follows:
[controller] -> [Service] -> [Controller]
To Unit test the Service I mock the DbContext and everything else and test the Service
[Test] -> [Service] -> [Test]
Nothing too excited so far.
Now, if for example I want to add a Point Of Interest then in the service I do something like this:
public class CreatePoiService{
public void InsertPoi(PoiData data){
DbContext.Pois.Add(data);
DbContext.SaveChanges();
}
}
No problem with that. But now I have a Console application that imports 100000 Points of interests. And while this Console Application makes use of the same InsertPoi function, the DbContext.SaveChanges() is called after every insert, and this slows things down because its better to do after N inserts.
So Ive added a function to the Service class:
public class CreatePoiService{
private bool SaveToContext = true;
public void InsertPoi(PoiData data){
DbContext.Pois.Add(data);
if(SaveToContext) DbContext.SaveChanges();
}
}
Now I can call in the Console Application CreatePoiService.SaveToContext = false to not have EF execute the changes. And I can call the SaveChanges in the Console Application itself. Which works great. But still I doubt if there are better ways to do this?

The normal pattern is that your services will call SaveChanges, but you can still scope a transaction across multiple service calls.
The transaction will control the durability of all the changes saved by the services because they all share a single DbContext instance as a Scoped dependency.

Related

Unable to cast object of type 'System.Data.ProviderBase.DbConnectionClosedConnecting' to type 'System.Data.SqlClient.SqlInternalConnectionTds

I am getting the following error on the first db access after the application starts - "Unable to cast object of type 'System.Data.ProviderBase.DbConnectionClosedConnecting' to type 'System.Data.SqlClient.SqlInternalConnectionTds"
The error only thrown once, at the first method tries to read data from the database, after the application starts.
Re-calling the same method for the 2nd time and further, everything works fine.
Using .net core 1.1 with Entity Framework
I recently had this same exception in an ASP.NET Core 2 app with EF Core. In my case, the root cause was a problem with the scopes my dependency-injected DbContext. I had a controller and a service both using an injected DbContext. The service was a singleton, like this:
public class TestService{
public TestService(FooDbContext db)
{
this.db = db;
}
}
public class FooController{
public FooController(FooDbContext db, TestService testService)
{
this.testService = testService;
this.db = db;
}
}
public class Startup{
public void ConfigureServices(IServiceCollection services){
//...
services.AddDbContext<FooDbContext>(options =>
options
.UseSqlServer(Configuration.GetConnectionString("FooDbContext"))
);
services.AddSingleton<TestService>();
}
}
So the controller would use it's instance, and then if the singleton service also tried to use it's own instance, then it would give the error above about 90% of the time. I'm a little fuzzy on why this would be an issue, or be intermittent, but it became pretty clear in debugging that EF was reusing some underlying resources. I didn't dig into EF code debugging, but I suspect the controller instance was closed, and the service instance reused the connection, expecting it to be open. In reading, others suggested MultipleActiveResultSet=true in the connection string would fix, but this did not resolve the issue in my case. In my case, the fix was to change the service to Transient in Startup.cs, which was acceptable in this case, and possibly better:
services.AddTransient<TestService>();

Why ninject gets different Db instances in nunit?

I write integration tests for ASP.NET MVC based application, and I try to resolve ninject registration issue.
So for my ASP.NET MVC registration I have
kernel.Bind(typeof(ICustomDbContext), typeof(IUnitOfWork))
.ToMethod(ctx => ctx.Kernel.Get<CustomDbContext>())
.InRequestScope();
Just to clarify CustomDbContext implements IUnitOfWork and ICustomDbContext.
With that registration i guarantee that i have one unique instance per request for CustomDbContext. That registration works properly in scope of ASP.NET.
The problem is when i write integration tests.
[SetUp]
public void SetUp()
{
kernel = NinjectConfig.CreateKernel();
}
[Test]
public async Task Test()
{
// Arrange
var claaService = kernel.Get<IService>();
}
On set up step i load my composition root (which is in ASP.NET MVC project).
The problem is when i get IService (Implementation of IService.cs is Service.cs and that service has dependencies to IUnitOfWork.cs and IGenericRepository.cs. IGenericRepository.cs has dependency to ICustomDbContext).
At the end when i access IService i should have same instance of CustomDbContext (and as I said in works in scope of MVC).
I have tried to resolve it in child scope, but the result is the same (they still have different hash code):
using (var childKernel1 = new ChildKernel(kernel))
{
childKernel1.Rebind(typeof(ICustomDbContext), typeof(IUnitOfWork))
.ToMethod(ctx => ctx.Kernel.Get<CustomDbContext>())
.InThreadScope();
var claaService = childKernel1.Get<IClassService>();
}
My questions are:
Why this is happening ?
How can I resolve it (it works if i do not use ninject, but i want to find a way with Ninject, even if i have to add additional configuration in integration tests) ?
Why this is happening ?
Ninject's scoping is limited to the lifetime of the container. You have setup the container to be created for each [Test] because you are using [SetUp].
This attribute is used inside a TestFixture to provide a common set of functions that are performed just before each test method is called.
[SetUp]
public void SetUp()
{
kernel = NinjectConfig.CreateKernel();
}
If you want to use the same container across multiple tests in the same [TestFixture] (assuming this because you said "instance is not the same", but you didn't mention same as what), you need to use [OneTimeSetup] instead.
This attribute is to identify methods that are called once prior to executing any of the tests in a fixture.
[OneTimeSetUp]
public void OneTimeSetUp()
{
kernel = NinjectConfig.CreateKernel();
}
This of course assumes all of your relevant integration tests are in a single class.
In short, your Ninject container is being re-initialized on every test, which means all instances it manages are also being re-initialized.

Web API global setup code, also for unit tests

Say I have some global application setup code, defined in my Global.asax, Application_Start. For example to disable certificate checking:
public class WebApiApplication : System.Web.HttpApplication
{
protected void Application_Start()
{
// (...)
ServicePointManager.ServerCertificateValidationCallback = delegate { return true; };
// (...)
}
}
Say I also have a unit test, which depends on the code above to be run. However in my unit test, Application_Start is not called as I'm instantiating the controller directly:
var controller = new TestSubjectController();
Is there some mechanism in ASP.NET or Web API that solves this problem? What would be the best way to define the setup code, preventing duplication in my code?
Research
I've went through multiple questions on SO already. Most of them focus on unit testing Application_Start itself, however that's not the goal here. Other questions tend to look at testing using the applications external (HTTP) interface, however I'd like being able to instantiate the controller directly in my unit tests.
In addition to Batavia's suggestion you could use a [TestInitialize] attributed method or your unit testing framework of choices's equivalent to call the common static method for all tests in a certain class which would reduce the duplication you're concerned about.

Getting new DbContext when unit testing with Asp.Net Boilerplate

I'm working on a project built on Asp.Net Boilerplate, and now I have to unit test the services using the real repositories with a real database connection (no mocking). I've been using the last post of BringerOd in https://gist.github.com/hikalkan/1e5d0f0142484da994e0 as a guide for setting up my UnitOfWorkScope instance. So, my code currently looks something like this:
IDisposableDependencyObjectWrapper<IUnitOfWork> _unitOfWork;
[TestInitialize]
public void SetUpService()
{
//initialize service
_unitOfWork = IocManager.Instance.ResolveAsDisposable<IUnitOfWork>();
UnitOfWorkScope.Current = _unitOfWork.Object;
UnitOfWorkScope.Current.Initialize(true);
UnitOfWorkScope.Current.Begin();
}
[TestCleanup]
public void CleanUpService()
{
UnitOfWorkScope.Current.Cancel();
_unitOfWork.Dispose();
UnitOfWorkScope.Current = null;
}
This works like a charm for the first unit test, but when I try to make a repository call in a second test, I get: "The operation cannot be completed because the DbContext has been disposed."
My guess is that when the TestInitialize method runs again, the unit of work scope is getting assigned with the same (disposed) DbContext, rather than a new one. I suppose, inside my actual test methods, I could set up my UnitOfWorkScope inside a using block with the IUnitOfWork. However, I really don't want to repeat that logic inside inside every single test. Does anyone know how to manually get the effect of a using block so that I get a brand new DbContext each time?
Check: http://aspnetboilerplate.com/Pages/Documents/Repositories
You must mark the calling method with [UnitOfWork] attribute.
The reason for this, as explained in the linked document is
When you call GetAll() out of a repository method, there must be an open database connection. This is because of deferred execution of IQueryable<T>. It does not perform database query unless you call ToList() method or use the IQueryable<T> in a foreach loop (or somehow access to queried items). So, when you call ToList() method, database connection must be alive. This can be achieved by marking caller method with the [UnitOfWork] attribute of ASP.NET Boilerplate. Note that Application Service methods are already using [UnitOfWork] as default, so, GetAll() will work without adding the [UnitOfWork] attribute for application service methods.

How to create a restful web service with TDD approach?

I've been given a task of creating a restful web service with JSON formating using WCF with the below methods using TDD approach which should store the Product as a text file on disk:
CreateProduct(Product product)
GetAProduct(int productId)
URI Templates:
POST to /MyService/Product
GET to /MyService/Product/{productId}
Creating the service and its web methods are the easy part but
How would you approach this task with TDD? You should create a test before creating the SUT codes.
The rules of unit tests say they should also be independent and repeatable.
I have a number of confusions and issues as below:
1) Should I write my unit tests against the actual service implementation by adding a reference to it or against the urls of the service (in which case I'd have to host and run the service)? Or both?
2)
I was thinking one approach could be just creating one test method inside which I create a product, call the CreateProduct() method, then calling the GetAProduct() method and asserting that the product which was sent is the one that I have received. On TearDown() event I just remove the product which was created.
But the issues I have with the above is that
It tests more than one feature so it's not really a unit test.
It doesn't check whether the data was stored on file correctly
Is it TDD?
If I create a separate unit test for each web method then for example for calling GetAProduct() web method, I'd have to put some test data stored physically on the server since it can't rely on the CreateProduct() unit tests. They should be able to run independently.
Please advice.
Thanks,
I'd suggest not worrying about the web service end points and focus on behavior of the system. For the sake of this discussion I'll drop all technical jargon and talk about what I see as the core business problem you're trying to solve: Creating a Product Catalog.
In order to do so, start by thinking through what a product catalog does, not the technical details about how to do it. Use that as your starting points for your tests.
public class ProductCatalogTest
{
[Test]
public void allowsNewProductsToBeAdded() {}
[Test]
public void allowsUpdatesToExistingProducts() {}
[Test]
public void allowsFindingSpecificProductsUsingSku () {}
}
I won't go into detail about how to implement the tests and production code here, but this is a starting point. Once you've got the ProductCatalog production class worked out, you can turn your attention to the technical details like making a web service and marshaling your JSON.
I'm not a .NET guy, so this will be largely pseudocode, but it probably winds up looking something like this.
public class ProductCatalogServiceTest
{
[Test]
public void acceptsSkuAsParameterOnGetRequest()
{
var mockCatalog = new MockProductCatalog(); // Hand rolled mock here.
var catalogService = new ProductCatalogService(mockCatalog);
catalogService.find("some-sku-from-url")
mockCatalog.assertFindWasCalledWith("some-sku-from-url");
}
[Test]
public void returnsJsonFromGetRequest()
{
var mockCatalog = new MockProductCatalog(); // Hand rolled mock here.
mockCatalog.findShouldReturn(new Product("some-sku-from-url"));
var mockResponse = new MockHttpResponse(); // Hand rolled mock here.
var catalogService = new ProductCatalogService(mockCatalog, mockResponse);
catalogService.find("some-sku-from-url")
mockCatalog.assertWriteWasCalledWith("{ 'sku': 'some-sku-from-url' }");
}
}
You've now tested end to end, and test drove the whole thing. I personally would test drive the business logic contained in ProductCatalog and likely skip testing the marshaling as it's likely to all be done by frameworks anyway and it takes little code to tie the controllers into the product catalog. Your mileage may vary.
Finally, while test driving the catalog, I would expect the code to be split into multiple classes and mocking comes into play there so they would be unit tested, not a large integration test. Again, that's a topic for another day.
Hope that helps!
Brandon
Well to answer your question what I would do is to write the test calling the rest service and use something like Rhino Mocks to arrange (i.e setup an expectation for the call), act (actually run the code which calls the unit to be tested and assert that you get back what you expect. You could mock out the expected results of the rest call. An actual test of the rest service from front to back would be an integration test not a unit test.
So to be clearer the unit test you need to write is a test around what actually calls the rest web service in the business logic...
Like this is your proposed implementation (lets pretend this hasn't even been written)
public class SomeClass
{
private IWebServiceProxy proxy;
public SomeClass(IWebServiceProxy proxy)
{
this.proxy = proxy;
}
public void PostTheProduct()
{
proxy.Post("/MyService/Product");
}
public void REstGetCall()
{
proxy.Get("/MyService/Product/{productId}");
}
}
This is one of the tests you might consider writing.
[TestFixture]
public class TestingOurCalls()
{
[Test]
public Void TestTheProductCall()
{
var webServiceProxy = MockRepository.GenerateMock<IWebServiceProxy>();
SomeClass someClass = new SomeClass(webServiceProxy);
webServiceProxy.Expect(p=>p.Post("/MyService/Product"));
someClass.PostTheProduct(Arg<string>.Is.Anything());
webServiceProxy.VerifyAllExpectations();
}
}

Categories

Resources