EF DbContext Factory with Ninject C# - c#

I am looking for some advice on the best way to re-structure an existing C# Console Application that I have written.
The application utilises Entity Framework for the Data coupled with the Repository pattern. I am also using Ninject for DI.
The issue I am facing is as follows:
I have a class, say ClassA, which has a Repository passed in through the constructor. The repository constructor takes in a DbContext. Ninject is currently handling all of this for me.
public class ClassA
{
private IRepository Repository;
public ClassA(IRepository repostitory)
{
Repository = repository;
}
public void Process()
{
var RV = Function1();
Function2(RV);
Function3(RV);
}
private IList<ClassB> Function1()
{
//Populate database using Repository and return list of objects
var items = //Call External Web Service to get list of Items
foreach(var item in items)
{
Repository.AddEntry(item);
}
return Repository.Items.ToList();
}
private void Function2(IList<Item> items)
{
//Long running process maybe 20/30 mins.
}
private void Function3(IList<Item> items)
{
//Remove objects in list from database via the Repository.
foreach(var item in items)
{
Repository.DeleteEntry(item);
}
}
}
public class Repository : IRepository
{
private DbContext DbContext;
public IQueryable<Item> Items
{
get { return DbContext.Items; }
}
public Repository(DbContext dbContext)
{
DbContext = dbContext;
}
void AddEntry(Item item)
{
DbContext.Items.Add(item);
DbContext.SaveChanges();
}
void DeleteEntry(Item item)
{
DbContext.Items.Remove(item);
DbContext.SaveChanges();
}
}
ClassA then has a process function that works through a series of 3 private functions.
Only functions 1 and 3 need access to the Repository, as 1 populates the database via the Repository and then returns an in memory collection. Function 2 uses this collection, after which function 3 deletes records from the database via the Repository.
Function 2 can take several minutes to complete and the behaviour I am noticing is that when function 3 calls the database I receive a "DbContext has been disposed" error. It has not been disposed in any of my code but it seems another process running on the same SQL server is causing this.
In an attempt to work around this I want to be able to dispose of the Repository after function 1 and then use a new one for function 3, so in essence wrap the code in function 1 & 3 inside a using statement. This is when my head starts to hurt when trying to figure out this scenario whilst utilising Ninject.
What should I be passing in to ClassA as part of the constructor to allow me to create on the fly Repostitories? Do I need some sort of Factory pattern or Unit of Work pattern?
Any help or thoughts gratefully received.
Thanks
pf79

You can inject your DbContext as a factory instead of an instance perse, take a look at this: https://github.com/vany0114/EF.DbContextFactory
http://elvanydev.com/EF-DbContextFactory/
There is an extension to Ninject to do that in a very easy way, just calling the method kernel.AddDbContextFactory<YourContext>(); also you need to change your repository by receiving a Func<YourContext>

Related

Generic C# Repository, service and controller design

Im learning about generics and was wondering about how a generic controller, service and ef core repo design would look like.
My case: lets say an incomming post request to add Smartphone and keyboard object to smartphone and keyboard tables
My repository setup is
public class GenericRepository<TEntity> : IGenericRepository<TEntity>
where TEntity : class, IProductGenericEntities
{
private readonly MyDbContext _db;
public GenericRepository(MyDbContext db)
{
_db = db;
}
public async Task<bool> AddProduct(TEntity entity)
{
try
{
_db.Set<TEntity>().AddAsync(entity);
return (await _db.SaveChangesAsync()) > 0;
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
return false;
}
}
}
And my service
public class ProductService<TEntity> : IProductService<TEntity>
where TEntity : class
{
private readonly IGenericRepository<TEntity> _repo;
public ProductService(IGenericRepository<TEntity> repo)
{
_repo = repo;
}
public async Task<bool> AddProduct(TEntity entity)
{
return await _repo.AddProduct(entity);
}
}
And my Controller.cs
[ApiController]
[Route("api/[controller]")]
public class ProductController
{
private readonly IProductService<Keyboards> _keyService;
private readonly IProductService<Smartphones> _smartService;
public ProductController(IProductService<Keyboards> keyService, IProductService<Smartphones> smartService)
{
_keyService = keyService;
_smartService = smartService;
}
[HttpPost("Post-generated-items")]
public async Task<ActionResult> PostProducts(List<TEntity> entities)
{
foreach(var item in entities)
{
and sort the objects here
}
}
}
is it correct to initialize 2 of IProductServices and sort the incomming objects to their correct DI on the controller?
private readonly IProductService<Keyboards> _keyService;
private readonly IProductService<Smartphones> _smartService;
Is there a way to make it more automatic by detecting incomming object type and then initilize it all the way to repo so i dont need 2 of IProductService<>?
Or is it what im doing plain wrong with a generic service layor?
Ok, so your approach is completely valid, i would not worry about initializing two repositories, since they're essentially empty memory vise since they just take reference to existant DbContext which by default is registered with Scoped lifecycle.
There will be a time when you need to use several repositories to complete task at hand. I would suggest going for NON-generic services approach. This way you could make ProductsService which has all the needed generic repositories injected and can orchestrate their work to achieve use case goal.
You might as well look into UOW (Unit Of Work) pattern for even more complex situations.
Answering your question:
Is there a way to make it more automatic by detecting incomming object type and then initilize it all the way to repo so i dont need 2 of IProductService<>?
You might write some code that would do just that for you using Reflection, but i would suggest against doing so. By initializing your repositories specifically you make yourself less error prone and code becomes more self-documenting.
For example now you have a controller that asks DI for two services and that instantly set's you up for what's going on in this controller. On the other hand if everything would be generic, you would end up with one huge knot of spaghetti that "Does everything".

ASP MVC Entity Framework Database context and classes

In my asp mvc Framework project, using EF, I have some objects (from classes) whose fields store data coming from my database.
My question is :
How to populate these fields, or manage methods of these objects using a dbcontext variable ?
Sol 1: Is it better to use each time I need a connection with db in my classes with the instruction (using (resource), see below ) ?
Sol 2: Is it betterI to code a singleton class to use one instance of the context ?
Sol 3: Or should I use another way for the links beetween my classes and the database ?
What is the best method considering performances and code quality.
Thanks for your attention .
Solution 1
public class Test
{
private T1 a;
private T2 b;
public Test()
{}
public void CreateFrom (int id)
{
using (var db=new WebApplicationMVCTest.Models.dbCtx())
{
a=db.T1s.Find(id);
b= db.T2s.Find(a.id2);
}
}
Solution 2:
public class DbSingleton
{
private static dbCtx instance;
private int foo;
private DbSingleton ()
{}
public static dbCtx Current
{
get
{
if (instance == null)
{
instance = new dbCtx();
}
return instance;
}
}
public static void Set (dbCtx x)
{
if (instance==null)
{
instance = x;
}
}
}
For a web project, never use a static DbContext. EF DbContexts are not thread safe so handling multiple requests will lead to exceptions.
A DbContext's lifespan should only be as long as it is needed. Outside of the first time setup cost when a DbContext is used for the first time, instantiating DbContexts is fast.
My advice is to start simple:
public ActionResult Create(/* details */)
{
using (var context = new AppDbContext())
{
// do stuff.
}
}
When you progress to a point where you learn about, and want to start implementing dependency injection applications then the DbContext can be injected into your controller / service constructors. Again, from the IoC container managing the DbContext, the lifetime scope of the Context should be set to PerWebRequest or equivalent.
private readonly AppDbContext _context;
public MyController(AppDbContext context)
{
_context = context ?? throw new ArgumentNullException("context");
}
public ActionResult Create(/* details */)
{
// do stuff with _context.
}
The gold standard for enabling unit testing would be injecting a Unit of Work pattern and considering something like the Repository pattern to make your dependencies easier to unit test.
The best advice I can give you starting out with EF and MVC is to avoid the temptation to pass Entities between the controller (server) and the UI. (views) You will come across dozens of examples doing just this, but it is a poor choice for performance, it also hides a LOT of land mines and booby traps for both performance, exceptions, and data security issues. The most important detail is that when the UI calls the controller passing what you expect will be an entity, you are not actually getting an entity, but a de-serialized JSON object cast to an entity. It is not an entity that is tracked by the DbContext handling the request. Instead, get accustomed to passing view models (serializable data containers with the data the view needs or can provide) and IDs + values where the controller will re-load entities to update the data only as needed.

Disposing EF DBContext in MVC controller

I want to make sure I'm disposing my EF dbContext objects.
Currently I'm using static methods to invoke EF crud operations to keep all the data layer stuff black boxed and out of the controllers.
The example below I have one method the returns an IQueryable and uses a Using statment which causes an exception when the query tries to run on a disposed context object.
The other doesn't use the Using statement and works fine but is it getting disposed?
Should I just return IEnumerable instead of IQueryable?
public class MyContext : DbContext
{
public MyContext() : base("MyConnectionString")
{
Database.SetInitializer<EFContext>(null);
}
public DbSet<User> Users { get; set; }
public DbSet<Role> Roles { get; set; }
}
public class Data
{
// Fails when IQuerable tried to run against a disposed MyContext object
public static T Get<T>(params string[] joins)
{
using (var context = new MyContext())
{
return context.Get<T>(joins);
}
}
// Works fine but when is it disposed?
public static T Get<T>(params string[] joins)
{
return new MyContext().Get<T>(joins);
}
}
public ActionResult GetUser(int id = 0)
{
var data = Data.Get<User>();
return View("Users", model);
}
The issue is not about using IQueryable or IEnumerable. You have the exception because by returning IQueryable or IEnumerable, you're not executing the LINQ. You just openned a connection to your Database then close after configuring your query.
To solve this, you need to execute the LINQ query by calling ToList, ToArray, Single[OrDefault], First[OrDefault] etc... extension methods.
Because you're using Web Application, it is a best practice to have one instance of your DbContext in the whole life of your web request. I recommend you to use DI (Dependency Injection), it will help you a lot. You can try Simple Injector which is very simple DI.
If you're not able to use DI so just follow this steps but take a time to learn about DI please :) :
When a web request arrives, store a new instance of your DbContext into HttpContext.Items collections.
In your methods, just retrieve the stored DbContext from HttpContext.Items and use it.
When the web request is terminating, just dispose the DbContext.

Using a DbContext variable from one Controller to Another

Hi I am using MVC 4 and C# to develop an application that has two controllers:
The first one is called Business, it has a method called Create that calls a method called CreatePartner from another Controller named PartnerController.
public class BusinessController : Controller
{
private storeContext db = new storeContext();
public ActionResult Create(Business business)
{
//Some stuff here
PartnerController pt = new PartnerController();
pt.CreatePartner(int partner_id);
//Here is another stuff that uses db DbContext variable
return RedirectToAction("Index");
}
}
This is the second controller Called Partner
public class PartnerController : Controller
{
private storeContext db = new storeContext();
public void CreatePartner(int partner_id)
{
//Some interesting stuff
}
}
Each controllers has its Dispose() method
The Problem is: After I called the CreatePartnet method from Business controller I try to use the db variable again to save other data but it throws me the following exception:
The operation can not be completed because the DbContext has been disposed
-What is the best way to Use methods from one controller to another that has the same DbContext variable name?.
-Something strange happens: My stuff works locally but when I publish my code in the IIS server is when the app throws that exception.
Thanks!
Might I suggest an alternative approach?
Controllers are not very good places for business logic; that is they're not very good places for "doing stuff". It's often demonstrated in MVC tutorials and examples in this manner but it's really only good for getting into MVC quickly - it's not very good practice.
Furthermore Controllers aren't really supposed to have methods to be called - from themselves or called from another Controller. Controllers should really just contain their Actions.
Instead, extract your logic to an external class. A Service is a design pattern in which commonly used business logic is abstracted away. That way things can have a reference to the service and execute the logic without knowing anything about the implementation.
Observe:
IPartnerService
public interface IPartnerService
{
void CreatePartner(int partnerId);
}
DefaultPartnerService
public class DefaultPartnerService : IPartnerService
{
private StoreContext db;
public DefaultPartnerService()
{
db = new StoreContext();
}
public void CreatePartner(int partnerId)
{
// Something interesting
}
}
BusinessController
public class BusinessController : Controller
{
private IPartnerService _partnerService;
public BusinessController()
{
_partnerService = new DefaultPartnerService();
}
public ActionResult Create(Business business)
{
_partnerService.CreatePartner(business.PartnerId);
return RedirectToAction("Index");
}
}
Of course this approach is also greatly simplified for educational purposes. It's not best practice yet, but it might put you on the right track. Eventually you'll discover problems with this approach and you'll gravitate to reading about Repositories, Unit of Work, Dependency Injection and so on.

Performance consideration of destroying dataContext vs. keeping it open for future db access?

I'm using LINQ2SQL to handle my database needs in a ASP. Net MVC 3 project. I have a separate model which contains all my database access in its own class as follows:
public class OperationsMetricsDB
{
public IEnumerable<client> GetAllClients()
{
OperationsMetricsDataContext db = new OperationsMetricsDataContext();
var clients = from r in db.clients
orderby r.client_name ascending
select r;
return clients;
}
public void AddClient(client newClient)
{
OperationsMetricsDataContext db = new OperationsMetricsDataContext();
db.clients.InsertOnSubmit(newClient);
db.SubmitChanges();
}
I have about 50 different methods in this class which all create and then destroy a copy of my DataContext. My reasoning was that this way would save memory because it would destroy the DataContext after I use the connection and free up that memory. However, I have a feeling that it may be better to use one copy the dataContext and keep it open instead of disposing and reestablishing the connection over and over again. e.g
public class OperationsMetricsDB
{
OperationsMetricsDataContext db = new OperationsMetricsDataContext();
public IEnumerable<client> GetAllClients()
{
var clients = from r in db.clients
orderby r.client_name ascending
select r;
return clients;
}
public void AddClient(client newClient)
{
db.clients.InsertOnSubmit(newClient);
db.SubmitChanges();
}
What is the best practice on this?
I personally use the Unit of Work pattern in conjunction with Repositories for this.
The UnitOfWork creates and manages the DataContext. It then passes the context to each repository when requested. Each time the caller wants to do a new set of operations with the database, they create a new UnitOfWork.
The interfaces would look something like:
public interface IUnitOfWork
{
IRepository<T> GenerateRepository<T>();
void SaveChanges();
}
public interface IRepository<T> where T : class
{
public IQueryable<T> Find();
public T Create(T newItem);
public T Delete(T item);
public T Update(T item);
}
That ensures that the context's lifespan is exactly one Unit of Work long (which is longer than a single operation but shorter than the lifespan of the application).
Its not recommended to cary a datacontext a long time with you. So you are on the right path. It uses connection pooling as far as i know, so the performance hit of creating more than one datacontext in an applications lifetime is not too serious.
But i would not create a new context instance for every single method call of your data class.
I prefer to use it in a unit of work style. Within a web application the processing of a http request can be seen as a unit of work.
So my advice is to create one datacontext instance for the lifetime of on http request and dispose it afterwards.
One context per request is usually fine for most applications.
http://blogs.microsoft.co.il/blogs/gilf/archive/2010/05/18/how-to-manage-objectcontext-per-request-in-asp-net.aspx

Categories

Resources