How to handle on-demand data fetching for Domain Models - c#

Given the following scenario...
I am concerned about two things...
1) Is it okay to inject a provider into a business model object? - like I did with the Folder implementation because I want to load Sub-folders on demand.
2) Since I am injecting the DbContext in the Sql implementation of IFolderDataProvider, the context could be disposed or it could live on forever, therefore should I instantiate the context in the constructor?
If this design is incorrect then someone please tell me how should business models be loaded.
//Business model.
interface IFolder
{
int Id { get; }
IEnumerable<IFolder> GetSubFolders();
}
class Folder : IFolder
{
private readonly int id_;
private readonly IFolderDataProvider provider_;
public Folder(int id, IFolderDataProvider provider)
{
id_ = id;
provider_ = provider;
}
public int Id { get; }
public IEnumerable<IFolder> GetSubFolders()
{
return provider_.GetSubFoldersByParentFolderId(id_);
}
}
interface IFolderDataProvider
{
IFolder GetById(int id);
IEnumerable<IFolder> GetSubFoldersByParentFolderId(int id);
}
class SqlFolderDataProvider : IFolderDataProvider
{
private readonly DbContext context_;
public SqlFolderDataProvider(DbContext context)
{
context_ = context;
}
public IFolder GetById(int id)
{
//uses the context to fetch the required folder entity and translates it to the business object.
return new Folder(id, this);
}
public IEnumerable<IFolder> GetSubFoldersByParentFolderId(int id)
{
//uses the context to fetch the required subfolders entities and translates it to the business objects.
}
}

Is it okay to inject a provider into a business model object? - like I did with the Folder implementation because I want to load Sub-folders on demand.
Yes, how else would you be able to call the provider and get the data?
However, the suffix DataProvider is very confusing because it is used for the provider that you use to connect to the database. I recommend changing it to something else. Examples: Repository, Context.
Since I am injecting the DbContext in the Sql implementation of IFolderDataProvider, the context could be disposed or it could live on forever, therefore should I instantiate the context in the constructor?
It won't necessarily live on forever. You decide its life span in your ConfigureServices function when you're adding it as a service, so you can change its scope from Singleton to whatever you like. I personally set the scope of my DBContext service to Transient and I also initiate it there with the connection string:
services.AddTransient<IDbContext, DbContext>(options =>
new DbContext(Configuration.GetConnectionString("DefaultDB")));
I then open and close the database connection in every function in my data layer files (you call it provider). I open it inside a using() statement which then guarantees closing the connection under any condition (normal or exception). Something like this:
public async Task<Location> GetLocation(int id) {
string sql = "SELECT * FROM locations WHERE id = #p_Id;";
using (var con = _db.CreateConnection()) {
//get results
}
}

Is it okay to inject a provider into a business model object
Yes if you call it "business" provider :). Actually do not take too serious all this terminology "inject", "provider". Till you pass (to business model layer's method/constructor) interface that is declared on business model layer (and document abstraction leaks) - you are ok.
should I instantiate the context in the constructor?
This could be observed as an abstraction leak that should be documented. Reused context can be corrupted or can be shared with another thread and etc -- all this can bring side effects. So developers tend to do create one "heavy" object like dbContext per "user request" (that usually means per service call using(var context = new DbContext()), but not always, e.g. Sometimes I share it with Authentication Service Call - to check is the next operation allowed for this user). BTW, DbContext is quite quick to create so do not reuse it just for "optimization".

Related

ASP MVC Entity Framework Database context and classes

In my asp mvc Framework project, using EF, I have some objects (from classes) whose fields store data coming from my database.
My question is :
How to populate these fields, or manage methods of these objects using a dbcontext variable ?
Sol 1: Is it better to use each time I need a connection with db in my classes with the instruction (using (resource), see below ) ?
Sol 2: Is it betterI to code a singleton class to use one instance of the context ?
Sol 3: Or should I use another way for the links beetween my classes and the database ?
What is the best method considering performances and code quality.
Thanks for your attention .
Solution 1
public class Test
{
private T1 a;
private T2 b;
public Test()
{}
public void CreateFrom (int id)
{
using (var db=new WebApplicationMVCTest.Models.dbCtx())
{
a=db.T1s.Find(id);
b= db.T2s.Find(a.id2);
}
}
Solution 2:
public class DbSingleton
{
private static dbCtx instance;
private int foo;
private DbSingleton ()
{}
public static dbCtx Current
{
get
{
if (instance == null)
{
instance = new dbCtx();
}
return instance;
}
}
public static void Set (dbCtx x)
{
if (instance==null)
{
instance = x;
}
}
}
For a web project, never use a static DbContext. EF DbContexts are not thread safe so handling multiple requests will lead to exceptions.
A DbContext's lifespan should only be as long as it is needed. Outside of the first time setup cost when a DbContext is used for the first time, instantiating DbContexts is fast.
My advice is to start simple:
public ActionResult Create(/* details */)
{
using (var context = new AppDbContext())
{
// do stuff.
}
}
When you progress to a point where you learn about, and want to start implementing dependency injection applications then the DbContext can be injected into your controller / service constructors. Again, from the IoC container managing the DbContext, the lifetime scope of the Context should be set to PerWebRequest or equivalent.
private readonly AppDbContext _context;
public MyController(AppDbContext context)
{
_context = context ?? throw new ArgumentNullException("context");
}
public ActionResult Create(/* details */)
{
// do stuff with _context.
}
The gold standard for enabling unit testing would be injecting a Unit of Work pattern and considering something like the Repository pattern to make your dependencies easier to unit test.
The best advice I can give you starting out with EF and MVC is to avoid the temptation to pass Entities between the controller (server) and the UI. (views) You will come across dozens of examples doing just this, but it is a poor choice for performance, it also hides a LOT of land mines and booby traps for both performance, exceptions, and data security issues. The most important detail is that when the UI calls the controller passing what you expect will be an entity, you are not actually getting an entity, but a de-serialized JSON object cast to an entity. It is not an entity that is tracked by the DbContext handling the request. Instead, get accustomed to passing view models (serializable data containers with the data the view needs or can provide) and IDs + values where the controller will re-load entities to update the data only as needed.

EntityFrameworkCore swap DbConnection (add Transaction)

I have large DB model (hundred of tables) split to multiple EntityFrameworkCore DbContexts. Is is a quite common use case when I modify multiple entities in two (or more) different DbContexts, but I need to commit this operations withing a single transaction.
I use a IReporitory pattern where I get injected into Controller an instance of ISomeRepository implementation which looks like:
[HttpPost]
public asycn Task DoSomeWorkAsync()
{
using (var transaction = this.IEmployeesRepository.BeginTransaction())
{
// do some work
await this.IEmployeesRepository.SaveChangesAsync();
// do another work
await this.IPayrollRepository.SaveChangesAsync();
}
}
An EmployeeDbContext implements an IEmployeeRepository interface, PayrollDbContexts implements IPayrollRepository.
I end up with error:
System.InvalidOperationException: The specified transaction is not associated with the current connection. Only transactions associated with the current connection may be used.
There exists very handy documentation, which basically solves the problem.
Cool, but I am not able to create a new instance of EmployeeDbContext, as described in documentation, because I am working with abstraction - interface only. I am looking for some method how to change / swap / inject / replace a DbConnection in existing DbContext.
I was thinking of implementing Clone method like
[HttpPost]
public asycn Task DoSomeWorkAsync()
{
using (var transaction = this.IEmployeesRepository.BeginTransaction())
{
await this.IEmployeesRepository.SaveChangesAsync();
var payrollRepoClone = IPayrollRepository.Clone(transaction);
await payrollRepoClone.SaveChangesAsync();
}
}
and then I would do
public class PayrollDbContext : DbContext, IPayrollRepository
{
private readonly DbConnection dbConnection;
public PayrollDbContext Clone(DbTransaction tran)
{
return new PayrollDbContext(tran.GetDbTransaction.Connection);
}
protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
{
optionsBuilder.UseSqlServer(dbConnection);
}
}
but I am trying to avoid this kind of tight coupling with SQL Server, because currently UseNpgSql is called in IoC in Service container where I resolve IPayrolRepository instance. And UseInMemoryDatabase in unit tests. This would crash my tests (or at least will require some dirty if/else in OnConfiguring)
Do you have any hints how to inject transaction or dbConnection to existing DbContext?
Thanks

Performance consideration of destroying dataContext vs. keeping it open for future db access?

I'm using LINQ2SQL to handle my database needs in a ASP. Net MVC 3 project. I have a separate model which contains all my database access in its own class as follows:
public class OperationsMetricsDB
{
public IEnumerable<client> GetAllClients()
{
OperationsMetricsDataContext db = new OperationsMetricsDataContext();
var clients = from r in db.clients
orderby r.client_name ascending
select r;
return clients;
}
public void AddClient(client newClient)
{
OperationsMetricsDataContext db = new OperationsMetricsDataContext();
db.clients.InsertOnSubmit(newClient);
db.SubmitChanges();
}
I have about 50 different methods in this class which all create and then destroy a copy of my DataContext. My reasoning was that this way would save memory because it would destroy the DataContext after I use the connection and free up that memory. However, I have a feeling that it may be better to use one copy the dataContext and keep it open instead of disposing and reestablishing the connection over and over again. e.g
public class OperationsMetricsDB
{
OperationsMetricsDataContext db = new OperationsMetricsDataContext();
public IEnumerable<client> GetAllClients()
{
var clients = from r in db.clients
orderby r.client_name ascending
select r;
return clients;
}
public void AddClient(client newClient)
{
db.clients.InsertOnSubmit(newClient);
db.SubmitChanges();
}
What is the best practice on this?
I personally use the Unit of Work pattern in conjunction with Repositories for this.
The UnitOfWork creates and manages the DataContext. It then passes the context to each repository when requested. Each time the caller wants to do a new set of operations with the database, they create a new UnitOfWork.
The interfaces would look something like:
public interface IUnitOfWork
{
IRepository<T> GenerateRepository<T>();
void SaveChanges();
}
public interface IRepository<T> where T : class
{
public IQueryable<T> Find();
public T Create(T newItem);
public T Delete(T item);
public T Update(T item);
}
That ensures that the context's lifespan is exactly one Unit of Work long (which is longer than a single operation but shorter than the lifespan of the application).
Its not recommended to cary a datacontext a long time with you. So you are on the right path. It uses connection pooling as far as i know, so the performance hit of creating more than one datacontext in an applications lifetime is not too serious.
But i would not create a new context instance for every single method call of your data class.
I prefer to use it in a unit of work style. Within a web application the processing of a http request can be seen as a unit of work.
So my advice is to create one datacontext instance for the lifetime of on http request and dispose it afterwards.
One context per request is usually fine for most applications.
http://blogs.microsoft.co.il/blogs/gilf/archive/2010/05/18/how-to-manage-objectcontext-per-request-in-asp-net.aspx

How to create a connection string manager to detect if server is available, return cache connection string if not

I have a occasionally connected application where there is a server that stores information about products. I am using a database cache to store this information locally so when a connection is unavailable the application will still work when trying to read the database.
Since the database is configured and I do not have access to modify the tables, I did not implement 2 way updating and it only downloads a snapshot. A side question is if it is possible to create a database cache and have 2-way sync with only tracking columns on the client machine? I cannot add any columns or tables to the server. I know this might be a question for a separate post, but if this is true then it would change my direction for this problem completely, to a separate module detecting and syncing the database and handling any sync errors that are thrown and always connecting to the cache.
I am using a generic repository and I am wondering what the best practice to go about handling if a connection is available or not and using either a local or remote database depending on this status.
Should I add an interface to the generic repository that handles returning the correct string, and lets the repository know if it is live or not? I need to enable/disable certain features depending on the connection state so I also will need a property somewhere so that when this repository is used there can be a way to bind various controls enabled state to this status.
Instead should I have a wrapper that contains for example an IRepository and IConnectionStringManager and then handles feeding and initializing the repository connection string based on availability? This wrapper would expose the repository and any status properties required.
I guess I am not sure if I should be setting up my program to use IRepository with all the automatic connection sensing behind the scenes, or if I should have IRepositoryManager that has a IRepository and IConnectionStringManager in it.
Maybe both of those options are wrong?
I like the way Entity Framework allows you to provide a connection string as a constructor argument to its contexts. That way you can leverage a dependency injection framework to apply special logic when creating the context, and you only have to change the code in one place (assuming you're using DI principles). You could do something similar with your repository implementation.
Update
Since you're using Entity Framework, here's an idea of what it might look like in code:
// DI Bindings, Ninject style
Bind<Func<MyContext>>().ToMethod(
c => new MyContext(
c.Kernel.Get<IConnectionManager>().IsOnline()
? OnlineConnectionString
: OfflineConnectionString));
// Usage
public class ThingRepository : IThingRepository
{
private Func<MyContext> _getContext;
public ThingRepository(Func<MyContext> getContext)
{
_getContext = getContext;
}
public IEnumerable<Thing> GetAllThings()
{
using(var context = _getContext())
{
return context.Things.ToList();
}
}
}
Or, if you prefer to use a more explicit factory implementation:
public interface IMyContextFactory
{
MyContextFactory Get();
}
public class MyContextFactory : IMyContextFactory
{
private const string OnlineConnectionString = "...";
private const string OfflineConnectionString = "...";
private IConnectionManager _connectionManager;
public MyContextFactory(IConnectionManager connectionManager)
{
_connectionManager = connectionManager;
}
public MyContextFactory Get()
{
var connectionString = _connectionManager.IsOnline()
? OnlineConnectionString
: OfflineConnectionString
return new MyContext(connectionString);
}
}
// DI Bindings, Ninject style
Bind<IMyContextFactory>().To<MyContextFactory>();
// Usage
public class ThingRepository : IThingRepository
{
private IMyContextFactory _myContextFactory;
public ThingRepository(IMyContextFactory myContextFactory)
{
_myContextFactory = myContextFactory;
}
public IEnumerable<Thing> GetAllThings()
{
using(var context = _myContextFactory.Get())
{
return context.Things.ToList();
}
}
}

How to scope out Dbcontexts (to prevent singleton context for entire application)

I was wondering how do you scope out your Dbcontexts in Entity Framework so you don't use a single Dbcontext for your entire application. I am new to Entity Framework and have been reading tutorials, but they all used a single Dbcontext as an example, so EF is pretty much a blackbox for me right now.
Let's say for example I have 3 models:
Post
User
Comment
Each model is related to each other (A Post belongs to User, Comment belongs to User and Post). Do I make a Dbcontext for each one individually? But that wouldn't be correct since they are all related, or would I make a Dbcontext for each scenario that I need? For example, if I only need to query for Post and Comments and not user, that would be a PostCommentsContext. And then we would have a PostUserCommentContext...
The best solution would be to use a Unit of Work to wrap the Data Context, as well as managing the connection lifetime and allowing you to work with multiple Repositories (if you were so inclined to go down that path).
Summary of implementation:
Create an interface (IUnitOfWork) which exposes properties for your DbSet's, as well as a single method called Commit
Create an implementation (EntityFrameworkUnitOfWork), implementing as required. Commit simply calls SaveChanges on the base class (DbContext), and also provides a good hook-in for last minute logic.
Your controller accepts a IUnitOfWork, use DI (preferably) to resolve a EntityFrameworkUnitOfWork, with a HTTP-context scoped lifetime setting (StructureMap is good for this)
(optional, but recommended) create a Repository which also takes the IUnitOfWork, and work off that via your Controller.
HTH
EDIT - In Response to Comments
Oh, how can you do work that involves creating records in multiple models then? i.e., create a new user and a new post in the same transaction.
Given your using ASP.NET MVC, your controllers should accept an IUnitOfWork in their constructor.
Here's an example, based on what you asked
public SomeController : Controller
{
private IUnitOfWork _unitOfWork;
private IUserRepo _userRepo;
private IPostRepo _postRepo;
public SomeController(IUnitOfWork unitOfWork, IUserRepo userRepo, IPostRepo postRepo)
{
_unitOfWork = unitOfWork; // use DI to resolve EntityFrameworkUnitOfWork
_userRepo = userRepo;
_postRepo = postRepo;
}
[HttpPost]
public ActionResult CreateUserAndPost(User user, Post post)
{
// at this stage, a HTTP request has come in, been resolved to be this Controller
// your DI container would then see this Controller needs a IUnitOfWork, as well
// as two Repositories. DI smarts will resolve each dependency.
// The end result is a single DataContext (wrapped by UoW) shared by all Repos.
try
{
userRepo.Add(user);
postRepo.Add(post);
// nothing has been sent to DB yet, only two objects in EF graph set to EntityState.Added
_unitOfWork.Commit(); // two INSERT's pushed to DB
}
catch (Exception exc)
{
ModelState.AddError("UhOh", exc.ToString());
}
}
}
And one more question, what does the HTTP-context scoped lifetime do?
Objects in DI-talk have scope management settings that include per thread, per session, per http request, singleton, etc.
HTTP-context scoped is the recommended setting for web apps. It means "new up a context when a HTTP request comes in, and get rid of it when the request is finished".
Use 1 DbContext! That will make life easier for you. Don't worry about performance, data that isn't needed or queried won't be loaded and won't consume any resources.
public class UserContext : DbContext
{
public DbSet<User> Users { get; set; }
public DbSet<Post> Posts { get; set; }
public DbSet<Comment> Comments { get; set; }
}
For some scenarios you might want 2 or more contexts.
A context like the one above to hold all the front-end data needed for your application to work and another context for - as an example - to store reports generated from that front-end data, and which is only used in the back-end of you application.
I am experimenting with UnitofWork, here is what I have come up with...
First I created a IUnitofWork that only contains one method. Commit();
Then my dbContext looks like this
public class myContext : DbContext, IUnitOfWork
{
public DbSet<Users> Users { get; set; }
public DbSet<Addresses> Address { get; set; }
public void Save()
{
SaveChanges();
}
}
My repository classes take a UnitofWork in their ctors.
public class UserRepository : IRepository<Position>
{
private myContext _context;
public UserRepository (IUnitOfWork unitOfWork)
{
if (unitOfWork == null)
throw new ArgumentNullException("unitOfWork");
_context = unitOfWork as myContext;
}
/// other methods ///
}
Then the code in the controller would be something like this
_unitOfWork = new myContext();
_userDB = new UserRepository(_unitOfWork);
_addressDB = new AddressRepository(_unitOfWork);
_userDB.Add(newUser);
_addresesDB.Add(newAddress);
_unitOfWork.Save();
I have debugged and proved that no data is commited until the Save method of the _unitOfWork is called. Very cool stuff!!

Categories

Resources