My implementation of unit of work and repository might be the root of the issue here.. but given that everywhere I look for a unit of work implementation I see a different one, I'm sure Ninject has a way of working around it.
So I'm injecting implementations of IRepository<T> into the constructor of my "unit of work".
_kernel.Bind<IRepository<SomeType1>>().To<RepoOfSomeType1>().WhenInjectedInto<IMyUoWFactory>();
_kernel.Bind<IRepository<SomeType2>>().To<RepoOfSomeType2>().WhenInjectedInto<IMyUoWFactory>();
...
I've set up the kernel to instantiate a DataContext in singleton scope which I thought would mean I'm injecting the same instance of this DataContext whenever I need it:
_kernel.Bind<DataContext>().ToConstructor(arg => new MyDataContext(arg.Inject<string>()))
.InSingletonScope() // pass same instance to all repositories?
("connection", _someConnectionString);
The problem I'm finding, is that each repository seems to have its own instance, plus the unit of work has its own as well - hence the exception I'm getting when I try to commit a transaction (something about cross-context transaction).
To top it all off, in certain specific situations I need the connection string to be dynamic, so that the unit of work can work off a database that's selected by the user - hence the "need" for a factory.
The result is that I have 1 connection per repository plus 1 connection per unit of work, which defeats the purpose of UoW (i.e. transactional data operations), and all this overhead, I think, is causing serious performance issues (running +/- 200,000 small data operations in... 3-4 hours!!).
This is how I'm creating the unit of work instance; I want to be able to inject repository implementations in there, yet still be able to use the connection string that the user asked for:
public MyUoWFactory(IRepository<Type1> type1repo, IRepository<Type2> type2repo,
IRepository<Type3> type3repo, IRepository<Type4> type4repo,
IRepository<Type5> type5repo, IRepository<Type6> type6repo)
{
_type1Repository = type1repo;
_type2Repository = type2repo;
_type3Repository = type3repo;
_type4Repository = type4repo;
_type5Repository = type5repo;
_type6Repository = type6repo;
}
public IUnitOfWork Create(string userSelectedConnectionString)
{
return new MyUoW(new MyDataContext(userSelectedConnectionString),
_type1Repository, _type2Repository, _type3Repository,
_type4Repository, _type5Repository, _type6Repository);
}
With the kernel bindings I've defined, this causes the DataContext of the repositories to point where the kernel dictates, and the DataContext of the created UoW to point where the user asked for.
How can I pull this off without resorting to service locator? I need the repositories to have their DataContext injected not at app start-up, but after the user has selected a database. Is this where Ninject.Factory comes into play?
Argh.. did it again (finding answer shortly after posting in premature despair). What I wanted was the unit of work factory to actually create the repositories. So the answer is pretty simple: instead of injecting IRepository<T>, I created an interface IRepositoryFactory<T> and injected that instead:
public interface IRepositoryFactory<T> where T : class
{
IRepository<T> Create(DataContext context);
}
This way the Create method of MyUoWFactory taking the user-selected connection string could create only one context and pass it to all repository factories:
public MyUoWFactory(IRepositoryFactory<Type1> type1repoFactory, IRepositoryFactory<Type2> type2repoFactory,
IRepositoryFactory<Type3> type3repoFactory, IRepositoryFactory<Type4> type4repoFactory,
IRepositoryFactory<Type5> type5repoFactory, IRepositoryFactory<Type6> type6repoFactory)
{
_type1RepositoryFactory = type1repoFactory;
_type2RepositoryFactory = type2repoFactory;
_type3RepositoryFactory = type3repoFactory;
_type4RepositoryFactory = type4repoFactory;
_type5RepositoryFactory = type5repoFactory;
_type6RepositoryFactory = type6repoFactory;
}
public IUnitOfWork Create(string userSelectedConnectionString)
{
var context = new MyDataContext(userSelectedConnectionString)
return new MyUoW(context,
_type1RepositoryFactory.Create(context),
_type2RepositoryFactory.Create(context),
_type3RepositoryFactory.Create(context),
_type4RepositoryFactory.Create(context),
_type5RepositoryFactory.Create(context),
_type6RepositoryFactory.Create(context));
}
Related
I have a legacy application that I'm maintaining that is leaking memory.
I am reasonably confident that the source is the session management/dependency injection code. It uses Simple Injector and NHibernate.
To start, here are some helper classes and interfaces we use:
public class SessionFactory : Dictionary<string, Func<ISession>>,Helpers.ISessionFactory, IDisposable
{
public ISession CreateNew(string name)
{
return this[name]();
}
public void Dispose()
{
foreach (var key in Keys)
{
this[key]().Close();
this[key]().SessionFactory.Close();
}
}
}
public interface ISessionFactory
{
ISession CreateNew(string name);
}
Here is what the container initialization looks like:
private static void InitializeContainer(Container container)
{
var connectionStrings = System.Configuration.
ConfigurationManager.ConnectionStrings;
var sf1 = new Configuration().Configure().SetProperty(
"connection.connection_string",
connectionStrings["db1"].ConnectionString
).BuildSessionFactory();
var sf2 = new Configuration().Configure().SetProperty(
"connection.connection_string",
connectionStrings["db2"].ConnectionString
).BuildSessionFactory();
var sf3 = new Configuration().Configure().SetProperty(
"connection.connection_string",
connectionStrings["db3"].ConnectionString
).BuildSessionFactory();
container.Register<ISessionFactory>(() =>
new SessionFactory
{
{"db1", sf1.OpenSession},
{"db2", sf2.OpenSession},
{"db3", sf3.OpenSession}
}, Lifestyle.Scoped);
}
Then, inside our base controller (other controllers inherit from it), this happens:
protected BaseController(ISessionFactory factory)
{
this.factory = factory;
db1Session = factory.CreateNew("db1");
db2Session = factory.CreateNew("db2");
db3Session = factory.CreateNew("db3");
}
From there, all of our methods can use a session from any database. Some request methods use multiple database sessions to complete their tasks. This project does not utilize the repository pattern at this point -- rewriting it would be an expensive operation. Is there any obvious memory leak I'm missing in this code?
I find your design very suspicious. First of all, your factory is leaking connections, since although you try to dispose it, the only thing you achieve is disposing things you just opened during disposal; this isn't very useful and means the already created connections will not be closed. Second, a design where your application requests the proper connection using a string based approach is error prone. Your application is probably dealing with multiple database schemas, where each connection relates to a certain schema. This means that connections aren't interchangeable and this warrants the use of a unique abstraction per schema. So instead of having one generic ISessionFactory abstraction that tries to serve all consumers (and currently fails), make things explicit by giving each unique schema its own abstraction. For instance:
public interface IDb1SessionProvider
{
ISession Session { get; }
}
public interface IDb2SessionProvider
{
ISession Session { get; }
}
public interface IDb3SessionProvider
{
ISession Session { get; }
}
By lack of context, I named the interfaces IDbXSessionProvider, but I bet you can come up with a better name.
This might look weird, since all interface have the same method signature, but remember that they have each a very different contract. The Liskov Substitution Principle describes that they should not share the same interface.
An implementation for such provider can be made as follows:
public class FuncDb1SessionProvider : IDb1SessionProvider
{
Func<ISession> provider;
public FuncDb1SessionProvider(Func<ISession> sessionProvider) {
this.sessionProvier = provider;
}
public ISession Session => provider();
}
And you can register such implementation in Simple Injector as follows:
var factory = new Configuration().Configure().SetProperty(
"connection.connection_string",
connectionStrings["db1"].ConnectionString)
.BuildSessionFactory();
var session1Producer = Lifestyle.Scoped.CreateProducer<ISession>(
factory.OpenSession, container);
container.RegisterSingleton<IDb1SessionProvider>(
new FuncDb1SessionProvider(session1Producer.GetInstance));
What this code does is creating a scoped InstanceProducer for the db1 session. The scoped InstanceProducer will ensure only one instance of that session is created during a certain scope (usually a web request) and it will ensure that the ISession implementation is disposed (if it implements IDisposable). The call to InstanceProducer.GetInstance() is wrapped in the FuncDb1SessionProvider. This session provider will call forward the creation of the session to the wrapped delegate.
With this design you can let your application code depend on the IDb1SessionProvider and that code can use it without the need to dispose it. Every call to IDb1SessionProvider.Session within the same session will ensure you get the same session and Simple Injector guarantees disposal on the end of the request.
It looks like you have invented your own interface called ISessionFactory. Given that you are using NHibernate which also provides an interface under this name, I would argue that it's VERY unfortunate to use the same names in your own code. You should pick a different name for your own interface and class to avoid confusion.
As for the question itself, NHibernate's ISessionFactory.OpenSession() does exactly that. It will open and return a session. There is no basis to assume that it will do something magic with regards to reuse or scoping.
To have NHibernate assist with contextual sessions, you need to configure the proper "context provider" and use, among other things, ISessionFactory.GetCurrentSession(). See Contextual Sessions in the NHibernate reference.
Alternatively, you can manage the sessions using whatever you like, but then you must use that mechanism to retrieve the current session and not expect NHibernate to know about it.
I have a service layer that I am reusing (AKA Business Layer).
Here's an example of one of my services with an IMyContextFactory dependency, this returns an instance of IMyContext.
public class MyService : IMyService
{
private IMyContextFactory DbContextFactory;
public MyService(IMyContextFactory dbContextFactory)
{
this.DbContextFactory = dbContextFactory;
}
public DoSomething(int id)
{
// Get instance of the db for use
IMyContext dbContext = this.DbContextFactory.CreateMyDbContext();
// Use the business layer for something
var user = dbContext.Set<User>().Find(id);
}
}
I am using the Ninject Factory extension.
Is it possible to make the IMyContextFactory to return the same instance of IMyContext every time?
Background
Originally I was injecting IMyDbContext straight into the service without the factory and I had this InRequestScope() when initialized by my ASP.NET MVC website.
But now I am making use of the service it in a Windows Service too and I don't want my DbContext to become bloated because of frequent looping. I didn't want my service to be newed up for every request either, so that's why I thought a factory within the service would do the trick.
I need the best of InRequestScope() and a new instance every time depending on the configuration. I already have a separate config for ASP.NET and for the Windows Service - it's just how I get a singleton each time from the factory.
I'm not fully proficient with Ninject, but according to this page https://github.com/ninject/Ninject.Extensions.Factory/wiki/Factory-interface it seems that the instance returned by the factory is retrieved from the IResolutionRoot.
My take would be that you have to register your IMyContext concrete type with a singleton lifetime type.
(But it seems that it's not that of a good idea to not destroy your context, according to Erik Fukenbusch's comment)
I am using EntityFramework.Extended library to perform batch updates. The only problem is EF does not keep track of the batch updates performed by the library. So when I query the DbContext again it does not return the updated entities.
I found that using AsNoTracking() method while querying disables the tracking and gets fresh data from the database. However, since EF does not keep track of the entities queried with AsNoTracking(), I am not able to perform any update on the queried data.
Is there any way to force EF to get the latest data while tracking changes?
Please try this to refresh a single entity:
Context.Entry<T>(entity).Reload()
Edit:
To get fresh data for a collection of entities is worth trying to dispose the DbContext instance after each request.
I stumbled upon this question while searching for a solution to a problem I was having where the navigation properties were not populating after updating the entity. Whenever I attempted to reload the entity from the database, it would grab the entry from the local store instead which would not populate the navigation properties via lazy loading. Instead of destroying the context and recreating one, I found this allowed me to get fresh data with the proxies working:
_db.Entry(entity).State = EntityState.Detached;
The logic behind it was - my update attached the entity so it would track changes to it. This adds it to the local store. Thereafter, any attempts to retrieve the entity with functional proxies would result in it grabbing the local one instead of going out to the db and returning a fresh, proxy-enabled entity. I tried the reload option above, which does refresh the object from the database, but that doesn't give you the proxied object with lazy-loading. I tried doing a Find(id), Where(t => t.Id = id), First(t => t.Id = id). Finally, I checked the available states that were provided and saw there was a "Detached" state. Eureka! Hope this helps someone.
I declared the entity variable, without assignment, as part of the class. This allowed me to dispose of an instance without losing the variable for reference by other methods. I just came across this so it doesn't have alot of runtime under it's belt, but so far it seems to be working fine.
public partial class frmMyForm
{
private My_Entities entity;
public frmMyForm()
{
InitializeComponent();
}
private void SomeControl_Click(object sender, EventArgs e)
{
db.SaveChanges();
db.Dispose();
entity = new My_Entities();
//more code using entity ...
}
Stumbled onto this problem. My app wasn't returning fresh data from the database.
These seems to be 3 solutions:
Reload on select: first you select the object, then reload. Loading it twice if it's not cached?
Detach after use: if you forget to detach an object after use, it's going to cause bugs in completely separate parts of the application that are going to be extremely hard to track down.
Disposing the DbContext after use. Definitely seems like the way to go.
I was creating my DbContext instance in the Repository class. If the DbContext is declared at the Repository level, then I have no control over how it gets disposed. That's a no-no. If I create a new DbContext on every call, then I cannot call Select, modify data, and then call Update.
Seems like something is fundamentally missing in my Repository pattern.
After some research on fundamental Repository pattern, I found the solution: Unit of Work pattern alongside the Repository pattern.
This is an excellent article on the Unit of Work pattern
Or this article from Microsoft. What I currently have is the Repository further up in the page, and what's missing is the section "Implement a Generic Repository and a Unit of Work Class"
Basically, instead of injecting repositories into your services, you access all repositories via a UnitOfWork that you inject into your service. It will solve many problems.
public class UnitOfWork : IUnitOfWork
{
private readonly ApplicationContext _context;
public UnitOfWork(ApplicationContext context)
{
_context = context;
Developers = new DeveloperRepository(_context);
Projects = new ProjectRepository(_context);
}
public IDeveloperRepository Developers { get; private set; }
public IProjectRepository Projects { get; private set; }
public int Complete()
{
return _context.SaveChanges();
}
public void Dispose()
{
_context.Dispose();
}
}
Remains the question: how to create the IUnitOfWork instance?
If I create it in the class constructor to be injected just like the repository, then it gets created and destroyed exactly the same way and we're back to the same problem. In ASP.NET and MVC, class instances are short-lived so injecting in the constructor may be fine, but in Blazor and desktop apps, class instances are much more long-lived and it's more of a problem.
This article from Microsoft clearly states that Dependency Injection isn't suitable to manage the lifetime of DbContext in Blazor:
In Blazor Server apps, scoped service registrations can be problematic
because the instance is shared across components within the user's
circuit. DbContext isn't thread safe and isn't designed for concurrent
use. The existing lifetimes are inappropriate for these reasons:
Singleton shares state across all users of the app and leads to
inappropriate concurrent use.
Scoped (the default) poses a similar
issue between components for the same user.
Transient results in a new
instance per request; but as components can be long-lived, this
results in a longer-lived context than may be intended.
They suggest using the Factory pattern, which can be implemented like this
/// <summary>
/// Creates instances of UnitOfWork. Repositories and UnitOfWork are not automatically injected through dependency injection,
/// and this class is the only one injected into classes to give access to the rest.
/// </summary>
public class UnitOfWorkFactory : IUnitOfWorkFactory
{
private readonly IDateTimeService _dateService;
private readonly DbContextOptions<PaymentsContext> _options;
public UnitOfWorkFactory(IDateTimeService dateService, DbContextOptions<PaymentsContext> options)
{
_dateService = dateService;
_options = options;
}
/// <summary>
/// Creates a new Unit of Work, which can be viewed as a transaction. It provides access to all data repositories.
/// </summary>
/// <returns>The new Unit of Work.</returns>
public IUnitOfWork Create() => new UnitOfWork(CreateContext(), _dateService);
/// <summary>
/// Creates a new DbContext.
/// </summary>
/// <returns>The new DbContext.</returns>
public PaymentsContext CreateContext() => new(_options);
}
Neither IWorkOfUnit nor any repository will be registered into the IoC container. Only IWorkOfUnitFactory.
And finally... how to share a transaction between various services?
I have a SetStatus method that updates the status field in the database. How is this method supposed to know whether it's a stand-alone operation or part of a larger transaction?
Since class-level dependency injection isn't suitable to manage and share the Work of Unit, then the only option is to pass it as parameters to the methods that need it.
I add an optional IUnitOfWork? workScope = null parameter to every method that needs it, and call Save only if this parameter is null. Here's an implementation.
public virtual async Task<TempOrder?> SetStatusAsync(int orderId, PaymentStatus status, IUnitOfWork? workScope = null)
{
using var unitOfWork = _workFactory.Create();
var work = workScope ?? unitOfWork;
var order = await work.Orders.GetByIdAsync(orderId);
if (order != null)
{
order.Status = status;
work.Orders.Update(order); // DateModified gets set here
if (workScope == null)
{
await work.SaveAsync();
}
}
return order;
}
Another option is to have IUnitOfWorkFactory.Create take the workScope parameter, and when set:
Re-use the existing DbContext
Do not dispose
IUnitOfWork.Save won't submit
My final implementation can be used like this
public virtual async Task<TempOrder?> SetStatusAsync(int orderId, PaymentStatus status, IUnitOfWork? workScope = null)
{
using var unitOfWork = _workFactory.Create(workScope);
var order = await unitOfWork.Orders.GetByIdAsync(orderId);
if (order != null)
{
order.Status = status;
work.Orders.Update(order); // DateModified gets set here
await unitOfWork.SaveAsync(); // Ignored if workScope != null
}
return order;
}
Pheww! That bug was a rabbit hole. It's a pretty long solution but should solve it for good with a solid architecture.
Making the code run on the same context will not yield you updated entities. It will only append new entities created in the database between runs. EF force reload can be done like this:
ObjectQuery _query = Entity.MyEntity;
_query.MergeOption = MergeOption.OverwriteChanges;
var myEntity = _query.Where(x => x.Id > 0).ToList();
For me ...
I access my DbContext like this:
_viewModel.Repo.Context
To force EF to hit the database I do this:
_viewModel.Repo.Context = new NewDispatchContext();
Overwriting the current DbContext with a new instance. Then the next time I use my data services they get the data from the database.
Reloading specific entities was not an option for me because I didn't know the exact entity. I also did not want to create a new DbContext as it is injected by DI. So I resorted to the following trick to "reset" the whole context.
foreach (var entry in db.ChangeTracker.Entries())
{
entry.State = EntityState.Detached;
}
I am using Entity Framework code first in my data layer using the repository pattern. I'm currently designing my WCF web services to connect to the data layer and I'm just a little confused about how to link up with the data layer.
In my test project for the data layer I create a new DbContext class for each test, wrap that in a using block, within that I create the repository class, passing in the context as the constructor parameter. I can then call my methods on the repository to get the data back.
Is this correct for a start and then do I do the same in the WCF service methods?
Eg would I have
public class UserService : IUserService
{
public bool CheckEmailAvailability(string email)
{
try
{
using (var context = new MyDbContext())
{
var repository = new UserDataRepository(context);
var emailAvailable =
repository.GetItems().Count(
u => u.EmailAddress.Equals(email, StringComparison.InvariantCultureIgnoreCase)) == 0;
return emailAvailable;
}
}
}
}
Or am I using the context/repository concept wrong?
It also strikes me that it would be handy to use DI here so I could mock the data context/repository objects in a WCF service test project. Is this something usually done and if so, does anyone have any links to an example or tutorial on this?
First of all, yes it would be better to inject the repository, or (if your chosen DI framework is unable to resolve them) a factory for the repository.
Additionally it might be a good idea to remove the idea of the context from your service. To my mind the repository and it's helpers should deal with the context, and if necessary with sharing the context between various DB calls required to assemble your entities. Then the services can request entities from the repositories without worrying about whether they're being source from a DB or some other data-store.
public class UserService: IUserService
{
IUserRepository userRepository;
... // ctor to inject repository
public bool CheckEmailAvailability(string email)
{
var emailAvailable = !userRepository.GetUserEmails().Any(u => u.EmailAddress.Equals(email, StringComparison.InvariantCultureIgnoreCase));
return emailAvailable;
}
}
public class UserRepository: IUserRepository
{
...
public IEnumerable GetUserEmails()
{
// actually this context should be handled more centrally, included here for sake of illustration
using (var context = new MyDbContext())
{
return repository.GetItems();
}
}
}
Finally, with regard to WCF services, I'm not certain what the best advice is. I've seen code where WCF services were created in the middle of business logic, which was awful, but a limitation of the oldish WPF code-base and limited DI. The better situations are where a ChannelFactory or service factory can be injected into your service layer. You may find this answer on Dependency Injection wcf helpful.
And as always, if you're planning to TDD your code, I'd definitely recommend wrapping the intersection points between layers of your application in interfaces so that they're easier to mock out. It sounds like you've got the right idea there.
Looking at this answer on SO, I am a bit confused by the following "principle":
Apply the Hollywood Principle
The Hollywood Principle in DI terms says: Don't call the DI Container, it'll call you.
Never directly ask for a dependency by calling a container from within
your code. Ask for it implicitly by using Constructor Injection.
But what if I have a repository class in my DAL, and I want to supply this instance to an object which is created when a TCP/IP client connects? At what place should I make the injection?
Right now, I have something like:
// gets created when a new TCP/IP client is connected
class Worker
{
private readonly IClient client;
public Worker(IClient client)
{
// get the repository
var repo = IoC.GetInstance<IClientMessagesRepo>();
// create an object which will parse messages
var parser = new MessageParser(client);
// create an object which will save them to repo
var logger = new MessageLogger(parser, repo);
}
}
I obviously cannot create this instance when my app is started. So where do I inject the repo?
Thanks a lot!
You should strive to only call IoC.GetInstance() once.
Since you cannot create the Worker at startup, you should instead create a WorkerFactory and have the DI container inject the dependency into that:
public class WorkerFactory
{
private readonly IClientMessagesRepo clientMessagesRepo;
public WorkerFactory(IClientMessagesRepo clientMessagesRepo)
{
this.clientMessagesRepo = clientMessagesRepo;
}
public Worker Create(IClient client)
{
return new Worker(client, clientMessagesRepo);
}
}
Move IClientMessagesRepo to your constructor arguments:
public Worker(IClient client,IClientMessagesRepo clientMessagesRepo)
Now of course this only moves the problem a bit, to the point where the worker is created. Of course at some point calls into the IoC container are necessary. But in those cases I'd rather pass in the container in a parameter than access it from a static property. Or use some kind of factory.
Have IClientMessagesRepo in your arguments, and let the IoC fill that for you:
public Worker(IClient client, IClientMessagesRepo repo)
{
[...]
}
Obviously, your constructor should do a little more than just create a couple local variables, but you get the idea.
As I understand you have the repository in your IOC container, but not the IClient. Assuming that you have access to the IOC container at the time you create your worker class, and assuming that you are using StructureMap you can write:
IClient concreteClient = ...;
worker = container.Using<IClient>(concreteClient).GetInstance<Worker>();
That way you tell StructureMap to use a specific IClient instance, but obtain the other dependencies from the repository.
note: It is some time since I last used StructureMap, so perhaps the code is not 100% correct, but the concept is there, you can provide a concrete dependency when creating a component.