Say I have a controller like below:
public class MyController : Controller
{
private MyDBContext db = new MyDBContext();
public ActionResult Index()
{
return View(db.Items.ToList());
}
...
Typically when I need to make EF calls I will instantiate the DBContext in the function I am using it in and wrap it in a using statement like so:
public ActionResult Index()
{
using(MyDBContext db = new MyDBContext())
{
return View(db.Items.ToList());
}
}
I found the first example on the www.asp.net website which seems like a reputable source(right?), but I'm concerned about the context not being manually disposed of after each use.
Is having a context that is defined outside of the function scope without a using statement a bad practice?
Not if you have something like this:
protected override void Dispose(bool disposing)
{
db.Dispose();
base.Dispose(disposing);
}
Since Controller is Disposable, it's ok
But generally I try to have a better separation between my controllers and my model.
The first way is the "recommended" way because the context will be disposed when the controller is disposed. Since the controller survives the life of the request, you're ensured that your context will hang around the entire time as well. Using using with contexts is dangerous as the context is disposed at a different point in the request and could result in problems if it's accessed after it's been disposed. You're probably okay here since the return is inside the using block, but assuming you did something like the following, instead:
List<Item> items;
using(MyDBContext db = new MyDBContext())
{
items = db.Items.ToList();
}
return View(items);
You'd be in a world of hurt the first time you accessed a navigation property that happened to be lazy-loaded. While, you didn't make that mistake in your code, it's far to easy to make it in general. If you avoid using altogether with your contexts, then you're always good.
That said, the best way is to actually use dependency injection. For all intents and purposes your context should be treated as a singleton - you don't want multiple instances floating around as that's a recipe for disaster. Using a DI container is a good way to ensure that you achieve that no matter where and in how many different way your context is used. Which DI container to use is a highly personal choice. I prefer Ninject, but there's quite a few other choices that may work better for your personal style. Regardless of which you go with, there should be some sort of option for using "request scope". That is what you'll want to use with your context as it ensures that there's only one instance per request, but every request gets its own instance.
Related
I would like to learn more about creating a Database Entity (Connection?).
Coming from PHP/MySQL, i was creating only one connection and was reusing that over and over using a connection pool.
I noticed in MVC, I create new db entity almost every chance I get. Is this really the correct way to do so in real world example?
For example, I have a code that tells the user how many unread messages they have left on every refresh/page view. It goes like this:
public int UnreadMessages()
{
using (dbEntities db = new dbEntities())
{
return db.messages.Select(M => M.status == "Unread").Count();
}
}
On my _Layout.html, I have a line that calls this code. So, this is being executed on every request. The way I look at it, this is terrible way of doing it because I keep creating a new connection? or maybe this is the way it was supposed to be done on MVC.
Could someone please explain to me, the best way of doing this? or maybe provide some links that may help me understand this better?
P.S. I am also not too sure how db connection on MVC works. Wether 1 connection is made and a new db entity(Not a connection, rather just a call?) is created on requests or a new brand new connection is made on requests.
Two things, Entity framework uses underlying ADO.NET which supports powerful connection pooling, and connections to database are closed instantly by context. So you don't need to worry about connection pooling.
However, it is not good idea to create and destroy context every time for single operation. Ideally only one context should be created for entire lifecycle of a request. Since creating and destroying context is little costly it does affect performance at high load.
Controller has OnDispose method, and this is how you can easily implement it,
public abstract class DBController : Controller {
public MyDbContext DbContext { get; private set; }
public DBController() {
DbContext = new ...
HttpContext.Items["DbContext"] = DbContext;
}
protected override void OnDispose() {
DbContext.Dispose();
}
}
And your every Controller should be derived from DBController. And in Layout file you can use same context by retrieving HttpContext.Items["DbContext"]
This way same context will be used for entire request. And yes, for every request new context will be created. EF is not designed to be thread safe and should not be reused for different requests.
In the mvc world, views (including layout) should only use data from the model or include partial views with RenderAction() that get their models from other actions.
You ask about connections and EF though, and while opening and disposing objects frequently isn't great you need to understand that EF has its own connection pool, so if your action calls a bunch of methods that all create and dispose their own dbEntities() object, only one connection to the actual database will be used.
In my opinion, it's recommended to use using to create new instance as it will automatically close connection after the connection and dispose the instance.
If you want to use a Global variable, you need to make sure to open and close db connection in each method, then it still be fine.
However, the bad thing that you are doing is to call Database connection from your _Layout.html, that is the view, should only render the view; not to connect to DB.
I have a problem with EF not returning the newest data in a 3 layered WPF application, and I suspect it has something to do with how I handle the lifetime of my context. This is the scenario:
There are several repositories wrapped inside a UnitOfWork. There is also one service (MyService), which uses the UnitOfWork. This UnitOfWork must also be called from the UI directly, without passing through a service.
In the ViewModel of my main window at some point I create a new window (using ViewModel first):
var dialog = new DialogViewModel(_eventAggregator, _unitOfWork, Container.Resolve<CarService>());
This main window ViewModel has a UnitOfWork, which has been injected in the constructor, and that is passed to the DialogViewModel.
CarService's constructor also needs a UnitOfWork, which is also injected in its constructor:
public CarService(IUnitOfWork unitOfWork){
_unitOfWork = unitOfWork;
}
When CarService is used in DialogViewModel to make a query to retrieve some data and make some updates, it works fine the first time. However, when the same query is made the next time to retrieve that data, instead of returning the newest modified one it returns the old/cached one. The query using UnitOfWork (inside CarService) looks like this:
var values = _unitOfWork.GarageRepository.GetSomeValues();
_unitOfWork.GarageRepository.MakeSomeChangesToTheValuesUsingStoredProcedure();
The second time this is called, values doesn't contain the newest version of the data; however it has been updated successfully in the DB.
I'm doing DI using Unity, and this is how my container looks like:
public class Container
{
public static UnityContainer Container = new UnityContainer();
// Called once in the AppBoostraper, as soon as the GUI application starts
public void BuildUp()
{
Container.RegisterType<IUnitOfWork, UnitOfWork>();
Container.RegisterType<ICarService, CarService>();
}
}
Why isn't the right data being returned, and how can I fix it?
I finally found the problem, which had to do with my management of the unitOfWork/dbcontext lifecycle.
I was loading some entities, then updating them with a stored procedure (so the entities in the code were not up to date anymore), and then loading the queries again; at this point EF was getting the values from the cache rather than from the DB.
I found two ways of fixing this:
A rather "hacky" one, force the entities to reload:
Context.Entry(entity).Reload();
Encapsulate the unitOfWork usage with a using, so that the context is disposed at the end of each transaction and thus getting fresh data the next time. I think this is more in line with what the UnitOfWork is meant for and feels more robust to me. I've also wrapped the UnitOfWork in a factory, so now that gets injected in the constructors.
using (var uOw = new unitOfWorkFactory.GetNew())
{
// make the queries
}
The default LifetimeManager for Unity is the TransientLifetimeManager, which means you get a new instance each time it resolves (including when injected). So given your registrations, you'll get a new CarService, with a new instance of a UnitOfWork each time you call Resolve(), and a new, different instance injected into your main window ViewModel.
So your ViewModel gets a UoW, the CarService gets a separate UoW, and updating one will mean the other is now out of date due to caching.
What you need to do is set up a LifetimeManager for the context that has appropriate scope, or defer to a factory. Unity doesn't have that many LMs built in, but the LifetimeManager class is basically a glorified map (has a Set, Get, and Remove method essentially).
I don't know enough about WPF and its lifetimes to suggest an implementation. Maybe it can be singleton (which will keep the same context the entire time the program is running), maybe it can be backed by a thread's CallContext.
Your other option is to pass along the UoW instance when you resolve the CarService by calling Container.Resolve<CarService>(new ParameterOverride("unitOfWork", _unitOfWork)). That'll keep the lifecycle management tied to the lifetime of the main window ViewModel. However, this approach has problems because your VM class knows a little too much about the CarService (notably, that it has a UoW in it).
I am currently using a DbContext similar to this:
namespace Models
{
public class ContextDB: DbContext
{
public DbSet<User> Users { get; set; }
public DbSet<UserRole> UserRoles { get; set; }
public ContextDB()
{
}
}
}
I am then using the following line at the top of ALL my controllers that need access to the database. Im also using it in my UserRepository Class which contains all methods relating to the user (such as getting the active user, checking what roles he has, etc..):
ContextDB _db = new ContextDB();
Thinking about this, there are occasions when one visitor can have multiple DbContexts active, for instance if it is visiting a controller that uses the UserRepository, which might not be the best of ideas.
When should I make a new DbContext? Alternatively, should I have one global context that is passed around and reused in all places? Would that cause a performance hit? Suggestions of alternative ways of doing this are also welcome.
I use a base controller that exposes a DataBase property that derived controllers can access.
public abstract class BaseController : Controller
{
public BaseController()
{
Database = new DatabaseContext();
}
protected DatabaseContext Database { get; set; }
protected override void Dispose(bool disposing)
{
Database.Dispose();
base.Dispose(disposing);
}
}
All of the controllers in my application derive from BaseController and are used like this:
public class UserController : BaseController
{
[HttpGet]
public ActionResult Index()
{
return View(Database.Users.OrderBy(p => p.Name).ToList());
}
}
Now to answer your questions:
When should I make a new DbContext / should I have one global context
that I pass around?
The context should be created per request. Create the context, do what you need to do with it then get rid of it. With the base class solution I use you only have to worry about using the context.
Do not try and have a global context (this is not how web applications work).
Can I have one global Context that I reuse in all places?
No, if you keep a context around it will keep track of all the updates, additions, deletes etc and this will slow your application down and may even cause some pretty subtle bugs to appear in your application.
You should probably chose to either expose your repository or your Context to your controller but not both. Having two contexts being access from the same method is going to lead to bugs if they both have different ideas about the current state of the application.
Personally, I prefer to expose DbContext directly as most repository examples I have seen simply end up as thin wrappers around DbContext anyway.
Does this cause a performance hit?
The first time a DbContext is created is pretty expensive but once this has been done a lot of the information is cached so that subsequent instantiations are a lot quicker. you are more likely to see performance problems from keeping a context around than you are from instantiating one each time you need access to your database.
How is everyone else doing this?
It depends.
Some people prefer to use a dependency injection framework to pass a concrete instance of their context to their controller when it is created. Both options are fine. Mine is more suitable for a small scale application where you know the specific database being used isn't going to change.
some may argue that you can't know this and that is why the dependency injection method is better as it makes your application more resilient to change. My opinion on this is that it probably won't change (SQL server & Entity Framework are hardly obscure) and that my time is best spent writing the code that is specific to my application.
I try to answer out of my own experience.
1. When should I make a new DbContext / should I have one global context that I pass around?
The Context should be injected by the dependency-injection and should not be instantiated by yourself. Best-Practice is to have it created as a scoped service by the dependency-injection. (See my answer to Question 4)
Please also consider using a proper layered application structure like Controller > BusinessLogic > Repository. In this case it would not be the case that your controller receives the db-context but the repository instead. Getting injected / instantiating a db-context in a controller tells me that your application architecture mixes many responsibilities in one place, which - under any circumstances - I cannot recommend.
2. Can i have one global Context that I reuse in all places?
Yes you can have but the question should be "Should I have..." -> NO. The Context is meant to be used per request to change your repository and then its away again.
3. Does this cause a performance hit?
Yes it does because the DBContext is simply not made for being global. It stores all the data that has been entered or queried into it until it is destroyed. That means a global context will get larger and larger, operations on it will get slower and slower until you will get an out of memory exceptions or you die of age because it all slowed to a crawl.
You will also get exceptions and many errors when multiple threads access the same context at once.
4. How is everyone else doing this?
DBContext injected through dependency-injection by a factory; scoped:
services.AddDbContext<UserDbContext>(o => o.UseSqlServer(this.settings.DatabaseOptions.UserDBConnectionString));
I hope my answers where of help.
In performance point of view, DbContext should be created just when it is actually needed, For example when you need to have list of users inside your business layer,you create an instance form your DbContext and immediately dispose it when your work is done
using (var context=new DbContext())
{
var users=context.Users.Where(x=>x.ClassId==4).ToList();
}
context instance will be disposed after leaving Using Block.
But what would happen if you do not dispose it immediately?
DbContext is a cache in the essence and the more you make query the more memory blocks will be occupied.
It will be much more noticeable in case of concurrent requests flooding towards your application, in this case,each millisecond that you are occupying a memory block would be of the essence, let alone a second.
the more you postpone disposing unnecessary objects the more your application is closed to crash!
Of course in some cases you need to preserve your DbContext instance and use it in another part of your code but in the same Request Context.
I refer you to the following link to get more info regarding managing DbContext:
dbcontext Scope
You should dispose the context immediately after each Save() operation. Otherwise each subsequent Save will take longer. I had an project that created and saved complex database entities in a cycle. To my surprise, the operation became three times faster after I moved
using (var ctx = new MyContext()){...}
inside the cycle.
Right now I am trying this approach, which avoids instantiating the context when you call actions that don't use it.
public abstract class BaseController : Controller
{
public BaseController() { }
private DatabaseContext _database;
protected DatabaseContext Database
{
get
{
if (_database == null)
_database = new DatabaseContext();
return _database;
}
}
protected override void Dispose(bool disposing)
{
if (_database != null)
_database.Dispose();
base.Dispose(disposing);
}
}
This is obviously an older question but if your using DI you can do something like this and scope all your objects for the lifetime of the request
public class UnitOfWorkAttribute : ActionFilterAttribute
{
public override void OnActionExecuting(HttpActionContext actionContext)
{
var context = IoC.CurrentNestedContainer.GetInstance<DatabaseContext>();
context.BeginTransaction();
}
public override void OnActionExecuted(HttpActionExecutedContext actionContext)
{
var context = IoC.CurrentNestedContainer.GetInstance<DatabaseContext>();
context.CloseTransaction(actionContext.Exception);
}
}
I have an UnitOfWork attribute, something like this:
public class UnitOfWorkAttribute : ActionFilterAttribute
{
public IDataContext DataContext { get; set; }
public override void OnActionExecuted(ActionExecutedContext filterContext)
{
if (filterContext.Controller.ViewData.ModelState.IsValid)
{
DataContext.SubmitChanges();
}
base.OnActionExecuted(filterContext);
}
}
As you can see, it has DataContext property, which is injected by Castle.Windsor. DataContext has lifestyle of PerWebRequest - meaning single instance reused for each request.
Thing is, that from time to time I get DataContext is Disposed exception in this attribute and I remember that ASP.NET MVC 3 tries to cache action filters somehow, so may that causes the problem?
If it is so, how to solve the issue - by not using any properties and trying to use ServiceLocator inside method?
Is it possible to tell ASP.NET MVC to not cache filter if it does cache it?
I would strongly advice against using such a construct. For a couple of reasons:
It is not the responsibility of the controller (or an on the controller decorated attribute) to commit the data context.
This would lead to lots of duplicated code (you'll have to decorate lots of methods with this attribute).
At that point in the execution (in the OnActionExecuted method) whether it is actually safe to commit the data.
Especially the third point should have drawn your attention. The mere fact that the model is valid, doesn't mean that it is okay to submit the changes of the data context. Look at this example:
[UnitOfWorkAttribute]
public View MoveCustomer(int customerId, Address address)
{
try
{
this.customerService.MoveCustomer(customerId, address);
}
catch { }
return View();
}
Of course this example is a bit naive. You would hardly ever swallow each and every exception, that would just be plain wrong. But what it does show is that it is very well possible for the action method to finish successfully, when the data should not be saved.
But besides this, is committing the transaction really a problem of MVC and if you decide it is, should you still want to decorate all action methods with this attribute. Wouldn't it be nicer if you just implement this without having to do anything on the Controller level? Because, which attributes are you going to add after this? Authorization attributes? Logging attributes? Tracing attributes? Where does it stop?
What you can try instead is to model all business operations that need to run in a transaction, in a way that allows you to dynamically add this behavior, without needing to change any existing code, or adding new attributes all over the place. A way to do this is to define an interface for these business operations. For instance:
public interface ICommandHandler<TCommand>
{
void Handle(TCommand command);
}
Using this interface, your controller would look like this:
private readonly ICommandHandler<MoveCustomerCommand> handler;
// constructor
public CustomerController(
ICommandHandler<MoveCustomerCommand> handler)
{
this.handler = handler;
}
public View MoveCustomer(int customerId, Address address)
{
var command = new MoveCustomerCommand
{
CustomerId = customerId,
Address = address,
};
this.handler.Handle(command);
return View();
}
For each business operation in the system you define a class (a DTO and Parameter Object). In the example the MoveCustomerCommand class. This class contains merely the data. The implementation is defined in a class that implementation of the ICommandHandler<MoveCustomerCommand>. For instance:
public class MoveCustomerCommandHandler
: ICommandHandler<MoveCustomerCommand>
{
private readonly IDataContext context;
public MoveCustomerCommandHandler(IDataContext context)
{
this.context = context;
}
public void Handle(MoveCustomerCommand command)
{
// TODO: Put logic here.
}
}
This looks like an awful lot of extra useless code, but this is actually really useful (and if you look closely, it isn't really that much extra code anyway).
Interesting about this is that you can now define one single decorator that handles the transactions for all command handlers in the system:
public class TransactionalCommandHandlerDecorator<TCommand>
: ICommandHandler<TCommand>
{
private readonly IDataContext context;
private readonly ICommandHandler<TCommand> decoratedHandler;
public TransactionalCommandHandlerDecorator(IDataContext context,
ICommandHandler<TCommand> decoratedHandler)
{
this.context = context;
this.decoratedHandler = decoratedHandler;
}
public void Handle(TCommand command)
{
this.decoratedHandler.Handle(command);
this.context.SubmitChanges();
}
}
This is not much more code than your UnitOfWorkAttribute, but the difference is that this handler can be wrapped around any implementation and injected into any controller, without the controller to know about this. And directly after executing a command is really the only safe place where you actually know whether you can save the changes or not.
You can find more information about this way of designing your application in this article: Meanwhile... on the command side of my architecture
Today I've half accidently found the original issue of the problem.
As it is seen from the question, filter has property, that is injected by Castle.Windsor, so those, who use ASP.NET MVC know, that for that to work you need to have IFilterProvider implementation, which would be able to use IoC container for dependency injection.
So I've started to look at it's implementation, and noticed, that it is derrived from FilterAttributeFilterProvider and FilterAttributeFilterProvider has constructor:
public FilterAttributeFilterProvider(bool cacheAttributeInstances)
So you can control to cache or not your attribute instances.
After disabling this cache, site was blown with NullReferenceExceptions, so I was able to find one more thing, that was overlooked and caused undesired side effects.
Thing was, that original filter was not removed, after we added Castle.Windsor filter provider. So when caching was enabled, IoC filter provider was creating instances and default filter provider was reusing them and all dependency properties were filled with values - that was not clearly noticeable, except the fact, that filters were running twice, after caching was disabled, default provider was needed to create instances by it self, so dependency properties were unfilled, that's why NullRefereceExceptions occurred.
I have some questions about the desired lifetime of an Entity Framework context in an ASP.NET MVC application. Isn't it best to keep the context alive for the shortest time possible?
Consider the following controller action:
public ActionResult Index()
{
IEnumerable<MyTable> model;
using (var context = new MyEntities())
{
model = context.MyTable;
}
return View(model);
}
The code above won't work because the Entity Framework context has gone out of scope while the view renders the page. How would others structure the code above?
Let's get controversial!
I disagree with the general MVC + EF consensus that keeping a context alive throughout the entire request is a good thing for a number of reasons:
Low performance increase
Do you know how expensive creating a new database context is? Well... "A DataContext is lightweight and is not expensive to create" that's from MSDN
Get the IoC wrong and it'll seem fine.. until you go live
If you set up your IoC container to dispose of your context for you and you get it wrong, you really really get it wrong. I've twice now seen massive memory leaks created from an IoC container not always disposing of a context correctly. You won't realise you've set it up wrong until your servers start crumbling during normal levels of concurrent users. It won't happen in development so do some load tests!
Accidental lazy loading
You return an IQueryable of your most recent articles so that you can list them on your homepage. One day someone else is asked to show the number of comments next to the respective article. So they add a simple bit of code to the View to show the comment count like so...
#foreach(var article in Model.Articles) {
<div>
<b>#article.Title</b> <span>#article.Comments.Count() comments</span>
</div>
}
Looks fine, works fine. But actually you didn't include the comments in your returned data so now this will make a new database call for each article in the loop. SELECT N+1 issue. 10 article = 11 database calls. Okay so the code is wrong but it is an easy mistake to make so it will happen.
You can prevent this by shutting your context down in you data layer. But won't the code break with a NullReferenceException on the article.Comments.Count() ? Yes it will so it will force you to edit the Data layer to get the data needed for the View layer. This is how is should be.
Code smell
There is just something wrong about hitting the database from your View. You know that an IQueryable hasn't actually hit the database yet right so forget that object. Make sure your database is hit before it leaves your data layer.
So the answer
Your code should be (in my opinion) like this
DataLayer:
public List<Article> GetArticles()
{
List<Article> model;
using (var context = new MyEntities())
{
//for an example I've assumed your "MyTable" is a table of news articles
model = (from mt in context.Articles
select mt).ToList();
//data in a List<T> so the database has been hit now and data is final
}
return model;
}
Controller:
public ActionResult Index()
{
var model = new HomeViewModel(); //class with the bits needed for you view
model.Articles = _dataservice.GetArticles(); //irrelevant how _dataService was intialised
return View(model);
}
Once you have done this and understand this then perhaps you can begin experimenting with having an IoC container handle context but definitely not before. Head my warning - I've seen two large scale failures :)
But honestly do what you like, programming is fun and should be a matter of preference. I'm just telling you mine. But whatever you do, don't start using IoC context per controller or per request just because "all the cool kids are doing it." Do it because you really truly care about it's benefits and understand how it's done correctly.
I agree with one context per request, we normally do this by binding the context .InRequestScope using Ninject, which works really well, is:
Bind<MyContext>().ToSelf().InRequestScope();
Also its really good practice to enumerate the set as close to the query as possible ie:
public ActionResult Index()
{
IEnumerable<MyTable> model;
using (var context = new MyEntities())
{
model = (from mt in context.MyTable
select mt).ToArray();
}
return View(model);
}
this will help you avoid augmenting the query unintentionally from your view.
Firstly, you should consider putting your database access to separate classes.
Secondly, my favorite solution to this is to use "one context per request" (if you're using MVC, I believe it's one context per controller).
Requested edit:
Have a look at this answer, maybe it will help you too. Please note I'm using webforms, so can't verify it in MVC at the moment, but it may be helpful for you or at least give you some pointers. https://stackoverflow.com/a/10153406/1289283
Some example usage of this dbcontext:
public class SomeDataAccessClass
{
public static IQueryable<Product> GetAllProducts()
{
var products = from o in ContextPerRequest.Current.Products
select o;
return products;
}
}
Then you can do something like this:
public ActionResult Index()
{
var products = SomeDataAccessClass.GetProducts();
return View(products);
}
Simple, right? You don't have to worry about disposing your context anymore, you write only the code you really need.
Some people like to further spice things up a little bit by adding UnitOfWork pattern, or maybe IoC containers... But I like this approach more because of its simplicity.
Can you utilize LINQ's .ToList() extension method as such:
public ActionResult Index()
{
IEnumerable<MyTable> model;
using (var context = new MyEntities())
{
model = (from mt in context.MyTable
select mt).ToList();
}
return View(model);
}