I would like to learn more about creating a Database Entity (Connection?).
Coming from PHP/MySQL, i was creating only one connection and was reusing that over and over using a connection pool.
I noticed in MVC, I create new db entity almost every chance I get. Is this really the correct way to do so in real world example?
For example, I have a code that tells the user how many unread messages they have left on every refresh/page view. It goes like this:
public int UnreadMessages()
{
using (dbEntities db = new dbEntities())
{
return db.messages.Select(M => M.status == "Unread").Count();
}
}
On my _Layout.html, I have a line that calls this code. So, this is being executed on every request. The way I look at it, this is terrible way of doing it because I keep creating a new connection? or maybe this is the way it was supposed to be done on MVC.
Could someone please explain to me, the best way of doing this? or maybe provide some links that may help me understand this better?
P.S. I am also not too sure how db connection on MVC works. Wether 1 connection is made and a new db entity(Not a connection, rather just a call?) is created on requests or a new brand new connection is made on requests.
Two things, Entity framework uses underlying ADO.NET which supports powerful connection pooling, and connections to database are closed instantly by context. So you don't need to worry about connection pooling.
However, it is not good idea to create and destroy context every time for single operation. Ideally only one context should be created for entire lifecycle of a request. Since creating and destroying context is little costly it does affect performance at high load.
Controller has OnDispose method, and this is how you can easily implement it,
public abstract class DBController : Controller {
public MyDbContext DbContext { get; private set; }
public DBController() {
DbContext = new ...
HttpContext.Items["DbContext"] = DbContext;
}
protected override void OnDispose() {
DbContext.Dispose();
}
}
And your every Controller should be derived from DBController. And in Layout file you can use same context by retrieving HttpContext.Items["DbContext"]
This way same context will be used for entire request. And yes, for every request new context will be created. EF is not designed to be thread safe and should not be reused for different requests.
In the mvc world, views (including layout) should only use data from the model or include partial views with RenderAction() that get their models from other actions.
You ask about connections and EF though, and while opening and disposing objects frequently isn't great you need to understand that EF has its own connection pool, so if your action calls a bunch of methods that all create and dispose their own dbEntities() object, only one connection to the actual database will be used.
In my opinion, it's recommended to use using to create new instance as it will automatically close connection after the connection and dispose the instance.
If you want to use a Global variable, you need to make sure to open and close db connection in each method, then it still be fine.
However, the bad thing that you are doing is to call Database connection from your _Layout.html, that is the view, should only render the view; not to connect to DB.
Related
I came in late to a project developed with asp.net mvc, EF6 and SQL Server. The code is a mess and they use linq everywhere to retrieve data. Now they are about to add a second SQL Server for a different country/language. I can't seem to get my head around the DbContext / IDbConnectionFactory stuff but can I do it the easy way here and override something, somewhere, so I can insert the right connections string depending on a selected culture?
To simplify:
I want to override DbContext so when it is created I can insert a connection string depending on a session value (for example) without specifying a connection string in the constructor.
How would I achieve this? Doing it the right way of course..
Thx!
/Mike
Your DBContext must have been created as a partial class. Create you second partial class next to it, and add a static constructor method there
public static DBContext Create()
Implement the logic inside it, and use this constructor everywhere you need context to be created. This will be kind of factory method. You can define your connection strings in config file and simply create context based on connection string name depending on certain conditions.
What you're trying to achieve is, to some extend, a database sharding architecture.
There are multiple resources on the internet you may want to have a look at to get acquainted with db shards...
i.e. http://www.4tecture.ch/blog/sql-azure-federations-with-entity-framework
I have a problem with EF not returning the newest data in a 3 layered WPF application, and I suspect it has something to do with how I handle the lifetime of my context. This is the scenario:
There are several repositories wrapped inside a UnitOfWork. There is also one service (MyService), which uses the UnitOfWork. This UnitOfWork must also be called from the UI directly, without passing through a service.
In the ViewModel of my main window at some point I create a new window (using ViewModel first):
var dialog = new DialogViewModel(_eventAggregator, _unitOfWork, Container.Resolve<CarService>());
This main window ViewModel has a UnitOfWork, which has been injected in the constructor, and that is passed to the DialogViewModel.
CarService's constructor also needs a UnitOfWork, which is also injected in its constructor:
public CarService(IUnitOfWork unitOfWork){
_unitOfWork = unitOfWork;
}
When CarService is used in DialogViewModel to make a query to retrieve some data and make some updates, it works fine the first time. However, when the same query is made the next time to retrieve that data, instead of returning the newest modified one it returns the old/cached one. The query using UnitOfWork (inside CarService) looks like this:
var values = _unitOfWork.GarageRepository.GetSomeValues();
_unitOfWork.GarageRepository.MakeSomeChangesToTheValuesUsingStoredProcedure();
The second time this is called, values doesn't contain the newest version of the data; however it has been updated successfully in the DB.
I'm doing DI using Unity, and this is how my container looks like:
public class Container
{
public static UnityContainer Container = new UnityContainer();
// Called once in the AppBoostraper, as soon as the GUI application starts
public void BuildUp()
{
Container.RegisterType<IUnitOfWork, UnitOfWork>();
Container.RegisterType<ICarService, CarService>();
}
}
Why isn't the right data being returned, and how can I fix it?
I finally found the problem, which had to do with my management of the unitOfWork/dbcontext lifecycle.
I was loading some entities, then updating them with a stored procedure (so the entities in the code were not up to date anymore), and then loading the queries again; at this point EF was getting the values from the cache rather than from the DB.
I found two ways of fixing this:
A rather "hacky" one, force the entities to reload:
Context.Entry(entity).Reload();
Encapsulate the unitOfWork usage with a using, so that the context is disposed at the end of each transaction and thus getting fresh data the next time. I think this is more in line with what the UnitOfWork is meant for and feels more robust to me. I've also wrapped the UnitOfWork in a factory, so now that gets injected in the constructors.
using (var uOw = new unitOfWorkFactory.GetNew())
{
// make the queries
}
The default LifetimeManager for Unity is the TransientLifetimeManager, which means you get a new instance each time it resolves (including when injected). So given your registrations, you'll get a new CarService, with a new instance of a UnitOfWork each time you call Resolve(), and a new, different instance injected into your main window ViewModel.
So your ViewModel gets a UoW, the CarService gets a separate UoW, and updating one will mean the other is now out of date due to caching.
What you need to do is set up a LifetimeManager for the context that has appropriate scope, or defer to a factory. Unity doesn't have that many LMs built in, but the LifetimeManager class is basically a glorified map (has a Set, Get, and Remove method essentially).
I don't know enough about WPF and its lifetimes to suggest an implementation. Maybe it can be singleton (which will keep the same context the entire time the program is running), maybe it can be backed by a thread's CallContext.
Your other option is to pass along the UoW instance when you resolve the CarService by calling Container.Resolve<CarService>(new ParameterOverride("unitOfWork", _unitOfWork)). That'll keep the lifecycle management tied to the lifetime of the main window ViewModel. However, this approach has problems because your VM class knows a little too much about the CarService (notably, that it has a UoW in it).
I am currently using a DbContext similar to this:
namespace Models
{
public class ContextDB: DbContext
{
public DbSet<User> Users { get; set; }
public DbSet<UserRole> UserRoles { get; set; }
public ContextDB()
{
}
}
}
I am then using the following line at the top of ALL my controllers that need access to the database. Im also using it in my UserRepository Class which contains all methods relating to the user (such as getting the active user, checking what roles he has, etc..):
ContextDB _db = new ContextDB();
Thinking about this, there are occasions when one visitor can have multiple DbContexts active, for instance if it is visiting a controller that uses the UserRepository, which might not be the best of ideas.
When should I make a new DbContext? Alternatively, should I have one global context that is passed around and reused in all places? Would that cause a performance hit? Suggestions of alternative ways of doing this are also welcome.
I use a base controller that exposes a DataBase property that derived controllers can access.
public abstract class BaseController : Controller
{
public BaseController()
{
Database = new DatabaseContext();
}
protected DatabaseContext Database { get; set; }
protected override void Dispose(bool disposing)
{
Database.Dispose();
base.Dispose(disposing);
}
}
All of the controllers in my application derive from BaseController and are used like this:
public class UserController : BaseController
{
[HttpGet]
public ActionResult Index()
{
return View(Database.Users.OrderBy(p => p.Name).ToList());
}
}
Now to answer your questions:
When should I make a new DbContext / should I have one global context
that I pass around?
The context should be created per request. Create the context, do what you need to do with it then get rid of it. With the base class solution I use you only have to worry about using the context.
Do not try and have a global context (this is not how web applications work).
Can I have one global Context that I reuse in all places?
No, if you keep a context around it will keep track of all the updates, additions, deletes etc and this will slow your application down and may even cause some pretty subtle bugs to appear in your application.
You should probably chose to either expose your repository or your Context to your controller but not both. Having two contexts being access from the same method is going to lead to bugs if they both have different ideas about the current state of the application.
Personally, I prefer to expose DbContext directly as most repository examples I have seen simply end up as thin wrappers around DbContext anyway.
Does this cause a performance hit?
The first time a DbContext is created is pretty expensive but once this has been done a lot of the information is cached so that subsequent instantiations are a lot quicker. you are more likely to see performance problems from keeping a context around than you are from instantiating one each time you need access to your database.
How is everyone else doing this?
It depends.
Some people prefer to use a dependency injection framework to pass a concrete instance of their context to their controller when it is created. Both options are fine. Mine is more suitable for a small scale application where you know the specific database being used isn't going to change.
some may argue that you can't know this and that is why the dependency injection method is better as it makes your application more resilient to change. My opinion on this is that it probably won't change (SQL server & Entity Framework are hardly obscure) and that my time is best spent writing the code that is specific to my application.
I try to answer out of my own experience.
1. When should I make a new DbContext / should I have one global context that I pass around?
The Context should be injected by the dependency-injection and should not be instantiated by yourself. Best-Practice is to have it created as a scoped service by the dependency-injection. (See my answer to Question 4)
Please also consider using a proper layered application structure like Controller > BusinessLogic > Repository. In this case it would not be the case that your controller receives the db-context but the repository instead. Getting injected / instantiating a db-context in a controller tells me that your application architecture mixes many responsibilities in one place, which - under any circumstances - I cannot recommend.
2. Can i have one global Context that I reuse in all places?
Yes you can have but the question should be "Should I have..." -> NO. The Context is meant to be used per request to change your repository and then its away again.
3. Does this cause a performance hit?
Yes it does because the DBContext is simply not made for being global. It stores all the data that has been entered or queried into it until it is destroyed. That means a global context will get larger and larger, operations on it will get slower and slower until you will get an out of memory exceptions or you die of age because it all slowed to a crawl.
You will also get exceptions and many errors when multiple threads access the same context at once.
4. How is everyone else doing this?
DBContext injected through dependency-injection by a factory; scoped:
services.AddDbContext<UserDbContext>(o => o.UseSqlServer(this.settings.DatabaseOptions.UserDBConnectionString));
I hope my answers where of help.
In performance point of view, DbContext should be created just when it is actually needed, For example when you need to have list of users inside your business layer,you create an instance form your DbContext and immediately dispose it when your work is done
using (var context=new DbContext())
{
var users=context.Users.Where(x=>x.ClassId==4).ToList();
}
context instance will be disposed after leaving Using Block.
But what would happen if you do not dispose it immediately?
DbContext is a cache in the essence and the more you make query the more memory blocks will be occupied.
It will be much more noticeable in case of concurrent requests flooding towards your application, in this case,each millisecond that you are occupying a memory block would be of the essence, let alone a second.
the more you postpone disposing unnecessary objects the more your application is closed to crash!
Of course in some cases you need to preserve your DbContext instance and use it in another part of your code but in the same Request Context.
I refer you to the following link to get more info regarding managing DbContext:
dbcontext Scope
You should dispose the context immediately after each Save() operation. Otherwise each subsequent Save will take longer. I had an project that created and saved complex database entities in a cycle. To my surprise, the operation became three times faster after I moved
using (var ctx = new MyContext()){...}
inside the cycle.
Right now I am trying this approach, which avoids instantiating the context when you call actions that don't use it.
public abstract class BaseController : Controller
{
public BaseController() { }
private DatabaseContext _database;
protected DatabaseContext Database
{
get
{
if (_database == null)
_database = new DatabaseContext();
return _database;
}
}
protected override void Dispose(bool disposing)
{
if (_database != null)
_database.Dispose();
base.Dispose(disposing);
}
}
This is obviously an older question but if your using DI you can do something like this and scope all your objects for the lifetime of the request
public class UnitOfWorkAttribute : ActionFilterAttribute
{
public override void OnActionExecuting(HttpActionContext actionContext)
{
var context = IoC.CurrentNestedContainer.GetInstance<DatabaseContext>();
context.BeginTransaction();
}
public override void OnActionExecuted(HttpActionExecutedContext actionContext)
{
var context = IoC.CurrentNestedContainer.GetInstance<DatabaseContext>();
context.CloseTransaction(actionContext.Exception);
}
}
I have some code which forms a database-centric class which performs CRUD operation. Insert() and Select() methods use the same connection string. At the moment, both methods are repetitive by repeating the standard bit of setting up a SqlConnection.
How best should this be refactored? Should I have a property for SqlConnection?
Thanks
Pull all your DB operations out into a single class, and pass the class to the objects that need it. You can do this via constructor injection (each new object gets an IDBProvider passed to it which it then uses for database operations).
Something like this:
public interface IDBProvider {
// ... list of DB operations you care about
List<Products> GetProducts(string vendor)
}
public class SomeWorkerClass {
private IDBProvider dbConnection;
public SomeWorkerClass(IDBProvider dbProvider) {
dbConnection = dbProvider;
}
public void SomeFunction() {
List<Products> = dbConnection.GetProducts("test");
}
}
There are lots of frameworks that do this kind of stuff for you, like NHibernate, but in some cases its just as easy to roll your own (upgrading existing code, organizations that dont want external framework dependencies, etc).
I usually do it one of two ways.
A class that contains static methods and properties to connect to a database. I can then use it in any other class I need.
A SqlConnection property in a class that connects to databases. I then inject the connection from the controlling class when I need it.
By far the first option is used most frequently. The only issue is that if the database server changes, I need to recompile the class. We only have one database server, though, and it doesn't change that often, so it really isn't that much of an issue.
The good thing about the second option is that it's a lot more flexible; if the server, user, or password changes it's as simple as updating that information in the controlling class.
This depends on your application. As far as I'm concerned: No. You might want to add a new method to set up the connection (that might even be a good idea), but a connection as field/property sounds like "I open a connection for the whole lifetime of my form", which is evil in my world.
Create a helper method and write something like
using (var connection = CreateSqlConnection()) {
// Do the operations here
}
I'm currently writing a data access layer for an application. The access layer makes extensive use of linq classes to return data. Currently in order to reflect data back to the database I've added a private data context member and a public save method. The code looks something like this:
private DataContext myDb;
public static MyClass GetMyClassById(int id)
{
DataContext db = new DataContext();
MyClass result = (from item in db.MyClasss
where item.id == id
select item).Single();
result.myDb = db;
return result;
}
public void Save()
{
db.SubmitChanges();
}
That's a gross over simplification but it gives the general idea. Is there a better way to handle that sort of pattern? Should I be instantiating a new data context every time i want to visit the db?
It actually doesn't matter too much. I asked Matt Warren from the LINQ to SQL team about this a while ago, and here's the reply:
There are a few reasons we implemented
IDisposable:
If application logic needs to hold
onto an entity beyond when the
DataContext is expected to be used or
valid you can enforce that contract by
calling Dispose. Deferred loaders in
that entity will still be referencing
the DataContext and will try to use it
if any code attempts to navigate the
deferred properties. These attempts
will fail. Dispose also forces the
DataContext to dump its cache of
materialized entities so that a single
cached entity will not accidentally
keep alive all entities materialized
through that DataContext, which would
otherwise cause what appears to be a
memory leak.
The logic that automatically closes
the DataContext connection can be
tricked into leaving the connection
open. The DataContext relies on the
application code enumerating all
results of a query since getting to
the end of a resultset triggers the
connection to close. If the
application uses IEnumerable's
MoveNext method instead of a foreach
statement in C# or VB, you can exit
the enumeration prematurely. If your
application experiences problems with
connections not closing and you
suspect the automatic closing behavior
is not working you can use the Dispose
pattern as a work around.
But basically you don't really need to dispose of them in most cases - and that's by design. I personally prefer to do so anyway, as it's easier to follow the rule of "dispose of everything which implements IDisposable" than to remember a load of exceptions to it - but you're unlikely to leak a resource if you do forget to dispose of it.
Treat your datacontext as a resource. And the rule of using resource says
"acquire a resource as late as
possible, release it as soon as its
safe"
DataContext is pretty lightweight and is intended for unit of work application as you are using it. I don't think that I would keep the DataContext in my object, however. You might want to look at repository patterns if you aren't going to use the designer generated code to manage your business objects. The repository pattern will allow you to work with your objects detached from the data context, then reattach them before doing updates, etc.
Personally, I'm able to live with the DBML designer generated code for the most part, with partial class implementations for my business and validation logic. I also make the designer-generated data context abstract and inherit from it to allow me to intercept things like stored-procedure and table-valued function methods that are added directly to the data context and apply business logic there.
A pattern that I've been using in ASP.NET MVC is to inject a factory class that creates appropriate data contexts as needed for units of work. Using the factory allows me to mock out the data context reasonably easy by (1) using a wrapper around the existing data context class so that it's mockable (mock the wrapper since DataContext is not easily mockable) and (2) creating Fake/Mock contexts and factories to create them. Being able to create them at will from a factory makes it so that I don't have to keep one around for long periods of time.