Generally the work with EF for an looks like this:
Some Web controller calls for a method that has something like this:
using (var ctx = new EntityContext())
{
...
return something;
}
But I guess that in highly loaded application, that has several thousand requests per minute, it might create the problems.
So my question is: maybe it makes sense to manually open connection and keep it alive?
If yes, can anybody share the proper piece of code for such the task?
Thanks in advance
No, don't try to keep opened connections alive. It just won't work, see https://stackoverflow.com/a/9416275/870604.
You're building an ASP.Net MVC application, so the pattern to follow to work with Entity Framework is quite simple in this use case: instantiate a new context for each new controller instance (for example using dependency injection). As a new controller instance is created for each user request, you'll have one new fresh EF context for each request, which is highly desirable.
Don't worry about creating a lot of context instances, it won't create a new DB connection each time thanks to connection pooling.
If you want your application to be scalable, your best option is to also use the async pattern if your version of Entity Framework supports it. See http://blogs.msdn.com/b/webdev/archive/2013/09/18/scaffolding-asynchronous-mvc-and-web-api-controllers-for-entity-framework-6.aspx
EDIT:
Have a look to this overview of the unit of work pattern with ASP.Net MVC and EF.
After closing tag of using element, it automatically close the connection. So you don't want to worry about it. But if you really want to open connection manually, try following (I couldn't test it. May be you need to give connection settings).
ctx.Database.Connection.Open();
Related
We're working on a project using ASP.NET MVC4. In one of team's meeting, came out an idea of using Session per request
pattern.
I did a little search and found out some questions here in SO saying - in general - that this pattern (if may be called) it's indicated to frameworks ORM.
A little example
//GET Controller/Test
public ActionResult Test()
{
//open database connection
var model = new TestViewModel
{
Clients = _clientService.GetClients(),
Products = _productService.GetProducts()
};
//close database connection
return View(model);
}
Without session per request:
//GET Controller/Test
public ActionResult Test()
{
var model = new TestViewModel
{
Clients = _clientService.GetClients(), // Open and close database connection
Products = _productService.GetProducts() // Open and close database connection.
};
return View(model);
}
Doubts
To contextualize, how does session per request works?
Is it a good solution?
What is the best way to implement it? Open the connection on web?
Is it recommended in projects with complex queries / operations?
Is there a possibility of giving a concurrency problem when transactions are involved?
Looks like you mean "DB context per request". You can achieve it with Unit of work pattern.
Simple implementation of that you can check int this article of Radu Pascal: https://www.codeproject.com/Articles/243914/Entity-Framework-context-per-request
Another implementation (for Entity Framework and NHibernate), you can find in ASP.NET Boilerplate which is more complex: http://www.aspnetboilerplate.com/Pages/Documents/Unit-Of-Work
In a web (web application, wcf, asp.net web api) it is a good idea to use one DB context per request. Why? Because the requests are short lived, at least that is the idea or your application will have a slow response time, so there is no point in creating many db contexts.
For example, if you are using EF as ORM and you issue a request to Find method, EF will first search for whatever you are asking in the local cache of the db context. If it is found, it will simply return it. If
not found, it will go to the database and pull it out and keep it in the cache. This can be really beneficial in scenarios where you query for the same items multiple times until your web application has fulfilled the request. If you create a context, query something, close the context then there is a possibility you will make many trips to the database which could be avoided.
To elaborate further, imagine you create many new records: a customer record, an order record, then do some work and then based on whatever criteria you create some discount records for the customer, then some other records, and then some orderitem records. If you use the Single Context Per-Request approach, you can keep adding them and call SaveChanges at the end. EF will do this in one transaction: either they all succeed or every thing is rolled back. This is great because you are getting transactional behavior without even creating transactions. If you do it without Single Context Per-Request approach, then you need to take care of such things yourself. That does not mean in the Single approach, everything needs to be in one transaction: You can call SaveChanges as many times as you want within the same http request. Consider other possibilities, where you pull a record, and later on decide to edit the record and then edit it some more: again in the Single approach, it will all be applied to the same object and then saved in one shot.
In addition to the above, if you still want to read more then you may find this helpful. Also, if you search for Single Context Per-Request, you will find many articles.
Working on a WPF application using MVVM and powered by Entity Framework. We were very keen to allow users to multi-window this app, for usability purposes. However, that has the potential to cause problems with EF. If we stick to the usual advice of creating one copy of the Repository per ViewModel and someone opens multiple windows of the same ViewModel, it could cause "multiple instances of IEntityChangeTracker" errors.
Rather than go with a Singleton, which has its own problems, we solved this by putting a Refresh method on the repository that gets a fresh data context. Then we do things like this all over the shop:
using (IRepository r = Rep.Refresh())
{
r.Update(CurrentCampaign);
r.SaveChanges();
}
Which is mostly fine. However, it causes problems with maintaining state. If the context is refreshed while a user is working with an object, their changes will be lost.
I can see two ways round this, both of which have their own drawbacks.
We call SaveChanges constantly. This has the potential to slow down the application with constant database calls. Plus there are occasions when we don't want to store incomplete objects.
We copy EF objects into memory when loaded, have the user work with those, and then add a "Save" button which copies all the objects back to the EF object and saves. We could do this with an automapper, but is still seems unnecessary faff.
Is there another way?
I believe having the repository for accessing entity framework as a singleton may not always be wrong.
If you have a scenario were you have a client side repository, i.e. a repository which is part of the executable of the client application, i.e. is used by one client, then a singleton might be ok. Of course I would not use a singleton on the server side.
I asked Brian Noyes (Microsoft MVP) a similar question on a "MVVM" course on the pluralsight website.
I asked: "What is the correct way to dispose of client services which are used in the view model?"
And in his response he wrote: "...most of my client services are Singletons anyway and live for the life of the app."
In addition having a singleton does not prevent you from writing unit tests, as long as you have an interface for the singleton.
And if you use a dependency injection framework (see Marks comment), which is a good idea for itself, changing to singleton instantiation is just a matter of performing a small change in the setup of the injection container for the respective class.
I have windows forms app and server side services based on ADO.NET dataservice.
Is it a bad practice to create and initialize one static dataservice client in windows app and use it across the program? For example i can use it in all opened forms(which have bindings to service's datacontext's objects) to call SaveChanges() and not loose tracking.. Or creating a service client instance for every new form is better(because i think after some time with one static client there will be huge memory growth)? But when i create a new client for every form, i assume i create a new connection to the service every time..
May be im wrong and a bit confused about using services in client application. Please help me to understand the right way it works.
Actually the DataServiceContext class doesn't create a connection to the service. The OData protocol it uses is based on REST and as such it's stateless. So creation of the context alone doesn't even touch the service. Each operation (query, save changes) issues a separate and standalone request to the service. From the point of view of the service it's just number of unrelated requests.
As noted above it's usually a good idea to have a separate context for each "section" of your application. What is that exactly depends on your app. If you are not going to load/track huge number of entities (1000s at least) then one context might be fine. On the other hand several contexts give you the ability to "cancel" the update operations by simply droping the context and not calling SaveChanges, which might be handy in some applications.
I would say: It depends. ;) Well your problem is familiar to the decison you have to make, when using directly Entity Framework. So I recommend you to search for such articles and extract their point.
My own experience with EF tells me, that an application with several workflows should have a context for every workflow. Especially, when more than one workflow can be started at the same time and the user can switch between them.
If the application is simple it's proper approach to use only one context.
Brief introduction:
I have this ASP.NET Webforms site with the particularity that it doesn't have only 1 database, it has many.
Why? Because you can create new "instances" of the site on-the-fly. Every "instance" share the same codebase, but has its own database. These all databases have the same schema (structure) but of course different data. Don't ask 'why don't you put everything in one database and use InstanceId to know which is" because it's a business policy thing.
The application knows which instance is being requested because of the url. There is one extra database to accomplish this (I do know its connection string in design time). This database has only 2 tables and associates urls to 'application instances'. Then, of course, each 'application instance' has its associated connection string.
Current situation: There is nothing being used right now to help us with the job of mantaining every instance database in sync (propagating schema changes to every one). So we are doing it by hand, which of course it's a total mess.
Question: I'd like to use a rails-migration way to handle schema changes, preferably migratordotnet, but could use any other if it's easier to setup.
The problem is that migratordotnet needs the connnection string to be declare in the proj.build file and I don't know them until runtime.
What it would be REALLY useful is some kind of method running on Application_Start that applies the latest migration to every database.
How could this be done with migratordotnet or any similar? Any other suggestion is thanksfully welcomed.
Thank you!
Since this is an old question, I assume that you have solved the problem in some manner or another, but I'll post a solution anyway for the benefit of other people stumbling across this question. It is possible to invoke MigratorDotNet from code, rather than having it as an MSBuild target:
public static void MigrateToLastVersion(string provider, string connectionString)
{
var silentLogger = new Logger(false, new ILogWriter[0]);
var migrator = new Migrator.Migrator(provider, connectionString,
typeof(YourMigrationAssembly).Assembly, false, silentLogger);
migrator.MigrateToLastVersion();
}
RedGate has a SQL Comparison SDK that could be used. Here is a Case Study that looks promising, but I can't tell you anthing from experience as I haven't used it. Download the trial and kick the tires.
You could use Mig# to maintain your migrations in your C# or .NET code: https://github.com/dradovic/MigSharp
Check out Fluent-Migrator.
We are undergoing a migration from classic ASP with SQL and Sprocs. Our choice fell upon c#.net 4 Webforms with Entity Framework 4.
My question is how to handle the context. Example:
Calling repository function GetProductById(), which open up a new context (using) and then we change something on the object and we save it.
When we save it we wont be in the same context as when we fetched the object.
The above didn't really work for us. We then tried to send the context around in our application. While this worked we didn't really want to work this way if we didn't have to.
We are currently using a third option, storing the current context in a global variable. We dispose the context when we have saved it. However we're not sure if this is a viable way in the long run or if we're to hit a wall with this method.
We've tried to search for best practices on this topic but not really been able to find any. Would appreciate and help on this topic.
The Unit of Work pattern described in this article might help, although the examples in the article are with an MVC app rather than an WebForms one.
http://msdn.microsoft.com/en-us/ff714955.aspx
Use object keys
When you get an object from the DB and then manipulate it and after some time (after context has been discarded) want to save it back, use primary key values.
create a new instance of the correct entity object and set key properties (and those that have to be changes) and then save it against newly created context.
Or you can get it again before saving. It's a waste of resources but it is the most bullet proof way of doing it. Eve though I wouldn't recomend it.
Using ASP? Go with MVC then
I highly recomend you rather switch to Asp.net MVC if you're used to ASP, then you'll be better and easier at home in MVC.
Another option available to you is to have the context be put on the thread request. This is done by creating the context on the BeginRequest event of a HTTPModule. You then need to be sure you handle any resources you have created that need to be disposed of in the EndRequest event.
After getting a label on our way to work (Unit of work pattern) we found this link that is pretty much exactly how we work and it might be helpful for anyone with the same thoughts as we had:
http://dotnet.dzone.com/news/using-unit-work-pattern-entity?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+zones%2Fdotnet+%28.NET+Zone%29