I have a MVC application with all Ninject stuff wired up properly. Within the application I wanted to add functionality to call a WCF service, which then sends bulk messages (i.e. bulk printing) to RabbitMQ queue .
A 'processor' app subscribes to messages in the queue and process them. This is where I also want to update some stuff in the database, so I want all my services and repositories from the MVC app to be available too.
The processor app implements the following:
public abstract class KernelImplementation
{
private IKernel _kernel;
public IKernel Kernel
{
get
{
if (_kernel != null)
return _kernel;
else
{
_kernel = new StandardKernel(new RepositoryModule(),
new DomainModule(),
new ServiceModule(),
new MessageModule());
return _kernel;
}
}
}
}
All Ninject repository bindings are specified within RepositoryModule, which is also used within MVC app and look like this:
Bind<IReviewRepository>().To<ReviewRepository>().InCallScope();
The processor class
public class Processor : KernelImplementation
{
private readonly IReviewPrintMessage _reviewPrintMessage;
public Processor()
{
_reviewPrintMessage = Kernel.Get<IReviewPrintMessage>();
[...]
_bus.Subscribe<ReviewPrintContract>("ReviewPrint_Id",
(reviewPrintContract) => _reviewPrintMessage.ProcessReviewPrint(reviewPrintContract));
//calling ProcessReviewPrint where I want my repositories to be available
}
}
Everything works fine until I update the database from the MVC app or database directly. The processor app doesn't know anything about those changes and the next time it tries to process something, it works on a 'cached' DbContext. I'm sure it's something to do with not disposing the DbContext properly, but I'm not sure what scope should be used for a console app (tried all sort of different scopes to no avail).
The only solution I can think of at the moment is to call WCF service back from the processor app and perform all the necessary updates within the service, but I would want to avoid that.
UPDATE: Adding update logic
Simplified ReviewPrintMessage:
public class ReviewPrintMessage : IReviewPrintMessage
{
private readonly IReviewService _reviewService;
public ReviewPrintMessage(IReviewService reviewService)
{
_reviewService = reviewService;
}
public void ProcessReviewPrint(ReviewPrintContract reviewPrintContract)
{
var review =
_reviewService.GetReview(reviewPrintContract.ReviewId);
[...]
//do all sorts of stuff here
[...]
_reviewService.UpdateReview(review);
}
}
UpdateReview method in ReviewService:
public void UpdateTenancyAgreementReview(TenancyAgreementReview review)
{
_tenancyAgreementReviewRepository.Update(review);
_unitOfWork.Commit();
}
RepositoryBase:
public abstract class EntityRepositoryBase<T> where T : class
{
protected MyContext _dataContext;
protected EntityRepositoryBase(IDbFactory dbFactory)
{
this.DbFactory = dbFactory;
_dbSet = this.DataContext.Set<T>();
}
[...]
public virtual void Update(T entity)
{
try
{
DataContext.Entry(entity).State = EntityState.Modified;
}
catch (Exception exception)
{
throw new EntityException(string.Format("Failed to update entity '{0}'", typeof(T).Name), exception);
}
}
}
Context itself is bound like this:
Bind<MyContext>().ToSelf().InCallScope();
From the description of scopes I thought that Transient scope was the right choice, but as I said earlier I tried all sorts including RequestScope, TransientScope, NamedScope and even Singleton (although I knew it wouldn't be desired behaviour), but none of them seem to be disposing the context properly.
What you'll need is one DbContext instance per transaction.
Now other "applications" like web-applications or wcf-service may be doing one transaction per request (and thus use something like InRequestScope(). Also note, that these application create an object graph for each request. However, that is a concept unknown to your console application.
Furthermore, scoping only affects the instantiation of objects. Once they are instantiated, Scoping does not have any effect on them.
So one way to solve your issue would be to create the (relevant) object tree/graph per transaction and then you could use InCallScope() (InCallScope really means "once per instantiation of an object graph", see here).
That would mean that you'd need a factory for IReviewPrintMessage (have a look at ninject.extensions.factory) and create an instance of IReviewPrintMessage every time you want to execute IReviewPrintMessage.ProcessReviewPrint.
Now you have re-created the "per request pattern".
However, regarding CompositionRoot this is not recommended.
Alternative: you can also only re-create the DbContext as needed. Instead of passing it around everywhere (DbContext as additional parameter on almost every method) you use a SynchronizationContext local storage (or if you don't use TPL/async await: a ThreadLocal). I've already described this method in more detail here
Related
In my asp mvc Framework project, using EF, I have some objects (from classes) whose fields store data coming from my database.
My question is :
How to populate these fields, or manage methods of these objects using a dbcontext variable ?
Sol 1: Is it better to use each time I need a connection with db in my classes with the instruction (using (resource), see below ) ?
Sol 2: Is it betterI to code a singleton class to use one instance of the context ?
Sol 3: Or should I use another way for the links beetween my classes and the database ?
What is the best method considering performances and code quality.
Thanks for your attention .
Solution 1
public class Test
{
private T1 a;
private T2 b;
public Test()
{}
public void CreateFrom (int id)
{
using (var db=new WebApplicationMVCTest.Models.dbCtx())
{
a=db.T1s.Find(id);
b= db.T2s.Find(a.id2);
}
}
Solution 2:
public class DbSingleton
{
private static dbCtx instance;
private int foo;
private DbSingleton ()
{}
public static dbCtx Current
{
get
{
if (instance == null)
{
instance = new dbCtx();
}
return instance;
}
}
public static void Set (dbCtx x)
{
if (instance==null)
{
instance = x;
}
}
}
For a web project, never use a static DbContext. EF DbContexts are not thread safe so handling multiple requests will lead to exceptions.
A DbContext's lifespan should only be as long as it is needed. Outside of the first time setup cost when a DbContext is used for the first time, instantiating DbContexts is fast.
My advice is to start simple:
public ActionResult Create(/* details */)
{
using (var context = new AppDbContext())
{
// do stuff.
}
}
When you progress to a point where you learn about, and want to start implementing dependency injection applications then the DbContext can be injected into your controller / service constructors. Again, from the IoC container managing the DbContext, the lifetime scope of the Context should be set to PerWebRequest or equivalent.
private readonly AppDbContext _context;
public MyController(AppDbContext context)
{
_context = context ?? throw new ArgumentNullException("context");
}
public ActionResult Create(/* details */)
{
// do stuff with _context.
}
The gold standard for enabling unit testing would be injecting a Unit of Work pattern and considering something like the Repository pattern to make your dependencies easier to unit test.
The best advice I can give you starting out with EF and MVC is to avoid the temptation to pass Entities between the controller (server) and the UI. (views) You will come across dozens of examples doing just this, but it is a poor choice for performance, it also hides a LOT of land mines and booby traps for both performance, exceptions, and data security issues. The most important detail is that when the UI calls the controller passing what you expect will be an entity, you are not actually getting an entity, but a de-serialized JSON object cast to an entity. It is not an entity that is tracked by the DbContext handling the request. Instead, get accustomed to passing view models (serializable data containers with the data the view needs or can provide) and IDs + values where the controller will re-load entities to update the data only as needed.
I am working on a test suit implementation which uses the SpecFlow + SpecRunner and XUnit. and we are trying to do parallel test execution and i would like to know is there are a way that i can run a hook in the begining of the test run and store the token value in a static variable so that that can be shared among threads.
to summarize is there a way that specflow offers a mechanism to share data between threads during parallel execution.
We can share the data using any one of the below approach
Scenario Context
Context Injection
Here, Approach 1 and 2 will not have any issue in multiple thread. Since, Context Injection life is specific to the scenario Level.
Approach 1 : we can define the Token Generation Step within the BeforeScenario hooks and the generated Token values can be updated in the ScenarioContext.
we can directly access the token from the scenario context in any place like below
Here, Token will be generated before each scenario run and it will not affect the Parallel execution.For more Details, Parallel-Execution
Scenarios and their related hooks (Before/After scenario, scenario block, step) are isolated in the different threads during execution and do not block each other. Each thread has a separate (and isolated) ScenarioContext.
Hooks Class:
public class CommonHooks
{
[BeforeScenario]
public static void Setup()
{
// Add Token Generation Step
var adminToken = "<Generated Token>";
ScenarioContext.Current["Token"] = adminToken;
}
}
Step Class:
[Given(#"I Get the customer details""(.*)""")]
public void WhenIGetTheCustomerDetails(string endpoint)
{
if(ScenarioContext.Current.ContainsKey("Token"))
{
var token = ScenarioContext.Current["Token"].ToString();
//Now the Token variable holds the token value from the scenario context and It can be used in the subsequent steps
}
else
{
Assert.Fail("Unable to get the Token from the Scenario Context");
}
}
If you wish to share the same token across multiple Step, then you can assign this token value within constructor and it can be used
For Example,
[Binding]
public class CustomerManagementSteps
{
public readonly string token;
public CustomerManagementSteps()
{
token= ScenarioContext.Current["Token"].ToString();
}
[Given(#"I Get the customer details""(.*)""")]
public void WhenIGetTheCustomerDetails(string endpoint)
{
//Now the Token variable holds the token value from the scenario context and It can be used in the subsequent steps
}
}
Approach 2: Context Injection details can be referred in the below link with an example
Context Injection
Updated
Given the downvote and comments, I've updated my code example to better show exactly one way you can use dependency injection here with code of your own design. This shared data will last the lifetime of the scenario and be used by all bindings. I think that's what you're looking for unless I'm mistaken.
//Stores whatever data you want to share
//Write this however you want, it's your code
//You can use more than one of these custom data classes of course
public class SomeCustomDataStructure
{
//If this is run in paralell, this should be thread-safe. Using List<T> for simplicity purposes
//Use EF, ConcurrentCollections, synchronization (like lock), etc...
//Again, do NOT copy this code for parallel uses as List<int> is NOT thread-safe
//You can force things to not run in parallel so this can be useful by itself
public List<int> SomeData { get; } = new List<int>();
}
//Will be injected and the shared instance between any number of bindings.
//Lifespan is that of a scenario.
public class CatalogContext : IDisposable
{
public SomeCustomDataStructure CustomData { get; private set; }
public CatalogContext()
{
//Init shared data however you want here
CustomData = new SomeCustomDataStructure();
}
//Added to show Dispose WILL be called at the end of a scenario
//Feel free to do cleanup here if necessary.
//You do NOT have to implement IDiposable, but it's supported and called.
public void Dispose()
{
//Below obviously not thread-safe as mentioned earlier.
//Simple example is all.
CustomData.SomeData.Clear();
}
}
[Binding]
public class SomeSteps
{
//Data shared here via instane variable, accessable to multiple steps
private readonly CatalogContext catalogContext;
//Dependency injection handled automatically here.
//Will get the same instance between other bindings.
public SomeSteps(CatalogContext catalogContext)
{
this.catalogContext = catalogContext;
}
[Given(#"the following ints")]
public void GivenTheFollowingInts(int[] numbers)
{
//This will be visible to all other steps in this binding,
//and all other bindings sharing the context
catalogContext.CustomData.SomeData.AddRange(numbers);
}
}
ASP.NET MVC 5 Project.
I know that the best practice of using EF context object as the following
using(var context = new ContextDB())
{
}
But I am working with a large existing project which not used this practice.
the project using the following pattern
public abstract class BaseService
{
private static ContextDB _data { get; set; }
public static ContextDB Data
{
get
{
if (_data== null)
_data= new ContextDB();
return _data;
}
}
}
Actually, because of this pattern, I am receiving this exception (sometimes, not always)
So to solve this I have to change all the code which is using the shared Data
property and replace it with the new instance of ContextDB as I mentioned in the beginning of the question.
The problem that this is a very large modification, and I will not be allowed to do that modification.
The Question, can I solve this problem without changing a ton of code, In another word, can I solve the problems with modifications done only inside the BaseService class, for example, Is there any event which I could handle to know if any query is executed and then dispose of the ContextDB
here is the pseudo-code of the idea in my mind
public abstract class BaseService
{
public static ContextDB Data
{
get
{
ContextDB _data= new ContextDB();
_data.SqlQueryExecuted += () => { this._data.dispose(); }
return _data;
}
}
}
NOTE: the SaveChanged event is not suitable, because not all of the query are updating or inserting.
I may use following solution.
In Global.asax
Begin Request : Create Instance of your dbContext. Store it in HttpContext.Current.Items.
End Request : Grab the context and close / dispose connection.
Another better solution is to use DI. Dependency Injection and limit the scope of your instance. There are many way Like Singleton, PerRequest etc.
Given the following scenario...
I am concerned about two things...
1) Is it okay to inject a provider into a business model object? - like I did with the Folder implementation because I want to load Sub-folders on demand.
2) Since I am injecting the DbContext in the Sql implementation of IFolderDataProvider, the context could be disposed or it could live on forever, therefore should I instantiate the context in the constructor?
If this design is incorrect then someone please tell me how should business models be loaded.
//Business model.
interface IFolder
{
int Id { get; }
IEnumerable<IFolder> GetSubFolders();
}
class Folder : IFolder
{
private readonly int id_;
private readonly IFolderDataProvider provider_;
public Folder(int id, IFolderDataProvider provider)
{
id_ = id;
provider_ = provider;
}
public int Id { get; }
public IEnumerable<IFolder> GetSubFolders()
{
return provider_.GetSubFoldersByParentFolderId(id_);
}
}
interface IFolderDataProvider
{
IFolder GetById(int id);
IEnumerable<IFolder> GetSubFoldersByParentFolderId(int id);
}
class SqlFolderDataProvider : IFolderDataProvider
{
private readonly DbContext context_;
public SqlFolderDataProvider(DbContext context)
{
context_ = context;
}
public IFolder GetById(int id)
{
//uses the context to fetch the required folder entity and translates it to the business object.
return new Folder(id, this);
}
public IEnumerable<IFolder> GetSubFoldersByParentFolderId(int id)
{
//uses the context to fetch the required subfolders entities and translates it to the business objects.
}
}
Is it okay to inject a provider into a business model object? - like I did with the Folder implementation because I want to load Sub-folders on demand.
Yes, how else would you be able to call the provider and get the data?
However, the suffix DataProvider is very confusing because it is used for the provider that you use to connect to the database. I recommend changing it to something else. Examples: Repository, Context.
Since I am injecting the DbContext in the Sql implementation of IFolderDataProvider, the context could be disposed or it could live on forever, therefore should I instantiate the context in the constructor?
It won't necessarily live on forever. You decide its life span in your ConfigureServices function when you're adding it as a service, so you can change its scope from Singleton to whatever you like. I personally set the scope of my DBContext service to Transient and I also initiate it there with the connection string:
services.AddTransient<IDbContext, DbContext>(options =>
new DbContext(Configuration.GetConnectionString("DefaultDB")));
I then open and close the database connection in every function in my data layer files (you call it provider). I open it inside a using() statement which then guarantees closing the connection under any condition (normal or exception). Something like this:
public async Task<Location> GetLocation(int id) {
string sql = "SELECT * FROM locations WHERE id = #p_Id;";
using (var con = _db.CreateConnection()) {
//get results
}
}
Is it okay to inject a provider into a business model object
Yes if you call it "business" provider :). Actually do not take too serious all this terminology "inject", "provider". Till you pass (to business model layer's method/constructor) interface that is declared on business model layer (and document abstraction leaks) - you are ok.
should I instantiate the context in the constructor?
This could be observed as an abstraction leak that should be documented. Reused context can be corrupted or can be shared with another thread and etc -- all this can bring side effects. So developers tend to do create one "heavy" object like dbContext per "user request" (that usually means per service call using(var context = new DbContext()), but not always, e.g. Sometimes I share it with Authentication Service Call - to check is the next operation allowed for this user). BTW, DbContext is quite quick to create so do not reuse it just for "optimization".
I'm using LINQ2SQL to handle my database needs in a ASP. Net MVC 3 project. I have a separate model which contains all my database access in its own class as follows:
public class OperationsMetricsDB
{
public IEnumerable<client> GetAllClients()
{
OperationsMetricsDataContext db = new OperationsMetricsDataContext();
var clients = from r in db.clients
orderby r.client_name ascending
select r;
return clients;
}
public void AddClient(client newClient)
{
OperationsMetricsDataContext db = new OperationsMetricsDataContext();
db.clients.InsertOnSubmit(newClient);
db.SubmitChanges();
}
I have about 50 different methods in this class which all create and then destroy a copy of my DataContext. My reasoning was that this way would save memory because it would destroy the DataContext after I use the connection and free up that memory. However, I have a feeling that it may be better to use one copy the dataContext and keep it open instead of disposing and reestablishing the connection over and over again. e.g
public class OperationsMetricsDB
{
OperationsMetricsDataContext db = new OperationsMetricsDataContext();
public IEnumerable<client> GetAllClients()
{
var clients = from r in db.clients
orderby r.client_name ascending
select r;
return clients;
}
public void AddClient(client newClient)
{
db.clients.InsertOnSubmit(newClient);
db.SubmitChanges();
}
What is the best practice on this?
I personally use the Unit of Work pattern in conjunction with Repositories for this.
The UnitOfWork creates and manages the DataContext. It then passes the context to each repository when requested. Each time the caller wants to do a new set of operations with the database, they create a new UnitOfWork.
The interfaces would look something like:
public interface IUnitOfWork
{
IRepository<T> GenerateRepository<T>();
void SaveChanges();
}
public interface IRepository<T> where T : class
{
public IQueryable<T> Find();
public T Create(T newItem);
public T Delete(T item);
public T Update(T item);
}
That ensures that the context's lifespan is exactly one Unit of Work long (which is longer than a single operation but shorter than the lifespan of the application).
Its not recommended to cary a datacontext a long time with you. So you are on the right path. It uses connection pooling as far as i know, so the performance hit of creating more than one datacontext in an applications lifetime is not too serious.
But i would not create a new context instance for every single method call of your data class.
I prefer to use it in a unit of work style. Within a web application the processing of a http request can be seen as a unit of work.
So my advice is to create one datacontext instance for the lifetime of on http request and dispose it afterwards.
One context per request is usually fine for most applications.
http://blogs.microsoft.co.il/blogs/gilf/archive/2010/05/18/how-to-manage-objectcontext-per-request-in-asp-net.aspx