How can one set the TransactionHandler for ObjectContext?
I am checking this example: Handling of Transaction Commit Failures, but it only shows for DbContext.
TransactionHandler also works for ObjectContext. The only problem is that the code based configurations (DbConfiguration) are not evaluated before the first DbContext is instantiated.
Two possible workarounds
Dummy DbContext:
public class MyDbConfiguration : DbConfiguration
{
public MyDbConfiguration()
{
SetTransactionHandler(SqlProviderServices.ProviderInvariantName,
() => new CommitFailureHandler());
}
}
public class TestContext : DbContext { }
static void Main(string[] args)
{
// instantiate DbContext to initialize code based configuration
using (var db = new TestContext()) { }
using (var db = new TransactionHandlerDemoEntities()) {
var handler = db.TransactionHandler; // should be CommitFailureHandler
db.AddToDemoTable(new DemoTable { Name = "TestEntiry1" });
db.SaveChanges();
}
}
Or DbConfiguration.Loaded event
static void Main(string[] args)
{
DbConfiguration.Loaded += DbConfiguration_Loaded;
using (var db = new TransactionHandlerDemoEntities()) {
var handler = db.TransactionHandler;
db.AddToDemoTable(new DemoTable { Name = "TestEntiry1" });
db.SaveChanges();
}
}
static void DbConfiguration_Loaded(object sender, DbConfigurationLoadedEventArgs e)
{
e.AddDependencyResolver(new TransactionHandlerResolver(
() => new CommitFailureHandler(),
SqlProviderServices.ProviderInvariantName,
null),true);
}
TransactionHandlerDemoEntities is an ObjectContext.
This is exclusively for DbContext. If you can, refactor your ObjectContext-based application into DbContext as soon as possible. I think that many more new features will appear that only work with the DbContext API. Maybe ObjectContext will even get deprecated as a public API some day.
You can create a DbContext from an ObjectContext, but I don't think that's of much help to you. The main problem is undoubtedly that the rest of the data logic currently expects ObjectContext.
Related
I have a model class:
public class Work
{
public long Id { get; set; }
[Required]
public string Name { get; set; }
}
I want this Work.Name will be unique, so I define the DbContext:
public class MyDbContext : DbContext
{
public MyDbContext () : base() { }
public MyDbContext (DbContextOptions<MyDbContext > options) : base(options) { }
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
modelBuilder.Entity<Work>(entity =>
entity.HasIndex(e => e.Name).IsUnique()
);
}
public DbSet<Work> Works { get; set; }
}
And I want to test this, so I have a test like this:
[Fact]
public void Post_InsertDuplicateWork_ShouldThrowException()
{
var work = new Work
{
Name = "Test Work"
};
using (var context = new MyDbContext (options))
{
context.Works.Add(work);
context.SaveChanges();
}
using (var context = new MyDbContext (options))
{
context.Works.Add(work);
context.SaveChanges();
}
using (var context = new MyDbContext (options))
{
Assert.Equal(1, context.Works.Count());
}
}
( The option object contains settings for InMemoryDatabase)
I don't really know what to check, but the test failed in the Assert, not in the second SaveChanges(). The database (the context) contains two objects with the same Name.
I went over all the relevant questions, but I did not see anyone answering what I was asking.
As others pointed out InMemory database provider ignore all possible constraints.
My suggestion would be then to use Sqlite provider with "in-memory' feature, which will throw an exception for duplicate unique keys.
public MyDbContext CreateSqliteContext()
{
var connectionString =
new SqliteConnectionStringBuilder { DataSource = ":memory:" }.ToString();
var connection = new SqliteConnection(connectionString);
var options = new DbContextOptionsBuilder<MyDbContext>().UseSqlite(connection);
return new MyDbContext(options);
}
private void Insert(Work work)
{
using (var context = CreateSqliteContext())
{
context.Works.Add(work);
context.SaveChanges();
}
}
[Fact]
public void Post_InsertDuplicateWork_ShouldThrowException()
{
var work1 = new Work { Name = "Test Work" };
var work2 = new Work { Name = "Test Work" };
Insert(work1);
Action saveDuplicate = () => Insert(work2);
saveDuplicate.Should().Throw<DbUpdateException>(); // Pass Ok
}
The test fails because the second SaveChanges() will throw an exception from the database that tells you that you cannot add another item because it already contains an object with that Name.
Unique constraints are not enforced silently. Instead, attempting to add a duplicate value will throw an exception when you try to do it. This is so that you can actually react to it, instead of only noticing it after the fact (when you see that the data you attempted to add is not there).
You can test that by using Assert.Throws:
[Fact]
public void Post_InsertDuplicateWork_ShouldThrowException()
{
var work = new Work
{
Name = "Test Work"
};
using (var context = new MyDbContext (options))
{
context.Works.Add(work);
context.SaveChanges();
}
using (var context = new MyDbContext (options))
{
context.Works.Add(work);
Assert.Throws<Exception>(() => context.SaveChanges());
}
}
You can also specify the exact exception there (I don’t remember on top of my head which exception it exactly is that is thrown there), and you can also assign it to a variable (Assert.Throws() returns the exception) and verify the exception message to make sure that this is the exact exception you expect.
Lets say I want to use, for example, a new DbContext object whenever a method is called in a class but without getting it by a parameter. Like so
class MyClass {
public virtual void MethodOne() {
// Having automatically a new instance of DbContext
}
public virtual void MethodTwo() {
// Also having automatically a new instance of DbContext
}
}
What I was really hoping for was a DI way of doing this. Like public void Method(IMyWayOfContext context).
class MyClass {
public virtual void MethodOne(IMyWayOfContext context)) {
}
public virtual void MethodTwo(IMyWayOfContext context) {
}
}
Other classes inheriting from this class must be provided with a new instance of dbcontext. That's why I don't want to create a new instance inside of the function
You could do something like this (generic interface, plus a wrapper with multiple constraints):
class DBContext{ }
interface IDoesMethods<TContext> where TContext : new()
{
void MethodOne(TContext context = default(TContext));
void MethodTwo(TContext context = default(TContext));
}
class MyClass : IDoesMethods<DBContext>
{
public void MethodOne(DBContext context)
{
}
public void MethodTwo(DBContext context)
{
}
}
class MyContextWrapper<TClass, TContext> : IDoesMethods<TContext> where TContext : new() where TClass : IDoesMethods<TContext>, new()
{
public void MethodOne(TContext context = default(TContext))
{
instance.MethodOne(new TContext());
}
public void MethodTwo(TContext context = default(TContext))
{
instance.MethodTwo(new TContext());
}
private TClass instance = new TClass();
}
class Program
{
static void Main(string[] args)
{
var wrapper = new MyContextWrapper<MyClass, DBContext>();
wrapper.MethodOne();
wrapper.MethodTwo();
}
}
Make a property with only getter that will return new instance every time
protected DbContext MyDBContext
{
get
{
return new DbContext();
}
}
EDIT: If you want some kind of dependency injection you can make your class generic and pass to instance of the class what type of context you want
class MyClass<T> {
protected DbContext MyDBContext
{
get
{
return Activator.CreateInstance<T>();
}
}
public void MethodOne() {
// Having automatically a new instance of DbContext
}
public void MethodTwo() {
// Also having automatically a new instance of DbContext
}
}
Your simple solution can work this way:
class MyClass {
protected DbContext InternalContext {
return new DbContext();
}
public virtual void MethodOne(DbContext dc = null) {
if(dc == null)
dc = InternalContext;
// do your work
}
public virtual void MethodTwo(DbContext dc = nnull) {
if(dc == null)
dc = InternalContext;
// do your work
}
}
In that case, you have to take care of disposing InternalContext
While answer here looks valid, they don't seem to fulfill perfectly your requirement of having a solution that rely on DI.
DI in it's simplest expression is most of the time achieve with Constructor Injection.
Your design was already good and DI ready.
Indeed, asking for dependencies via the constructor is good.
It is at the composition root of your application that you need to decide what implementation you need to pass.
Using a DI library can help (but it is not required to enable DI).
With your actual class design:
class MyClass {
public virtual void MethodOne(IMyWayOfContextFactory contextFactory)) {
using(var context = contextFactory.Create()){
//play with context
}
}
public virtual void MethodTwo(IMyWayOfContextFactory contextFactory) {
using(var context = contextFactory.Create()){
//play with context
}
}
}
public ContextFactory : IMyWayOfContextFactory {
IMyWayOfContext Create(){
return new MyWayOfContext();
}
}
Without a factory and with a DI container like SimpleInjector, you could have:
class MyClass {
public virtual void MethodOne(IMyWayOfContext context)) {
//play with context
}
public virtual void MethodTwo(IMyWayOfContext context) {
//play with context
}
}
And register your component once at the composition root with configurable Lifestyle management:
container.Register<IMyWayOfContext, MyWayOfContext>(Lifestyle.Transient);
The latter approach is simpler if you want to configure when to inject what instance of your context. Indeed, such configuration is built in an DI Container library. For instance, see: Lifestyle of component with SimpleInjector
What is the custom wen the record inserting/updating is carried out?
I have this Log table in the MS SQL server database, and a C# class (example is simplified)
[Table(Name = "dbo.Sys_Log")]
public class Sys_Log
{
// Read-only, db-generated primary key ID
private int _logID;
[Column(IsPrimaryKey=true, Storage="_logID", IsDbGenerated=true)]
public int logID
{
get
{
return this._logID;
}
}
// Read-only db-generated datetime field
private System.DateTime _logTime;
[Column(Storage="_logTime", IsDbGenerated=true)]
public System.DateTime logTime
{
get
{
return this._logTime;
}
}
// Read-write string field
private string _logEvent;
[Column(Storage="_logEvent")]
public string logEvent
{
get
{
return this._logEvent;
}
set
{
this._logEvent = value;
}
}
public Sys_Log() {}
public Sys_Log(string logEvent)
{
this.logEvent = logEvent;
}
}
And this is how I add a log entry:
Table<Sys_Log> linqLog = db.GetTable<Sys_Log>();
Sys_Log l = new Sys_Log("event");
linqLog.InsertOnSubmit(l);
db.SubmitChanges();
I am not particularly happy about this code. I'd like something like this instead:
Sys_Log.Log("event");
I have idea how this can be achieved, but I'd like to know if I am following the LINQ philosophy. With this code added to the Sys_Log class
private static DataContext db;
public static void Connect(DataContext db)
{
Sys_Log.db = db;
}
public static void Log(string logEvent)
{
Table<Sys_Log> linqLog = db.GetTable<Sys_Log>();
Sys_Log l = new Sys_Log(logEvent);
linqLog.InsertOnSubmit(l);
db.SubmitChanges();
}
I can now do this:
Sys_Log.Connect(db); // Only once, at init
Sys_Log.Log("event1");
Sys_Log.Log("event2");
Are there any pitfalls, apart from the fact that the database is updated several times, that could be considered ineffective?
************** Update ******************
Following the advice of #usr not to reuse the DataContext object, I have made these changes to the Sys_Log class:
private static SqlConnection db;
public static void Connect(SqlConnection db)
{
Sys_Log.db = db;
}
public static void Log(string logEvent)
{
DataContext ctx = new DataContext(db);
ctx.CommandTimeout = 240;
Table<Sys_Log> linqLog = ctx.GetTable<Sys_Log>();
Sys_Log l = new Sys_Log(logEvent);
linqLog.InsertOnSubmit(l);
ctx.SubmitChanges();
}
Use a fresh data context each time. Reusing the same context has to catastrophic consequences:
No entity memory is ever released
When an invalid entity enters the context (due to a bug) it is stuck and will forever prevent SubmitChanges from succeeding. The application will never recover
Also note, that L2S is deprecated and EF has superseded it.
You can share a SqlConnection and use it long-term if you really want. That requires, through, that you deal with broken connections. Thanks to connection pooling there are little performance incentives to do this.
It usually is the easiest and most clear way to use throw-away connections. Inject a factory, for example:
Func<SqlConnection> myFactory = () => new SqlConnection(myConnStr);
That's all there is to it. Use it, as always, with using:
using(var conn = myFactory()) { ... }
My services take a DbContext in their constructor, and I have created a UnitOfWork class that contains all my services in order to make sure the same DbContext is used between them all.
Sample of unitofwork class
private myEntities myContext
public UnitOfWork()
{
myContext = new myEntities();
}
private RequestService requestService;
public RequestService RequestService
{
get
{
if (requestService == null)
requestService = new RequestService(myContext);
return requestService;
}
}
By Using this unitofwork class all the DbContext for my services are now consistent and a change made in one service will appear in another.
However if i need to change the actual Entity context class then that does not get persisted across each service.
Below i have a "Refresh" method that re-initializes it (I need to refresh the context so i can have this class work with some legacy code).
public void Refresh()
{
myContext = new myEntities();
}
However my service classes DbContext objects aren't passed by ref so the context is not set to a new instance of my entity class and this results in the context not being refreshed.
So I think i can solve this by passing by ref as shown below
Service class sample
MyEntities myContext;
public RequestService(ref MyEntities myContext)
{
this.myContext = myContext;
}
However i have seen people say you should not pass context classes by ref so i am curious if there is a better way out there and i am looking at this the wrong way?
Edit
Sorry turns out my proposed solution of passing by ref does not solve my problem, but i am still interested as to how i can update the entity context on the UnitOfWork class e.g. setting it to null and have that effect the service classes.
Never ever should you share DbContext, by reference or as reference. It is not thread safe.
http://msdn.microsoft.com/en-us/data/jj729737.aspx
If you need an easy way to generate multiple DbContext, use ObjectPool from Parallel Extensions Extras.
Update 1
#tia is correct in saying that the private instance will not be updated when original changes:
class Program
{
static void Main(string[] args)
{
var pool1 = new ObjectPool<IDbConnection>(() => new SqlConnection("Data Source=server1"));
var service = new Service(ref pool1);
pool1 = new ObjectPool<IDbConnection>(() => new SqlConnection("Data Source=server2"));
Console.WriteLine(service.Pool.GetObject().ConnectionString);
}
}
class Service
{
private ObjectPool<IDbConnection> connectionPool;
public Service(ref ObjectPool<IDbConnection> pool) { this.connectionPool = pool; }
public ObjectPool<IDbConnection> Pool { get { return connectionPool; } }
}
Will print "Data Source=server 1", even if it would be a static field.
Enter Monostate, a wicked pattern, very similar to Singleton.
class Program
{
static void Main(string[] args)
{
var mop = new MonoObjectPool();
mop.Pool = new ObjectPool<IDbConnection>(() => new SqlConnection("Data Source=server1"));
var service = new Service();
mop.Pool = new ObjectPool<IDbConnection>(() => new SqlConnection("Data Source=server2"));
Console.WriteLine(service.Pool.GetObject().ConnectionString);
}
}
internal class MonoObjectPool
{
private static ObjectPool<IDbConnection> pool1;
public ObjectPool<IDbConnection> Pool
{
get { return pool1; }
set { pool1 = value; }
}
}
class Service
{
public ObjectPool<IDbConnection> Pool { get { return new MonoObjectPool().Pool; } }
}
I am getting rid of the constructor for service, as I can always get the current IDbConnection generator. There will always be only one instance of it, regardless how many times someone instantiates the MonoObjectPool.
Update 2
The other option might be to use Autofac, but I am not too familiar with it, yet, so I can't give you an example how a type could get resolved in a service instance. Here is a simple example:
class Program
{
private static IContainer container { get; set; }
static void Main(string[] args)
{
var builder = new ContainerBuilder();
builder.RegisterType<DbCtx1>().As<IDbCtx>();
container = builder.Build();
using (var scope = container.BeginLifetimeScope())
{
var dbctx = scope.Resolve<IDbCtx>();
Console.WriteLine(dbctx.GetType());
}
builder = new ContainerBuilder();
builder.RegisterType<DbCtx2>().As<IDbCtx>();
container = builder.Build();
using (var scope = container.BeginLifetimeScope())
{
var dbctx = scope.Resolve<IDbCtx>();
Console.WriteLine(dbctx.GetType());
}
}
}
interface IDbCtx
{
}
class DbCtx1 : IDbCtx { }
class DbCtx2 : IDbCtx { }
Update 3
So going back to the Monostate, this works as expected:
class Program
{
static void Main(string[] args)
{
var mop = new MonoObjectPool();
mop.Pool = new ObjectPool<IDbConnection>(() => new SqlConnection("Data Source=server1"));
var service = new Service(mop);
mop.Pool = new ObjectPool<IDbConnection>(() => new SqlConnection("Data Source=server2"));
Console.WriteLine(service.Pool.GetObject().ConnectionString);
}
}
internal class MonoObjectPool
{
private static ObjectPool<IDbConnection> pool1;
public ObjectPool<IDbConnection> Pool
{
get { return pool1; }
set { pool1 = value; }
}
}
class Service
{
private MonoObjectPool myPool;
public Service(MonoObjectPool pool) { myPool = pool; }
public ObjectPool<IDbConnection> Pool { get { return myPool.Pool; } }
}
I hope this helps.
I have a problem with some simple code, I'm refactoring some existing code from LINQ to SQL to the Entity Framework. I'm testing my saves and deletes, and the delete is really bugging me:
[TestMethod]
public void TestSaveDelete()
{
ObjectFactory.Initialize(x =>
{
x.For<IArticleCommentRepository>().Use<ArticleCommentRepository>();
});
PLArticleComment plac = new PLArticleComment();
plac.Created = DateTime.Now;
plac.Email = "myemail";
plac.Name = "myName";
plac.Text = "myText";
plac.Title = "myTitle";
IArticleCommentRepository acrep = ObjectFactory.GetInstance<IArticleCommentRepository>();
try
{
PortalLandEntities ple = new PortalLandEntities();
int count = ple.PLArticleComment.Count();
acrep.Save(plac);
Assert.AreEqual(ple.PLArticleComment.Count(), count + 1);
//PLArticleComment newPlac = ple.PLArticleComment.First(m => m.Id == plac.Id);
//ple.Attach(newPlac);
acrep.Delete(plac);
Assert.AreEqual(ple.PLArticleComment.Count(), count + 1);
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
}
Every time i try to run this code, I get an exception in the delete statement, telling me that its not contained within the current ObjectStateManager.Please note that both my Save and delete looks like this:
public void Delete(PLCore.Model.PLArticleComment comment)
{
using (PortalLandEntities ple = Connection.GetEntityConnection())
{
ple.DeleteObject(comment);
ple.SaveChanges();
}
}
public void Save(PLCore.Model.PLArticleComment comment)
{
using (PortalLandEntities ple = Connection.GetEntityConnection())
{
ple.AddToPLArticleComment(comment);
ple.SaveChanges();
}
}
and the connection thingy:
public class Connection
{
public static PortalLandEntities GetEntityConnection()
{
return new PortalLandEntities();
}
}
Any ideas on what i could do to make it work?
You cannot load an entity from one ObjectContext (in your case, an ObjectContext is an instance of PortalLandEntities) and then delete it from another ObjectContext, unless you detach it from the first and attach it to the second. Your life will be much, much simpler if you use only one ObjectContext at a time. If you cannot do that, you must manually Detach and then Attach first, all the while keeping track of which entities are connected to which ObjectContext.
How to use DI with your Connection : make it non-static.
public class Connection
{
private PortalLandEntities _entities;
public PortalLandEntities GetEntityConnection()
{
return _entities;
}
public Connection(PortalLandEntities entities)
{
this._entities = entities;
}
}
Then use a DI container per request. Most people do this via a controller factory.