We're implementing Entity Framework inside a winforms application using DbContext/Code First and have the following question regarding the proper way to check/handle when an entity has been deleted/updated in another context.
For example, we have some auxiliary table data (e.g. StateCodes) and the user could go in another and add/remove states as needed. This auxiliary editor form utilizes it's own DbContext and saves the changes once the user exits the form. Upon returning to the main form, the main context is unaware of the changes made to the database so we'd like to reload the DbSet for the entity. Unfortunately, it appears that if we remove the "MI" state code it still exists in the Local property of the DbSet with an EntityState of unchanged even after we call "Load" to bring in everything.
Outside of completely disposing of the main context would the following be the best way to check and see if what entities have been removed from the database?
foreach (State state in db.States.Local)
{
DbEntityEntry entry = db.Entry(state);
DbPropertyValues databaseValues = entry.GetDatabaseValues();
if (databaseValues == null)
{
db.States.Remove(state);
}
else
{
entry.OriginalValues.SetValues(databaseValues);
}
}
Thank you for your help
You shouldn't keep the context live past its unit of work. The context should only survive as long as its needed, otherwise you're bound to run in to caching pitfalls like you're observing. (Also, the context really isn't that heavy where instantiating it when you need it is overly time-consuming/resource intensive).
If you really must keep it alive, you may want to look in to passing the context to the auxiliary form.
Mirrored from my comment, figured it's best served as an answer
First, what Brad said. Only keep the context alive for the specific unit of work and dispose it. Not doing this will lead to nothing but headaches.
You can also check the entity's state by using the ObjectStateManager and pass in the object or entity key. You can also use the
public void Refresh(
RefreshMode refreshMode,
IEnumerable collection
)
method off of the Context. Also, you can check the entry state.
http://msdn.microsoft.com/en-us/library/bb503718.aspx
Related
The scenario here is for each screen (view) there is one ViewModel behind. And for best (or recommended) practice, we should use one long-alive DbContext for each ViewModel.
So there is one requirement to reload the related entities if there is some change (new added / deleted entities) made in another ViewModel.
Here are some solutions to this issue:
Publish some event or send some message to notify about the change, the subscriber ViewModels can:
Add/remove the added/deleted entities accordingly without having to reload the entities, this looks like syncing data between ViewModels. It has its own complexity because the added/removed entities here should not have state tracked (meaning the state should be Unchanged not Added or Deleted because these changes have already been updated to database). Also proxied entities cannot be added to multiple DbContexts, ... too many issues here.
Reload all the related entities. This is not naturally supported by EF.
Just reload the whole ViewModel at the time of switching screen (meaning the ViewModel won't be kept for a whole lifetime of the application). This may be applicable in some cases but actually it's not flexible enough to be used in any case (such as some change may be done from outside the application - another application - usually we just need a Refresh button on the current view to refresh data, so reload the whole ViewModel will affect the current View unnecessarily and may cause some bad visual effect, ...)
So I'm really looking for a good solution to this by reloading the related entities. By Googling around, looks like that this is not easily done by Entity Framework, the quickest and safest way is just create and use a new DbContext which means create and use a new ViewModel (please Note that I'm using dependency injection to inject the DbContext into the ViewModel, so the DbContext's lifetime is actually the same with the ViewModel's).
I can Google to find some hacky code to reload entities in Entity Framework but I don't really like hacky stuff. So if possible please share with me your approach, your solutions to this issue or even persuade me that hacky stuff is just fine.
we should use one long-alive DbContext for each ViewModel
I wouldn't say this is true.
You can and probably should create new DbContext instance for every load/update operation.
Using different DbContext instances give you possibility execute queries asynchronously.
For Windows applications (Winforms, WPF) asynchronous database access has huge improve in loading times, while application remain responsive.
With one DbContext this wouldn't be easy.
Instead injecting DbContext, create DbContext factory and inject it to the viewmodel, then
using (var context = _contextFactory.Create<MyDbContext>())
{
var orders = await context.Orders.ToListAsync();
return orders.Select(order => order.ToOrderDto());
}
But what I am afraid of, is that your business an view logic totally rely on database structure.
Your viewmodel shouldn't depend on DbContext, instead depend on a abstraction of database layer. (actually your question is the first wall you hit when rely on DbContext).
public interface OrderDataAccess
{
Task<Order> GetOrder(Guide id);
Task<IEnumerable<OrderLine>> GetOrderLines(Guide orderId);
}
When you load whole view you can load order and order lines.
var orderTask = _dataAccess.GetOrder(id);
var orderLinesTask = _dataAccess.GetOrderLines(id);
await Task.WhenAll(orderTask, orderLinesTask);
this.OrderViewModel = orderTask.Result;
this.OrderLinesViewModels = orderLinesTask.Result;
Then when for example you need reload order lines
this.OrderLinesViewModels = await _dataAccess.GetOrderLines(id);
Using transient DbContext instances just kicks the can down the road. Your ViewModel has some entity data that might be out-of-date. But it also might have unsaved changes. You simply have to decide how you want to handle that on a ViewModel-by-ViewModel basis.
In a Desktop App the ViewModel is the Unit-of-Work, and is still the proper scope for the DbContext.
If you decide you want to reload a tracked entity, or all the tracked entities for a DbContext instance, it shouldn't be a problem. EG something like:
void ReloadAllTrackedEntities()
{
foreach (var entry in ChangeTracker.Entries())
{
entry.Reload();
}
}
On a side-note, since you're building a desktop app, did you know EF Core supports using INotifyPropertyChanged for change tracking?
I hope you can help me out, I've being scratching my head the whole night trying to figure out where this bug persist.
I'm writing an invoicing application in winform.
I have a grid on the form with its data source set to a BindingList object.
Let's just it's along the lines of:
BindingList<InvoiceLine> MyInvoiceLines = new BindingList<InvoiceLine> { };
Invoice MyInvoice = new Invoice();
Both InvoiceLine and Invoice are entity objects in my model.
I add lines to the grid via:
MyInvoiceLines.Add(new InvoiceLine());
I remove lines from the grid via:
MyInvoiceLines.Remove(LineToBeRemoved);
Where LineToBeRemoved is a property that gets the selected line when use wants to remove the line etc...
So eventually I want to save the invoice, so I do this...
foreach(var line in MyInvoiceLines)
{
MyInvoice.InvoiceLines.Add(line);
}
and then calls SaveChange(). However the lines that was removed from InvoiceLines BindingList are also saved... I've being scratching my head trying to work this out as NO WHERE in my code from start to finish does the InvoiceLines collection gets referenced or was connected with the data context object before this method which eventually action the save.
This is a simplified version of my code but I can't help thinking I must got some thing conceptually wrong either with the BindingList or with the data context object. It really isn't obvious for me as I'm a noob.
Any help would be appreciated, not after a fix, maybe some tools or method where I can further diagnose this problem...
Update: detaching the item before adding to the BindingList seemingly fixed it but deleting the object from entity has strange behaviours :/ thanks everyone.
You could try deleting the object explictly. i.e.
foreach(var object in deletedObjectCollection)
{
_currentContext.DeleteObject(order);
}
rather than rely on it's absence in a collection to activate a delete. In my experience (with EF4) that doesn't work. I use lazy loading and the absence of an object in the collection could be because it hasn't been loaded so it doesn't feel right to rely on it's absence. There is probably (almost certainly) more elegant ways to do this but it is currently working for me.
Generally I've had to do a lot more explicitly with EF than I though i would.
The entity that has been removed from the BindingList, has at also bee detached from the DBContext?
If the entity is still attached to the context it will still be tracked and therefore changes will be saved.
I think you have to set one dirty flag for unchanged record and then check it in Entity.SaveChages() event.
May be this help to you...
You do this multiple times? You may have added all the InvoiceLine items to your Invoice, then removed some from the BindingList (not your actual entity!), and then re-add them to your entity.
I suppose (but to be honest I'm a little unsure about this point) as the primary keys match, duplicates aren't saved twice. However, the items that are supposed to be removed are still there.
If your Invoice object is an Entity object it is context-aware and will be tracked by the context. Calling SaveChanges() will save all changes for all Entity objects unless they are detached.
Keep in mind that if you relate these Entity objects to an object graph and attach any node of the object graph to the context, the entire graph will be attched. So if you create a new entity object, like an InvoiceLine, and you relate this new InvoiceLine to an object graph:
MyInvoiceLines.Add(new InvoiceLine());
the entire graph should be tracked by the context at this point.
Ok, I'm working on a project and was just handed a bug that I'm having a bit of trouble with. The code is written in a "different" manner and I think the way the original developers approached this project set it up for some problems, one of which I'm dealing with today. Basically, we have something like this:
Review_Comment comment = commentContext.Review_Comment.First(c => c.CommentID == commentID);
commentContext.DeleteObject(comment);
commentContext.SaveChanges();
review.Review_Comment.Clear();
review.Review_Comment.Load(System.Data.Objects.MergeOption.OverwriteChanges);
context.SaveChanges();
Let me explain a few things and then I'll explain the problem:
"review" is an instance of the Review class, which is the parent of a set of "Review_Comments" (i.e. Review_Comments belong to a single Review).
The function above is to delete a comment.
The comments, for better or worse, use their own EF4 context (separate from the context that the "review" variable is attached to. This is important.
What the original developer tried to do, I think was load the comment in a separate context, delete it, then update the EntityCollection of Review_Comments in the separate "Review" class manually.
However, when context.SaveChanges() is called, we get the following error:
The operation failed: The relationship could not be changed because one or more of the foreign-key properties is non-nullable. When a change is made to a relationship, the related foreign-key property is set to a null value. If the foreign-key does not support null values, a new relationship must be defined, the foreign-key property must be assigned another non-null value, or the unrelated object must be deleted.
I've seen this error described when people are trying to delete say, an Order object and the related OrderItems are not deleted correctly but this case is different. We're trying to delete a single child object and then update the EntityCollection on another entity using a separate context.
Hope that all makes sense, let me know if I can help clarify anything. Any thoughts?
EDIT: I should mention I was able to get this problem to go away by using the same context that the rest of the page uses. However, in this case, because of several dependencies previous developers have introduced, I have to keep the second context or else I have to rewrite a ton of code to remove the dependencies on the second context just to fix this bug. I'm hoping to find a solution that doesn't involve that much time. The goal is to delete the comment, then to reload a separate entity's Review_Comment EntityCollection and be able to call SaveChanges() without this error.
Your problem is that the .Clear() causes the second context to disassociate the Review_Comments from its Review, it never realizes that the Review_Comment was actually deleted.
You should do this instead
context.Refresh(RefreshMode.StoreWins, review.Review_Comment );
context.SaveChanges();
If you watch the entity state of the comment in "review.Review_Comment" you should see that after the Refresh, its state becomes "Detached" rather than "Modified"
I have a Linq object, and I want to make changes to it and save it, like so:
public void DoSomething(MyClass obj) {
obj.MyProperty = "Changed!";
MyDataContext dc = new MyDataContext();
dc.GetTable<MyClass>().Attach(dc, true); // throws exception
dc.SubmitChanges();
}
The exception is:
System.InvalidOperationException: An entity can only be attached as modified without original state if it declares a version member or does not have an update check policy.
It looks like I have a few choices:
put a version member on every one of my Linq classes & tables (100+) that I need to use in this way.
find the data context that originally created the object and use that to submit changes.
implement OnLoaded in every class and save a copy of this object that I can pass to Attach() as the baseline object.
To hell with concurrency checking; load the DB version just before attaching and use that as the baseline object (NOT!!!)
Option (2) seems the most elegant method, particularly if I can find a way of storing a reference to the data context when the object is created. But - how?
Any other ideas?
EDIT
I tried to follow Jason Punyon's advice and create a concurrency field on on table as a test case. I set all the right properties (Time Stamp = true etc.) on the field in the dbml file, and I now have a concurrency field... and a different error:
System.NotSupportedException: An attempt has been made to Attach or Add an entity that is not new, perhaps having been loaded from another DataContext. This is not supported.
So what the heck am I supposed to attach, then, if not an existing entity? If I wanted a new record, I would do an InsertOnSubmit()! So how are you supposed to use Attach()?
Edit - FULL DISCLOSURE
OK, I can see it's time for full disclosure of why all the standard patterns aren't working for me.
I have been trying to be clever and make my interfaces much cleaner by hiding the DataContext from the "consumer" developers. This I have done by creating a base class
public class LinqedTable<T> where T : LinqedTable<T> {
...
}
... and every single one of my tables has the "other half" of its generated version declared like so:
public partial class MyClass : LinqedTable<MyClass> {
}
Now LinqedTable has a bunch of utility methods, most particularly things like:
public static T Get(long ID) {
// code to load the record with the given ID
// so you can write things like:
// MyClass obj = MyClass.Get(myID);
// instead of:
// MyClass obj = myDataContext.GetTable<MyClass>().Where(o => o.ID == myID).SingleOrDefault();
}
public static Table<T> GetTable() {
// so you can write queries like:
// var q = MyClass.GetTable();
// instead of:
// var q = myDataContext.GetTable<MyClass>();
}
Of course, as you can imagine, this means that LinqedTable must somehow be able to have access to a DataContext. Up until recently I was achieving this by caching the DataContext in a static context. Yes, "up until recently", because that "recently" is when I discovered that you're not really supposed to hang on to a DataContext for longer than a unit of work, otherwise all sorts of gremlins start coming out of the woodwork. Lesson learned.
So now I know that I can't hang on to that data context for too long... which is why I started experimenting with creating a DataContext on demand, cached only on the current LinqedTable instance. This then led to the problem where the newly created DataContext wants nothing to do with my object, because it "knows" that it's being unfaithful to the DataContext that created it.
Is there any way of pushing the DataContext info onto the LinqedTable at the time of creation or loading?
This really is a poser. I definitely do not want to compromise on all these convenience functions I've put into the LinqedTable base class, and I need to be able to let go of the DataContext when necessary and hang on to it while it's still needed.
Any other ideas?
Updating with LINQ to SQL is, um, interesting.
If the data context is gone (which in most situations, it should be), then you will need to get a new data context, and run a query to retrieve the object you want to update. It's an absolute rule in LINQ to SQL that you must retrieve an object to delete it, and it's just about as iron-clad that you should retrieve an object to update it as well. There are workarounds, but they are ugly and generally have lots more ways to get you in trouble. So just go get the record again and be done with it.
Once you have the re-fetched object, then update it with the content of your existing object that has the changes. Then do a SubmitChanges() on the new data context. That's it! LINQ to SQL will generate a fairly heavy-handed version of optimistic concurrency by comparing every value in the record to the original (in the re-fetched) record. If any value changed while you had the data, LINQ to SQL will throw a concurrency exception. (So you don't need to go altering all your tables for versioning or timestamps.)
If you have any questions about the generated update statements, you'll have to break out SQL Profiler and watch the updates go to the database. Which is actually a good idea, until you get confidence in the generated SQL.
One last note on transactions - the data context will generate a transaction for each SubmitChanges() call, if there is no ambient transaction. If you have several items to update and want to run them as one transaction, make sure you use the same data context for all of them, and wait to call SubmitChanges() until you've updated all the object contents.
If that approach to transactions isn't feasible, then look up the TransactionScope object. It will be your friend.
I think 2 is not the best option. It's sounding like you're going to create a single DataContext and keep it alive for the entire lifetime of your program which is a bad idea. DataContexts are lightweight objects meant to be spun up when you need them. Trying to keep the references around is also probably going to tightly couple areas of your program you'd rather keep separate.
Running a hundred ALTER TABLE statements one time, regenerating the context and keeping the architecture simple and decoupled is the elegant answer...
find the data context that originally created the object and use that to submit changes
Where did your datacontext go? Why is it so hard to find? You're only using one at any given time right?
So what the heck am I supposed to attach, then, if not an existing entity? If I wanted a new record, I would do an InsertOnSubmit()! So how are you supposed to use Attach()?
You're supposed to attach an instance that represents an existing record... but was not loaded by another datacontext - can't have two contexts tracking record state on the same instance. If you produce a new instance (ie. clone) you'll be good to go.
You might want to check out this article and its concurrency patterns for update and delete section.
The "An entity can only be attached as modified without original state if it declares a version member" error when attaching an entitity that has a timestamp member will (should) only occur if the entity has not travelled 'over the wire' (read: been serialized and deserialized again). If you're testing with a local test app that is not using WCF or something else that will result in the entities being serialized and deserialized then they will still keep references to the original datacontext through entitysets/entityrefs (associations/nav. properties).
If this is the case, you can work around it by serializing and deserializing it locally before calling the datacontext's .Attach method. E.g.:
internal static T CloneEntity<T>(T originalEntity)
{
Type entityType = typeof(T);
DataContractSerializer ser =
new DataContractSerializer(entityType);
using (MemoryStream ms = new MemoryStream())
{
ser.WriteObject(ms, originalEntity);
ms.Position = 0;
return (T)ser.ReadObject(ms);
}
}
Alternatively you can detach it by setting all entitysets/entityrefs to null, but that is more error prone so although a bit more expensive I just use the DataContractSerializer method above whenever I want to simulate n-tier behavior locally...
(related thread: http://social.msdn.microsoft.com/Forums/en-US/linqtosql/thread/eeeee9ae-fafb-4627-aa2e-e30570f637ba )
You can reattach to a new DataContext. The only thing that prevents you from doing so under normal circumstances is the property changed event registrations that occur within the EntitySet<T> and EntityRef<T> classes. To allow the entity to be transferred between contexts, you first have to detach the entity from the DataContext, by removing these event registrations, and then later on reattach to the new context by using the DataContext.Attach() method.
Here's a good example.
When you retrieve the data in the first place, turn off object tracking on the context that does the retrieval. This will prevent the object state from being tracked on the original context. Then, when it's time to save the values, attach to the new context, refresh to set the original values on the object from the database, and then submit changes. The following worked for me when I tested it.
MyClass obj = null;
using (DataContext context = new DataContext())
{
context.ObjectTrackingEnabled = false;
obj = (from p in context.MyClasses
where p.ID == someId
select p).FirstOrDefault();
}
obj.Name += "test";
using (DataContext context2 = new ())
{
context2.MyClasses.Attach(obj);
context2.Refresh(System.Data.Linq.RefreshMode.KeepCurrentValues, obj);
context2.SubmitChanges();
}
I've run into a scenario where I essentially need to write the changes of a child entity of a one-to-many association to the database, but not save any changes made to the parent entity.
The Entity Framework currently deals with database commits in the context scope (EntityContext.SaveChanges()), which makes sense for enforcing relationships, etc. But I'm wondering if there is some best practice or maybe a recommended way to go about doing fine-grained database commits on individual entites instead of the entire context.
Best practices? Do you mean, besides, "Don't do it!"?
I don't think there is a best practice for making an ObjectContext different than the state of the database.
If you must do this, I would new up a new ObjectContext and make the changes to the child entity there. That way, both contexts are consistent.
I have a similar need. The solution I am considering is to implement wrapper properties on all entities that store any property changes privately without affecting the actual entity property. I then would add a SaveChanges() method to the entity which would write the changes to the entity and then call SaveChanges() on the context.
The problem with this approach is that you need to make all your entities conform to this pattern. But, it seems to work pretty well. It does have another downside in that if you make a lot of changes to a lot of objects with a lot of data, you end up with extraneous copies in memory.
The only other solution I can think of is to, upon saving changes, save the entity states of all changed/added/deleted entities, set them to unmodified except the one you're changing, save the changes, and then restore the states of the other entities. But that sounds potentially slow.
This can be accomplished by using AcceptAllChanges().
Make your changes to the parent entity, call AcceptAllChanges(), then make your changes to the related Entities and call SaveChanges(). The changes you have made to the parent will not be saved because they have been "committed" to the Entity but not saved to the database.
using (AdventureWorksEntities adv = new AdventureWorksEntities())
{
var completeHeader = (from o in adv.SalesOrderHeader.Include("SalesOrderDetail")
where o.DueDate > System.DateTime.Now
select o).First();
completeHeader.ShipDate = System.DateTime.Now;
adv.AcceptAllChanges();
var details = completeHeader.SalesOrderDetail.Where(x => x.UnitPrice > 10.0m);
foreach (SalesOrderDetail d in details)
{
d.UnitPriceDiscount += 5.0m;
}
adv.SaveChanges();
}
This worked for me. Use the ChangeTracker.Clear() method to clear out changes for other entities.
_contextICH.ChangeTracker.Clear();
var x = _contextICH.UnitOfMeasure.Attach(parameterModel);
x.State = (parameterModel.ID != null) ? Microsoft.EntityFrameworkCore.EntityState.Modified : Microsoft.EntityFrameworkCore.EntityState.Added;
_contextICH.SaveChanges();