I am trying to use the attach method to update an entity that was retrieve via a stored proc.
The stored proc is set up to return a specific instance, which is present in my dbml. The retrieval works as expected and returns a fully populated object. The reason I need to use a stored proc is that I need to update a property on that entity at the same time that it is retrieved.
After I have retrieved this entity, I am mapping it using AutoMapper to another model which is used in another tier of the app. This tier performs a few operations, and makes a change to the entity, and passes it back to the repository for updating.
The repository converts this business model back into a database model, and attempts to attach it to the datacontext in order to take advantage of the automagic updating.
No matter what combination of Attach(entity, true) Attach(entity) etc, and it gives me messages like "Row not found or changed" or "Unable to add an entity with the same primary key".
Does anyone have any experience with the Attach method and how it can be used to update entities that did not necessarily come from the data context using query syntax (ie in this case a stored proc)?
Thanks alot
First, if you are creating a copy of the object, making changes and then trying to attach the copied object to the same DataContext as the one with the original object in it, then this would probably result in the "Unable to add an entity with the same primary key" message. One way to handle this is:
1. Get object from DataContext
2. Make changes and map object (or vice versa - whatever order)
3. Update the original object with the new values made in the other tier
4. SubmitChanges on the DataContext containing the original object
or
Get the object from a DataContext and close the DataContext
Make your changes and do your mapping
Retrieve the object from the DataContext to which you want to save
Update that object with the values from your mapped object
SubmitChanges
Alternately, when you say you are using the proc because you need to update a property at the same time that you retrieve it, I'd need to see the proc, but if you are somehow committing this update after retrieving the information, then indeed the message "row not found or changed" is correct. This would be hard to do, but you could do it if you're loading the data into a temp table, doing the update, and then using a select from the temp table to populate the object. One thing you could try is setting that property, in the L2S designer, to AutoUpdate = Never and see if that makes the problem go away. If so, this is your problem.
1: is it the same data-context, and
2: is it the same entity instance (or one that looks like it)
This would only happen for the same data-context, I suspect. If it is the same entity, then it is already there; just call SumbitChanges. Otherwise, either use a second data-context or detach the original entity.
So if I retrieved the entity via a stored proc, is it being tracked by the datacontext?
The thing is.. I'm going from the data model, to a another model that is used by another component, and then back. Its not.. really the same instance, but it does have all the same properties.
IE
public Models.Tag GetEntity()
{
var dbTag = db.PROJ_GetEntity((int)EntityStatuses.Created, (int)EntityStatuses.CreatingInApi).SingleOrDefault();
return FromDb Entity(dbEntity);
}
var appModel = GetEntity(); // gets an Entity from a stored proc (NOT GetEntity_RESULT)
appModel.MakeSomeChanges();
_Repo.Persist(appModel);
public void Persist(Models.AppModel model)
{
var dbEntity = Mapper.Map(model);
db.Attach(dbEntity);
db.SubmitChanges();
}
This is somewhat pseudo code like.. but it demostrates pretty much exactly what I am doing.
Thanks
I'm upvoting weenet's answer because he's right - you can't use Attach to apply the changes.
Unlike Entity Framework, you can only attach an L2S object to a datacontext if it has never been attached before - i.e. it's a newed entity that you want to Insert into a table.
This does cause numerous problems in multi-layered environments - however I've been able to get around many of the issues by creating a generic entity synchronisation system, which uses reflection and expression trees.
After an object has been modified, I run the dynamic delegate against a new object from the DC and the modified object, so that only the differences are tracked in the DC before generating the Update statement. Does get a bit tricky with related entities, though.
Related
I'm new to C# and .NET core.
I'm wondering why when editing a model using bind, it creates a new model and binds the attributes from the post. But if you do not place all your fields in the form post as hidden and the bind it will null them out?
Shouldn't it load a model and update the bind parameters and leave the ones alone?
For example if I'm updating a person and person has
Id, Name, Age, updated, Created
Edit(int id, [Bind("Id,Name,Age") Person p]
When I go to _context.update(p), it's nulling out updated and Created because they weren't bound.
WHY does it work like that?
How can I make it only update the bound parameters without nulling out the ones I don't need to load?
What you pass in is a deserialized block of data that MVC is mapping into an entity definition. It doesn't auto-magically open a DbContext, load the entity, and overwrite values, it just creates an instance of the entity, and copies in the values across. Everything else would be left as defaults.
As a general rule I advise against ever passing entities from the client to the server to avoid confusion about what is being sent back. When performing an update, accept a view model with the applicable properties to update, and ideally the data model and view model should include a row version #. (I.e. Timestamp column in SQL Server that can be converted to/from Base64 string to send/compare in ViewModel)
From there, when performing an update, you fetch the entity by ID, compare the timestamp, then can leverage Automapper to handle copying data from the ViewModel into the Entity, or copy the values across manually.
That way, when your ViewModel comes back with the data to update:
using (var context = new AppDbContext())
{
var entity = context.Persons.Single(x => x.Id == id);
if (entity.TimeStampBase64 != viewModel.TimestampBase64)
{
// Handle fact that data has changed since the client last loaded it.
}
Mapper.Map(entity, viewModel);
context.SaveChanges();
}
You could use the entity definition as-is and still load the existing entity and use Automapper to copy values from the transit entity class to the DbContext tracked one, however it's better to avoid confusing instances of entities between tracked "real" entity instances, and potentially incomplete untracked transit instances. Code will often have methods that accept entities as parameters to do things like validation, calculations, etc. and it can get confusing if later those methods assume they will get "real" entities vs. getting called somewhere that has only a transient DTO flavour of an entity.
It might seem simpler to take an entity in and just call DbContext.Update(entity) with it, however this leaves you with a number of issues including:
You need to pass everything about the entity to the client so that the client can send it back to the server. This requires hidden inputs or serializing the entire entity in the page exposes more about your domain model to the browser. This increases the payload size to the client and back.
Because you need to serialize everything, "quick fixes" like serializing the entire entity in a <script> block for later use can lead to lazy-load "surprises" as the serializer will attempt to touch all navigation properties.
Passing an entity back to the server to perform an Update() means trusting the data coming back from the client. Browser debug tools can intercept a Form submit or Ajax POST and tamper with the data in the payload. This can open the door to unexpected tampering. DbContext.Update also results in an UPDATE statement that overwrites all columns whether anything changed or not, where change tracked entities will only build UPDATE statements to include values that actually changed only if anything actually changed.
Let's assume I have the following situation, the update method in my service accepts a model (the one that is going to be updated) as an input parameter. The model can be unattached (in which case attach method is called before submitting changes) or attached (in which case we just submit changes). Edit actions just call this update method in my service. Now let's assume I cannot change the code in those actions (the code that produces the model to be updated). Can I still somehow prevent certain columns from updating from within the update method. Note that I might want to set those columns using linq to SQL, but only during insert method.
I'm quite sure I'm trying something unconventional here, but it might help me write some easy to reuse code. If it cannot be done, then I'll solve it differently, but it never hurts to try something new.
The Attach method does provide an override to accept both a modified and original entity.
Attach Modified Entity on Data Context
When using this the internal change tracker will figure out which columns have been updated and will only update those ones on the datasource which have changed, rather than updating all columns.
Alternatively if you want more explicit control over which properties are updated, you can reattach your entity as unmodified in its original state:
Attach Modified/Unmodified Entity on Data Context
This will hook up the internal change tracker to the PropertyChanging events on the entity so it can be tracked. You would then simply change the values of the properties on that entity in the Update method on your Service.
void Update(MyModel model)
{
using (MyContext ctx = new MyContext())
{
ctx.DeferredLoadingEnabled = false;
ctx.MyEntities.Attach(model.OriginalEntity);
model.OriginalEntity.Value = model.ModifiedEntity.Value;
ctx.SubmitChanges();
}
}
The pitfall of these approaches means you must maintain both the original and modified entities in your model, but could be set when your entities are loaded - a simple shallow copy of the object should do the trick by deriving from ICloneable in a partial class for each entity.
For each of my tables there is the Key value so my question is how can I get the Key property?
I want to search that object I got in the paramater and update the table through this updated object. how do I find my object from the table?
public static void Update<TEntity>(TEntity UpdatedObject) where TEntity : class
{
DatabaseLinqDataContext ClientLinq = new DatabaseLinqDataContext(Patch);
ClientLinq.GetTable<TEntity>()// I want to search that object i got in the paramater and update it. how do I find my object from the table?
ClientLinq.SubmitChanges();
}
Any idea? what is the right way to solve this problem?
This action in written in the Business Logic Layer.
There's a few cases to consider.
If the entity instance was previously loaded by this datacontext, just call SubmitChanges. There's no need to do anything else (all modified instances will be updated at that point).
If the entity instance was previously loaded by a different datacontext, you need to make a new copy which will not be tracked, then follow the next case.
If the entity instance was not loaded by a datacontext (or the instance is a copy from above), then you need to call
//attach the changes to the tracked object graph
dataContext.GetTable<T>().Attach(t, true);
//if type uses optimistic concurrency, then you need to refresh
//load the "original" state for the tracked object graph
dataContext.Refresh(t, RefreshMode.KeepCurrentValues);
dataContext.SubmitChanges();
Or, it would be better to call:
dataContext.GetTable<T>().Attach(tChanged, tOriginal);
dataContext.SubmitChanges();
I'm struggling with Entity Framework code first and merging.
I have an MVC controller with a generic repository. A view model gets posted up and I convert that into the type that EF knows about
var converted = AutoMapper.Mapper.Map<RoutineViewModel, Routine>(result);
_routineRepository.Update(converted);
In the repository I have:
/*
Routines.Attach(item);
ChangeTracker.Entries<Routine>().Single(x => x.Entity.Id == item.Id).State = EntityState.Modified;*/
var match = Routines.Single(x => x.Id == item.Id);
var entity = Entry(match);
entity.CurrentValues.SetValues(item);
I commented out the first bit because it was throwing an error about already tracking the entity even though a check like this:
if (ChangeTracker.Entries<Routine>().Count(x => x.Entity.Id == item.Id) != 0)
returned false
The problem I'm having is that the Routine object has an ICollection property of Steps. When I set the values of the tracked entity to match that of the poco the ICollection changes aren't propagated down. Looking around this site there looks to be a few nasty looking recursive calls. Is this really how it works or am I missing something?
Is there any easy way to say, here is the source object (untracked), copy everything about it into the tracked object?
Just to be clear I don't think that getting the object first and updating the properties on that should be done outside of the repository. That seems to not only force you to pass your data models across domain boundaries but seems like instead of an equivalent SQL like statement being (update x,y where id = 1), to (insert into temp table where id = 1, for reach row in temp table, update x..... now for each row in table update table x = tempx where id = 1)
Edit --
So the problem is with the setValues not being a recursive call. The routine object has 2 simple properties (id and name) and one complex (ICollection ). If the item coming in has the name changed and some steps changed, setValues picks up the name change but doesn't apply to the children. Is there some other way to do this? It seems a little creaky to me that I have to hand roll this functionality
From what i can tell you are creating your entity, populating properties and then attaching it to the DB. This is kinda the wrong way round with EF.
If you want to attach an object which is already in the DB but isnt being tracked, you can use attach but only changes made after the attach call are recorded to be committed to the DB. If you want to use attach make sure you make your changes after calling that method.
In addition EF only allows you to attach an object which is not currently in the object graph. So if you try to attach the same object twice (or one with the same key) you will be given an error such as the one you are seeing.
I have a Linq object, and I want to make changes to it and save it, like so:
public void DoSomething(MyClass obj) {
obj.MyProperty = "Changed!";
MyDataContext dc = new MyDataContext();
dc.GetTable<MyClass>().Attach(dc, true); // throws exception
dc.SubmitChanges();
}
The exception is:
System.InvalidOperationException: An entity can only be attached as modified without original state if it declares a version member or does not have an update check policy.
It looks like I have a few choices:
put a version member on every one of my Linq classes & tables (100+) that I need to use in this way.
find the data context that originally created the object and use that to submit changes.
implement OnLoaded in every class and save a copy of this object that I can pass to Attach() as the baseline object.
To hell with concurrency checking; load the DB version just before attaching and use that as the baseline object (NOT!!!)
Option (2) seems the most elegant method, particularly if I can find a way of storing a reference to the data context when the object is created. But - how?
Any other ideas?
EDIT
I tried to follow Jason Punyon's advice and create a concurrency field on on table as a test case. I set all the right properties (Time Stamp = true etc.) on the field in the dbml file, and I now have a concurrency field... and a different error:
System.NotSupportedException: An attempt has been made to Attach or Add an entity that is not new, perhaps having been loaded from another DataContext. This is not supported.
So what the heck am I supposed to attach, then, if not an existing entity? If I wanted a new record, I would do an InsertOnSubmit()! So how are you supposed to use Attach()?
Edit - FULL DISCLOSURE
OK, I can see it's time for full disclosure of why all the standard patterns aren't working for me.
I have been trying to be clever and make my interfaces much cleaner by hiding the DataContext from the "consumer" developers. This I have done by creating a base class
public class LinqedTable<T> where T : LinqedTable<T> {
...
}
... and every single one of my tables has the "other half" of its generated version declared like so:
public partial class MyClass : LinqedTable<MyClass> {
}
Now LinqedTable has a bunch of utility methods, most particularly things like:
public static T Get(long ID) {
// code to load the record with the given ID
// so you can write things like:
// MyClass obj = MyClass.Get(myID);
// instead of:
// MyClass obj = myDataContext.GetTable<MyClass>().Where(o => o.ID == myID).SingleOrDefault();
}
public static Table<T> GetTable() {
// so you can write queries like:
// var q = MyClass.GetTable();
// instead of:
// var q = myDataContext.GetTable<MyClass>();
}
Of course, as you can imagine, this means that LinqedTable must somehow be able to have access to a DataContext. Up until recently I was achieving this by caching the DataContext in a static context. Yes, "up until recently", because that "recently" is when I discovered that you're not really supposed to hang on to a DataContext for longer than a unit of work, otherwise all sorts of gremlins start coming out of the woodwork. Lesson learned.
So now I know that I can't hang on to that data context for too long... which is why I started experimenting with creating a DataContext on demand, cached only on the current LinqedTable instance. This then led to the problem where the newly created DataContext wants nothing to do with my object, because it "knows" that it's being unfaithful to the DataContext that created it.
Is there any way of pushing the DataContext info onto the LinqedTable at the time of creation or loading?
This really is a poser. I definitely do not want to compromise on all these convenience functions I've put into the LinqedTable base class, and I need to be able to let go of the DataContext when necessary and hang on to it while it's still needed.
Any other ideas?
Updating with LINQ to SQL is, um, interesting.
If the data context is gone (which in most situations, it should be), then you will need to get a new data context, and run a query to retrieve the object you want to update. It's an absolute rule in LINQ to SQL that you must retrieve an object to delete it, and it's just about as iron-clad that you should retrieve an object to update it as well. There are workarounds, but they are ugly and generally have lots more ways to get you in trouble. So just go get the record again and be done with it.
Once you have the re-fetched object, then update it with the content of your existing object that has the changes. Then do a SubmitChanges() on the new data context. That's it! LINQ to SQL will generate a fairly heavy-handed version of optimistic concurrency by comparing every value in the record to the original (in the re-fetched) record. If any value changed while you had the data, LINQ to SQL will throw a concurrency exception. (So you don't need to go altering all your tables for versioning or timestamps.)
If you have any questions about the generated update statements, you'll have to break out SQL Profiler and watch the updates go to the database. Which is actually a good idea, until you get confidence in the generated SQL.
One last note on transactions - the data context will generate a transaction for each SubmitChanges() call, if there is no ambient transaction. If you have several items to update and want to run them as one transaction, make sure you use the same data context for all of them, and wait to call SubmitChanges() until you've updated all the object contents.
If that approach to transactions isn't feasible, then look up the TransactionScope object. It will be your friend.
I think 2 is not the best option. It's sounding like you're going to create a single DataContext and keep it alive for the entire lifetime of your program which is a bad idea. DataContexts are lightweight objects meant to be spun up when you need them. Trying to keep the references around is also probably going to tightly couple areas of your program you'd rather keep separate.
Running a hundred ALTER TABLE statements one time, regenerating the context and keeping the architecture simple and decoupled is the elegant answer...
find the data context that originally created the object and use that to submit changes
Where did your datacontext go? Why is it so hard to find? You're only using one at any given time right?
So what the heck am I supposed to attach, then, if not an existing entity? If I wanted a new record, I would do an InsertOnSubmit()! So how are you supposed to use Attach()?
You're supposed to attach an instance that represents an existing record... but was not loaded by another datacontext - can't have two contexts tracking record state on the same instance. If you produce a new instance (ie. clone) you'll be good to go.
You might want to check out this article and its concurrency patterns for update and delete section.
The "An entity can only be attached as modified without original state if it declares a version member" error when attaching an entitity that has a timestamp member will (should) only occur if the entity has not travelled 'over the wire' (read: been serialized and deserialized again). If you're testing with a local test app that is not using WCF or something else that will result in the entities being serialized and deserialized then they will still keep references to the original datacontext through entitysets/entityrefs (associations/nav. properties).
If this is the case, you can work around it by serializing and deserializing it locally before calling the datacontext's .Attach method. E.g.:
internal static T CloneEntity<T>(T originalEntity)
{
Type entityType = typeof(T);
DataContractSerializer ser =
new DataContractSerializer(entityType);
using (MemoryStream ms = new MemoryStream())
{
ser.WriteObject(ms, originalEntity);
ms.Position = 0;
return (T)ser.ReadObject(ms);
}
}
Alternatively you can detach it by setting all entitysets/entityrefs to null, but that is more error prone so although a bit more expensive I just use the DataContractSerializer method above whenever I want to simulate n-tier behavior locally...
(related thread: http://social.msdn.microsoft.com/Forums/en-US/linqtosql/thread/eeeee9ae-fafb-4627-aa2e-e30570f637ba )
You can reattach to a new DataContext. The only thing that prevents you from doing so under normal circumstances is the property changed event registrations that occur within the EntitySet<T> and EntityRef<T> classes. To allow the entity to be transferred between contexts, you first have to detach the entity from the DataContext, by removing these event registrations, and then later on reattach to the new context by using the DataContext.Attach() method.
Here's a good example.
When you retrieve the data in the first place, turn off object tracking on the context that does the retrieval. This will prevent the object state from being tracked on the original context. Then, when it's time to save the values, attach to the new context, refresh to set the original values on the object from the database, and then submit changes. The following worked for me when I tested it.
MyClass obj = null;
using (DataContext context = new DataContext())
{
context.ObjectTrackingEnabled = false;
obj = (from p in context.MyClasses
where p.ID == someId
select p).FirstOrDefault();
}
obj.Name += "test";
using (DataContext context2 = new ())
{
context2.MyClasses.Attach(obj);
context2.Refresh(System.Data.Linq.RefreshMode.KeepCurrentValues, obj);
context2.SubmitChanges();
}