Updating an Object Using Linq to SQL - c#

I am trying to write some business layer logic for an asp.net app to insert or update an object. I am getting an object out of the business layer and then passing it back in to be saved to the db. The data context is contained in the business layer which is what I think is causing the exception.
The exceptions is "An attempt has been made to Attach or Add an entity that is not new, perhaps having been loaded from another DataContext. This is not supported."
I'm sure I am missing some small setting but I'm just not sure what.
this is the code that is doing the insertion and updating....
public static void Save(Order order)
{
using (TicketInformationDataContext db = new TicketInformationDataContext())
{
if (order.OrderID <= 0)
db.Orders.InsertOnSubmit(order);
else
{
db.ObjectTrackingEnabled = true;
ITable table = db.GetTable(typeof(Order));
table.Attach(order, true);
db.Orders.Attach(order, true);
}
db.SubmitChanges();
}
}

I believe your problem is that you are creating separate contexts for loading the Order and later for saving it.
You should design your repositories to work against a context you can pass during construction (or preferably inject through IoC).
That way, both operations would work against the same context.
Remember that entities are bound to their context, and attempting to mix them will cause all sorts of problems, especially with lazy initialization or associated entities.
Please see the accepted response for this similar question, which discusses repository design and the unit of work design pattern.

Related

Insert of entity with association causes duplicate association

When I try to insert a entity into the database that has a association with another entity which is already in the database and has an id the entityframework the associated entity gets inserted too. This causes a duplicate entry for the associated entity.
Insert method in the Repository class
public T Insert(T entity)
{
DbSet.Add(entity);
Context.SaveChanges();
return entity;
}
Call of the insert method
this happens somewhere in my Code. I save it into my session variable.
using(var repository = new Repository<User>())
{
user = repository.GetById(id);
}
Then some other place:
Post post = new Post{ User = user, Content ="oO" };
using (var rep = new Repository<Post>())
{
rep.Insert(post);
}
I resolved the duplicate insert with this snippet below. This there a better way than to cast for every entitytype and reattach the assocated entites?
if (entity is Post)
{
Post post = (Post)(object)entity;
Context.Users.Attach(post.User);
}
Most probably your repositories create a new instance of the context instead of sharing one. That's why the newly created context in the Post repository doesn't "see" the user object instance.
Fix this properly by controlling the lifetime of your context and injecting the same instance into different repositories in the same unit of work.
Edit: Container based injection could help you a lot but start by having the data context as the required constructor parameter of your repositories. Then, think of the lifetime of your context.
In a web application a request lifetime is usually most convenient. In a WPF application you could possibly have a view model lifetime, a new context in each view model.
And whether or not the dependency will be satisfied by a container is another story.
These generic repositories are out. There are numerous reasons not to use them. This is one of them. The repositories suggest that you're obeying the single responsibility principle, but they don't. They contain the "Trojan horse" DbContext, which is a component that spills out far more responsibilities than a generic repository should have. Thus, when a repository saves its "own" entities, any old moment it steps out of line by saving others too.
Make your life much easier by throwing out this useless layer. A DbSet is a repository and a DbContext is a unit of work. (I'm only quoting here). In the EF architecture the UoW contains the repositories and these repositories don't have save responsibilities! The UoW has.
Dropping these generic repositories, you can simply do
using(var db = new MyContext())
{
var user = db.Users.Find(id);
Post post = new Post{ User = user, Content ="oO" };
db.Posts.Add(post); // User is not added because it is known to the context.
db.SaveChanges();
}
This is a dedicated method that saves a post connected to a user. There is no point trying to make some generic method for this, because it is a specific task (or use case). As you already noticed, when trying to generalize such tasks you probably can't do without inspecting objects and introducing ugly and clunky if or switch chains.

Load table from a database at the first reference in context

I'm using Entity Framework 4.0. I do a lot of read operations on the database (data analysis). No data will be saved. Currently, despite the Lazy Loading, number of I / O operations to the database server slows down considerably the application. I decided most of the small tables loaded into memory (.ToList()) and then generate the calculation. Is there a way to automatically read the context of the data in the table are only the first references to it and have not been updated by the life context?. The idea is that with further references to this table was not queried database, only the memory of the application.
Now, I use this code:
public class cDBReader
{
private List<RISK_T_MEMBERS> fMembers;
public List<RISK_T_MEMBERS> Members
{
get
{
if (fMembers == null)
using (RiskEntities context = new RiskEntities(TConfiguration.connectionString))
{
context.RISK_T_MEMBERS.MergeOption = System.Data.Objects.MergeOption.NoTracking;
fMembers = context.RISK_T_MEMBERS.ToList();
}
return fMembers;
}
set { fMembers = value; }
}
}
Could you please add some more information? What is wrong with the current implementation?
It seems like what you need is some kind of caching. Here is an example implementation on caching.
An even simple ( less overkill in my opinion ) way to implement caching would be this.
Last but not least is any implementation of yours ( like the one you already made ). Maybe the singleton pattern could help you here? That is a singleton object that saves all those data you need globally and implements gets to your data by providing the context currently active, and in case the data is null, the application uses the active context to get them. Dropping and creating the context could also speed up the application after a while ( and that's the reason why I think you should provide the active context to the getData functions ).
You could try Eager Loading instead of Lazy Loading first to get full control on what EF loads and just manually preload what you need once. Loaded data will stay alive within one context lifetime bounds, if you will need more and are sure that database is read only, you could just store somewhere and then attach preloaded data when new context is created

How to revert the ef4 context, or at least some entities to their original values?

Scenario:
Retrieve some entities
Update some properties on those entities
You perform some sort of business logic which dictates that you should no longer have those properties updated; instead you should insert some new entities documenting the results of your business logic.
Insert said new entities
SaveChanges();
Obviously in the above example calling SaveChanges() will not only insert the new entities, but update the properties on the original entities. Before I have managed to rearrange my code in a way where changes to the context (and its entities) would only be made when I knew for sure that I would want all my changes saved, however that’s not always possible. So the question is what is the best way to handle this scenario? I don’t work with the context directly, rather through repositories, if that matters. Is there a simple way to revert the entities to their original values? What is the best practice in this sort of scenario?
Update
Although I disagree with Ladislav that the business logic should be rearranged in such way that the validation always come before any modification to the entities, I agree that the solution should really be persisting wanted changes on a different context. The reason I disagree is because my business transaction is fairly long, and validation or error checking that might happen at the end of the transaction are not always obvious upfront. Imagine a Christmas tree you're decorating with lights from the top down, you've already modified the tree by the time you're working on the lower branches. What happens if one of the lights breaks? You want to roll back all of your changes, but you want to create some ERROR entities. As Ladislav suggested the most straight forward way would be to save the ERROR entities on a different context, allowing the original one (with the modified metaphorical tree) to expire without SaveChanges being ever called.
Now, in my situation I utilize Ninject for dependance injection, injecting one EF context into all of my repositories that are in the scope of the top level service. What this means is that my business layer classes don't really have control of creating new EF contexts. Not only do they not have access to the EF context (remember they work through repositories), but the injection has already occurred higher in the object hierarchy. The only solution I found is to create another class that will utilize Ninject to create a new UOW within it.
//business logic executing against repositories with already injected and shared (unit of work) context
Tree = treeRepository.Get();
Lights = lightsRepsitory.Get();
//update the tree as you're decorating it with lights
if(errors.Count == 0)
{
//no errors, calling SaveChanges() on any one repository will commit the entire UOW as they all share the same injected EF context
repository1.SaveChanges();
}
else
{
//oops one of the lights broke, we need to insert some Error entities
//however if we just add id to the errorRepository and call SaveChanges() the modifications that happened
//to the tree will also be committed.
TreeDecoratorErroHandler.Handle(errors);
}
internal class TreeDecoratorErroHandler
{
//declare repositories
//constructor that takes repository instances
public static void Handle(IList<Error> errors)
{
//create a new Ninject kernel
using(Ninject... = new Ninject...)
{
//this will create an instance that will get injected with repositories sharing a new EF instance
//completely separate from the one outside of this class
TreeDecoratorErroHandler errorHandler = ninjectKernel.Get<TreeDecoratorErroHandler>();
//this will insert the errors and call SaveChanges(), the only changes in this new context are the errors
errorHandler.InsertErrors(errors);
}
}
//other methods
}
You should definitely use a new context for this. Context is unit of work and once your business logic says: "Hey I don't want to update this entity" then the entity is not part of unit of work. You can either detach the entity or create new context.
There is possibility to use Refresh method but that method is supposed to be used in scenarios where you have to deal with optimistic concurrency. Because of that this method refreshes only scalar and complex properties and foreign keys if part of the entity - if you made changes to navigation properties these can be still present after you refresh the entity.
Take a look at ObjectContext.Refresh with RefreshMode.StoreWins I think that will do what you want. Starting a new context would achieve the same thing I guess, but not be as neat.

Linq to SQL - How should I manage database requests?

I have studied a bit into the lifespan of the DataContext trying to work out what is the best possible way of doing things.
Given I want to re-use my DAL in a web application I decided to go with the DataContext Per Business Object Request approach.
My idea was to extend my L2S entities from the dbml file to retrieve information the database creating a separate context per request e.g.
public partial class AnEntity
{
public IEnumerable<RelatedEntity> GetRelatedEntities()
{
using (var dc = new MyDataContext())
{
return dc.RelatedEntities.Where(r => r.EntityID == this.ID);
}
}
}
In terms of returning the Entities...do I need to return POCOs at this point or is it ok to simply return the business object returned from the query? I understand that if I was to try access properties of the returned entity (after the DataContext has been disposed) it would fail. However, this is the reason I have decided to implement these type of methods e.g.
Instead of:
AnEntity entity = null;
using (var repo = new EntityRepo())
{
entity = repo.GetEntity(12345);
}
var related = entity.RelatedEntities; // this would cause an exception
In theory I should be able to do:
AnEntity entity = null;
using (var repo = new EntityRepo())
{
entity = repo.GetEntity(12345);
}
var related = entity.GetRelatedEntities();
Given the circumstances of my particular app (needs to work in a windows service & web application) I would like to know if this seems a plausible approach, whether there are obvious flaws and if there are better approaches for what it is I am trying to do.
Thanks.
Generally speaking, as long as you are not calling a single DataContext object using more than one thread, you should be OK. In other words, use one DataContext object per thread, and do not share data or state between different DataContext objects.
The remaining multi-threaded issues have to do with concurrency in the database, not threading operations.
Other than these caveats, your approach seems sound. You can either use partial classes to implement your business methods, or you can add a business layer between the Linq to SQL classes and your repository.
You can get away with this:
var repo = new EntityRepo();
entity = repo.GetEntity(12345);
var related = entity.RelatedEntities;
See this StackOverflow post for an explanation why not disposing your context doesn't cause connection leaks or stuff like that.
The repository and context will get cleaned up by the garbage collector when the entities that were fetched by them fall out of scope (when building a website, at the end of the request).
Edit: MSDN documents that connections are opened as late as possible and closed as soon as possible. So skipping the using doesn't causes connection pool problems.

How do you save a Linq object if you don't have its data context?

I have a Linq object, and I want to make changes to it and save it, like so:
public void DoSomething(MyClass obj) {
obj.MyProperty = "Changed!";
MyDataContext dc = new MyDataContext();
dc.GetTable<MyClass>().Attach(dc, true); // throws exception
dc.SubmitChanges();
}
The exception is:
System.InvalidOperationException: An entity can only be attached as modified without original state if it declares a version member or does not have an update check policy.
It looks like I have a few choices:
put a version member on every one of my Linq classes & tables (100+) that I need to use in this way.
find the data context that originally created the object and use that to submit changes.
implement OnLoaded in every class and save a copy of this object that I can pass to Attach() as the baseline object.
To hell with concurrency checking; load the DB version just before attaching and use that as the baseline object (NOT!!!)
Option (2) seems the most elegant method, particularly if I can find a way of storing a reference to the data context when the object is created. But - how?
Any other ideas?
EDIT
I tried to follow Jason Punyon's advice and create a concurrency field on on table as a test case. I set all the right properties (Time Stamp = true etc.) on the field in the dbml file, and I now have a concurrency field... and a different error:
System.NotSupportedException: An attempt has been made to Attach or Add an entity that is not new, perhaps having been loaded from another DataContext. This is not supported.
So what the heck am I supposed to attach, then, if not an existing entity? If I wanted a new record, I would do an InsertOnSubmit()! So how are you supposed to use Attach()?
Edit - FULL DISCLOSURE
OK, I can see it's time for full disclosure of why all the standard patterns aren't working for me.
I have been trying to be clever and make my interfaces much cleaner by hiding the DataContext from the "consumer" developers. This I have done by creating a base class
public class LinqedTable<T> where T : LinqedTable<T> {
...
}
... and every single one of my tables has the "other half" of its generated version declared like so:
public partial class MyClass : LinqedTable<MyClass> {
}
Now LinqedTable has a bunch of utility methods, most particularly things like:
public static T Get(long ID) {
// code to load the record with the given ID
// so you can write things like:
// MyClass obj = MyClass.Get(myID);
// instead of:
// MyClass obj = myDataContext.GetTable<MyClass>().Where(o => o.ID == myID).SingleOrDefault();
}
public static Table<T> GetTable() {
// so you can write queries like:
// var q = MyClass.GetTable();
// instead of:
// var q = myDataContext.GetTable<MyClass>();
}
Of course, as you can imagine, this means that LinqedTable must somehow be able to have access to a DataContext. Up until recently I was achieving this by caching the DataContext in a static context. Yes, "up until recently", because that "recently" is when I discovered that you're not really supposed to hang on to a DataContext for longer than a unit of work, otherwise all sorts of gremlins start coming out of the woodwork. Lesson learned.
So now I know that I can't hang on to that data context for too long... which is why I started experimenting with creating a DataContext on demand, cached only on the current LinqedTable instance. This then led to the problem where the newly created DataContext wants nothing to do with my object, because it "knows" that it's being unfaithful to the DataContext that created it.
Is there any way of pushing the DataContext info onto the LinqedTable at the time of creation or loading?
This really is a poser. I definitely do not want to compromise on all these convenience functions I've put into the LinqedTable base class, and I need to be able to let go of the DataContext when necessary and hang on to it while it's still needed.
Any other ideas?
Updating with LINQ to SQL is, um, interesting.
If the data context is gone (which in most situations, it should be), then you will need to get a new data context, and run a query to retrieve the object you want to update. It's an absolute rule in LINQ to SQL that you must retrieve an object to delete it, and it's just about as iron-clad that you should retrieve an object to update it as well. There are workarounds, but they are ugly and generally have lots more ways to get you in trouble. So just go get the record again and be done with it.
Once you have the re-fetched object, then update it with the content of your existing object that has the changes. Then do a SubmitChanges() on the new data context. That's it! LINQ to SQL will generate a fairly heavy-handed version of optimistic concurrency by comparing every value in the record to the original (in the re-fetched) record. If any value changed while you had the data, LINQ to SQL will throw a concurrency exception. (So you don't need to go altering all your tables for versioning or timestamps.)
If you have any questions about the generated update statements, you'll have to break out SQL Profiler and watch the updates go to the database. Which is actually a good idea, until you get confidence in the generated SQL.
One last note on transactions - the data context will generate a transaction for each SubmitChanges() call, if there is no ambient transaction. If you have several items to update and want to run them as one transaction, make sure you use the same data context for all of them, and wait to call SubmitChanges() until you've updated all the object contents.
If that approach to transactions isn't feasible, then look up the TransactionScope object. It will be your friend.
I think 2 is not the best option. It's sounding like you're going to create a single DataContext and keep it alive for the entire lifetime of your program which is a bad idea. DataContexts are lightweight objects meant to be spun up when you need them. Trying to keep the references around is also probably going to tightly couple areas of your program you'd rather keep separate.
Running a hundred ALTER TABLE statements one time, regenerating the context and keeping the architecture simple and decoupled is the elegant answer...
find the data context that originally created the object and use that to submit changes
Where did your datacontext go? Why is it so hard to find? You're only using one at any given time right?
So what the heck am I supposed to attach, then, if not an existing entity? If I wanted a new record, I would do an InsertOnSubmit()! So how are you supposed to use Attach()?
You're supposed to attach an instance that represents an existing record... but was not loaded by another datacontext - can't have two contexts tracking record state on the same instance. If you produce a new instance (ie. clone) you'll be good to go.
You might want to check out this article and its concurrency patterns for update and delete section.
The "An entity can only be attached as modified without original state if it declares a version member" error when attaching an entitity that has a timestamp member will (should) only occur if the entity has not travelled 'over the wire' (read: been serialized and deserialized again). If you're testing with a local test app that is not using WCF or something else that will result in the entities being serialized and deserialized then they will still keep references to the original datacontext through entitysets/entityrefs (associations/nav. properties).
If this is the case, you can work around it by serializing and deserializing it locally before calling the datacontext's .Attach method. E.g.:
internal static T CloneEntity<T>(T originalEntity)
{
Type entityType = typeof(T);
DataContractSerializer ser =
new DataContractSerializer(entityType);
using (MemoryStream ms = new MemoryStream())
{
ser.WriteObject(ms, originalEntity);
ms.Position = 0;
return (T)ser.ReadObject(ms);
}
}
Alternatively you can detach it by setting all entitysets/entityrefs to null, but that is more error prone so although a bit more expensive I just use the DataContractSerializer method above whenever I want to simulate n-tier behavior locally...
(related thread: http://social.msdn.microsoft.com/Forums/en-US/linqtosql/thread/eeeee9ae-fafb-4627-aa2e-e30570f637ba )
You can reattach to a new DataContext. The only thing that prevents you from doing so under normal circumstances is the property changed event registrations that occur within the EntitySet<T> and EntityRef<T> classes. To allow the entity to be transferred between contexts, you first have to detach the entity from the DataContext, by removing these event registrations, and then later on reattach to the new context by using the DataContext.Attach() method.
Here's a good example.
When you retrieve the data in the first place, turn off object tracking on the context that does the retrieval. This will prevent the object state from being tracked on the original context. Then, when it's time to save the values, attach to the new context, refresh to set the original values on the object from the database, and then submit changes. The following worked for me when I tested it.
MyClass obj = null;
using (DataContext context = new DataContext())
{
context.ObjectTrackingEnabled = false;
obj = (from p in context.MyClasses
where p.ID == someId
select p).FirstOrDefault();
}
obj.Name += "test";
using (DataContext context2 = new ())
{
context2.MyClasses.Attach(obj);
context2.Refresh(System.Data.Linq.RefreshMode.KeepCurrentValues, obj);
context2.SubmitChanges();
}

Categories

Resources