how to insert records to unrelated tables using EF - c#

I have a Comment table which can be linked to many different entities that have comments, but for reasons, I have not linked those tables. Instead Comment contains TableReferenceId and EntryReferenceId. TableReferenceId is just an int that we can check in the app layer as to which entity/table that comment refers to, and EntryReferenceId is an int that refers to a particular entry in said entity/table to which the comment belongs.
Querying such comments by table and entry reference would be fine, but when inserting bulk data, I am drawing a blank. For example if I have Vehicle entity and a Vehicle can have many comments, when inserting the data, how would I link them since I don't have a VehicleId yet? Is this doable or is it better to just go many-to-many route for each of the tables that link to comments?

If you can avoid this situation, then you should try to, or you should try to avoid supporting a bulk insert. If you must do this though, then either of the following patterns may work for you.
Perform the Bulk Insert in 2 stages, before the normal import, maintain a map or dictionary of records and the comments that they are linked to, then after the first call to SaveChanges() the IDs will be available to insert.
You could store the mapped comments inside an unbound collection on the entity, after SaveChanges() if there are any entries in this collection, they should be inserted using the new record's Id.
Lets look at the first option:
var mappedComments = new Dictionary<Vehicle,Comment[]>();
// bulk processing, however you choose to do it
// importantly for each item, capture the record reference and the comments
foreach(var item in source)
{
Vehicle newItem;
... construct/parse the new Entity object
List<Comment> newComments = new List<Comment>();
... parse the comments records
// store the map
mappedComments.Add(newItem, newComments.ToArray());
// Add the entity to the context?
db.AddToVehicles(newItem);
}
db.SaveChanges();
foreach(var mapEntry in mappedComments)
{
var newVehicle = mapEntry.Key;
// replace this with your actual logic of course...
int vehicleTableReferenceId = db.TableReferences.Single(x => x.TableName == nameof(Vehicle));
foreach(var comment in mappEntry.Value)
{
comment.TableReferenceId = vehicleTableReferenceId;
comment.EntityReferenceId = newVehicle.Id; // the Id that is now populated
db.AddToComments(comment);
}
}
db.SaveChanges();
If you have a lot Entity types that exhibit this linking behaviour, then you could build this functionality into the Entities themselves, by embedding the mapped comments within the entity itself.
Define an Interface that describes an object that has a weak reference to these Comments
public interface ICommentsToInsert
{
// Only necessary if your convention is NOT to use a common name for the PK
int Id { get; }
ICollection<Comment> CommentsToInsert { get;set;}
}
Implement this interface and add an unmapped collection property to the entities to store the Comment Entries to insert against each record.
partial class Vehicle : ICommentsToInsert
{
[NotMapped]
int ICommentsToInsert.Id { get => Vehicle_Id; }
[NotMapped]
public ICollection<Comment> CommentsToInsert { get;set; } = new HashSet<Comment>();
}
In your bulk logic, add the Comment records into the Vehicle.CommentsToInsert collection, I'll leave that to you...
Override SaveChanges() to detect entities that have comments and re-process them after the save operation.
In this example I am storing the EntityState for all modified entries before the save, this is overkill for this particular example, but you only lose this state information during the save, keeping a record of it becomes useful for a whole range of other applications for post-processing logic.
public override int SaveChanges()
{
var beforeStates = BeforeSaveChanges();
int result = base.SaveChanges();
if (AfterSaveChanges(beforeStates);
result += base.SaveChanges();
return results;
}
private Dictionary<DbEntityEntry, EntityState> BeforeSaveChanges()
{
var beforeSaveChanges = new Dictionary<DbEntityEntry, EntityState>();
foreach( var entry in this.ChangeTracker.Entries())
{
//skip unchanged entries!
if (entry.State == EntityState.Unchanged)
continue;
// Today, only cache the ICommentsToInsert records...
if (entry.Entity is ICommentsToInsert)
beforeSaveChanges.Add(entry, entry.State);
}
return beforeSaveChanges;
}
private bool AfterSaveChanges(Dictionary<DbEntityEntry, EntityState> statesBeforeSaveChanges)
{
bool moreChanges = false;
foreach (var entry in statesBeforeChanges)
{
if (entry.Key.Entity is ICommentsToInsert hasComments)
{
if(hasComments.CommentsToInsert.Any())
{
moreChanges = true;
// Get the Id to the TableReference, based on the name of the Entity type
// you would normally cache this type of lookup, rather than hitting the DB every time
int tableReferenceId = db.TableReferences.Single(x =
> x.TableName == entry.Key.Entity.GetType().Name);
foreach (var comment in hasComments.CommentsToInsert)
{
comment.TableReferenceId = tableReferenceId;
comment.EntityReferenceId = hasComments.Id;
db.AddToComments(comment);
}
}
}
}
return moreChanges;
}
You can further evolve this by implementing DbTransaction scopes to rollback the whole lot if things fail, this code itself is para-phrased from my common routines that I use in production code, so whilst it may not work as is, the concept has served me well in many projects.

Related

Insert order of Entity Framework children

I have a structure like this
public class Son {
public string Name {get;set;}
public int Age {get;set;}
}
public class Daughter {
public string Name {get;set;}
public int Age {get;set;}
}
public class Parent {
public Daughter[] Daughters {get;set;}
public Son[] Sons {get;set;}
}
Where there is a FK Parent -> Son and Parent -> Daughter
Currently when doing a Context.SaveChanges() on a parent object it saves the Parent, and then saves the Daughters and then saves the Sons. I need it to save the Sons before the Daughters because we have a database trigger that does validation of the Sons based on the Daughters (and will deny the whole thing if it doesnt meet a requirement)
This trigger is obviously outside the knowledge of EF.
How can I specify that Sons are dependent on Daughters in EF such that Sons get inserted first; or is there a specification or attribute that I can define insert order?
PS: Do not look too much into the contrived example (such as why we dont save it under one thing called Children). The real-world example is much more complicated but the idea of saving Sons before Daughters is there
I love a challenge!
Firstly a declaration: I'm not a fan of Triggers or building a requirement for making order of insert important. My first exercise would be to exhaust all options to remove such a requirement.
After a bit of tinkering from what I can see at least when adding entities, for instance a Parent with one or more Daughters and one or more Sons, the order of insert is consistently alphabetical based on the Entity names. For example with entities named "Parent", "Daughter", and "Son", the insert was always Parent > Daughter > Son. The order of the properties, configuration, inserts, or even the table names had no bearing on the operations, however renaming the entity class "Son" to "ASon" resulted in Sons being inserted before Daughters. I don't know if this will carry forward to edits, but it's something to consider without getting too hacky. (Though something like this would definitely need to be documented well in the system in case someone questions a naming convention to get something inserting before something else.)
That said, getting into the hacky fun business!
Using a Son entity called ASon to force Sons before Daughters, it is possible to get EF to reverse that insert order:
using (var context = new ParentDbContext())
{
var parent = context.Parents.Create();
parent.Name = "Steve";
parent.Daughters.Add(new Daughter { Name = "Elise" });
parent.Daughters.Add(new Daughter { Name = "Susan" });
parent.Sons.Add(new ASon { Name = "Jason" });
parent.Sons.Add(new ASon { Name = "Duke" });
context.Parents.Add(parent);
context.SaveChanges();
}
Out of the box this inserted Parent, Son, Son, Daughter, Daughter.
To reverse it, I overrode SaveChanges, looking for our Sons to defer saving until after everything else:
public override int SaveChanges()
{
var trackedStates = new[] { EntityState.Added, EntityState.Modified };
var trackedParentIds = ChangeTracker.Entries<Parent>().Where(x => trackedStates.Contains(x.State)).Select(x => x.Entity.ParentId).ToList();
var addedSons = ChangeTracker.Entries<ASon>().Where(x => x.State == EntityState.Added).ToList();
var modifiedSons = ChangeTracker.Entries<ASon>().Where(x => x.State == EntityState.Modified).ToList();
int tempid = -1;
int modifiedParentCount = addedSons.Select(x => x.Entity.Parent.ParentId)
.Where(x => trackedParentIds.Contains(x))
.Count();
List<Tuple<Parent, ASon>> associatedSons = new List<Tuple<Parent, ASon>>();
modifiedSons.ForEach(x => { x.State = EntityState.Unchanged; });
addedSons.ForEach(x =>
{
x.Entity.SonId = tempid--;
associatedSons.Add(new Tuple<Parent, ASon>(x.Entity.Parent, x.Entity));
x.Entity.Parent.Sons.Remove(x.Entity);
x.State = EntityState.Unchanged;
});
var result = base.SaveChanges();
addedSons.ForEach(x => { x.Entity.Parent = associatedSons.Single(a => a.Item2 == x.Entity).Item1; x.State = EntityState.Added; });
modifiedSons.ForEach(x => { x.State = EntityState.Modified; });
result += base.SaveChanges() - modifiedParentCount;
return result;
}
So what this is doing:
The first bit is easy, we find our added and modified sons. We also take a count of parents with both modified and added sons. These will get double-counted when this is done.
For modified sons, we just set their state to Unchanged.
For added sons, we need to do a bit of dirty work. We need to give them a temporary unique ID because to mark them as Unchanged, EF still wants to track their ID and when you add 2 sons it will fail here. Note that when we put them back as added, they will receive proper IDs from Identity columns, not these temporary negative ones. We also track the association of their added sons with their respective parent in a Tuple because we need to temporarily remove those sons from their parent. Finally the added son is also marked unchanged.
Now we call the base SaveChanges which will save our parents and their daughters.
For modified sons, we just need to update the state back to Modified.
For our added sons, we use our saved association to re-assign them to their parent, and then set their state back to added.
We call the base SaveChanges again, appending the affected row count to the first run, and subtract the duplicate parent references. (parents that were already counted due to being modified)
The sketchy bit is adjusting the result count for the double-save, this might not be 100% accurate but it should only be an issue if you happen to be using the result of SaveChanges. I can't say I've ever really paid much attention to that return value :)
Hopefully that gives you some ideas to play with.

How to prevent retrieving duplicate values from the db using entityframework

I have a situation where i have a counter field in a table named Profile, and on form submit, i will retrieve the counter field and +1 to the counter and update profile table. The incremented counter will be stored in a variable where i will then use to create new records in another table [Bidder]. The problem is when there are multiple form submit at the same time, duplicate record values will be created in the Bidder table
Profile profile = db.Profile.Where(w => w.TenderId == tender_Id && w.IsDeleted == false).FirstOrDefault();
int submission = profile.TotalSubmission + 1;
if (profile != null) {
profile.TotalSubmission = submission;
profile.ModifiedBy = user_id;
profile.ModifiedOn = DateTime.Now;
db.Entry(profile).State = EntityState.Modified;
db.SaveChanges();
}
bid.ROId = string.Format("RO{0}", submission);
db.Entry(bid).State = EntityState.Modified;
db.SaveChanges();
How do i prevent duplicate ROId to be created?
The uniqueness should be enforced using a unique index or a unique constraint.
You can create these using code first (from MSDN):
public class User
{
public int UserId { get; set; }
[Index(IsUnique = true)]
public string Username { get; set; }
public string DisplayName { get; set; }
}
or directly via the database.
The counter should be protected using optimistic concurrency:
public class MyEntity
{
[Key]
public Guid Id { get; set; }
// Add a timestamp property to your class
[Timestamp]
[Required]
[DatabaseGenerated(DatabaseGeneratedOption.Computed)]
[ConcurrencyCheck]
public byte[] VersionTimestamp { get; set; }
public int Counter { get; set; }
}
If you try to update the row with the VersionTimestamp after it has been changed without re-reading it from the database, you'll get an OptimisiticConcurrencyException e.g. in this test scenario:
// Read the entity
MyEntity entity;
using (var context = new MyContext())
{
entity = context.MyEntities.Single(e => e.Id == id1);
}
// Read and update the entity
using (var context = new MyContext())
{
var entity2 = context.MyEntities.Single(e => e.Id == id1);
entity2.Counter++;
context.SaveChanges();
}
// Try to update stale data
// - an OptimisticConcurrencyException will be thrown
using (var context = new MyContext())
{
entity.Counter++;
context.SaveChanges();
}
If you are using SQL Server 2012 or newer, you can use a Sequence to accomplish this. You would also want to enforce uniqueness through a unique constraint.
public partial class YourEfContext : DbContext
{
.... (other EF stuff) ......
// get your EF context
public int GetNextSequenceValue()
{
var rawQuery = Database.SqlQuery<int>("SELECT NEXT VALUE FOR dbo.SomeSequence;");
var task = rawQuery.SingleAsync();
int nextVal = task.Result;
return nextVal;
}
}
Another option, if you don't have a version that supports sequences, is to use a stored procedure on the database to issue Id numbers. The stored proc can work in conjunction with an ID table, which it can place an explicit lock on. This means you can request an id from the proc, it can lock the table, read the current number, increment it, store it back in the table, release the lock, and return the id. You would need to call your proc from code to get the new id to assign. The lock on the db side ensures that you are only ever assigned unique values. As long as your id column is only ever given a value assigned by the proc, you will have unique values. You will still be able to assign arbitrary numbers though, which could include duplicates, that can be solved with a unique constraint.
None of this in Entity-Framework specific, though you can still access all this through entity-framework in one way or another.
You can not rely only on entity framework for your solution. Only the database has a full picture of the stored data. Your different entity context instances don't even know if other instances exist, so coordinating sequence numbers on a global scale is extremely difficult on EF level.
Depending on the frequency of conflicts, two options come to my mind to enforce the uniqueness of the sequence number:
Unique constraint
Stored procedure for writing the data
Unique constraint
You can create a UNIQUE constraint over the ProfileId and Sequence columns. When you store the data with a duplicate sequence number, you will get an exception. Either the exception itself or one of its inner exceptions will be an SqlException. You can examine the error number of that exception and if it's error number 2627 (if your DBMS is SQL Server; if it is not, check for the similar error in your DBMS), you know it's a unique key constraint violation. In this case you get the current sequence number from the DB and write the data again with a new sequence. You have to repeat that until the insert was successful.
In case you're using SQL server, you can selectively handle a UNIQUE KEY constraint violation like this (using C# 6.0 exception filters):
private bool IsUniqueKeyViolation(Exception exception) {
Exception currentException = exception;
while (currentException != null) {
SqlException sqlException = exception as SqlException;
if (sqlException != null) {
return sqlException.Errors.Cast<SqlError>().Any(error => error.Number == 2627);
}
currentException = currentException.InnerException;
}
return false;
}
//...
//...Code to set up the POCOs before Save...
while(true) {
try {
context.SaveChanges();
}
catch(Exception exc) when (IsUniqueKeyViolation(exc)) {
//...Code to update the sequence number...
continue;
}
break;
}
This solution is only practical if the number of conflicts is expected to be small. If the number of conflicts is large, you will see a lot of unsuccessful UPDATE requests to the DB, which can become a performance issue.
EDIT:
As some other answers suggested, you could also use optimistic concurrency with a timestamp column. As long as you only update the DB from your own code, this works fine. However, a UNIQUE KEY constraint will protect the integrity of your data also from changes that don't originate from your application (like migration scripts etc.). Optimistic concurrency does not give you the same guarantee.
Stored procedure
You can create a stored procedure that will set the new sequence number from the last existing number in the same INSERT or UPDATE statement. The stored procedure can return the new sequence number back to the client and you can process it accordingly.
Since this solution will always update the DB in a single statement, it works well for a larger amount of conflicting updates. The disadvantage is that you have to write a part of your program logic in SQL on the DB level.

Azure Table Storage Doesn't Save Everything in Entity definition

This is the same of the entity that I am planning to save in the Azure Table Storage (ATS):
public class CarEntity : TableEntity
{
public CarEntity(string objPartitionKey, string objRowKey)
{
this.PartitionKey = objPartitionKey;
this.RowKey = objRowKey;
}
public string TableName
{
get { return "EntityTableName"; }
}
public string Property1 { get; set; }
// and this goes on
public string Property60 { get; set; }
}
Not all properties are required. Population of records depend on the selections that the user would be saving (e.g this is a CarEntity - if the user ordered wheels, properties WheelSize and WheelQuantity would be populated, if the user asks for repainting, RepaintingColor would be populated and so, on).
Assuming that there are 60 properties in this entity, not all properties gets saved in ATS. Despite a property being defined and no error being returned, that data doesn't get saved in the table. I know that there's a limit of 1MB per entity but considering the computations that we have done, it is kinda far from the 1MB limit.
Any help why columns don't appear even if the properties are saved accordingly? My save function is defined as follows:
public static CarEntity CarInsertOrReplace(CarEntity entity)
{
if (entity == null)
{
throw new ArgumentNullException("entity");
}
var table = SetupTable(entity.TableName);
table.CreateIfNotExists();
TableOperation insertOrMergeOperation = TableOperation.InsertOrReplace(entity);
TableResult result = table.Execute(insertOrMergeOperation);
CarEntity objEntity = result.Result as CarEntity;
return objEntity;
}
Sounds like the properties for your Entity vary based on the usage. What's probably happening is that Azure Table Storage is only creating columns for properties that are not null (have a value set). So you are will only see columns created for properties that have been set.
It sounds as if Table Storage is performing as you require but not necessarily as you expect. As answered by #Paul Fryer ATS will not store null values and as you do not (going by your quoted code) initialise the CarEntity properties they will be null by default. Therefore only properties set by the user will be saved to the table.
https://msdn.microsoft.com/en-us/library/azure/hh452242.aspx #Remarks:
If the Insert Or Replace Entity operation is used to replace an
entity, any properties from the previous entity will be removed if the
new entity does not define them. Properties with a null value will
also be removed.
Also, from your code
TableResult result = table.Execute(insertOrMergeOperation);
CarEntity objEntity = result.Result as CarEntity;
result will contain the TableOperation not a copy of the full entity in case that was what you were expecting.
This scenario might be the difference between using, for example, a SQL table where fields that are not given a value have a database default or default to null, against the Azure table model where fields not given a value do not exist. You just need to be aware of that difference when reading/writing to the storage chosen.
If you require all fields to be persisted to the table then provide a default value for each property, e.g. string.Empty
If strings are null or empty then it doesn't save the property at all. You're not doing anything wrong, you just have to consider this in your design when you're working with them.
If you're using TableEntity then it does the null/empty check for you. If you're using DynamicTableEntity then you have to do the check yourself.

Failure to attach a detached entity (entity with the same key is already in the context)

I'm using Entity Framework 6, Code First approach. I'll try to present my problem with a simple piece of code:
public void ViewEntity(MyEntity Entity) // Want to read properties of my entity
{
using (var Db = new MyDbContext())
{
var DummyList = Db.MyEntities.ToList(); // Iteration on this DbSet
Db.MyEntities.Attach(Entity); // Exception
}
}
The exception message is: Attaching an entity of type 'MyProgram.MyEntity' failed because another entity of the same type already has the same primary key value.
From what I've read on MSDN it's an expected behaviour. But what I want on that last line is to first check if there is an entity with the same key already attached to a context; if it is, use it instead, and only otherwise attach my entity to context.
But I've failed to find a way to do so. There are many utility methods on ObjectContext instance (for example GetObjectByKey). I can't test them all 'cause they all ultimately need a qualifiedEntitySetName, and I don't have any in my real imlpementation, because this method should be on an abstract class and it should work for all entity types. Calling Db.Entity(this) is no use, there is no EntityKey which would have EntitySetName.
So all of this became complex really fast. And in my terms I just want to check if the object is already in "cache" (context), use it, otherwise use my object and attach it to this context.
To be clear, I have a detached object from a TreeNode.Tag in the first place, and I just want to use it again, or if it's impossible; if there already is one in the context), use that one instead. Maybe I'm missing some crucial concepts of EF6, I'm just starting out with EF.
I've found a solution for me. As I guessed correctly ObjectContext.GetObjectByKey method does what I need, but first I needed to construct qualifiedEntitySetName, and I found a way to do so. A tad bit cumbersome (using reflection, iterating properties of MyDbContext), but does not compare to a headache of a problem I made out of all this. Just in case, here's the patch of code that is a solution for me:
public SdsAbstractObject GetAttachedToContext()
{
var ObjContext = (SdsDbContext.Current as IObjectContextAdapter).ObjectContext;
var ExistingItem = ObjContext.GetObjectByKey(GetEntityKey()) as SdsAbstractObject;
if (ExistingItem != null)
return ExistingItem;
else
{
DbSet.Attach(this);
return this;
}
}
public EntityKey GetEntityKey()
{
string DbSetName = "";
foreach (var Prop in typeof(SdsDbContext).GetProperties())
{
if (Prop.PropertyType.IsGenericType
&& Prop.PropertyType.GenericTypeArguments[0] == ObjectContext.GetObjectType(GetType()))
DbSetName = Prop.Name;
}
if (String.IsNullOrWhiteSpace(DbSetName))
return null;
else
return new EntityKey("SdsDbContext." + DbSetName, "Id", Id);
}
An Entity can be in one of five stages : Added, Unchanged, Modified, Deleted, Detached.
public void ViewEntity(MyEntity entity) // Want to read properties of my entity
{
using (var Db = new MyDbContext())
{
var DummyList = Db.MyEntities.ToList(); // Iteration on this DbSet
// Set the Modified state of entity or you can write defensive code
// to check it before set the state.
if (Db.Entry(entity).State == EntityState.Modified) {
Db.Entry(entity).State = EntityState.Modified
}
// Attached it
Db.MyEntities.Attach(Entity);
Db.SaveChanges();
}
}
Since EF doesn't know which properties are different from those in the database, it will update them all.

Is there a way to find all Entities that have had their relationships deleted?

I am trying to not have my Business Logic know the inner workings of my Data Layer and vica versa.
But Entity Framework is making that hard. I can insert into a collection (in my Business Layer) without a reference to the ObjectContext:
order.Containers.Add(new Container { ContainerId = containerId, Order = order });
And that saves fine when it comes time to do a SaveChanges() in the Data Layer.
But to delete an item from a collection I need a reference to the ObjectContext. (I am case #1 in this guide to deleting EF Entities.) If I just do this:
delContainers.ForEach(container => order.Containers.Remove(container));
Then when I call SaveChanges() I get an exception telling me that I need to delete the object as well as the reference.
So, my options as I see it are:
To pass a delegate to my Business Logic that will call the Entity Framework ObjectContext Delete method.
Or (I am hoping) find a way to get all entities that have had their reference deleted and actually delete them. (Right before calling SaveChanges() in my data layer.)
Does anyone know a way to do that?
UPDATE:
I tried this:
// Add an event when Save Changes is called
this.ObjectContext.SavingChanges += OnSavingChanges;
...
void OnSavingChanges(object sender, EventArgs e)
{
var objectStateEntries = ObjectContext.ObjectStateManager
.GetObjectStateEntries(EntityState.Deleted);
foreach (var objectStateEntry in objectStateEntries)
{
if (objectStateEntry.IsRelationship)
{
// Find some way to delete the related entity
}
}
}
But none even though I deleted a relationship, the set of deleted items is empty.
(I tried viewing all the items too and my relationship is not in there. Clearly there is something fundamental that I don't get about ObjectStateManager.)
The correct solution for EF is point 3. from the linked article. It means propagating FK to principal entity into PK for dependent entity. This will form something called identifying relation which automatically deletes dependent entity when it is removed from the parent entity.
If you don't want to change your model and still want to achieve that in persistence ignorant way you probably can but it will work only for independent associations. Some initial implementation which works at least for my simple tested solution:
public partial class YourObjectContext
{
public override int SaveChanges(SaveOptions options)
{
foreach (ObjectStateEntry relationEntry in ObjectStateManager
.GetObjectStateEntries(EntityState.Deleted)
.Where(e => e.IsRelationship))
{
var entry = GetEntityEntryFromRelation(relationEntry, 0);
// Find representation of the relation
IRelatedEnd relatedEnd = entry.RelationshipManager
.GetAllRelatedEnds()
.First(r => r.RelationshipSet == relationEntry.EntitySet);
RelationshipType relationshipType = relatedEnd.RelationshipSet.ElementType;
if (!SkipDeletion(relationshipType))
{
// Now we know that model is inconsistent and entity on many side must be deleted
if (!(relatedEnd is EntityReference)) // related end is many side
{
entry = GetEntityEntryFromRelation(relationEntry, 1);
}
if (entry.State != EntityState.Deleted)
{
context.DeleteObject(entry.Entity);
}
}
}
return base.SaveChanges();
}
private ObjectStateEntry GetEntityEntryFromRelation(ObjectStateEntry relationEntry, int index)
{
var firstKey = (EntityKey) relationEntry.OriginalValues[index];
ObjectStateEntry entry = ObjectStateManager.GetObjectStateEntry(firstKey);
return entry;
}
private bool SkipDeletion(RelationshipType relationshipType)
{
return
// Many-to-many
relationshipType.RelationshipEndMembers.All(
r => r.RelationshipMultiplicity == RelationshipMultiplicity.Many) ||
// ZeroOrOne-to-many
relationshipType.RelationshipEndMembers.Any(
r => r.RelationshipMultiplicity == RelationshipMultiplicity.ZeroOrOne);
}
}
To make it work your entities must be enabled for dynamic change tracking (all properties must be virtual and entity must be proxied) or you must manually call DetectChanges.
In case of foreign key associations the situation will be probably much worse because you will not find any deleted relation in the state manager. You will have to track changes to collections or keys manually and compare them to find discrepancies (I'm not sure how to do it in generic way) Foreign key association IMHO requires the identifying relation. Using FK properties already means that you included additional persistence dependency into your model.
One way is to write a change handler in your data layer:
private void ContainersChanged(object sender,
CollectionChangeEventArgs e)
{
// Check for a related reference being removed.
if (e.Action == CollectionChangeAction.Remove)
{
Context.DeleteObject(e.Element);
}
}
There are many places you can wire this up -- in your object's constructor or repository get or SavingChanges or wherever:
entity.Containers.AssociationChanged += new CollectionChangeEventHandler(ContainersChanged);
Now you can remove the association from elsewhere and it will "cascade" to the entity.

Categories

Resources