A little curiosity. I have a User object which contains a bag each of UserPhoto, UserMatchInterest, UserPreference objects. I have given each item in the bag a reference to the parent User and with nhibernate I have got the two way mapping sorted so that when you Create the user object for the first time, it automatically creates the UserPhoto, UserMatchInterest and UserPreference objects in the collection bags, setting the UserId to the parent User object, that works fine.
As an example, the UserPhoto table has a PhotoId PK column and a UserId FK column. The UserPhoto object has the PhotoId property and a User property (not UserId) and so rather than holding the UserId, it holds a reference to the parent and populates the DB column based on the Users PK.
The problem I have is when I want to update the User object all in one go. The rest of the User object updates fine, but when it comes to the photos, it creates new photos in the database. I can understand why, as they are not linked at all to the previous photo session objects which is acceptable as being an ASP.NET website I will be dealing with detached objects. But it leaves the old ones. So if you had photo ID 1 & 2, with UserId=1. After the update, you will have photos 1,2,3 & 4 with UserId=1. What I want is for photo 1 & 2 to be deleted and then insert 3 & 4 instead.
I have tried to retrieve them independently as a collection and delete them in the transaction first, but I get the message
Message = "deleted object would be re-saved by cascade (remove deleted object from associations)
Code to delete is as follows
// First delete existing photos, interests and preferences
var photos = from row in repository.GetItemsAsQuery<UserPhoto>()
where row.User.UserId == user.UserId
select row;
repository.DeleteItems(photos.ToList());
var interests = from row in repository.GetItemsAsQuery<UserMatchInterest>()
where row.User.UserId == user.UserId
select row;
repository.DeleteItems(interests.ToList());
var preferences = from row in repository.GetItemsAsQuery<UserPreference>()
where row.User.UserId == user.UserId
select row;
repository.DeleteItems(preferences.ToList());
// Now update the user object and re-add the above linked items
repository.UpdateItem(user);
The error is thrown on the repository.DeleteItems(interests.ToList()); line, the first delete passes fine - though it is all in a transaction.
My question is am I approaching this the right way to update an object in the DB which has bags of other objects it needs to update as well? I don't see any way to update existing photo objects without manually setting ID's - and the user may have replaced all photos or added/deleted anyway so it is probably cleaner to delete existing and re-add new, but how do I delete existing ones without getting this cascade error?
from your description i envision the following classes
public class User
{
public virtual int Id { get; set; }
public virtual ICollection<Photo> Photos { get; private set; }
}
public class Photo
{
public User Parent { get; set; }
public virtual string Name { get; set; }
public virtual byte[] Data { get; set; }
}
then the mapping would look like
public class UserMap : ClassMap<User>
{
public UserMap()
{
Id(x => x.Id);
HasMany(x => x.Photos)
.AsSet() // no duplicate entries, allows NH to optimise some things
.Cascade.AllDeleteOrphan()
.Component(c =>
{
c.ParentReference(x => x.Parent);
c.Map(x => x.Name);
c.Map(x => x.Data);
});
}
}
Note: the Cascade.AllDeleteOrphan will delete all children automaticly which are not part of the collection anymore
Related
The other table contain references data with well know ID. The use case is to read data from file, create entities then insert them in batch. I don't need to query anything first, so all entities are "disconnected" from context.
Simple exemple:
public class Post
{
public int ID { get; set; }
public string Text { get; set; }
public virtual ICollection<Tag> Tags { get; set; }
}
public class Tag
{
public int ID { get; set; }
[Required]
public string Label { get; set;}
public virtual ICollection<Post> Posts { get; set; }
}
First try
List<Post> posts = new List<Post>();
loop
var post = new Post { Text = "some text"});
post.Tags.Add(new Tag {ID = 1});
post.Tags.Add(new Tag {ID = 2});
posts.Add(post);
...
context.Posts.AddRange(posts);
context.SaveChange();
Error because EF try to update the tags record by setting the other tag column to null. I don't want EF to update the tag table anyway, only the join table.
Second try
After reading Long story short: Use Foreign key and it will save your day, I did not find a way to make it work with a collection of FKs because in my case it's a many to many relation.
Third try
Instead of using context.Post.AddRange(posts);, I attach only the parent entity:
var post = new Post { Text = "some text"});
post.Tags.Add(new Tag {ID = 1});
post.Tags.Add(new Tag {ID = 2});
context.Posts.Attach(post).State = EntityState.Added;
context.SaveChanges();
That worked. Post is inserted, and the joining table PostsTags contain the relation data, with the Tags table left untouched.
BUT that will not work in batch (same context) because I can't then create another post with the same tag. The context "track" the tags by their ID, so I can't insert a "new" one with the same ID.
Fourth try
What I'm doing right now is instead of adding a new Tag post.Tags.Add(new Tag {ID = 1});, I add the tag from the db post.Tags.Add(context.Tags.Find(1)); That means many trips to database, for information that is already knows.
Others options I think of is to keep a local dictionnary of tag that are already attached to context, change context between each post, find a way to insert data directly into the entity type that represent the join table, query all references beforehand (but some references tables contains thousand of elements) or simply juste use raw sql query.
I can't imagine that there is no simple way to insert a model with Fk ids, like it work for a one to many by using a Foreign Key property.
Thank you
The issue will be due to the tracking, or lack of tracking on the Tags. Since you don't want to query the database, then you can opt to Attach tag instances that you can guarantee are legal tag rows. If you have a reasonable # of Tag IDs to use you could create and attach the full set to reference. Otherwise you could derive it from the data IDs coming in.
I.e. if we have 20 Tags to select from, ID 1-20:
for (int tagId = 1; tagId <= 20; tagId++)
{
var tag = new Tag { Id = tagId };
context.Tags.Attach(tag);
}
We don't need to track these tags separately in a list. Once associated with the DbContext we can use context.Tags, or to be extra cautious about reads, context.Tags.Local
then when populating your Posts:
var post = new Post { Text = "some text"});
post.Tags.Add(context.Tags.Local.Single(x => x.Id == 1));
post.Tags.Add(context.Tags.Local.Single(x => x.Id == 2));
posts.Add(post);
//...
context.Posts.AddRange(posts);
If you have a large # of tags and pass a structure in for the posts that nominate the Tag IDs you want to associate with each new post, then you can build a list from that:
var tags = postViewModels.SelectMany(x => x.TagIds)
.Distinct()
.Select(t => new Tag { Id == t)).ToList();
Such as the case where a provided set of post ViewModels contains a list of TagIds. We select all of the distinct Tag IDs, then build Tags to associate.
The caveat here is if the DbContext might already by tracking a Tag with any of the desired IDs. Calling Attach for a Tag that the DbContext might already have loaded will result in an exception. Whether you build a complete set of tags or build a set from the provided post, the solution should check the DbContext for any locally cached/tracked tags and only attach ones that aren't already tracked.
var tags = postViewModels.SelectMany(x => x.TagIds)
.Distinct()
.Select(t => new Tag { Id == t))
.ToList();
foreach(var tag in tags)
{
if (!context.Tags.Local.Any(x => x.TagId == tag.Id))
context.Tags.Attach(tag);
}
There may be a better way to build the Tags to attach to exclude existing tracked tags (such as using Except, though that requires an EqualityComparer) but we guard against attaching a Tag that is already tracked. From there we create the Posts and associate the desired tags as per the first example using context.Tags.Local. Every tag referenced in each post should have been attached or already tracked and available.
The remaining caveat here is that this assumes that the provided Tag actually exists in the database. We don't want to set the attached Tag's EntityState to anything like Added or Modified to avoid creating incomplete/invalid or replacing data in the Tags table.
When you have Many-to-Many Relationship between Post and Tag, EFCore Automatically adds a table to store that relationship. You will need to add that table manually using a POCO and then use that table to add your relationship. As shown here
Here is the relevant code:
Create a class that contains the relationship.
public class PostTag
{
public int PostId { get; set; }
public Post Post { get; set; }
public int TagId { get; set; }
public Tag Tag { get; set; }
}
Then in OnModelCreate method, Add the relationship like this:
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
modelBuilder.Entity<PostTag>()
.HasKey(t => new { t.PostId, t.TagId });
modelBuilder.Entity<PostTag>()
.HasOne(pt => pt.Post)
.WithMany(p => p.PostTags)
.HasForeignKey(pt => pt.PostId);
modelBuilder.Entity<PostTag>()
.HasOne(pt => pt.Tag)
.WithMany(t => t.PostTags)
.HasForeignKey(pt => pt.TagId);
}
Alternatively, you can use annotations to set the relationship using [ForeignKey(nameof())] attribute
Then you can manually add the PostTag entity with the required data.
I wrote a query which is pretty simple:
var locations = await _context.Locations
.Include(x => x.LocationsOfTheUsers)
.Include(x => x.Address)
.ThenInclude(x => x.County)
.Where(CalculateFilters(searchObj))
.ToListAsync(cancellationToken);
And everytime LocationsOfTheUsers were null so I decided to .Include(x => x.LocationsOfTheUsers) and I received results as expected but I'm not sure why do I have to include this collections since it's defined like this:
public class Location
{
public string Title { get; set; }
public long? RegionId { get; set; }
public Region Region { get; set; }
public long? AddressId { get; set; }
public Address Address { get; set; }
public long? CountyId { get; set; }
public County County { get; set; }
public ICollection<LocationsOfTheUsers> LocationsOfTheUsers { get; set; }
}
I thought this will be automatically included since it exist as ICollection in Location class.
So why is .Include() on LocationsOfTheUsers needed here?
Thanks guys
Cheers
In entity framework the non-virtual properties represent the columns of the tables, the virtual properties represent the relations between the tables (one-to-many, many-to-many, ...)
So your property should have been defined as:
public virtual ICollection<LocationsOfTheUsers> LocationsOfTheUsers { get; set; }
One of the slower parts of a database query is the transfer of the selected data from the database management system to your local process. Hence it is wise to limit the selected data to the values you actually plan to use.
If you have a one-to-many relation between Schools and Students, and you ask for School [10] you don't want automatically to fetch its 2000 Students.
Even if you would like to have "School [10] with all its Students" it would not be efficient to use Include to also fetch the Students. Every Student will have a foreign key SchoolId with a Value of [10]. If you would use Include you would transfer this foreign key 2000 times. What a waste!
When using entity framework always use Select to fetch data and select only the properties that you actually plan to use. Only use Include if you plan to change the included items.
This way you can separate your database table structure from the actual query. If your database structure changes, only the query changes, users of your query don't notice the internal changes.
Apart from better performance and more robustness against changes, readers of your code can more easily see what values are in their query.
Certainly don't use Include to save you some typing. Having to debug one error after future changes will take way more time than you will ever save by typeing include instead of Select
Finally: limit your data early in your process, so put the Where in front.
So your query should be:
var predicate = CalculateFilters(searchObj)
var queryLocations = dbContext.Locations
.Where(predicate)
.Select(location => new
{
// Select only the location properties that you plan to use
Id = location.Id,
Name = location.Name,
// Locations Of the users:
UserLocations = location.LocationsOfTheUsers
.Select(userLocation => new
{
// again: only the properties that you plan to use
Id = userLocation.Id,
...
// Not needed, you already know the value
// LocationId = userLocation.LocationId
})
.ToList(),
Address = new
{
Street = location.Address.Street,
PostCode = location.Addrress.PostCode,
...
County = location.Address.County.Name // if you only want one property
// or if you want more properties:
County = new
{
Name = location.Address.County.Name,
Abbr = location.Address.Count.Abbr,
...
}),
},
});
I thought this will be automatically included since it exist as ICollection in Location class.
Well, it's not automatically included, probably for performance reasons as the graph of related entities and their recursive child entities may be rather deep.
That's why you use eager loading to explicitly include the related entities that you want using the Include method.
The other option is to use lazy loading which means that the related entities are loaded as soon as you access the navigation property in your code, assuming some prerequisites are fulfilled and that the context is still around when this happens.
Please refer to the docs for more information.
I believe you are using EntityFrameworkCore. In EntityFramework (EF6), lazy loading is enabled by default, However, in EntityFrameworkCore, lazy loading related entities is handled by a separate package Microsoft.EntityFrameworkCore.Proxies.
To enable the behaviour you are seeking, install the above package and add the following code
protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
{
optionsBuilder.UseLazyLoadingProxies();
}
After this, the related entities will be loaded without the Include call.
i am struggeling for a while now to understand how EF loads / updates entities.
First of all i wanna explain what my app (WPF) is about. I am developing
an application where users can store Todo Items in Categories, these categories are predefined by the application. Each user can read all items but can only delete / update his own items. It's a multiuser system, means the application is running multiple times in the network accessing the same sql server database.
When a user is adding/deleting/updating items the UI on all the other running apps has to update.
My model looks like this:
public class Category
{
public int Id { get; set; }
public string Name { get; set; }
public List<Todo> Todos { get; set; }
}
public class Todo
{
public int Id { get; set; }
public string Content { get; set; }
public DateTime LastUpdate { get; set; }
public string Owner { get; set; }
public Category Category { get; set; }
public List<Info> Infos { get; set; }
}
public class Info
{
public int Id { get; set; }
public string Value { get; set; }
public Todo Todo { get; set; }
}
I am making the inital load like this, which works fine:
Context.dbsCategories.Where(c => c.Id == id).Include(c => c.Todos.Select(t => t.Infos)).FirstOrDefault();
Now i was trying to load only the Todos which are from the current user therefore i tried this:
Context.dbsCategories.Where(c => c.Id == id).Include(c => c.Todos.Where(t => t.Owner == Settings.User).Select(t => t.Infos)).FirstOrDefault();
This does not work because it's not possible to filter within include, so I tried this:
var cat = Context.dbsCategories.Where(c => c.Id == id).FirstOrDefault();
Context.dbsTodos.Where(t => t.Category.Id == cat.Id && t.Owner == Settings.User).Include(t=>t.Infos);
After executing the second line where i look for the Todo Items, these Items were automatically added to cat's Todos collection. Why? I would have expected that i have to add them manually to cat's Todos collection.
Just for my understanding what is EF doing here exactly?
Now to my main problem -> the synchronization of the data between database and client. I am using a long running Context which lives as long as the application is running to save changes to the database which are made on owned items. The user does not have the possibility to manipulate / delete data from other users this is guarantee by the user interface.
To synchronize the data i build this Synch Method which will run every 10 second, right now it's triggere manually.
Thats my synchronization Code, which only synchronizes Items to the client that do not belong to it.
private async Task Synchronize()
{
using (var ctx = new Context())
{
var database = ctx.dbsTodos().Where(x => x.Owner != Settings.User).Select(t => t.Infos).AsNoTracking();
var loaded = Context.dbsTodos.Local.Where(x => x.Owner != Settings.User);
//In local context but not in database anymore -> Detachen
foreach (var detach in loaded.Except(database, new TodoIdComparer()).ToList())
{
Context.ObjectContext.Detach(detach);
Log.Debug(this, $"Item {detach} detached");
}
//In database and local context -> Check Timestamp -> Update
foreach (var update in loaded.Intersect(database, new TodoIdTimeStampComparer()))
{
await Context.Entry(update).ReloadAsync();
Log.Debug(this, $"Item {update} updated");
}
//In database but not in local context -> Attach
foreach (var attach in database.ToList().Except(loaded, new TodoIdComparer()))
{
Context.dbsTodos().Attach(attach);
Log.Debug(this, $"Item {attach} attached");
}
}
}
I am having following problems / issues of unknow origin with it:
Detaching deleted Items seems to work, right now i am not sure if only the Todo Items are detached or also the Infos.
Updating Items works only for the TodoItem itsself, its not reloading the Infos within? How can i reload the whole entity with all it's relations?
I am thankful for every help on this, even if you are saying it's all wrong what i am doing here!
Attaching new Items and Infos does not work so far? What am i doing wrong here?
Is this the right approach to synchronize data between client and database?
What am i doing wrong here? Is there any "How to Sync" Tutorial? I have not found anything helpful so far?
Thanks!
My, you do like to deviate from entity framework code-first conventions, do you?
(1) Incorrect class definitions
The relations between your tables are Lists, instead of ICollections, they are not declared virtual and you forgot to declare the foreign key
There is a one-to-many relation between Todo and Category: every Todo belongs to exactly one Category (using a foreign key), every Category has zero or more Todos.
You choose to give Category a property:
List<Todo> Todos {get; set;}
Are you sure that category.Todos[4] has a defined meaning?
What would category.Todos.Insert(4, new Todo()) mean?
Better stick to an interface where you can't use functions that have no proper meaning in your database: use ICollection<Todo> Todos {get; set;}. This way you'll have only access to functions that Entity Framework can translate to SQL.
Besides, a query will probably be faster: you give entity framework the possibility to query the data in its most efficient way, instead of forcing it to put the result into a List.
In entity framework the columns of a table are represented by non-virtual properties; the virtual properties represent the relations between the tables (one-to-many, many-to-many)
public class Category
{
public int Id { get; set; }
public string Name { get; set; }
... // other properties
// every Category has zero or more Todos (one-to-many)
public virtual ICollection<Todo> Todos { get; set; }
}
public class Todo
{
public int Id { get; set; }
public string Content { get; set; }
... // other properties
// every Todo belongs to exactly one Category, using foreign key
public int CategoryId { get; set }
public virtual Category Category { get; set; }
// every Todo has zero or more Infos:
public virtual ICollection<Info> Infos { get; set; }
}
You'll probably guess Info by now:
public class Info
{
public int Id { get; set; }
public string Value { get; set; }
... // other properties
// every info belongs to exactly one Todo, using foreign key
public int TodoId {get; set;}
public virtual Todo Todo { get; set; }
}
Three major improvements:
ICollections instead of Lists
ICollections are virtual, because it is not a real column in your table,
foreign key definitions non-virtual: they are real columns in your tables.
(2) Use Select instead of Include
One of the slower parts of a database query is the transport of the selected data from the Database Management System to your local process. Hence it is wise to limit the amount of transported data.
Suppose Category with Id [4] has a thousand Todos. Every Todo of this Category will have a foreign key with a value 4. So this same value 4 will be transported 1001 times. What a waste of processing power!
In entity framework use Select instead of Include to query data and select only the properties you actually plan to use. Only use Include if you plan to update the Selected data.
Give me all Categories that ... with their Todos that ...
var results = dbContext.Categories
.Where(category => ...)
.Select(category => new
{
// only select properties that you plan to use
Id = category.Id,
Name = category.Name,
...
Todos = category.Todos
.Where(todo => ...) // only if you don't want all Todos
.Select(todo => new
{
// again, select only the properties you'll plan to use
Id = todo.Id,
...
// not needed, you know the value:
// CategoryId = todo.CategoryId,
// only if you also want some infos:
Infos = todo.Infos
.Select(info => ....) // you know the drill by now
.ToList(),
})
.ToList(),
});
(3) Don't keep DbContext alive for such a long time!
Another problem is that you keep your DbContext open for quite some time. This is not how a dbContext was meant. If your database changes between your query and your update, you'll have troubles. I can hardly imagine that you query so much data that you need to optimize it by keeping your dbContext alive. Even if you query a lot of data, the display of this huge amount of data would be the bottle-neck, not the database query.
Better fetch the data once, dispose the DbContext, and when updating fetch the data again, update the changed properties and SaveChanges.
fetch data:
RepositoryCategory FetchCategory(int categoryId)
{
using (var dbContext = new MyDbContext())
{
return dbContext.Categories.Where(category => category.Id == categoryId)
.Select(category => new RepositoryCategory
{
... // see above
})
.FirstOrDefault();
}
}
Yes, you'll need an extra class RepositoryCategory for this. The advantage is, that you hide that you fetched your data from a database. Your code would hardly change if you'd fetch your data from a CSV-file, or from the internet. This is way better testable, and also way better maintainable: if the Category table in your database changes, users of your RepositoryCategory won't notice it.
Consider creating a special namespace for the data you fetch from your database. This way you can name the fetched Category still Category, instead of RepositoryCategory. You even hide better where you fetched your data from.
Back to your question
You wrote:
Now i was trying to load only the Todos which are from the current user
After the previous improvements, this will be easy:
string owner = Settings.User; // or something similar
var result = dbContext.Todos.Where(todo => todo.Owner == owner)
.Select(todo => new
{
// properties you need
})
I am building a registration site for a conference for my organization, with multiple VIPs and guest speakers. The requirement is to track many details about each attendee including their arrival and departure plans and their local lodging information. In order to facilitate discussion with stakeholders on the types of reports we need to build, I want to populate my dev database with a batch of records from a CSV containing randomly generated information like name, arrival/departure date/time, etc. This will allow us to look at a working site without having to register and re-register many times.
However, I simply cannot get the Seed method to persist the relevant records properly, yet my controller which handles the registration works perfectly.
My database structure is basically an Attendee entity with child entities for TravelSchedule, LodgingArrangement, and various lookups. Here are excerpts from my entities:
public class Attendee
{
public int Id { get; set; }
... other strings/etc ...
public virtual TravelSchedule TravelSchedule { get; set; }
public int TravelScheduleId { get; set; }
public virtual LodgingArrangment LodgingArrangement { get; set; }
public int LodgingArrangementId { get; set; }
}
public class TravelSchedule
{
public int Id { get; set; }
... other properties ...
public virtual Attendee Attendee { get; set; }
public int AttendeeId { get; set; }
}
public class LodgingArrangement
{
public int Id { get; set; }
... other properties ...
public virtual Attendee Attendee { get; set; }
public int AttendeeId { get; set; }
}
Here is the content of my context's OnModelCreating method:
modelBuilder.Entity<Attendee>()
.HasOptional(a => a.TravelSchedule)
.WithRequired(r => r.Attendee);
modelBuilder.Entity<TravelSchedule>()
.HasRequired(m => m.ArrivalMode)
.WithMany(m => m.Arrivals)
.HasForeignKey(m => m.ArrivalModeId)
.WillCascadeOnDelete(false);
modelBuilder.Entity<TravelSchedule>()
.HasRequired(m => m.DepartureMode)
.WithMany(m => m.Departures)
.HasForeignKey(m => m.DepartureModeId)
.WillCascadeOnDelete(false);
modelBuilder.Entity<Attendee>()
.HasOptional(a => a.LodgingArrangement)
.WithRequired(l => l.Attendee);
The following is an excerpt from my Seed method.
var attendees = GetAttendeesFromCsv();
context.Attendees.AddOrUpdate(a => a.Email, attendees.ToArray());
context.SaveChanges();
var dbAttendees = context.Attendees.ToList();
foreach (var attendee in dbAttendees)
{
attendee.TravelSchedule = CreateTravelSchedule();
context.Entry<Attendee>(attendee).State = EntityState.Modified;
context.SaveChanges();
}
GetAttendeesFromCsv() extracts the records from the CSV into Attendee objects, thanks to the CsvHelper package. CreateTravelSchedule creates a new TravelSchedule entity and populates it with data from lookup tables using the SelectRandom() method from extensionmethod.com. The bottom line is that I extract the CSV rows into Attendee objects, add a new randomly-generated TravelSchedule entity, and save the resulting Attendee with attached TravelSchedule.
Except this does not work. Instead the above code adds the TravelSchedule records to the database, but the AttendeeId is always set to 0 in the table. Also the Attendees table is populated with all of the records from the CSV, but the TravelScheduleId on each row is always 0 as well. However, when stepping through the update-database call with the debugger the attendee.Id is populated properly, so by my understanding EF should pick up that the two are related and persist the related TravelSchedule at the same time as the Attendee. So why isn't EF connecting the two records?
Changing the loop to this:
foreach (var attendee in dbAttendees)
{
var travel = CreateTravelSchedule();
travel.AttendeeId = attendee.Id; // I also tried just travel.Attendee = attendee, without success
context.TravelSchedules.Add(travel);
context.SaveChanges();
}
Results in this error:
System.Data.Entity.Core.UpdateException: An error occurred while updating the entries. See the inner exception for details. ---> System.Data.SqlClient.SqlException: The INSERT statement conflicted with the FOREIGN KEY constraint "FK_dbo.TravelSchedules_dbo.Attendees_Id". The conflict occurred in database "MySite.DAL.MyContext", table "dbo.Attendees", column 'Id'.
So it appears I cannot add the TravelSchedule entity to the Attendee, and I also cannot go "backwards" by creating the TravelSchedule and then attaching the Attendee.
The frustrating part is that the registration form logic in my controller works perfectly fine, excerpt below. The walkthrough is that the registration controller stores each screen's data (view models) in the session using a static WorkflowManager class which handles persistence between screens. After user confirmation the controller pulls each screen's details from the WorkflowManager, runs them through AutoMapper to convert them to the relevant populated DAL entities, attaches those entities to the attendee entity, and saves it all to the database.
Again, this works perfectly fine, saving the attendee and its two child entities without error. Here is the relevant excerpt of the controller action:
var attendeeRegistration = WorkflowManager.GetAttendeeRegistration();
var travelRegistration = WorkflowManager.GetTravelRegistration();
using (var db = new MyContext())
{
var attendee = Mapper.Map<Attendee>(attendeeRegistration);
attendee.AnotherChildEntity = db.ChildEntities.Find(attendeeRegistration.SelectedChildEntityId);
var travel = Mapper.Map<TravelSchedule>(travelRegistration);
travel.ArrivalMode = db.TravelModes.Find(travelRegistration.SelectedArrivalModeId);
travel.DepartureMode = db.TravelModes.Find(travelRegistration.SelectedDepartureModeId);
var lodging = Mapper.Map<LodgingArrangement>(lodgingRegistration);
lodging.LodgingLocation = db.LodgingLocations.Find(lodgingRegistration.SelectedLodgingLocationId);
attendee.Comments = comments;
attendee.TravelSchedule = travel;
attendee.LodgingArrangement = lodging;
db.Attendees.Add(attendee);
db.SaveChanges();
}
This works perfectly. None of these objects are in the database until after the user confirms the registration is complete. So I don't understand why I can persist new entities to the database here, yet I can't do what appears to me to be the same thing in the Seed method above.
Any help or ideas much appreciated. This has been causing me endless grief for days now. Thanks.
The association between Attendee and TravelSchedule is 1:1. EF implements 1:1 associations by creating a primary key in the dependent entity (here: TravelSchedule) that's also a foreign key to its principal entity (Attendee).
So Attendee.TravelScheduleId and TravelSchedule.AttendeeId are not used for the association and their values remain 0. Which means: the Seed works without these fields/properties (and I'd even expect it to work with them), it establishes the associations through the Id fields.
I have two simple entities named Country and City.
public class Country : Entity
{
public Country()
{
Cities = new List<City>();
}
public virtual string Name { get; set; }
public virtual IList<City> Cities { get; set; }
}
public class City : Entity
{
public virtual Country Country { get; set; }
public virtual string Name { get; set; }
}
The DB used is SQL Server and City has a foreign key to Country with cascade delete.
I am using Fluent NHibernate, this is the mapping configuration for the relation:
public CountryMap()
{
Id(x => x.Id, "IdCountry").GeneratedBy.Identity();
Map(x => x.Name).Not.Nullable().Length(50);
HasMany(x => x.Cities).KeyColumn("IdCountry").ForeignKeyConstraintName("FK_Cities_Countries")
.Not.KeyNullable().Cascade.AllDeleteOrphan().ExtraLazyLoad();
Table("Countries");
}
public CityMap()
{
Id(x => x.Id, "IdCity").GeneratedBy.Identity();
Map(x => x.Name).Not.Nullable().Length(50);
References(x => x.Country, "IdCountry").ForeignKey("FK_Cities_Countries")
.Not.Nullable().Not.Insert().Not.Update().Cascade.All().LazyLoad();
Table("Cities");
}
All works fine, but after delete a country, cities remain in parent collection and I want the cities to be removed from that collection. (As EF does)
The only way I found to get it working is refreshing the session (Clear, Evict...)
Manual deletion of the collection items is not a solution. It in facts breaks the cascading feature.
In case we do have mapping Cascade.AllDeleteOrphan(), we should expect that deletion of Parent will deleted Children as well - no need to do more then delete Parent. Just ... NHibernate does not care about clearing that collection in memory (app server/application)
In case you are searching for long, stable, solid solution I would strongly suggest:
Split READ and WRITE operations
The current frameworks we have (like Web API) are helping us to go this direction. We should create set of operations:
PUT, POST, DELETE ... to handle client requests for data amendments
GET ... to retrieve data by ID or by Criteria (Find())
Each of these operations should have its own session, its own transaction, and should represent the unit of work. All or nothing. Either operation is successful and all is persisted, or not (rollback)
That will help our application being successful and growing in a longer/long run. We do a split. We care about
1) DELETE, UPDATE or INSERT (or more of them arround some root entity, could see more here).
2) we do READ operation - expecting only SELECT operations, therefore working with up-to-date data. Also check this, including comments
While this could be a bit out of scope, this cite form 9.8. Exception handling doc is also the clue:
...If the ISession throws an exception you should immediately rollback the transaction, call ISession.Close() and discard the ISession instance. Certain methods of ISession will not leave the session in a consistent state...
I wanted to demonstrate, that the operation (session, transaction) should have only one goal (WRITE, READ) and be as short as possible...
With that kind of mapping, the easiest way to work is to remove the City from the Country collection and save the Country. With the cascade, the city will be deleted and the collection will be in the state you want.