Is this really the expected behavior? I'm using the standard T4 POCO templates (but Repository and UnitOfWork generated via http://geekswithblogs.net/danemorgridge/archive/2010/06/28/entity-framework-repository-amp-unit-of-work-t4-template-on.aspx although the problem seems to be with the POCO fixup)
If I do the following
var UOW = new EFUnitOfWork();
UOW.LazyLoadingEnabled = true;
UOW.ProxyCreationEnabled = true;
var horderRepo = RepositoryHelper.GetHORDERRepository(UOW);
var subrelmRepository = RepositoryHelper.GetSUBRELMRepository(UOW);
var ho = horderRepo.Where(h=>h.RECORD_NUMBER==1).FirstOrDefault();
var somerelm = subrelmRepository.Where(r=>r.RECORD_NUMBER==ho.REALM_KEY+1).FirstOrDefault();
ho.SUBRELM=somerelm;
UOW.Commit();
return View(ho);
each time I change the ho.SUBRELM to a new RELM the expected POCO fixup is called. If that relm is pointed to by 100,000 other HORDERS (which some are) then the fixup seems to walk the lot of them, taking forever (or until memory runs out - whichever is the sooner)
If I turn lazyloading off, this doesn't happen, but should I really expect fixup to back-walk all of the relationships in my database ? Or has something else gone wrong? If so, what?
This is well known problem of using POCO entities generated by T4 template with a lazy loading. You can't avoid it unless you simply modify your RELM to not contain navigation property to all included HORDERS. Other possibilities are modifying T4 to not use fixup collections or writting POCOs by yourselves.
Just conclusion - it is not incorrect behavior of EF. It is an unexpected behavior of code generated by T4 template.
Related
I would like to know if the following scenario is possible with Entity Framework:
I want to load several tables with the option AsNoTracking since they are all like static tables that cannot be changed by user.
Those tables also happen to be navigation property of others. Up till now I relied on the AutoMapping feature of the Entity Framework, and don't use the .Include() or LazyLoading functionality.
So instead of:
var result = from x in context.TestTable
.Include("ChildTestTable")
select x;
I am using it like this:
context.ChildTestTable.Load();
context.TestTable.Load();
var result = context.TestTable.Local;
This is working smoothly because the application is so designed that the tables within the Database are very small, there won't be a table that exceeds 600 rows (and that's already pretty high value in my app).
Now my way of loading data, isn't working with .AsNoTracking().
Is there any way to make it working?
So I can write:
context.ChildTestTable.AsNoTracking().List();
var result = context.TestTable.AsNoTracking().List();
Instead of:
var result = from x in context.TestTable.AsNoTracking()
.Include("ChildTestTable")
select x;
So basically, I want to have 1 or more tables loaded with AutoMapping feature on but without loading them into the Object State Manager, is that a possibility?
The simple answer is no. For normal tracking queries, the state manager is used for both identity resolution (finding a previously loaded instance of a given entity and using it instead of creating a new instance) and fixup (connecting navigation properties together). When you use a no-tracking query it means that the entities are not tracked in the state manager. This means that fixup between entities from different queries cannot happen because EF has no way of finding those entities.
If you were to use Include with your no-tracking query then EF would attempt to do some fixup between entities within the query, and this will work a lot of the time. However, some queries can result in referencing the same entity multiple times and in some of those cases EF has no way of knowing that it is the same entity being referenced and hence you may get duplicates.
I guess the thing you don't really say is why you want to use no-tracking. If your tables don't have a lot of data then you're unlikely to see significant perf improvements, although many factors can influence this. (As a digression, using the ObservableCollection returned by .Local could also impact perf and should not be necessary if the data never changes.) Generally speaking you should only use no-tracking if you have an explicit need to do so since otherwise it ends up adding complexity without benefit.
I am adding new entity to the context and I would like populate all its references collections once the add is done. Issue is, I am reading the same entity from the context which I created during the add(), basically EF doesn't go to the DB. This is correct behaviour, but how do I get around it ?
Repo().Add(newEntity);
Repo().Reload(newEntity);
This reloads the entity from DB however I am not getting the references (FK relations). I have found how to load the reference, however I would need a generic way how to load all the references for any entity.
var entry = Context.Entry(entity);
entry.Reference("ReferenceName").Load();
Is the above the correct approach or is there some other way ?
Without seeing your repository code, I am guessing it's a lazy loading issue.
This site explains eager vs lazy loading: http://msdn.microsoft.com/en-us/data/jj574232.aspx
You can either specify exactly what references you want to bring back by using the .Include() in your repo (good for long chains where you dont need everything).
context.Set<whateverType>.Include(t => t.(whatever you are referencing)).Where(t => t.id = id);
Or you can specify the context to use eager loading and bring back everything with your retrieval.
context.LazyLoadingEnabled = false;
I want to cache a never-changing aggregate which would be accessible by a root object only (all other entities are accessible only by using Reference/HasMany properties on the root object)?
Should I use NHibernate (which we are already using) second-level-cache or is it better to build some sort of singleton that provides access to all entities in the aggregate?
I found a blog post about getting everything with MultiQuery but my database does not support it.
The 'old way' to do this would be to
Do a select * from all aggregate tables
Loop the entities and set the References and the Collections manually
Something like:
foreach (var e in Entities)
{
e.Parent = loadedParentEntities.SingleOrDefault(pe => e.ParentId = pe.Id);
}
But surely there is a way to tell NHibernate to do this for me?
Update
Currently I tried merely fetching everything from the db and hope NHibernate does all the reference setting. It does not however :(
var getRoot = Session.Query<RootObject>().ToList();
var getRoot_hasMany = Session.Query<RootObjectCollection>().ToList();
var getRoot_hasMany_ref = Session.Query<RootObjectCollectionReference>().ToList();
var getRoot_hasMany_hasMany = Session.Query<RootObjectCollectionCollection>().ToList();
Domain:
Root objects are getRoot. These have a collection property 'HasMany'. These HasMany have each a reference back to GetRoot, and a reference to another entity (getRoot_hasMany_ref), and a collection of their own (getRoot_hasMany_hasMany). If this doesn't make sense, I'll create an ERD but the actual structure is not really relevant for the question (I think).
This results in 4 queries being executed. (which is good)
However, when accessing properties like getRoot.First().HasMany.First().Ref or getRoot.First().HasMany.First().HasMany().First() it still results in extra queries being executed even altough everything should already be known to the ISession?
So how do I tell NHibernate to perform those 4 queries and then build the graphs without using any proxy properties, ... so that I have access to everything even after the ISession went out of scope?
I think there are several questions in one.
I stopped trying to trick NHibernate too much. I wouldn't access entities from multiple threads, because they are usually not thread safe. At least when using lazy loading. Caching lazy entities is therefore something evil.
I would avoid too many queries by the use of batch size, which is far the cleanest and easiest solution and in most cases "good enough". It's fully transparent to the business logic, which makes it so cool.
I would:
Consider not caching the entity at all. Use NH first level cache (say: always load it using session.Get()). Make use of lazy loading when only a small part of the data is used in a single transaction.
Is there is a proven need to cache the data, consider to turn off lazy loading at all (by making the entities non-lazy and setting all the collections to non lazy. Load the entity once and cache it. Still consider thread safety when accessing the data while it is still loaded.
Should the entities be lazy, because some instances of the same type are not in the cache, consider using a DTO-like structure as cache. Copy all data in a similar class structure which are not entities. This may sound like a lot of additional work, but at the end it will avoid many strange problems and safe you much time.
Usually, query time is less important as flush time. This time is used by NH to find which entities changed in a session. To avoid this, make entities read only if you can.
if the whole object tree never changes (config settings?) then just load them efficiently with all references/collections initialised
using(var Session = Sessionfactory = OpenSession())
{
var root = Session.Query<RootObject>().FetchMany(x => x.Collection).ToFutureValue();
Session.Query<RootObjectCollection>().Fetch(x => x.Ref).FetchMany(x => x.Collection).ToFuture();
// Do something with root.Value
}
I've got a couple of entities in a parent-child relationship: Family (parent) and Updates (child). I want to read a list of families without the corresponding updates. There are only 17 families but about 60,000 updates so I really don't want the updates.
I used EntitiesToDTOs to generate a DTO from the Family entity and to create an assembler for converting the Family entity to the FamilyDTO. The ToDTO method of the assembler looks like:
public static FamilyDTO ToDTO(this Family entity)
{
if (entity == null) return null;
var dto = new FamilyDTO();
dto.FamilyCode = entity.FamilyCode;
dto.FamilyName = entity.FamilyName;
dto.CreateDatetime = entity.CreateDatetime;
dto.Updates_ID = entity.Updates.Select(p => p.ID).ToList();
entity.OnDTO(dto);
return dto;
}
When I run the assembler I find each resulting FamilyDTO has the Updates_ID list populated, although lazy loading is set to true for the EF model (edmx file). Is it possible to configure EntitiesToDTOs to support lazy loading of child elements or will it always use eager loading? I can't see any option in EntitiesToDTOs that could be set to support lazy loading when generating the assembler.
By the way, I'm part of a larger team that uses EntitiesToDTOs to regenerate the assemblers on an almost daily basis, so I'd prefer not to modify the assembler by hand if possible.
I'm Fabian, creator of EntitiesToDTOs.
First of all, thanks a lot for using it.
What you have detected is in fact what I don't want the Assembler to do, I want the developer to map the navigation properties only if needed using the partial methods OnDTO and OnEntity. Otherwise you run into problems like you have.
Seems like I never run into that problem using the tool, THANKS A LOT.
So right now I'm fixing this. It's now fixed in version 3.1.
Based on the code that you've posted here, and based on how I think someone would implement such a solution (i.e. to convert records to a DTO format) I think that you would have no choice but to do eager loading.
Some key points:
1) Your Updates_ID field is clearly a List, which means that it's hydrating the collection right away (ToList always executes. Only a regular IEnumerable employs deferred execution).
2) If you're sticking any sort of navigation property in a DTO it would automatically be loaded eagerly. That's because once you so much as touch a navigation property that was brought back by Entity Framework, the framework automatically loads it from the database, and doesn't care that all you wanted was to populate a DTO with it.
I'm having trouble with entity framework returning Proxies when I want the actual entity class. The first time I run my code everything runs properly (no proxies), but every iteration afterwards one of my DbSets always returns proxies instead of the actual type.
I dispose of the context after every iteration, so I don't understand why the first time through it works, and every time after doesn't.
My code fails on this line. All my POCOs have the Table attribute set, but because it is returning a proxy class there is no table attribute.
TableAttribute attrib = (TableAttribute)attributes.Single();
Is there some behind the scenes static magic in the DbContext that lives after I destroy the object?
I move my objects into memory using the following
MajorClasses = ctx.MajorClasses.ToArray();
I also tried
MajorClasses = ctx.MajorClasses.AsNoTracking().ToArray();
In my OnModelCreating I have the following set
base.Configuration.ProxyCreationEnabled = false;
base.Configuration.LazyLoadingEnabled = false;
You can set ObjectContext.ContextOptions.ProxyCreationEnabled to false. This will prevent you from using some of EFs fancy features like lazy loading and I believe change tracking.
As far as your app cares, it should be able to treat the proxies just like the types they represent. Is there a specific issue you are having?
Edit
We have some code that requires the POCO type instead of the proxy type and we do the following to detect if the current type is a proxy.
if (entityType.BaseType != null && entityType.Namespace == "System.Data.Entity.DynamicProxies")
{
entityType = entityType.BaseType;
}
To turn off proxy creation in Entity Framework 5 you can use the following,
_dbContext.Configuration.ProxyCreationEnabled = false;
Simply set this property once before using the context to pull data.
By default, EF uses Change Tracking and uses an in-memory cache of all entities. You can use different Merge Options when working with EF. By default, EF 4.1 is set to AppendOnly Merge Option. As I understand, this means that if you have already queried an entity, subsequent queries will get the entity from the cache (if there are no detected changes in the database). So you might be seeing the cached entity coming back.
In EF 4.1, you can use NoTracking Merge Option. This will go to the database for every call.
In EF 6.1.3 you can get the right type using
using (var context = new BloggingContext()) {
var blog = context.Blogs.Find(1);
var entityType = ObjectContext.GetObjectType(blog.GetType());
}
Note that if the type passed to GetObjectType is an instance of an entity type that is not a proxy type then the type of entity is still returned. This means you can always use this method to get the actual entity type without any other checking to see if the type is a proxy type or not.
From MSDN
simple solution is you are missing some object that must be included and also do this one before getting values
_dbContext.Configuration.ProxyCreationEnabled = false;
In my case this issue was fixed by setting Lazy Loading Enabled to false.
Open the .edmx (diagram)
Hit F4 to bring up the properties
Set Lazy Loading Enabled to false
Save and rebuild