Reading several tables from one single Entity Framework ExecuteStoreQuery request. - c#

I have a library which uses EF4 for accessing a SQL Server data store. For different reasons, I have to use SQL Server specific syntax to read data from the store (for free text search), so I have to create the SQL code by hand and send it through the ExecuteStoreQuery method.
This works fine, except that the query uses joins to request several tables aside the main one (the main one being the one I specify as the target entity set when calling ExecuteStoreQuery), and EF never fills up the main entity's relationship properties with the other table's data.
Is there anything special to do to fill up these relationships? Using other EF methods or using special table names in the query or something?
Thanks for your help.

Executing direct SQL follows very simple rule: It uses column from the result set to fill the property with the same name in materialized entity. I think I read somewhere that this works only with the the main entity you materialize (entity type defined in ExecuteStoreQuery = no relations) but I can't find it now. I did several tests and it really doesn't populate any relation.

Ok so I'll write here what I ended up doing, which does not looks like a perfect solution, but it does not seem that there is any perfect solution in this case.
As Ladislav pointed out, the ExecuteStoreQuery (as well as the other "custom query" method, Translate) only maps the column of the entity you specify, leaving all the other columns aside. Therefore I had to load the dependencies separately, like this :
// Execute
IEnumerable<MainEntity> result = context.ExecuteStoreQuery<MainEntity>(strQuery, "MainEntities", MergeOption.AppendOnly, someParams).ToArray();
// Load relations, first method
foreach (MainEntity e in result)
{
if (!e.Relation1Reference.IsLoaded)
e.Relation1Reference.Load();
if (!e.Relation2Reference.IsLoaded)
e.Relation2Reference.Load();
// ...
}
// Load relations, second method
// The main entity contains a navigation property pointing
// to a record in the OtherEntity entity
foreach(OtherEntity e in context.OtherEntities)
context.OtherEntities.Attach(e);
There. I think these two techniques have to be chosen depending on the number and size of generated requests. The first technique will generate a one-record request for every required side record, but no unnessecary record will be loaded. The second technique uses less requests (one per table) but retrieves all the records so it uses more memory.

Related

EntityFrameworkCore reading data using reflection - how to create DbSet when type unknown

I am working with a .Net 6 Console application where I need to read data from tables in a custom DbContext using Microsoft.EntityFrameworkCore
I have added the entities to the model in OnModelCreating() and can get them back using a call to
var entity = ctx.Model.GetEntityTypes().FirstOrDefault(e => e.FullName().InfexOf(tableName) >= 0);
Given that, how to I retrieve a list of data, for example entity.ToList() - the type returned for entity is IEntityType?.
As an alternate (and my preferred way if possible), I have created an array of tables using reflection (they all inherit from BaseTable), they are stored as a list.
I would like to create a DbSet<> using DbContext.Set() so that I can use Find(), AsNoTracking() and other such commands (including write operations).
I have the following:-
IQueryable<Object>dbSet = (IQueryable<Object>)ctx
.GetType()
.GetMethod("Set",1,Type.EmptyTypes)
.MakeGenericMethod(t)
.Invoke(ctx, null);
Which allows me to do something like dbSet.ToList(), but I would really like to cast it to a DbSet.
Does anyone know if it is possible to make such a conversion?
(I am reading only a few records from sets of tables and then writing data back to a different database (with the same tables).
Update: *Another way of thinking about this: I am iterating across a collection of tables. I need to pull out the PK and two other columns (for which I have the name of at runtime) - if the value of column 1 contains a specific value, I need to update the value of column 2 *
You can't really do that. However, you can cast the database value to specific types based on discriminators. Take a look at this https://learn.microsoft.com/en-us/ef/core/modeling/value-conversions?tabs=data-annotations

Dynamic table name EF CORE 2.2

I want to make a universal method for working with tables. Studied links
Dynamically Instantiate Model object in Entity Framework DB first by passing type as parameter
Dynamically access table in EF Core 2.0
As an example, the ASP.NET CORE controller for one of the SQL tables is shown below. There are many tables. You have to implement such (DEL,ADD,CHANGE) methods for each table :
[Authorize(Roles = "Administrator")]
[HttpPost]
public ActionResult DeleteToDB(string id)
{
webtm_mng_16Context db = new webtm_mng_16Context();
var Obj_item1 = (from o1 in db.IT_bar
where o1.id == int.Parse(id)
select o1).SingleOrDefault();
if ((Obj_item1 != null))
{
db.IT_bar.Remove(Obj_item1);
db.SaveChanges();
}
var Result = "ok";
return Json(Result);
}
I want to get a universal method for all such operations with the ability to change the name of the table dynamically. Ideally, set the table name as a string. I know that this can be done using SQL inserts, but is there really no simple method to implement this in EF CORE
Sorry, but you need to rework your model.
It is possible to do something generic as long as you have one table per type - you can go into the configuration and change the database table. OpenIddict allows that. You can overwrite the constructors of the DbContext and play whatever you want with the object model, and that includes changing table names.
What you can also do is a generic base class taking the classes you deal with as parameters. I have those - taking (a) the db entity type and (b) the api side dto type and then using some generic functions and Automapper to map between them.
But the moment you need to grab the table name dynamically you are in a world of pain. EF standard architecture assumes that an object type is mapped to a database entity. As such, an ID is unique within a table - the whole relational model depends on that. Id 44 has to be unique, for a specific object, not for an object and the table it was at this moment loaded from.
You also miss up significantly on acutally logic, i.e. for delete. I hate to tell you, but while you can implement security on other layers for reading, every single one of my write/update methods are handwritten. Now, it may seem that "Authorize" works - but no, it does not. Or - it does if your application is "Hello world" complex. I run sometimes pages of testing code whether an operation is allowed in a specific business context and this IS specific, whether the user has set an override switch (which may or may not be valid depending on who he is) do bypass certain business rules. All that is anyway specific.
Oh, what you can also do... because you seem to have a lot of tables: do NOT use one class, generate them. Scaffolding is not that complex. I hardly remember when I did generate the last EF core database classes - they nowadays all come out of Entity Developer (tool from Devart), while the db is handled with change scripts (I work db first - i actually want to USE The database and that means filtered indices, triggers, some sp's and views with specific SQL), so migrations do not really work at all.
But now, overwriting the table name dynamically - while keeping the same object in the background - will bite you quite fast. It likely only works for extremely simplistic things - you know, "hello world" example - and breaks apart the moment you actually have logic.

Load Entities AsNoTracking() with navigation properties, without specifying includes

I would like to know if the following scenario is possible with Entity Framework:
I want to load several tables with the option AsNoTracking since they are all like static tables that cannot be changed by user.
Those tables also happen to be navigation property of others. Up till now I relied on the AutoMapping feature of the Entity Framework, and don't use the .Include() or LazyLoading functionality.
So instead of:
var result = from x in context.TestTable
.Include("ChildTestTable")
select x;
I am using it like this:
context.ChildTestTable.Load();
context.TestTable.Load();
var result = context.TestTable.Local;
This is working smoothly because the application is so designed that the tables within the Database are very small, there won't be a table that exceeds 600 rows (and that's already pretty high value in my app).
Now my way of loading data, isn't working with .AsNoTracking().
Is there any way to make it working?
So I can write:
context.ChildTestTable.AsNoTracking().List();
var result = context.TestTable.AsNoTracking().List();
Instead of:
var result = from x in context.TestTable.AsNoTracking()
.Include("ChildTestTable")
select x;
So basically, I want to have 1 or more tables loaded with AutoMapping feature on but without loading them into the Object State Manager, is that a possibility?
The simple answer is no. For normal tracking queries, the state manager is used for both identity resolution (finding a previously loaded instance of a given entity and using it instead of creating a new instance) and fixup (connecting navigation properties together). When you use a no-tracking query it means that the entities are not tracked in the state manager. This means that fixup between entities from different queries cannot happen because EF has no way of finding those entities.
If you were to use Include with your no-tracking query then EF would attempt to do some fixup between entities within the query, and this will work a lot of the time. However, some queries can result in referencing the same entity multiple times and in some of those cases EF has no way of knowing that it is the same entity being referenced and hence you may get duplicates.
I guess the thing you don't really say is why you want to use no-tracking. If your tables don't have a lot of data then you're unlikely to see significant perf improvements, although many factors can influence this. (As a digression, using the ObservableCollection returned by .Local could also impact perf and should not be necessary if the data never changes.) Generally speaking you should only use no-tracking if you have an explicit need to do so since otherwise it ends up adding complexity without benefit.

Nhibernate Polymorphic Query - Eager Load Associations Without Polymorphic Fetch

I will start by saying I have already looked thoroughly in stack overflow, nhusers and the documentation for a possible solution to my issue.
I need to be able to query only the base class table in parts of my multi/future query when eagerly loading associations (although from the research I have done I don't think this is possible)
I have started to map an existing schema using fluent nhibernate as a proof of concept. I have mapped an inheritance hierarchy using table per sub class (The mappings all work perfectly fine so I won't paste them all in here). The hierarchy has around 15 sub classes and the base class has some additional associations. E.g.
Base
Dictionary<string, Attribute> Attributes
List<EntityChange> Changes
I need to eagerly load both of the collections as in the given scenario they are required for post processing and lazily loading them causes performance issues. I am eagerly loading them by a multi / future query:
var baseQuery = session.CreateCriteria<Base>("b")
.CreateCriteria("Nested", JoinType.LeftOuterJoin)
.CreateCriteria("Nested2", JoinType.LeftOuterJoin)
.CreateCriteria("Nested2.AdditionalNested", JoinType.LeftOuterJoin);
var logsQuery = session.CreateCriteria<Base>("b").CreateAlias("Changes", "c", JoinType.LeftOuterJoin,
Expression.And(Expression.Ge("c.EntryDate", changesStartDate), Expression.Le("c.EntryDate", changesEndDate)))
.AddOrder(Order.Desc("c.EntryDate"));
var attributesQuery = session.CreateCriteria<Base>("t").SetFetchMode("Attributes", FetchMode.Join);
logsQuery.Future<Base>();
attributesQuery.Future<Base>();
var results = baseQuery.Future<Base>().ToList();
The queries execute and return the correct results. But just to eagerly load the associations in this manner means that the attribute and changes queries have to perform a polymorphic fetch (the addition of about 15 left outer joins per query that aren't required). I know this is required for polymorphic querying but the base query will return the hierarchy that I desire. The other parts of the multi query that are issuing a polymorphic query are redundant.
I haven't yet mapped the whole of the hierarchy so there will be additional unecessary joins being performed and there are also other associations that could be loaded up front. These two combined without the addition of an increase in volume will lead to performance issues. The performance currently of this query is about 6 seconds (which admittedly is better than the 20 it's currently taking) but by messing around a bit with the query and taking out the extra joins I can get it down to about 2 seconds (this is a common query so getting it as low as possible is beneficial not just pleasing to me. It will also be run from multiple distributed machine so I would rather not get into a discussion about caching / 2nd level caching).
I have tried
using the class modifier in the query 'class = base'. I initially done this blindly but believe this is for discriminator values. Even if it is for the case statement in the SQL this will not prevent the extra joins.
Doing everything in a single query. This is slower than splitting it up as above and gives the cartesian product
Using 'Polymorphism.Explicit();' in the fluent mappings. This has no effect as I am using ClassMap with SubclassMaps so it is ignored. I tried changing all the maps to ClassMaps and using Join but this didn't give the desired behaviour.
Tried to trick nhibernate into joining the base class table onto itself for loading associations (basically load a more concrete type to prevent the polymorphic query) - create a derived class 'BaseOnlyLoading' which uses the same table and primary key as the base class. This was obviously a hack but I was just trying to see what's possible. NHibernate doesn't allow the class and sub class to use the same table.
Define the BaseOnlyLoadingMap to be a classmap with the same assocations as the BaseMap with a join back onto the Base. This was hopeful as assocation collections are resolved in the context based on full type name.
Use an interceptor which modifies the SQL that before it's execute. I wouldn't use this in production and just tried it out of interest. I passed an interceptor into a local session. This caused issues and I didn't proceed.
The HQL 'Type' query operator as explained here although I am not sure this has been implemented in the .NET version and might behave similarly to 1.
There is comment on highest rated answer (How to perform a non-polymorphic HQL query in Hibernate?) which suggest overriding the IsExplicitPolymorphism on the persister. I had a quick look and from what I remember the persister was either global per entity or created in the SessionImpl from a static factory which would prevent doing this. Even if this was possible I am not sure what sort of side effects this would have.
I tried using some SQL to load everything but even if I use a stored proc I am not sure how nhibernate will piece the graph back together. Maybe I could specify all the entities and aliases?
Specifying explicit per query would be nice. Any suggestions?
Thanks in advance.

Using Attach with Linq To Sql and Stored Procs

I am trying to use the attach method to update an entity that was retrieve via a stored proc.
The stored proc is set up to return a specific instance, which is present in my dbml. The retrieval works as expected and returns a fully populated object. The reason I need to use a stored proc is that I need to update a property on that entity at the same time that it is retrieved.
After I have retrieved this entity, I am mapping it using AutoMapper to another model which is used in another tier of the app. This tier performs a few operations, and makes a change to the entity, and passes it back to the repository for updating.
The repository converts this business model back into a database model, and attempts to attach it to the datacontext in order to take advantage of the automagic updating.
No matter what combination of Attach(entity, true) Attach(entity) etc, and it gives me messages like "Row not found or changed" or "Unable to add an entity with the same primary key".
Does anyone have any experience with the Attach method and how it can be used to update entities that did not necessarily come from the data context using query syntax (ie in this case a stored proc)?
Thanks alot
First, if you are creating a copy of the object, making changes and then trying to attach the copied object to the same DataContext as the one with the original object in it, then this would probably result in the "Unable to add an entity with the same primary key" message. One way to handle this is:
1. Get object from DataContext
2. Make changes and map object (or vice versa - whatever order)
3. Update the original object with the new values made in the other tier
4. SubmitChanges on the DataContext containing the original object
or
Get the object from a DataContext and close the DataContext
Make your changes and do your mapping
Retrieve the object from the DataContext to which you want to save
Update that object with the values from your mapped object
SubmitChanges
Alternately, when you say you are using the proc because you need to update a property at the same time that you retrieve it, I'd need to see the proc, but if you are somehow committing this update after retrieving the information, then indeed the message "row not found or changed" is correct. This would be hard to do, but you could do it if you're loading the data into a temp table, doing the update, and then using a select from the temp table to populate the object. One thing you could try is setting that property, in the L2S designer, to AutoUpdate = Never and see if that makes the problem go away. If so, this is your problem.
1: is it the same data-context, and
2: is it the same entity instance (or one that looks like it)
This would only happen for the same data-context, I suspect. If it is the same entity, then it is already there; just call SumbitChanges. Otherwise, either use a second data-context or detach the original entity.
So if I retrieved the entity via a stored proc, is it being tracked by the datacontext?
The thing is.. I'm going from the data model, to a another model that is used by another component, and then back. Its not.. really the same instance, but it does have all the same properties.
IE
public Models.Tag GetEntity()
{
var dbTag = db.PROJ_GetEntity((int)EntityStatuses.Created, (int)EntityStatuses.CreatingInApi).SingleOrDefault();
return FromDb Entity(dbEntity);
}
var appModel = GetEntity(); // gets an Entity from a stored proc (NOT GetEntity_RESULT)
appModel.MakeSomeChanges();
_Repo.Persist(appModel);
public void Persist(Models.AppModel model)
{
var dbEntity = Mapper.Map(model);
db.Attach(dbEntity);
db.SubmitChanges();
}
This is somewhat pseudo code like.. but it demostrates pretty much exactly what I am doing.
Thanks
I'm upvoting weenet's answer because he's right - you can't use Attach to apply the changes.
Unlike Entity Framework, you can only attach an L2S object to a datacontext if it has never been attached before - i.e. it's a newed entity that you want to Insert into a table.
This does cause numerous problems in multi-layered environments - however I've been able to get around many of the issues by creating a generic entity synchronisation system, which uses reflection and expression trees.
After an object has been modified, I run the dynamic delegate against a new object from the DC and the modified object, so that only the differences are tracked in the DC before generating the Update statement. Does get a bit tricky with related entities, though.

Categories

Resources