Mapping List<dynamic> using Slapper - c#

I've got the following code snippet in a repository class, using Dapper to query and Slapper.Automapper to map:
class MyPocoClass{
MyPocoClassId int;
...
}
//later:
var results = connection.Query<dynamic>("select MyPocoClassID, ...");
return AutoMapper.MapDynamic<MyPocoClass>(results).ToList();
results above has many items, but the list returned by AutoMapper.MapDynamic has only one item (which is clearly wrong). However, I found that adding the following configuration to AutoMapper fixes the problem:
AutoMapper.Configuration.AddIdentifier(typeof(MyPocoClass), "MyPocoID");
Why does Slapper.AutoMapper need to know the key of my class to simply map a list to another list? Is it trying to eliminate duplicates? I'll also note that this only happens while mapping a certain one of my POCOs (so far)...and I can't figure out why this particular POCO is special.

Turns out this is a bug in Slapper.AutoMapper.
The library supports case-insensitive mapping and convention-based keys. The SQL result set has MyPocoClassID and the class itself has MyPocoClassId -- which is not a problem for Slapper.AutoMapper as far as mapping goes. But internally Slapper.AutoMapper identifies (by convention) that MyPocoClass has MyPocoClassId as its identifier, and it can't find that field in the result set. The library uses that key to eliminate duplicates in the output list (for some reason), and since they're all 'null/empty', we get only one record.
I may submit a pull request to fix this problem, but since the library appears to be unmaintained I don't think it'll help.

Related

Get column name from dynamic query

I have a problem and it is taking me too long to solve. I've tried to search a lot of websites, including other questions here at stackoverflow. Nothing works so far.
My problem is: I need to use the SqlQuery() method from Entity Framework in order to make generic queries and return the results in dynamically created objects (using c# Reflection, for instance); the issue is easy to solve when the query specifies the columns to be returned. With those, I can create my object using those column names and everything would work. But when I get a query requesting all columns (like 'select * from mytable') I need a way to get, from the result, the names of the columns, in order that I can access the results.
I have not found a way to do that yet. All the posts and articles I have read on this always assume that I previously know the structure of the object returned, even those which use Reflection.
Any help?
you can do it with a DataTable if i'm correct with the method
DataTable.Collumns[index].caption
and you can get the count with
datatable.collumns.count

Nhibernate Polymorphic Query - Eager Load Associations Without Polymorphic Fetch

I will start by saying I have already looked thoroughly in stack overflow, nhusers and the documentation for a possible solution to my issue.
I need to be able to query only the base class table in parts of my multi/future query when eagerly loading associations (although from the research I have done I don't think this is possible)
I have started to map an existing schema using fluent nhibernate as a proof of concept. I have mapped an inheritance hierarchy using table per sub class (The mappings all work perfectly fine so I won't paste them all in here). The hierarchy has around 15 sub classes and the base class has some additional associations. E.g.
Base
Dictionary<string, Attribute> Attributes
List<EntityChange> Changes
I need to eagerly load both of the collections as in the given scenario they are required for post processing and lazily loading them causes performance issues. I am eagerly loading them by a multi / future query:
var baseQuery = session.CreateCriteria<Base>("b")
.CreateCriteria("Nested", JoinType.LeftOuterJoin)
.CreateCriteria("Nested2", JoinType.LeftOuterJoin)
.CreateCriteria("Nested2.AdditionalNested", JoinType.LeftOuterJoin);
var logsQuery = session.CreateCriteria<Base>("b").CreateAlias("Changes", "c", JoinType.LeftOuterJoin,
Expression.And(Expression.Ge("c.EntryDate", changesStartDate), Expression.Le("c.EntryDate", changesEndDate)))
.AddOrder(Order.Desc("c.EntryDate"));
var attributesQuery = session.CreateCriteria<Base>("t").SetFetchMode("Attributes", FetchMode.Join);
logsQuery.Future<Base>();
attributesQuery.Future<Base>();
var results = baseQuery.Future<Base>().ToList();
The queries execute and return the correct results. But just to eagerly load the associations in this manner means that the attribute and changes queries have to perform a polymorphic fetch (the addition of about 15 left outer joins per query that aren't required). I know this is required for polymorphic querying but the base query will return the hierarchy that I desire. The other parts of the multi query that are issuing a polymorphic query are redundant.
I haven't yet mapped the whole of the hierarchy so there will be additional unecessary joins being performed and there are also other associations that could be loaded up front. These two combined without the addition of an increase in volume will lead to performance issues. The performance currently of this query is about 6 seconds (which admittedly is better than the 20 it's currently taking) but by messing around a bit with the query and taking out the extra joins I can get it down to about 2 seconds (this is a common query so getting it as low as possible is beneficial not just pleasing to me. It will also be run from multiple distributed machine so I would rather not get into a discussion about caching / 2nd level caching).
I have tried
using the class modifier in the query 'class = base'. I initially done this blindly but believe this is for discriminator values. Even if it is for the case statement in the SQL this will not prevent the extra joins.
Doing everything in a single query. This is slower than splitting it up as above and gives the cartesian product
Using 'Polymorphism.Explicit();' in the fluent mappings. This has no effect as I am using ClassMap with SubclassMaps so it is ignored. I tried changing all the maps to ClassMaps and using Join but this didn't give the desired behaviour.
Tried to trick nhibernate into joining the base class table onto itself for loading associations (basically load a more concrete type to prevent the polymorphic query) - create a derived class 'BaseOnlyLoading' which uses the same table and primary key as the base class. This was obviously a hack but I was just trying to see what's possible. NHibernate doesn't allow the class and sub class to use the same table.
Define the BaseOnlyLoadingMap to be a classmap with the same assocations as the BaseMap with a join back onto the Base. This was hopeful as assocation collections are resolved in the context based on full type name.
Use an interceptor which modifies the SQL that before it's execute. I wouldn't use this in production and just tried it out of interest. I passed an interceptor into a local session. This caused issues and I didn't proceed.
The HQL 'Type' query operator as explained here although I am not sure this has been implemented in the .NET version and might behave similarly to 1.
There is comment on highest rated answer (How to perform a non-polymorphic HQL query in Hibernate?) which suggest overriding the IsExplicitPolymorphism on the persister. I had a quick look and from what I remember the persister was either global per entity or created in the SessionImpl from a static factory which would prevent doing this. Even if this was possible I am not sure what sort of side effects this would have.
I tried using some SQL to load everything but even if I use a stored proc I am not sure how nhibernate will piece the graph back together. Maybe I could specify all the entities and aliases?
Specifying explicit per query would be nice. Any suggestions?
Thanks in advance.

sorting on related field in llblgen 2.6

I inherited an application that uses llblgen 2.6. I have a PersonAppointmentType entity that has a AppointmentType property (n:1 relation). Now I want to sort a collection of PersonAppointmentTypes on the name of the AppointmentType. I tried this so far in the Page_Load:
if (!Page.IsPostBack)
{
var p = new PrefetchPath(EntityType.PersonAppointmentTypeEntity);
p.Add(PersonAppointmentTypeEntity.PrefetchPathAppointmentType);
dsItems.PrefetchPathToUse = p;
// dsItems.SorterToUse = new SortExpression(new SortClause(PersonAppointmentTypeFields.StartDate, SortOperator.Ascending)); // This works
dsItems.SorterToUse = new SortExpression(new SortClause(AppointmentTypeFields.Name, SortOperator.Ascending));
}
I'm probably just not getting it.
EDIT:
Phil put me on the right track, this works:
if (!Page.IsPostBack)
{
dsItems.RelationsToUse = new RelationCollection(PersonAppointmentTypeEntity.Relations.AppointmentTypeEntityUsingAppointmentTypeId);
dsItems.SorterToUse = new SortExpression(new SortClause(AppointmentTypeFields.Name, SortOperator.Ascending));
}
You'll need to share more code if you want an exact solution. You didn't post the code where you actually fetch the entity (or collection). This may not seem relevant but it (probably) is, as I'm guessing you are making a common mistake that people make with prefetch paths when they are first trying to sort or filter on a related entity.
You have a prefetch path from PersonAppointmentType (PAT) to AppointType (AT). This basically tells the framework to fetch PATs as one query, then after that query completes, to fetch ATs based on the results of the PAT query. LLBLGen takes care of all of this for you, and wires the objects together once the queries have completed.
What you are trying to do is sort the first query by the entity you are fetching in the second query. If you think in SQL terms, you need a join from PAT=>AT in the first query. To achieve this, you need to add a relation (join) via a RelationPredicateBucket and pass that as part of your fetch call.
It may seem counter-intuitive at first, but relations and prefetch paths are completely unrelated (although you can use them together). You may not even need the prefetch path at all; It may be that you ONLY need the relation and sort clause added to your fetch code (depending on whether you actually want the AT Entity in your graph, vs. the ability to sort by its fields).
There is a very good explanation of Prefetch Paths and how they were here:
http://www.llblgening.com/archive/2009/10/prefetchpaths-in-depth/
Post the remainder of your fetch code and I may be able to give you a more exact answer.

Reading several tables from one single Entity Framework ExecuteStoreQuery request.

I have a library which uses EF4 for accessing a SQL Server data store. For different reasons, I have to use SQL Server specific syntax to read data from the store (for free text search), so I have to create the SQL code by hand and send it through the ExecuteStoreQuery method.
This works fine, except that the query uses joins to request several tables aside the main one (the main one being the one I specify as the target entity set when calling ExecuteStoreQuery), and EF never fills up the main entity's relationship properties with the other table's data.
Is there anything special to do to fill up these relationships? Using other EF methods or using special table names in the query or something?
Thanks for your help.
Executing direct SQL follows very simple rule: It uses column from the result set to fill the property with the same name in materialized entity. I think I read somewhere that this works only with the the main entity you materialize (entity type defined in ExecuteStoreQuery = no relations) but I can't find it now. I did several tests and it really doesn't populate any relation.
Ok so I'll write here what I ended up doing, which does not looks like a perfect solution, but it does not seem that there is any perfect solution in this case.
As Ladislav pointed out, the ExecuteStoreQuery (as well as the other "custom query" method, Translate) only maps the column of the entity you specify, leaving all the other columns aside. Therefore I had to load the dependencies separately, like this :
// Execute
IEnumerable<MainEntity> result = context.ExecuteStoreQuery<MainEntity>(strQuery, "MainEntities", MergeOption.AppendOnly, someParams).ToArray();
// Load relations, first method
foreach (MainEntity e in result)
{
if (!e.Relation1Reference.IsLoaded)
e.Relation1Reference.Load();
if (!e.Relation2Reference.IsLoaded)
e.Relation2Reference.Load();
// ...
}
// Load relations, second method
// The main entity contains a navigation property pointing
// to a record in the OtherEntity entity
foreach(OtherEntity e in context.OtherEntities)
context.OtherEntities.Attach(e);
There. I think these two techniques have to be chosen depending on the number and size of generated requests. The first technique will generate a one-record request for every required side record, but no unnessecary record will be loaded. The second technique uses less requests (one per table) but retrieves all the records so it uses more memory.

Using Unmapped Class with NHibernate Named Query

I'm using a custom named query with NHibernate which I want to return a collection of Person objects. The Person object is not mapped with an NHibernate mapping which means I'm getting the following exception:
System.Collections.Generic.KeyNotFoundException:
The given key was not present in the
dictionary.
It's getting thrown when the Session gets created because it can't find the class name when it calls NHibernate.Cfg.Mappings.GetClass(String className). This is all fairly understandable but I was wondering if there was any way to tell NHibernate to use the class even though I haven't got a mapping for it?
Why don't you use:
query.SetResultTransformer(Transformers.AliasToBean(typeof(Person)));
It will insert data from each column in your query into Person object properties using column alias as a property name.
How can you create a query which would return instances of a type that is not mapped ?
I think Michal has a point here, and maybe you should have a look at projections. (At least, this is what I think you're looking for).
You create a query on some mapped type, and then, you can 'project' that query to a 'DTO'.
In order to do this, you'll have to 'import' your Person class, so that it is known to NHibernate, and you'll have to use a ResultTransformer.
Something like this:
ICriteria crit = session.CreateCriteria (typeof(Person));
// set some filter criteria
crit.SetProjection (Projections.ProjectionList()
.Add (Property("Name"), "Name")
.Add (Property( ... )
);
crit.SetResultTransformer(Transformers.AliasToBean(typeof(PersonView));
return crit.List<PersonView>();
But, this still means you'll have to import the class, so that NHibernate knows about it.
By using the class, NHibernate would basically be guessing about everything involved including which table you meant to use for Person, and the field mappings. NHibernate could probably be hacked to do dynamic binding based on matching the names or something, but the whole idea is to create the mappings from plain old data object to the database fields using the xml files.
If there's not a really good reason not to map the class, simply adding the mapping will give you the best results...
That said, you can't use a named query to directly inject results into an unmapped class. You would need to tell it which columns to put into which fields or in other words, a mapping. ;) However, you can return scalar values from a named query and you could take those object arrays and build your collection manually.
To solve this, I ended up using the TupleToPropertyResultTransformer and providing the list of property values. There are a few limitations to this, the main one being that the SQL query must return the results in the same order as you provide your properties to the TupleToPropertyResultTransformer constructor.
Also the property types are inferred so you need to be careful with decimal columns returning only integer values etc. Apart from that using the TupleToPropertyResultTransformer provided a reasonably easy way to use an SQL query to return a collection of objects without explicitly mapping the objects within NHibernate.

Categories

Resources