AutoMapper to copy EF connected object without line and lines of code - c#

This is a bit of a long one, so get yourself a coffee (other quality libations are available).
In an effort to write as little code as possible, keep my apps simple, testable, maintainable and clean, and to turn apps around in quickly.
I'm using a really simple method I wrote to copy from MVC objects to EF objects to save me writing loads of code when I have objects with loads of properties. In fact, I don't care what the object is or how many properties it has. I just want to copy it without prepping loads of code in a map or somewhere.
Please don't start off on view models and all that, and quoting me the big book of Microsoft. All I'm looking for a little advice from my peers and the community in general about AutoMapper The example here is simple so you can see what I'm getting at.
What I didn't want to do and I've seen it lots, was:-
item ... //the original item populated from somewhere, MVC database, who cares, its got stuff in it
Item newItem = new Item();
newItem.prop1 = item.prop1;
newItem.prop2 = item.prop2;
newItem.prop3 = item.prop3;
newItem.prop4 = item.prop4;
//... you get the idea
or even this ...
Item newItem = new Item {
prop1 = item.prop1,
prop2 = item.prop2,
prop3 = item.prop3,
prop4 = item.prop4,
//... you get the idea
}
So I came up with this. A function called CopyObject that does excatly what I want it to do, so I don't have to care about any objects or how many properties it has, I write one line of code anywhere I need to that does all the work for me. See the example below
//show the item in a view, typically a bootstrap popup dialog
public IActionResult EditItem(int ID)
{
Item item = _dbContext.Items.Where(i => i.ID == ID).FirstOrDefault();
if (item == null)
item = new Item { ... property defaults ... };
return View(item);
}
//save the item from the view
[HttpPost]
public JsonResult EditItem(Item item)
{
Item newItem = _dbContext.Item.Where(i => item.ID).FirstOrDefault();
if (newItem == null)
{
newItem = newItem {... property defaults ... };
_dbContext.Items.Add(newItem);
}
//here is the fun part
CopyObject(item, newItem, ...ignore some properties);
_dbContext.SaveChanges();
return new JsonResult( new { result = "success", message = "Updated" });
}
CopyObject is my function, it does nothing clever except it uses reflection to copy properties from one object to another (EF)object without losing the connection to EF. CopyObject looks like this (below). I won't bore you with the implementation, but simply, it uses reflection to copy properties between any two objects.
At the moment it only copies from the top-level because that's all I need it to do right now, but it wouldn't be a massive stretch to get it to copy a hierarchy of stuff.
It doesn't actually care that the object types match, it doesn't care that the property types match. It only cares that if finds properties on each object with the same name, and it will then attempt to copy the values. You can also specify properties not to copy.
/// <summary>
/// Copies the values of the properties from one object to another object with the same properties if they exist.
/// This will try to copy from any two objects whether they are the same object type or not
/// </summary>
/// <param name="CopyFrom">The object to copy property data from</param>
/// <param name="CopyTo">The object to copy property data to</param>
/// <param name="DontCopy">A (string) list field names not to be copied</param>
/// <returns>True if at least one property was copied, otherwise false</returns>
public bool CopyObjects(object CopyFrom, object CopyTo, params string[] DontCopy) {...}
There is nothing wrong with my code it works perfectly fine just the way I need it to. I don't have any extra work to do when I start a new project no matter how many properties any of the objects have. I just import that
Anyway, because I'm not published or any kind of authority my code is getting frowned upon. I've been told AutoMapper can do the same thing but I can't seem to make it. I always get a disconnected object that I then have to do some tomfoolery to get it back into the EF and ultimately database.
So my question is. How do you acheive the same thing using Automapper without lots of code?. Remember my goal is not to have to write loads of code, in prep or in line.

AutoMapper could ignore properties(such as Name in following example) by using below code
public class MyProfile : Profile
{
public MyProfile ()
{
CreateMap<Item, Item>().ForMember(x => x.Name, opt => opt.Ignore()) ;
}
}
Your Action:
var newItem = _mapper.Map<Item, Item>(item);
Refer to
Ignore mapping one property with Automapper
https://medium.com/ps-its-huuti/how-to-get-started-with-automapper-and-asp-net-core-2-ecac60ef523f

Related

Why won't two views of the same type correctly return the current row?

On a custom screen, I have two identical views defined like this:
public PXSelectJoin<MXMix,
InnerJoin<MXBatch, On<MXBatch.batchID, Equal<MXMix.batchID>>,
InnerJoin<SOLine, On<SOLine.orderType, Equal<MXBatch.sOOrderType>,
And<SOLine.orderNbr, Equal<MXBatch.sOOrderNbr>,
And<SOLine.lineNbr, Equal<MXBatch.sOLineNbr>>>>,
InnerJoin<Customer, On<Customer.bAccountID, Equal<SOLine.customerID>>,
LeftJoin<Address, On<Address.addressID, Equal<Customer.defAddressID>>,
LeftJoin<INLocation, On<INLocation.locationID, Equal<MXMix.locationID>>>>>>>> MixesBlk;
public PXSelectJoin<MXMix,
InnerJoin<MXBatch, On<MXBatch.batchID, Equal<MXMix.batchID>>,
InnerJoin<SOLine, On<SOLine.orderType, Equal<MXBatch.sOOrderType>,
And<SOLine.orderNbr, Equal<MXBatch.sOOrderNbr>,
And<SOLine.lineNbr, Equal<MXBatch.sOLineNbr>>>>,
InnerJoin<Customer, On<Customer.bAccountID, Equal<SOLine.customerID>>,
LeftJoin<Address, On<Address.addressID, Equal<Customer.defAddressID>>,
LeftJoin<INLocation, On<INLocation.locationID, Equal<MXMix.locationID>>>>>>>> MixesBag;
(For what it's worth, the page primary view is NOT of the same type.)
These each have a data view delegate which are also identical except for one condition which separates bags from bulk.
These views are tied to two separate data grids, which correctly display the records matching the data view delegate conditions (fig 1).
The issue is, when I attempt to get the selected row from either grid, it simply returns the first row from the database regardless of the data view delegate conditions.
Also, I have SyncPosition enabled on both grids.
Am I doing something incorrectly, or is this somehow a limitation in Acumatica?
The Current property of your view actually maps to the Caches dictionary of the graph behind the scenes. Here's how the Current property is implemented on PXSelect:
/// <summary>Gets or sets the <tt>Current</tt> property of the cache that
/// corresponds to the DAC specified in the type parameter.</summary>
public virtual Table Current
{
get
{
return (Table)View.Cache.Current;
}
set
{
View.Cache.Current = value;
}
}
And PXView.Cache implementation is:
/// <summary>Gets the cache corresponding to the first DAC mentioned in
/// the BQL command.</summary>
public virtual PXCache Cache
{
get
{
if (_Cache == null || _Graph.stateLoading)
{
_Cache = Graph.Caches[CacheType];
}
return _Cache;
}
}
MixesBlk.Current and MixesBag.Current are both using the same underlying cache, and retrieving the current property of other view will be equivalent to Caches[typeof(MXMix)].Current
The solution you took is the same one I would have used -- by creating an inherited type, you ensure that both views will have a different cache in the graph. You will find this pattern used in a couple of places in Acumatica.

Merging C# objects with rules

I have some rather large files around that I would like to load in automatically with deserialization and then simply merge together, if possible maintaining memory references in a central object to make the merge as benign as possible.
Merges seem to be anything but simple though, in concept it seems easy
If there are some nulls, use the non null values
If there are if there are conflicting objects, dig deeper and merge their internals or just prefer one, maybe write custom code just for those classes to merge.
If there are collections, definitely combine them, if there are keys, like in a dictionary, try to add them, when there is a key conflict, as before, merge them or prefer one.
I've seen a lot of people around stack recommending I use Automapper to try and achieve this, though that seems flawed. Automapper isn't made for this and the overall task doesn't seem complex enough to warrant it. Its also not amazing encapsulation to put all your class specific merge code anywhere but that class. Your data pertaining to a given aspect of your code should sit in central locations, like a class object, to enable programmers to understand the usage of the data structure around them more readily. So I don't feel that automapper is a good solution for merging objects rather than simply, keeping one.
How would you recommend automating the merge of two structurally identical c# objects with nested heirarchies of custom classes?
I will post my own solution as well, but I encourage other developers, certainly many more intelligent than I, to recommend solutions.
While #JodySowald's answer decribes a nice generic approach, merging sounds to me like something that could involve an awful lot of class-specific business logic.
I would simply add a MergeWith method to each and every class in my hierarchy, down to a level where "merging" means a simple repeatable generic operation.
class A
{
string Description;
B MyB {get; set;}
public void MergeWith(A other)
{
// Simple local logic
Description = string.IsNullOrWithSpace(Description) ? other.Description : Description;
// Let class B do its own logic
MyB = MyB.MergeWith(other.MyB);
}
}
I think that in ~70% of use cases, someone will have a large hierarchical structure of many classes in a class library and will wish to be able to merge the whole hierarchy at once. For that purpose I think the code should iterate across the properties of the object and nested properties of subclasses, but only ones defined in the assembly you've created. No merging the internals of System.String, who knows what could happen. So only internal types to this assembly should be dug into for further merging
var internalTypes = Assembly.GetExecutingAssembly().DefinedTypes;
We also need a way to define custom code on a given class, there are always edge cases. I believe that this is what interfaces were created for, to generically define functionality for several classes and have a specific implementation available to the specific class. But I found that if merging requires knowledge of the data hierarchically above this class, such as the key it is stored with in a dictionary or perhaps an enum indicating the types or modes of data present, a reference to the containing datastructure should be available. So I defined a quick interface, ICombinable
internal interface ICombinable
{
/// <summary>
/// use values on the incomingObj to set correct values on the current object
/// note that merging comes after the individual DO has been loaded and populated as necessary and is the last step in adding the objects to the central DO that already exists.
/// </summary>
/// <param name="containingDO"></param>
/// <param name="incomingObj">an object from the class being merged in.</param>
/// <returns></returns>
ICombinable Merge(object containingDO, ICombinable incomingObj);
}
Bringing this together into a functional piece of code basically requires a little bit of property reflection, a little bit of recursion, and a little bit of logic, all of which is nuanced, so I just commented my code isntead of explaining it before hand. Since the goal is to affect a central object and not to create a new, merged copy, this is an instance method in the base class of the datastructure. but you could probably convert it to a helper method pretty easily.
internal void MergeIn(Object singleDO)
{
var internalTypes = Assembly.GetExecutingAssembly().DefinedTypes;
var mergableTypes = internalTypes.Where(c => c.GetInterfaces().Contains(typeof(ICombinable)));
MergeIn(this, this, singleDO, internalTypes, mergableTypes);
}
private void MergeIn(Object centralDORef, object centralObj, object singleObj, IEnumerable<TypeInfo> internalTypes, IEnumerable<TypeInfo> mergableTypes)
{
var itemsToMerge = new List<MergeMe>();
//all at once to open up parallelization later.
IterateOver(centralObj, singleObj, (f, t, i) => itemsToMerge.Add(new MergeMe(t, f, i)));
//check each property on these structures.
foreach (var merge in itemsToMerge)
{
//if either is null take non-null
if (merge.From == null || merge.To == null)
merge.Info.SetValue(centralObj, merge.To ?? merge.From);
//if collection merge
else if (merge.Info.PropertyType.IsGenericType && merge.Info.PropertyType.GetGenericTypeDefinition().IsAssignableFrom(typeof(IList<>)))
foreach (var val in (IList)merge.From)
((IList)merge.To).Add(val);
//if dictionary merge
else if (merge.Info.PropertyType.IsGenericType && merge.Info.PropertyType.GetGenericTypeDefinition().IsAssignableFrom(typeof(IDictionary<,>)))
{
var f = ((IDictionary)merge.From);
var t = ((IDictionary)merge.To);
foreach (var key in f.Keys)
if (t.Contains(key))
{
//if conflicted, check for icombinable
if (merge.Info.PropertyType.GenericTypeArguments[1].IsAssignableFrom(typeof(ICombinable)))
t[key] = ((ICombinable)t[key]).Merge(centralDORef, (ICombinable)f[key]);
}
else
t.Add(key, f[key]);
}
//if both non null and not collections, merge.
else if (merge.From != null && merge.To != null)
{
//check for Icombinable.
if (merge.Info.PropertyType.IsAssignableFrom(typeof(ICombinable)))
merge.Info.SetValue(centralObj, ((ICombinable)merge.To).Merge(centralDORef, (ICombinable)merge.From));
//if we made the object, dig deeper
else if (internalTypes.Contains(merge.Info.PropertyType))
{
//recurse.
MergeIn(centralDORef, merge.To, merge.From, internalTypes, mergableTypes);
}
//else do nothing, keeping the original
}
}
}
private class MergeMe{
public MergeMe(object from, object to, PropertyInfo info)
{
From = from;
To = to;
Info = info;
}
public object From;
public object To;
public PropertyInfo Info;
}
private static void IterateOver<T>(T destination, T other, Action<object, object, PropertyInfo> onEachProperty)
{
foreach (var prop in destination.GetType().GetProperties(BindingFlags.Public | BindingFlags.Instance))
onEachProperty(prop.GetValue(destination), prop.GetValue(other), prop);
}

Ignore null values in context update

I have a large model which has been partially updated via deserialization. Since it has only been partially updated I would like to ignore any null values when I pass this to my entity framework update. Ultimately the EntityState.Modified is set but the issue I am having is that all fields are updated. This means anything that was null is now blanked in the database.
Is it possible to change this default behavior through a setting or override a method to check for null? It seems that since the context is expecting the full model I cannot simply set only a few values.
I've verified this by mapping only what I need to modify and the same behavior occurs.
You could implement something like this.
In this case I'm using a generic repository with reflection, to iterate through the properties and exclude null values in the update method.
public virtual TEntity Update(TEntity entity)
{
dbSet.Attach(entity);
dbContext.Entry(entity).State = EntityState.Modified;
var entry = dbContext.Entry(entity);
Type type = typeof(TEntity);
PropertyInfo[] properties = type.GetProperties();
foreach (PropertyInfo property in properties)
{
if (property.GetValue(entity, null) == null)
{
entry.Property(property.Name).IsModified = false;
}
}
dbContext.SaveChanges();
return entity;
}
public IHttpActionResult PutProduct(int id, Product product)
{
NorthwindEntities db = new NorthwindEntities();
if (!ModelState.IsValid)
{
return BadRequest(ModelState);
}
db.Products.Attach(product);
// Only the fields you want to update, will change.
db.Entry(product).Property(p => p.ProductName).IsModified = true;
db.Entry(product).Property(p => p.UnitPrice).IsModified = true;
db.Entry(product).Property(p => p.UnitsInStock).IsModified = true;
// only if if the value is not null, the field will change.
db.Entry(product).Property(p => p.UnitsOnOrder).IsModified =
product.UnitsOnOrder != null;
db.SaveChanges();
return Ok(product);
}
This is a somewhat tedious problem I've had to solve.
For lack of a more straightforward solution, eventually I decided I'd rather not want to try solving it as much downstream as when EF's SaveChanges-and-the-likes' get called (i.e., no need to "hook into" EF so late), but instead as much higher upstream / earlier on as possible --
that is, to do so right after I obtain a satisfying deserialization, which is meaningful to mutate the model, on a per-entity instance basis (in my use case, none of the updatable properties would represent relationships, but only independent attributes, in E/R parlance -- YMMV)
So, I opted for a "Populate" helper, along the lines of:
static void Populate(object from, object to)
{
var sourceType = from.GetType();
foreach (PropertyInfo target in to.GetType().GetProperties())
{
// Is the property at the target object writable and *not* marked
// as `[NotMapped]'?
var isUpdatable =
target.CanWrite &&
(target.GetCustomAttribute<NotMappedAttribute>(true) == null);
if (isUpdatable)
{
// If so, just find the corresp. property with the same name at the source object
// (caller is assumed responsible to guarantee that there is one, and of the same type, here)
var source = sourceType.GetProperty(target.Name);
var #default = sourceType.IsValueType ? Activator.CreateInstance(sourceType) : null;
var equality = (IEqualityComparer)typeof(EqualityComparer<>).MakeGenericType(sourceType).GetProperty("Default", BindingFlags.Public | BindingFlags.Static).GetValue(null);
var value = source.GetValue(from);
// Test for <property value> != default(<property type>)
// (as we don't want to lose information on the target because of a "null" (or "default(...)") coming from the source)
if (!equality.Equals(value, #default))
{
target.SetValue(to, value, null);
}
}
}
}
where "from" is the fresh entity instance that just got partially populated by whatever deserialization code, and where "to" is the actual target entity that lives in the DbContext (be it an EF proxy or not);
and where NotMappedAttribute is the EF's usual.
You'd typically have Populate to be called some time after the deserialization (&/or DTO-mapping) onto your "from" instance is done, but anyway before SaveChanges() gets call on your DbContext for all the "to" entities -- obviously, we assume there is a feasible 1-to-1 mapping "from" ... "to", that the caller of Populate knows about / could figure out.
Note I still don't know if there is a more elegant (more straightforward) way to do that, without recourse to reflection -- so, there, FWIW.
Remarks
1) above code can (or should) be made more defensive in various ways, depending on the caller assumptions;
2) one may want to cache those IEqualityComparer's (&/or the PropertyInfo's) for whatever (good) reason may arise -- in my case, I didn't have to;
3) finally, my understanding is that third-party librairies such as AutoMapper are also specially designed for that sort of task, if you can afford the additional dependency
'HTH,

Reflection, contravariance and polymorphism

I have a base class (abstract) with multiple implementations, and some of them contain collection properties of other implementations - like so:
class BigThing : BaseThing
{
/* other properties omitted for brevity */
List<SquareThing> Squares { get; set; }
List<LittleThing> SmallThings { get; set;}
/* etc. */
}
Now sometimes I get a BigThing and I need to map it to another BigThing, along with all of its collections of BaseThings. However, when this happens, I need to be able to tell if a BaseThing in a collection from the source BigThing is a new BaseThing, and thus should be Add()-ed to the destination BigThing's collection, or if it's an existing BaseThing that should be mapped to one of the BaseThings that already exist in the destination collection. Each implementation of BaseThing has a different set of matching criteria on which it should be evaluated for new-ness. I have the following generic extension method to evaluate this:
static void UpdateOrCreateThing<T>(this T candidate, ICollection<T> destinationEntities) where T : BaseThing
{
var thingToUpdate = destinationEntites.FirstOrDefault(candidate.ThingMatchingCriteria);
if (thingToUpdate == null) /* Create new thing and add to destinationEntities */
else /* Map thing */
}
Which works fine. However I think I am getting lost with the method that deals in BigThings. I want to make this method generic because there are a few different kinds of BigThings, and I don't want to have to write methods for each, and if I add collection properties I don't want to have to change my methods. I have written the following generic method that makes use of reflection, but it is not
void MapThing(T sourceThing, T destinationThing) where T : BaseThing
{
//Take care of first-level properties
Mapper.Map(sourceThing, destinationThing);
//Now find all properties which are collections
var collectionPropertyInfo = typeof(T).GetProperties().Where(p => typeof(ICollection).IsAssignableFrom(p.PropertyType));
//Get property values for source and destination
var sourceProperties = collectionPropertyInfo.Select(p => p.GetValue(sourceThing));
var destinationProperties = collectionPropertyInfo.Select(p => p.GetValue(destinationThing));
//Now loop through collection properties and call extension method on each item
for (int i = 0; i < collectionPropertyInfo.Count; i++)
{
//These casts make me suspicious, although they do work and the values are retained
var thisSourcePropertyCollection = sourceProperties[i] as ICollection;
var sourcePropertyCollectionAsThings = thisSourcePropertyCollection.Cast<BaseThing>();
//Repeat for destination properties
var thisDestinationPropertyCollection = destinationProperties[i] as ICollection;
var destinationPropertyCollectionAsThings = thisDestinationPropertyCollection.Cast<BaseThing>();
foreach (BaseThing thing in sourcePropertyCollectionAsThings)
{
thing.UpdateOrCreateThing(destinationPropertyCollectionAsThings);
}
}
}
This compiles and runs, and the extension method runs successfully (matching and mapping as expected), but the collection property values in destinationThing remain unchanged. I suspect I have lost the reference to the original destinationThing properties with all the casting and assigning to other variables and so on. Is my approach here fundamentally flawed? Am I missing a more obvious solution? Or is there some simple bug in my code that's leading to the incorrect behavior?
Without thinking too much, I'd say you have fallen to a inheritance abuse trap, and now trying to save yourself, you might want to consider how can you solve your problem while ditching the existing design which leads you to do such things at the first place. I know, this is painful, but it's an investment in future :-)
That said,
var destinationPropertyCollectionAsThings =
thisDestinationPropertyCollection.Cast<BaseThing>();
foreach (BaseThing thing in sourcePropertyCollectionAsThings)
{
thing.UpdateOrCreateThing(destinationPropertyCollectionAsThings);
}
You are losing your ICollection when you use Linq Cast operator that creates the new IEnumerable<BaseThing>. You can't use contravariance either, because ICollectiondoes not support it. If it would, you'd get away with as ICollection<BaseThing> which would be nice.
Instead, you have to build the generic method call dynamically, and invoke it. The simplest way is probably using dynamic keyword, and let the runtime figure out, as such:
thing.UpdateOrCreateThing((dynamic)thisDestinationPropertyCollection);

List<Type> Remove

Somebody explain to me this:
I am trying to delete items from a list with matching ids contained in another list of strings.
Step 1 is as below:
I'm trying to delete Items from myListingSyncIDs where the ListingNumber matches ListingNumbers in lstListingsUpdatedIn24Hrs.
The item at [0] Equals a value from lstListingsUpdatedIn24Hrs, as shown in Step 2:
But as shown in Step3: The Remove fails:
Then After doing a RemoveAll(func) Step4: The Remove works
Somebody explain why the Remove(item) doesn't work, Please ...
Code:
myListingSyncIDs.AddRange(myListingSync.Listings);
#region Remove Listing References Fetched In The Last 24Hrs
// Listing References Fetched In The Last 24Hrs
// These will be excluded to optimise the running of the App.
// Basically meaning that a complete sync of all listings
// will only be done once every 24hrs
// So that if this is run every hr, it will not slow down the most recent additions
List<String> lstListingsUpdatedIn24Hrs = DAL.PropertyPortalDAL.GetSahtWebserviceUpdatesIn24Hrs();
List<P24SyncService.ListingSyncItem> myListingsUpdatedIn24Hrs =
lstListingsUpdatedIn24Hrs.Select(p => new P24SyncService.ListingSyncItem()
{
ListingNumber = p,
Status = P24SyncService.ListingState.AddedModified
}).ToList();
foreach (P24SyncService.ListingSyncItem myLSI in myListingsUpdatedIn24Hrs)
{
myListingSyncIDs.Remove(myLSI);
}
myListingSyncIDs.RemoveAll(p => lstListingsUpdatedIn24Hrs.Contains(p.ListingNumber));
#endregion
ListingSyncItem is:
public partial class ListingSyncItem {
private string listingNumberField;
private ListingState statusField;
/// <remarks/>
public string ListingNumber {
get {
return this.listingNumberField;
}
set {
this.listingNumberField = value;
}
}
/// <remarks/>
public ListingState Status {
get {
return this.statusField;
}
set {
this.statusField = value;
}
}
}
At a guess, your ListingSyncItem type doesn't override Equals, so List<T>.Remove doesn't know that the item to remove is "equal" to the item in your list.
Simply overriding Equals and GetHashCode appropriately (to check for the equality of ListNumber and Status, presumably, and build a hash code based on those) should fix the problem.
For RemoveAll, you're providing a predicate. That doesn't use ListingSyncItem.Equals, which is why it's working.
I can't be sure without seeing the definition of ListingSyncItem, but I'm guessing this has to do with the fact that you have two instances referring to the same item.
You know that two different instances with the same ListingNumber refer to the same conceptual object, but the List<> doesn't. By default, .NET knows two objects are identical if they share the same instance, but since you're creating a new ListingSyncItem in your internal lambda function, it's not being removed.
What you should do is implement IEquatable in your ListingSyncItem class and make it return True for two objects with the same ListingNumber. Then List will know to remove the right items from the list.
As stated, it is because you arent overriding the Equals.
By default, it will check for ref equality. Which, obviously, isnt the case here.
Either override Equals and GetHashCode or use some way of getting the correct reference (lambdas for instance)

Categories

Resources