I have a reference file (dll) containing a class, which i use as my base class:
public class Group
{
public Group();
public int Id { get; set; }
public int League_Id { get; set; }
public string Name { get; set; }
public string Nationality { get; set; }
}
(Please note that the Group class contains about 30 entities, i have only displayed some)
Since i use Entity Framework the ID needs to be unique. i get this data from a API call and noticed that the ID is not unique, i have added my own class.
public class modGroup : APICall.Group
{
public modGroup()
{
modID = 0;
}
[Key]
public int modID { get; set; }
}
This setup works, EF is creating the database.
What i'd like to do is get the data from the API (Which is structured as the Group class), create a new modGroup() and set all data from the API call, without referencing each individual object.
What i would like to do is transfer data without setting each individual entity.
List<APICall.Group> groupData= _ApiRequester.GetGroup();
using (var a = new databaseModel())
{
foreach (APICall.Group x in groupData)
{
var modGroup = new Models.modGroup();
modGroup.modID = 0;
// fill modgroup with rest of variables without doing:
// modGroup.League_iD = a.League_iD;
// modGroup.Name = a.Name;
// etc
}
}
I would use Automapper to map the two classes together in one call. This type of situation is what it was designed for. All you would need to do is create a mapping configuration and then map the two classes. It's a very simple but powerful tool. e.g. Something like this:
var config = new MapperConfiguration(cfg => cfg.CreateMap<APICall.Group, modGroup>()
.ForMember(dest => dest.modID, s=>s.MapFrom(s=>s.Id));
// etc create more mappings here
);
var mapper = config.CreateMapper();
List<modGroup> modGroupList = mapper.Map<List<modGroup>>(groupData);
Related
I am looking some utility using which I can avoid writing extra line of code.
For Example
config.CreateMap<ModelClass, DTOClass>();
Though I dont have any difference between ModelClass and DTOClass still I need to create map or automapper can do it by itself?
As per the comments, seemingly the Automapper team elected to remove the CreateMissingTypeMaps option in the 9.0 upgrade. Speculatively, this was because automatic map creation could lead to unexpected mappings and awkward runtime bugs, and also, it is preferable to define all maps at bootstrap time and compile them, rather than have them compiled lazily on the fly during first mapping.
However, if you use a consistent naming scheme between your Poco class layers, you can quite easily replicate the automapping capability, at bootstrap time, for all classes with the same names in the two poco layers. For example, if you're convention is:
namespace Models
{
public class MyModel
{
public int Id { get; set; }
public string Name { get; set; }
public decimal Amount { get; set; }
public DateTime Date { get; set; }
}
}
namespace Dtos
{
public class MyDto
{
public int Id { get; set; }
public string Name { get; set; }
public decimal Amount { get; set; }
public DateTime Date { get; set; }
}
}
You can then reflect out all the classes meeting your naming convention in each of the layers, and explicitly create a mapping between the matching classes:
const string modelSuffix = "Model";
const string dtoSuffix = "Dto";
var mapperConfiguration = new MapperConfiguration(cfg =>
{
// You may need to repeat this if you have Pocos spread across multiple assemblies
var modelTypes = typeof(MyModel).Assembly
.GetTypes()
.Where(type => type.Namespace == "Models" && type.Name.EndsWith(modelSuffix))
.ToDictionary(t => StripSuffix(t.Name, modelSuffix));
var dtoTypes = typeof(MyDto).Assembly
.GetTypes()
.Where(type => type.Namespace == "Dtos"
&& type.Name.EndsWith(dtoSuffix));
foreach (var dtoType in dtoTypes)
{
if (modelTypes.TryGetValue(StripSuffix(dtoType.Name, dtoSuffix), out var modelType))
{
// I've created forward and reverse mappings ... remove as necessary
cfg.CreateMap(dtoType, modelType);
cfg.CreateMap(modelType, dtoType);
}
}
});
var mapper = mapperConfiguration.CreateMapper();
StripSuffix is simply:
public static string StripSuffix(string someString, string someSuffix)
{
if (someString.EndsWith(someSuffix))
return someString.Substring(0, someString.Length - someSuffix.Length);
else
return someString;
}
Which you can now test as applicable:
var model = new MyModel
{
Id = 1,
Name = "Foo",
Amount = 1.23m,
Date = DateTime.UtcNow
};
var dto = mapper.Map<MyDto>(model);
var backTomodel = mapper.Map<MyModel>(dto);
I am learning how to use AutoMapper. First thing first, I don't use Entity Framework to read my data.
Hence, in my case I have to do manual mapping for each of the properties of my response model.
Below code may help you get more insight of this:
Response model:
public class TotalLossResults
{
public string N_CLAIM_NUMBER { get; set; }
public string N_CLAIM_ID { get; set; }
}
MapperClass:
public class TLResultsMapper : Profile
{
private TotalLossResults tlResultsObj = new TotalLossResults();
public TLResultsMapper()
{
IMappingExpression<DataRow, TotalLossResults> mappingExpression = CreateMap<DataRow, TotalLossResults>();
foreach (var prop in tlResultsObj.GetType().GetProperties())
{
mappingExpression.ForMember(prop.Name, y => y.MapFrom(s => s[prop.Name]));
}
}
}
Note: in the mapper class I used for each to get rid of the mappingExpression.ForMember statement for each property. But this works only when the property name is the same as of the column name (entity name for example) of the result which I get from the database.
I am looking out for some option where I can take similar approach to map the data values to my response model properties when the property's names are not matching with the column names.
I tried doing something like this:
I created another class which has the properties with different names:
public class TLResultsDifferentNames
{
public string N_CLAIM_NUMBER { get; set; }
public string N_CLAIM_ID { get; set; }
}
and a mapper implementation like this:
private TLResultsDifferentNames tlResultsObj = new TLResultsDifferentNames ();
private TotalLossResults tlResultsColObj = new TotalLossResults ();*
for (int i = 0, j = 0; i<tlResultsObj.GetType().GetProperties().Length - 1 && j<tlResultsColObj.GetType().GetProperties().Length - 1; i++, j++)
{
mappingExpression.ForMember(tlResultsObj.GetType().GetProperties()[i].Name, y => y.MapFrom(s => s[tlResultsColObj.GetType().GetProperties()[j].Name]));
}
But this doesn't work. It binds the last column values to all the model properties.
Any help/suggestion to achieve the mapping without using the manual way of mapping would be very helpful.
I could find something really interesting in Auto Mapper today. Which is Attribute Mapping and using that i need not to worry about any sort of manual/dynamical mapping for my models.
Below is the code which works perfectly now for all the properties:
Ex1: here all the properties' names are same
[AutoMap(typeof(object))] //this takes our Source class name
public class TotalLossResults
{
public string N_CLAIM_NUMBER { get; set; }
public string N_CLAIM_ID { get; set; }
}
Ex2: here we got different properties
[AutoMap(typeof(TotalLossResults))] //this takes our Source class name
public class TLResultsDifferentNames
{
[SourceMember(nameof(TotalLossResults.N_CLAIM_NUMBER))]
public string claimNumberOfJack { get; set; }
public string claimIDofJack { get; set; }
}
For mapping configuration we gonna use the below code:
var config1 = new MapperConfiguration(cfg =>
cfg.AddMaps(typeof(TotalLossResults)));
var mapper = new Mapper(config1);
var response = mapper.Map<TotalLossResults>(sourceObject);
Note: Its better to have the configs created in App Start.
To elaborate what I try to achieve with servicestack.ormlite. Imagine that a franchise business has some branches, each branch has system and local database, all of these database are replicating each other. In the system, each model is with a property called store_id like below.
public class UserEntity : EntityBase
{
[PrimaryKey, AutoIncrement]
public int id { get; set; }
public string user_id { get; set; }
public string name { get; set; }
public string email { get; set; }
public string password { get; set; }
public int role { get; set; }
}
public class EntityBase
{
public int store_id {get;set;}
public bool is_delete {get;set;}
}
We have 40+ entity and repos, is there any way to have all servicestack.ormlite read api filtered by store_id in one action instead of coding repo by repo ? I've a abstract repobase from which all repos are derived. And some repos needs to read all data across different store_id.
any help is much appreciated !!
This question is still unclear on what answer it wants, the screenshot says it doesn't know which API to use to filter by store_id but your screenshot includes 2 different examples of filtering by store_id?
db.Where<T>(new { store_id = _store_id });
db.Where<T>("store_id", _store_id);
Both of which should work. Although I'd recommend using the Typed version when possible, you can also use nameof() instead of magic strings:
db.Where<T>(nameof(EntityBase.store_id), _store_id);
Maybe you're after different examples of doing the same thing inside a generic repo?
You can also query using a typed SqlExpression<T>:
var q = db.From<T>().Where(x => (x as EntityBase).store_id == _store_id);
var all = db.Select(q);
Or if you want to combine it with an additional typed expression:
var q = db.From<T>().Where(x => (x as EntityBase).store_id == _store_id);
var filtered = db.Select(q.And(expr));
Since you're already using generic constraints, you can also add a constraint that the entity must be a EntityBase as well, e.g:
class RepoBase<T> where T : EntityBase, new() { ... }
That way you can query without casting, e.g:
var q = db.From<T>().Where(x => x.store_id == _store_id);
var all = db.Select(q);
and
var q = db.From<T>().Where(x => x.store_id == _store_id);
var filtered = db.Select(q.And(expr));
I have two objects with a many-to-one relationship:
public class Product
{
public int ProductID { get; set; }
public string ProductName { get; set; }
public virtual Collection<ProductInventory> ProductInventorys { get; set; } = new Collection<ProductInventory>();
}
public class ProductInventory
{
public int ProductInventoryID { get; set; }
public string ProductInventoryName { get; set; }
public int ProductID { get; set; }
public virtual Product ProductFK { get; set; }
}
I would like to add a new Product with a collection of existing ProductInventory (my API would have an input of ProductInventoryID array) into the database, so I perform like:
private void AddProduct(int[] productInventoryIDs)
{
Product newProduct = new Product();
newProduct.Name = "New Product";
// Here I have no clue which would be the correct way...should I use
// Approach A - fetch each related "ProductInventory" entity from database,
// then add them into the collection of my new base entity - Product)
productInventoryIDs.ToList().Foreach(p =>
{
newProduct.ProductInventorys.Add(_dbContext.ProductInventory.FindById(p))
}
);
_dbContext.Products.Add(newProduct);
_dbContext.SaveChanges();
// Approach B: Save base "Product" entity first, then grab the new ProductID,
// then fetch each "ProductInventory" from database and assign the foreign key with the new "ProductID" value, and then save each
_dbContext.Products.Add(newProduct);
var newProductID = _dbContext.SaveChanges();
productInventoryIDs.ToList().Foreach(pi =>
{
var existedProductInventoryFromDb = _dbContext.ProductInventory.FindById(pi);
existedProductInventoryFromDb.ProductID = newProductID;
_dbContext.SaveChanges();
}
);
}
By using approach (A), my newProduct failed to save and I looked into SQL resource, looks like it is trying to insert ProductInventory as well, although these ProductInventory already exist in the database. I guess that's because I add them into my base entity's collection?
By using approach (B), I am feeling a little awkward for doing that as it's like fetching and saving multiple times for just one object, I doubt if I am doing the correct way...
Maybe I am wrong at both approaches, so what would be the correct way to deal with above scenario?
I got the following entity model which I use in Entity Framework:
public class User {
public int Id { get; set; }
public string Name { get; set; }
public string EMail { get; set; }
}
Now I'm trying to display to user on a view (MVVM in WPF, MVC in ASP.NET...), but along with other information that isn't available inside the database, but can be fetched at runtime from a service.
For this, I created a derived model class:
public class UserDetail : User {
public bool IsOnline { get; set; }
}
And now some gibberish code that describes what I want to achieve:
var users = _myContext.Users
.ToList()
.Select(x => new UserDetail() {
IsOnline = _myUserService.IsOnline(x.Id)
} = (UserDetail)x); // downcast x (User) to the new UserDetail instance
return View["MyView", users];
Now, downcasting doesn't work that way in C#.. do I have any other options to achieve what I want?
You can add a constructor by copy to UserDetail.
public class UserDetail : User
{
public UserDetail(User x)
{
this.Id = x.Id;
this.Name = x.Name;
this.EMail = x.EMail;
}
public bool IsOnline { get; set; }
}
(that kind of constructor can be generated by T4 if you have many class with this behaviour)
then, change your linq and use that constructor :
var users = _myContext.Users
.ToList()
.Select(x => new UserDetail(x) {
IsOnline = _myUserService.IsOnline(x.Id)
});
return View["MyView", users];
No, you have to copy the properties one by one, or write some code which will do it for you.
Create a separate view model. You shouldn't expand your entity models to accommodate for properties required by your view. Then you can either copy properties one by one as zahorak suggested or use a library specificaly made for this task like AutoMapper.