Have I misunderstood the PetaPoco.IgnoreAttribute? - c#

I have a table containing service announcements. For this table I have a 1:1 POCO - except that it contains one extra field. In my query this is the joined in username of the author, the table contains just the author id.
I thought that I could just tack on an [Ignore] attribute on this field, and then be able to use the POCO for inserts/updates without problems? My problem is that with the [Ignore] attribute, the BrukerNavn field is not filled. And without the attribute, it goes bang on insert/update.
[TableName("tblDriftsmelding")]
[PrimaryKey("DriftID")]
public class Driftsmelding
{
public int DriftID { get; set; }
[Column("tittel")] public string Tittel { get; set; }
public string Tekst { get; set; }
public string HTMLTekst { get; set; }
[Column("gyldigfra")] public DateTime? Fra { get; set; }
[Column("gyldigtil")] public DateTime? Til { get; set; }
[Column("publisert")] public bool Publisert { get; set; }
[Column("CreatedBy")] public int? BrukerID { get; set; }
public string BrukerNavn { get; set; }
}
This is the POCO. The table is a 1:1 mapping, except the "BrukerNavn" field at the end.
select d.DriftID, d.Tekst, d.Created, d.gyldigtil, d.gyldigfra, d.publisert, d.tittel, d.HTMLTekst, d.createdby, b.brukerident as BrukerNavn
from tblDriftsmelding d
left outer join tblbruker b on d.CreatedBy = b.brukerid
order by DriftID desc
This is the query that feeds the POCO. (I have also tried using select d.*, b.brukerid. No difference)
(Note, the actual question is in bold in the above text, since it sort of got intermingled with the rest of the text)

I think what you need is the [ResultColumn] attribute - this will fill the column if your query contains data for it and it will not get used for inserts and updates.
You can see more on it here -> https://github.com/CollaboratingPlatypus/PetaPoco/wiki/Mapping-Pocos

Related

Dapper is converting my dates into 01-Jan-01 and I cannot see why?

I'm using SQL Server 2012 and Dapper v1.50.5.
I have a standard query:
SELECT *
FROM [TsData].[ImportingLogs]
WHERE ImportContextGuid = '{3c19d706-0895-49e4-b96c-38eb6a3cc579}'
which returns some data:
and the CreationTime column I am interested in has valid datetimes.
The table is simply defined as:
The POCO is:
public class OzCsImportingLogsTableModel
{
public DateTime CreationDateTime { get; set; }
public int? CreatorUserId { get; set; }
public int? DeleterUserId { get; set; }
public DateTime DeletionTime { get; set; }
public int DurationMs { get; set; }
public int Id { get; set; }
public Guid ImportContextGuid { get; set; }
public bool IsDeleted { get; set; }
public DateTime? LastModificationTime { get; set; }
public int? LastModifierUserId { get; set; }
public string Message { get; set; }
public OzCsImportManagerMessageKindEnum MessageKindId { get; set; }
public string Source { get; set; }
public string StructuredData { get; set; }
public string Tags { get; set; }
}
and the Dapper call is:
DbContext.Execute($"[{Schema}].[usp_ImportingQueue_FinaliseImport]", storedProcParams, aCommandType: CommandType.StoredProcedure)
However when I look at OzCsImportingLogsTableModel.CreationTime, the value is always 01-Jan-01 00:00:00 which indicates the value is NULL.
I don't understand this. Can someone point me in the right direction here please?
#John has it correct in his comment. The name of the property (here CreationDateTime) generally must match the name of the column (here CreationTime). Having that said, it is not really the name of the table column, but the name of the result column, so you could do something like this:
SELECT CreationTime as CreationDateTime, ...
if you can modify the actual query.
As commented by #fstam: the behavoir is that 01-jan-01 is the default value of a DateTime and since it's a non-nullable type and never set to anything by dapper, it is the value shown.
Note that the logic applied to find the member for a column name is available here:
// preference order is:
exact match over underscore match, exact
case over wrong case, backing fields over regular fields,
match-inc-underscores over match-exc-underscores
In your case, however none of the above applies; it is probably best to adjust the name of the property in code.
Update:
Another thing that I just saw: the DurationMs column in your schema is nullable, but the DurationMs property is not. You might want to define
this property as public int? DurationMs { get; set; } instead.
(I haven't checked all members).

Entity Framework Joining Three Tables and Using Group Concat

I have three models like followings,
public class Team
{
public int Id { get; set; }
public string Name { get; set; }
}
public class Document
{
public int Id { get; set; }
public string Name { get; set; }
public string Type { get; set; }
public string Application { get; set; }
public ICollection<DocumentResponsible> DocumentResponsibles { get; set; }
public string Pcda { get; set; }
public DateTime CreatedAt { get; set; }
public DateTime UpdatedAt { get; set; }
}
public class DocumentResponsible
{
public int Id { get; set; }
public int DocumentId { get; set; }
public int TeamId { get; set; }
}
I want to write an entity framework expression to join three table and select all fields of document table and team names in one row per documents. So basicly I want to join three table use group_concat for team names. Then I want to bind it to a gridview in web form.
What I have tried,
(from dc in DBContext.Document
join dr in DBContext.DocumentResponsible on dc.Id equals dr.DocumentId
join t in DBContext.Team on dr.TeamId equals t.Id
select new
{
Name = dc.Name,
Type = dc.Type,
Application = dc.Application,
Pcda = dc.Pcda,
}).ToList();
and I have just tried it,
var data = DBContext.Dcoument.Include("DocumentResponsibles").Tolist();
It's hard to help without your DbContext and the Entity Mappings, but I'll go out on a limb saying you might just want to mark Document.DocumentResponsibles as virtual.
Also, in DocumentResponsible, maybe you'd want to add a property for Document and one for Team (both marked as virtual too) this way you don't have to do the join keys all the time you want to work with your data, EF would do it for you once properly configured.
If it doesn't work, can you add the following information to your question: First, the context class and the mappings you have. Second, if you do var firstDocument = yoneylemDB.Document.First(), how does firstDocument looks like? Does it have all it's fields and properties filled out? Is there anything weird?

Update Entity from ViewModel in MVC using AutoMapper

I have a Supplier.cs Entity and its ViewModel SupplierVm.cs. I am attempting to update an existing Supplier, but I am getting the Yellow Screen of Death (YSOD) with the error message:
The operation failed: The relationship could not be changed because one or more of the foreign-key properties is non-nullable. When a change is made to a relationship, the related foreign-key property is set to a null value. If the foreign-key does not support null values, a new relationship must be defined, the foreign-key property must be assigned another non-null value, or the unrelated object must be deleted.
I think I know why it is happening, but I'm not sure how to fix it. Here's a screencast of what is happening. I think the reason I'm getting the error is because that relationship is lost when AutoMapper does its thing.
CODE
Here are the Entities that I think are relevant:
public abstract class Business : IEntity
{
public int Id { get; set; }
public string Name { get; set; }
public string TaxNumber { get; set; }
public string Description { get; set; }
public string Phone { get; set; }
public string Website { get; set; }
public string Email { get; set; }
public bool IsDeleted { get; set; }
public DateTime CreatedOn { get; set; }
public DateTime? ModifiedOn { get; set; }
public virtual ICollection<Address> Addresses { get; set; } = new List<Address>();
public virtual ICollection<Contact> Contacts { get; set; } = new List<Contact>();
}
public class Supplier : Business
{
public virtual ICollection<PurchaseOrder> PurchaseOrders { get; set; }
}
public class Address : IEntity
{
public Address()
{
CreatedOn = DateTime.UtcNow;
}
public int Id { get; set; }
public string AddressLine1 { get; set; }
public string AddressLine2 { get; set; }
public string Area { get; set; }
public string City { get; set; }
public string County { get; set; }
public string PostCode { get; set; }
public string Country { get; set; }
public bool IsDeleted { get; set; }
public DateTime CreatedOn { get; set; }
public DateTime? ModifiedOn { get; set; }
public int BusinessId { get; set; }
public virtual Business Business { get; set; }
}
public class Contact : IEntity
{
public Contact()
{
CreatedOn = DateTime.UtcNow;
}
public int Id { get; set; }
public string Title { get; set; }
public string FirstName { get; set; }
public string LastName { get; set; }
public string Phone { get; set; }
public string Email { get; set; }
public string Department { get; set; }
public bool IsDeleted { get; set; }
public DateTime CreatedOn { get; set; }
public DateTime? ModifiedOn { get; set; }
public int BusinessId { get; set; }
public virtual Business Business { get; set; }
}
And here is my ViewModel:
public class SupplierVm
{
public SupplierVm()
{
Addresses = new List<AddressVm>();
Contacts = new List<ContactVm>();
PurchaseOrders = new List<PurchaseOrderVm>();
}
public int Id { get; set; }
[Required]
[Display(Name = "Company Name")]
public string Name { get; set; }
[Display(Name = "Tax Number")]
public string TaxNumber { get; set; }
public string Description { get; set; }
public string Phone { get; set; }
public string Website { get; set; }
public string Email { get; set; }
[Display(Name = "Status")]
public bool IsDeleted { get; set; }
public IList<AddressVm> Addresses { get; set; }
public IList<ContactVm> Contacts { get; set; }
public IList<PurchaseOrderVm> PurchaseOrders { get; set; }
public string ButtonText => Id != 0 ? "Update Supplier" : "Add Supplier";
}
My AutoMapper mapping configuration is like this:
cfg.CreateMap<Supplier, SupplierVm>();
cfg.CreateMap<SupplierVm, Supplier>()
.ForMember(d => d.Addresses, o => o.UseDestinationValue())
.ForMember(d => d.Contacts, o => o.UseDestinationValue());
cfg.CreateMap<Contact, ContactVm>();
cfg.CreateMap<ContactVm, Contact>()
.Ignore(c => c.Business)
.Ignore(c => c.CreatedOn);
cfg.CreateMap<Address, AddressVm>();
cfg.CreateMap<AddressVm, Address>()
.Ignore(a => a.Business)
.Ignore(a => a.CreatedOn);
Finally, here's my SupplierController Edit Method:
[HttpPost]
public ActionResult Edit(SupplierVm supplier)
{
if (!ModelState.IsValid)
return View(supplier);
_supplierService.UpdateSupplier(supplier);
return RedirectToAction("Index");
}
And here's the UpdateSupplier Method on the SupplierService.cs:
public void UpdateSupplier(SupplierVm supplier)
{
var updatedSupplier = _supplierRepository.Find(supplier.Id);
Mapper.Map(supplier, updatedSupplier); // I lose navigational property here
_supplierRepository.Update(updatedSupplier);
_supplierRepository.Save();
}
I've done a load of reading and according to this blog post, what I have should work! I've also read stuff like this but I thought I'd check with readers before ditching AutoMapper for Updating Entities.
The cause
The line ...
Mapper.Map(supplier, updatedSupplier);
... does a lot more than meets the eye.
During the mapping operation, updatedSupplier loads its collections (Addresses, etc) lazily because AutoMapper (AM) accesses them. You can verify this by monitoring SQL statements.
AM replaces these loaded collections by the collections it maps from the view model. This happens despite the UseDestinationValue setting. (Personally, I think this setting is incomprehensible.)
This replacement has some unexpected consequences:
It leaves the original items in the collections attached to the context, but no longer in scope of the method you're in. The items are still in the Local collections (like context.Addresses.Local) but now deprived of their parent, because EF has executed relationship fixup. Their state is Modified.
It attaches the items from the view model to the context in an Added state. After all, they're new to the context. If at this point you'd expect 1 Address in context.Addresses.Local, you'd see 2. But you only see the added items in the debugger.
It's these parent-less 'Modified` items that cause the exception. And if it didn't, the next surprise would have been that you add new items to the database while you only expected updates.
OK, now what?
So how do you fix this?
A. I tried to replay your scenario as closely as possible. For me, one possible fix consisted of two modifications:
Disable lazy loading. I don't know how you would arrange this with your repositories, but somewhere there should be a line like
context.Configuration.LazyLoadingEnabled = false;
Doing this, you'll only have the Added items, not the hidden Modified items.
Mark the Added items as Modified. Again, "somewhere", put lines like
foreach (var addr in updatedSupplier.Addresses)
{
context.Entry(addr).State = System.Data.Entity.EntityState.Modified;
}
... and so on.
B. Another option is to map the view model to new entity objects ...
var updatedSupplier = Mapper.Map<Supplier>(supplier);
... and mark it, and all of its children, as Modified. This is quite "expensive" in terms of updates though, see the next point.
C. A better fix in my opinion is to take AM out of the equation completely and paint the state manually. I'm always wary of using AM for complex mapping scenarios. First, because the mapping itself is defined a long way away from the code where it's used, making code difficult to inspect. But mainly because it brings its own ways of doing things. It's not always clear how it interacts with other delicate operations --like change tracking.
Painting the state is a painstaking procedure. The basis could be a statement like ...
context.Entry(updatedSupplier).CurrentValues.SetValues(supplier);
... which copies supplier's scalar properties to updatedSupplier if their names match. Or you could use AM (after all) to map individual view models to their entity counterparts, but ignoring the navigation properties.
Option C gives you fine-grained control over what gets updated, as you originally intended, instead of the sweeping update of option B. When in doubt, this may help you decide which option to use.
I searched all stackoverflow answers and google searches. Finally i just added 'db.Configuration.LazyLoadingEnabled = false;' line and it worked perfectly for me.
var message = JsonConvert.DeserializeObject<UserMessage>(#"{.....}");
using (var db = new OracleDbContex())
{
db.Configuration.LazyLoadingEnabled = false;
var msguser = Mapper.Map<BAPUSER>(message);
var dbuser = db.BAPUSER.FirstOrDefault(w => w.BAPUSERID == 1111);
Mapper.Map(msguser, dbuser);
// db.Entry(userx).State = EntityState.Modified;
db.SaveChanges();
}
I've gotten this issue many times and is normally this:
The FK Id on the parent reference doesn't match the PK on that FK entity. i.e. If you have an Order table and a OrderStatus table. When you load both into entities, Order has OrderStatusId = 1 and the OrderStatus.Id = 1. If you change OrderStatusId = 2 but do not update OrderStatus.Id to 2, then you'll get this error. To fix it, you either need to load the Id of 2 and update the reference entity or just set the OrderStatus reference entity on Order to null before saving.
I am not sure if this is going to fit your requirement but I would suggest following.
From your code it surely looks like you are loosing relationship during mapping somewhere.
To me it looks like that as part of UpdateSupplier operation you are not actually updating any of the child details of the supplier.
If that is the case I would suggest to updadate only changed properties from the SupplierVm to the domain Supplier class. You can write a separate method where you will assign property values from SupplierVm to the Supplier object (This should change only non-child properties such as Name, Description, Website, Phone etc.).
And then perform db Update. This will save you from possible messup of the tracked entities.
If you are changing the child entities of supplier, I would suggest to update them independent of suppliers because retrieving an entire object graph from database would require lot of queries to be executed and updating it will also execute unnecessary update queries on database.
Updating entities independently would save lot of db operations and would add to the performance of the application.
You can still use the retrieval of entire object graph if you have to display all the details about the supplier in one screen. For updates I would not recommend update of entire object graph.
I hope this would help resolving your issue.

Servicestack OrmLite Ignore insert update - POCO

Is there any attribute set a POCO field just for SELECT.
Something like below;
public class Poco {
public string Id { get; set; }
public string Name { get; set; }
[IgnoreUpdate]
public Datetime CreatedOn{ get; set; }
[IgnoreInsert]
public Datetime UpdateOn{ get; set; }
}
OrmLite has [Ignore] to ignore the property completely, [IgnoreOnInsert] to ignore the property during INSERT's and [IgnoreOnUpdate] to ignore the property during updates.
An alternative solution is to use a different Model for SELECT's where you can use the [Alias] attribute to map it back to the original tablename, e.g.
[Alias("Poco")]
public class PocoDetails
{
public string Id { get; set; }
public string Name { get; set; }
public Datetime CreatedOn{ get; set; }
public Datetime UpdateOn{ get; set; }
}
Just small update - in current verion of ServiceStack v5 there is possibility to mark property with attributes [IgnoreOnInsert], [IgnoreOnUpdate] or [IgnoreOnSelect]

ADO.NET EF Code First - adding dummy data when we have two tables with 1:N relation

I have those two tables in my DataBase -
Material:
public class Material
{
[Required]
[MaxLength(10)]
public string Code { get; set; }
[MaxLength(40)]
public string Color { get; set; }
[MaxLength(40)]
public string Description { get; set; }
[MaxLength(255)]
public string Picture { get; set; }
public long MaterialTypeId { get; set; }
public virtual MaterialType MaterialType { get; set; }
}
and MaterialType :
public class MaterialType
{
[MaxLength(40)]
public string MatType { get; set; }
public virtual ICollection<Material> Materials { get; set; }
}
Then I call a method that populates all my tables with dummy data but the problem is that the Foreign Key could not be null and I still don't have data generated obviously. I tried to change the order the methods for creating the dummy data are called but this seems to not work. I wonder is there any easy way to work around this problem or maybe something that i don't know for managing this problem?
So in my case which is - the dummy data is created when the DB is created to work around the problem with Foreign Keys which could not be null the easiest way was to explicitly set the type of the property to nullable. As is in my code long?. This allows me to populate the table with data using null FK and then set the values I need.

Categories

Resources