I'm creating a web api program using entity framework. So as the basis, I have an sql server database which I'm connected to with entity framework trought my web api program. Using an add-on for entity framework, I'v generated classes according to my database tables. However i don't want to use these classes for my webservices because I don't need to display some of the attributes generated by the entity framework and little bit tricky with all the proxies problems. These attributes are especially generated because of the foreign keys. As below, for this generated class, I don't need to display "Societe" object and "Utilisateur" object:
public partial class FonctionnalitePerUser
{
public int FonctionUserLngId { get; set; }
public int FonctionUserLngUserId { get; set; }
public int FonctionUserLngSocieteId { get; set; }
public virtual Societe Societe { get; set; }
public virtual Utilisateur Utilisateur { get; set; }
}
I would need some advice to avoid displaying that entities in my webservices.
I was thinking about 3 possibilities:
As it's a partial class, I might create an other partial class with the same name where I put the attributes that I need and override the constructor.
I might inherit a custom class from that one to override the constructor in order to get one structured as I need.
I might create Management classes with functions that create the perfect objects that I need for my webservices. I mean functions that convert "FonctionnalitePerUser" object to "FonctionnalitePerUserCustom" objects.
These are the 3 solutions that I've found. In order to get the best performance, I was wondering if anyone can give me some advise about that or either propose some other solutions.
Thanks in advance
If your using Newtonsoft Json.NET which I think is the default in MVC5 then you can attribute your properties to tell newtonsoft what to serialize and what to ignore.
public class Car
{
// included in JSON
public string Model { get; set; }
public DateTime Year { get; set; }
public List<string> Features { get; set; }
// ignored
[JsonIgnore]
public DateTime LastModified { get; set; }
}
or if you have more properties you want to ignore than you want to serialize you can do this:
[DataContract]
public class Computer
{
// included in JSON
[DataMember]
public string Name { get; set; }
[DataMember]
public decimal SalePrice { get; set; }
// ignored
public string Manufacture { get; set; }
public int StockCount { get; set; }
public decimal WholeSalePrice { get; set; }
public DateTime NextShipmentDate { get; set; }
}
this information was taken from here.
In general, it is often useful to expose a different type of object for a web service API than for persistence. This is for exactly the reason you state: because you don't need to expose all of that persistence stuff to the rest of the world (clients).
Usually, you would map the information that you want to expose from your persistence model (EF entities etc) to a view model object (or DTO).
So, I would say your option 3 is on the right track.
I might create Management classes with functions that create the
perfect objects that I need for my webservices. I mean functions that
convert "FonctionnalitePerUser" object to
"FonctionnalitePerUserCustom" objects
There are several tools out there that help with the converting or mapping of the objects. One is AutoMapper which will map by convention. This can save a lot of mapping code.
Related
The Microsoft.Azure.Cosmos (v3.29.1.0) and Micosoft.Azure.Documents (v2.10.3.0) has the same class called UnixDateTimeConverter within the same namespace Microsoft.Azure.Documents. My project needs to use both packages, and I cannot rename both. How can I specifically use the class of Microsoft.Azure.Documents package? ( cause that of Micosoft.Azure.Documents has public access modifier which can be called anywhere)
Micosoft.Azure.Cosmos package
Micosoft.Azure.Documents package
Update:Specifying the problem occurs in a CosmosModel/Student.cs file
namespace DataAccess.CosmosModel{
public class Student{
public string id { get; set; }
public string Email { get; set; }
public bool IsEmailConfirmed { get; set; }
public string FirstName { get; set; }
public string LastName { get; set; }
public DateTime DateOfBirth { get; set; }
public string PhoneNumber { get; set; }
public string CountryCode { get; set; }
[Newtonsoft.Json.JsonConverter(typeof(Microsoft.Azure.Documents.UnixDateTimeConverter))]
[Newtonsoft.Json.JsonProperty(PropertyName = "_ts")]
public virtual DateTime Timestamp { get; set; }
the key is that they are not in same namespace and you can choose which one you would like to use in your using section of code for instance
using UnixDateTimeConverter = Microsoft.Azure.Cosmos.UnixDateTimeConverter;
this would instruct to use this implementation and not the older, but it is typically only necessary when you are using both namespaces to begin with, so just as often we can remove one of the using statements.
Update 1 (one more to follow):
When issue is to serialize to and from Azure Cosmos, you can choose to let the driver handle your conversion
[BsonDateTimeOptions]
public DateTime UpdatedUTCZero { get; set; } = DateTime.UtcNow;
Details about Serialization on the .net driver can be found here:
http://mongodb.github.io/mongo-csharp-driver/2.2/apidocs/html/R_Project_CSharpDriverDocs.htm
so there is a huge capability from the mongo driver:
http://mongodb.github.io/mongo-csharp-driver/2.2/apidocs/html/N_MongoDB_Bson_Serialization.htm
Update 2:
Wanting to elaborate, i'm sure you can find a hook to completely replace serialization, but if you want to use the mongo driver to for instance support 'schema' changes with the ExtraElements and such things i recommend looking in to, you're left with a collection interface IMongoCollection so if not using the Bson Serializer you'd have to store payload as string, which is sort of not good for querying.
However if You implement the serializing methods from link above, you can totally Use newtonsoft for serializing, just remember to tag your id with [BsonId] or it will generate one for You
Update 3: Example using the MongoDb.Driver and MongoDb.Json
I wouldn't normally interfere with the internal timestamp a.k.a _ts though i do not see why You couldn't expose it, it like the _id field relates to the MongoDb engine used to interface with cosmos. Wouldn't put virtual on, it's a .net thing not Bson/ Json.
using MongoDB.Bson;
using MongoDB.Driver;
using MongoDB.Bson.Serialization.Attributes;
// ...
public class Student{
[BsonId]
public string Id { get; set; }
[BsonDateTimeOptions]
public DateTime LastUpdatedDateTime{ get; set;}
[BsonElement("ts")]
public BsonTimestamp TimeStamp{ get; set; }
// ...
}
See section on timestamps: https://www.mongodb.com/docs/v4.4/reference/bson-types/
Also found a good article about dealing with mongo db engine related time issues: http://www.binaryintellect.net/articles/6c715186-97b1-427a-9ccc-deb3ece7b839.aspx this field has quite a few options.
The project I am working on is a Xamarin mobile application with a REST api and offline first practices.
To achieve this I've setup a generic repository which is supposed to handle the communication to the SQLite database on the device.
Offline first is used only as a fallback mechanism to read data and does not need to handle 'complex scenarios'
We're using automapper to convert http json objects to domain to database tables, the problem we are having problems with storing class properties of the database dto objects.
When mapping the unload and load address are correctly converted but sqlite does not seem able to parse the Address_DTO object to a string.
See the addresses and address blobs of the order dto.
/// DTO Wrapper for storing order information do the database
/// </summary>
/// <seealso cref="CarrierConnectMobile.Core.Repositories.EntityBase" />
[Table("Orders")]
public class DTO_DBOrder : EntityBase
{
public string? LicensePlate { get; set; }
public string Quantity { get; set; }
public string ProductKind { get; set; }
public string ProductType { get; set; }
public string Product { get; set; }
public OrderStatus Status { get; set; }
public bool ProvidedUnloadData { get; set; }
public bool ProvidedLoadData { get; set; }
[TextBlob("UnLoadAddressBlob")]
public DTO_DBAddress UnloadAddress { get; set; }
[TextBlob("LoadAddressBlob")]
public DTO_DBAddress LoadAddress { get; set; }
public string UnLoadAddressBlob { get; set; }
public string LoadAddressBlob { get; set; }
}
The automapper profiles for the database's DTO object's are having the same naming convention for the UnloadAddress and LoadAddress and they are correctly mapped with the following minimal setup.
public DBProfiles()
{
CreateMap<DTO_DBAddress, DTO_HttpAddress>().ReverseMap();
CreateMap<DTO_DBAddress, Address>().ReverseMap();
CreateMap<DTO_DBOrder, Order>()
.ReverseMap();
CreateMap<DTO_DBOrder, DTO_httpOrderListItem>()
.ForMember(d => d.Status, opt => opt.ConvertUsing(new OrderStatusStringToEnumConverter(), src => src.Status))
.ReverseMap();
}
Is anything wrongly configured? As I cannot seem to find out why sqlite is not storing the address object as a string blob and parsing it correctly
Better late than never. Placing this for someone who needs it in the future.
The problem was caused using the regular sqlLite library methods and not the sqlite extensions library from sqllite-pcl.
the using STLiteNetExtensionsAsync.Extensions did not get suggested with the autocomplete i relied on. I did not know that without using the extensions, the serialization of the objects will not work.
I am a beginner with DDD and I try to model elegantly in C# the next scenario:
A template that basically has only a name property on it and a list of items that have to be executed in a specific order.
public class Template
{
public string Name { get; set; }
public List<Item> Items { get; set; }
}
public class Item
{
public string Name { get; set; }
public int Order { get; set; }
}
A type called Profile.
public class Profile
{
public string Name { get; set; }
}
The profile class is intended to say
I am using template A to know what items I have and in what order
If template A changes, then I am using the new version because I don't want to keep a clone of the list template A had.
If I am deleted then the template is not affected in any way
If I am created then I require a template
I can be looked after by my name only
This looks like the aggregate root would be the template, which would have a list of Items and a list of Profiles. But I feel that searching by the name of the profile is requiring me to search all the templates that have a profile with the given name. Somehow, coming from a CRUD background, it seems a high price to pay. Also, the profile is the one that uses the template and having the template know about profiles that use it, seems wrong.
How do you model this? What should be the aggregate root here? Is more than one? How do you perform the search if you want to use it from UI?
Don't. Do not start meta-modeling and over-abstracting when you need to learn DDD. It is a really bad idea, as it will focus your attention on things that have nothing to do with learning DDD, will distract you, and will lead you to making bad decisions.
You need to start with solving concrete problems. Abstractions need to come from the concrete solutions. After you have implemented (at least three of) them, it is time to look at abstractions
Neither Profile or Template can be nested within the other aggregate, they need to exist as separate aggregates. It sounds as though the Profile needs to keep a reference to which Template it is using. Therefore, I'd include a reference to the template by id (Template.Name).
public class Template
{
public string Name { get; set; }
public List<Item> Items { get; set; }
}
public class Item
{
public string Name { get; set; }
public int Order { get; set; }
}
public class Profile
{
public string Name { get; set; }
public string TemplateName { get; set; }
}
I am writing a set of data structures to ingest third-party JSON into (no writing out) using JSON.NET.
I have a case for reading some of the top-level JSON elements into a member object of the object being deserialized into.
My JSON:
{
"Id":1
"Checksum":42
"Name":"adam",
"Hair":true
}
My ideal object structure:
public class EntityHeader
{
int Id { get; set; }
int Checksum { get; set; }
}
public class Entity
{
[HeroicJsonAttribute( "Id", "Checksum" )]
public EntityHeader Header { get; set; }
public string Name { get; set; }
public bool Hair { get; set; }
}
Is there a simple way to achieve this? I will have a number of types which will need this, and I'd hate to have to write a JsonConverter for each.
This question has been asked before, here, but the accepted answer doesn't address the question.
Thanks!
An alternative approach would be to use an EntityHeader field in the Entity class as a backing store for private properties which can be deserialized into:
public class EntityHeader
{
int Id { get; set; }
int Checksum { get; set; }
}
public class Entity
{
private EntityHeader m_Header = new EntityHeader();
public EntityHeader Header { get { return m_Header; } }
[JsonProperty]
private int Id { set { m_Header.Id = value; } }
[JsonProperty]
private int Checksum { set { m_Header.Checksum = value; } }
public string Name { get; set; }
public bool Hair { get; set; }
}
Thus, all the properties in the JSON can be read straight into the Entity object, but consumers of Entity objects have access to a "nicely encapsulated" EntityHeader property.
I haven't tested this, and it may even be kludgey, but it would technically work for me (OP). I am still interested in other answers!
Base on your example you could either; use the adapter pattern:
public class EntityJson
{
int Id { get; set; }
int Checksum { get; set; }
public string Name { get; set; }
public bool Hair { get; set; }
}
// quick/poor example
public class EntityAdapter : IEntity
{
public EntityAdapter(EntityJson model)
{
Header = new Header(); // and populate this objects fields
Name = model.Name; // populate other properties
}
public EntityHeader Header { get; set; }
public string Name { get; set; }
public bool Hair { get; set; }
}
Or abuse the fact that json.net ignores properties not available:
var entity = JsonConvert.Deserialze<Entity>();
var header = JsonConvert.Deserialize<EntityHeader>();
entity.Header = header;
I'm going to go ahead and post this answer which is a little bit too long for a comment, so please take this more as an extended comment than an actual attempt to answer your specific question. And of course, you know your requirements best so this is just my considered opinion :)
With that in mind, my advice is:
Don't do this.
I would instead create a simple DTO class that has a 1-1 relationship to the JSON being received; and I'd put all my validation attributes on the properties of that class.
Once I had deserialised the JSON into this simple DTO, I would then use a mapping layer of some kind (roll your own or use Automapper, etc) to map this DTO into a more meaningful structure such as your Entity class.
My reasoning behind this is because unless your Entity class is itself only a simple DTO (in which case it should be as simple as possible and ideally not be a composite) you are mixing OOP and concerns with data mapping concerns; whilst this in and of itself is not such a bad thing, it only serves to increase the complexity of your code.
Consider for example if your incoming JSON ends up with 30 or 40 properties, and you manage to figure out a way (maybe adapting some of the nice techniques from the other answers) to map it to the Entity class. But what about when something goes wrong - it's going to be much easier to reason about, and therefore debug, a process which you have much more control over; it's also going to be much easier to make special adaptations for odd edge cases where the serialiser behaviour just can't help you out
Granted it's a bit of work to write and maintain these DTOs but not that much - Webtools already does this for you
Reference: At the boundaries, Applications are not Object-Oriented
My client has 10 tables that it needs to load via an internal WCF to a server. Since all this is internal, I can write both client and server using whatever technique i want.
On the Client, I thought to use LINQ to load data from the tables to a List, List and so on...
On the Server, I thought to have a [DataContract] as follow:
[DataContract]
[KnownType(typeof(Table1))]
[KnownType(typeof(Table2))]
[KnownType(typeof(Table3))]
public class GenericType<T>
{
[DataMember]
public List<T> Data { get; set; }
}
and then add the classes that will represent the matching Tables on the Client.
[DataContract]
public class Table1
{
[DataMember]
public int UserID { get; set; }
[DataMember]
public string FullName { get; set; }
}
[DataContract]
public class Table2
{
[DataMember]
public int UserID { get; set; }
[DataMember]
public string Address1 { get; set; }
}
[DataContract]
public class Table3
{
[DataMember]
public string Name { get; set; }
[DataMember]
public string Description { get; set; }
}
When I create the client reference, i'm NOT getting all the classes declared on the server and it seems that ONLY the 1st [KnownType] specified on the [DataContract] becomes visible to the Client.
I was under the impression that Generics was meant to allow multiple types but am I right to think that WCF can only handle one [KnownType] x class ??
And if so, my only way to code this would be to copy and paste 10 times the GenericType class and on each copy, change the [KnownType] ??
Cause if that's the only solution, then what are the real benefits to use Generic instead of straight defined List, List for my params ??
Any thought will help clarify my mind here
The problem happens because unless ONE of the WCF methods uses any of the CLASSES declared as [DataContract] ...it seems that WCF does NOT brings those classes to the Client.
Is this the expected case?
You could try attributing your interface method with the ServiceKnownType attribute for each of the classes.
There is another option, which is to implement the generic lists in classes that are attributed with CollectionDataContract:
[CollectionDataContract]
public class Table1Collection
Inherits List<Table1>
On the client side, you can the edit Reference.svcmap and enumerate each of the collections in the CollectionMappings section:
<CollectionMappings>
<CollectionMapping TypeName="My.Namespace.Table1Collection" Category="List" />
This allows you to reuse the same code on both ends of the pipe.