Havin a Response with a complex property, i want to to map to my responseDTO properly. For all basic types it works out flawlessly.
The ResponseDTO looks like this:
public class ResponseDto
{
public string Id {
get;
set;
}
public struct Refs
{
public Genre GenreDto {
get;
set;
}
public Location LocationDto {
get;
set;
}
}
public Refs References {
get;
set;
}
}
Genre and Location are both for now simple classes with simple properties (int/string)
public class GenreDto {
public string Id {
get;
set;
}
public string Name {
get;
set;
}
}
Question:
Is there any way, without changing/replacing the generic unserializer ( and more specific example) (in this example JSON ) to map such complex properties?
One specific difference to the GithubResponse example is, that i cant use a dictionry of one type, since i have different types under references. Thats why i use a struct, but this seems not to work. Maybe only IEnumerable are allowed?
Update
There is a way using lamda expressins to parse the json manually github.com/ServiceStack/ServiceStack.Text/blob/master/tests/ServiceStack.Text.Tests/UseCases/CentroidTests.cs#L136 but i would really like to avoid this, since the ResponseDTO becomes kinda useless this way - since when writing this kind of manual mapping i would no longer us Automapper to map from ResponseDto to DomainModel - i though like this abstraction and "seperation".
Thanks
I used lambda expressions to solve this issue, a more complex example would be
static public Func<JsonObject,Cart> fromJson = cart => new Cart(new CartDto {
Id = cart.Get<string>("id"),
SelectedDeliveryId = cart.Get<string>("selectedDeliveryId"),
SelectedPaymentId = cart.Get<string>("selectedPaymentId"),
Amount = cart.Get<float>("selectedPaymentId"),
AddressBilling = cart.Object("references").ArrayObjects("address_billing").FirstOrDefault().ConvertTo(AddressDto.fromJson),
AddressDelivery = cart.Object("references").ArrayObjects("address_delivery").FirstOrDefault().ConvertTo(AddressDto.fromJson),
AvailableShippingTypes = cart.Object("references").ArrayObjects("delivery").ConvertAll(ShippingTypeDto.fromJson),
AvailablePaypmentTypes = cart.Object("references").ArrayObjects("payment").ConvertAll(PaymentOptionDto.fromJson),
Tickets = cart.Object("references").ArrayObjects("ticket").ConvertAll(TicketDto.fromJson)
});
So this lamda exprpession is used to parse the JsonObject response of the request and map everything inside, even nested ressources. This works out very well and flexible
Some time ago i stumbled upon a similar problem. Actually ServiceStack works well with complex properties. The problem in my scenario was that i was fetching data from a database and was passing the objects returned from the DB provider directly to ServiceStack. The solution was to either create DTOs out of the models returned by the DB provider or invoke .ToList() on those same models.
I'm just sharing some experience with SS but may be you can specify what's not working for you. Is there an exception thrown or something else.
Related
Say I have a database in which I am storing user details of this structure:
public class User
{
public string UserId { get; set; }
public string Name { get; set; }
public string Email { get; set; }
public string PasswordHash { get; set; }
}
I have a data access layer that works with this that contains methods such as GetById() and returns me a User object.
But then say I have an API which needs to return a users details, but not sensitive parts such as the PasswordHash. I can get the User from the database but then I need to strip out certain fields. What is the "correct" way to do this?
I've thought of a few ways to deal with this most of which involve splitting the User class into a BaseClass with non sensitive data and a derived class that contains the properties I would want kept secret, and then converting or mapping the object to the BaseClass before returning it, however this feels clunky and dirty.
It feels like this should be a relatively common scenario, so am I missing an easy way to handle it? I'm working with ASP.Net core and MongoDB specifically, but I guess this is more of a general question.
It seems for my purposes the neatest solution is something like this:
Split the User class into a base class and derived class, and add a constructor to copy the required fields:
public class User
{
public User() { }
public User(UserDetails user)
{
this.UserId = user.UserId;
this.Name = user.Name;
this.Email = user.Email;
}
public string UserId { get; set; }
public string Name { get; set; }
public string Email { get; set; }
}
public class UserDetails : User
{
public string PasswordHash { get; set; }
}
The data access class would return a UserDetails object which could then be converted before returning:
UserDetails userDetails = _dataAccess.GetUser();
User userToReturn = new User(userDetails);
Could also be done using AutoMapper as Daniel suggested instead of the constructor method. Don't love doing this hence why I asked the question but this seems to be the neatest solution and requires the least duplication.
There are two ways to do this:
Use the same class and only populate the properties that you want to send. The problem with this is that value types will have the default value (int properties will be sent as 0, when that may not be accurate).
Use a different class for the data you want to send to the client. This is basically what Daniel is getting at in the comments - you have a different model that is "viewed" by the client.
The second option is most common. If you're using Linq, you can map the values with Select():
users.Select(u => new UserModel { Name = u.Name, Email = u.Email });
A base type will not work the way you hope. If you cast a derived type to it's parent type and serialize it, it still serializes the properties of the derived type.
Take this for example:
public class UserBase {
public string Name { get; set; }
public string Email { get; set; }
}
public class User : UserBase {
public string UserId { get; set; }
public string PasswordHash { get; set; }
}
var user = new User() {
UserId = "Secret",
PasswordHash = "Secret",
Name = "Me",
Email = "something"
};
var serialized = JsonConvert.SerializeObject((UserBase) user);
Notice that cast while serializing. Even so, the result is:
{
"UserId": "Secret",
"PasswordHash": "Secret",
"Name": "Me",
"Email": "something"
}
It still serialized the properties from the User type even though it was casted to UserBase.
If you want ignore the property just add ignore annotation in you model like this, it will skip the property when model is serializing.
[JsonIgnore]
public string PasswordHash { get; set; }
if you want ignore at runtime(that means dynamically).there is build function avilable in Newtonsoft.Json
public class User
{
public string UserId { get; set; }
public string Name { get; set; }
public string Email { get; set; }
public string PasswordHash { get; set; }
//FYI ShouldSerialize_PROPERTY_NAME_HERE()
public bool ShouldSerializePasswordHash()
{
// use the condtion when it will be serlized
return (PasswordHash != this);
}
}
It is called "conditional property serialization" and the documentation can be found here. hope this helps
The problem is that you're viewing this wrong. An API, even if it's working directly with a particular database entity, is not dealing with entities. There's a separation of concerns issue at play here. Your API is dealing with a representation of your user entity. The entity class itself is a function of your database. It has stuff on it that only matters to the database, and importantly, stuff on it that does not matter to your API. Trying to have one class that can satisfy multiple different applications is folly, and will only lead to brittle code with nested dependencies.
More to the point, how are you going to interact with this API? Namely, if your API exposes your User entity directly, then any code that consumes this API either must take a dependency on your data layer so it can access User or it must implement its own class representing a User and hope that it matches up with what the API actually wants.
Now imagine the alternative. You create a "common" class library that will be shared between your API and any client. In that library, you define something like UserResource. Your API binds to/from UserResource only, and maps that back and forth to User. Now, you have completely segregated your data layer. Clients only know about UserResource and the only thing that touches your data layer is your API. And, of course, now you can limit what information on User is exposed to clients of your API, simply by how you build UserResource. Better still, if your application needs should change, User can change without spiraling out as an API conflict for each consuming client. You simply fixup your API, and clients go on unawares. If you do need to make a breaking change, you can do something like create a UserResource2 class, along with a new version of your API. You cannot create a User2 without causing a whole new table to be created, which would then spiral out into conflicts in Identity.
Long and short, the right way to go with APIs is to always use a separate DTO class, or even multiple DTO classes. An API should never consume an entity class directly, or you're in for nothing but pain down the line.
I have a class in C#, that has a number of variables. Let's call it "QuestionItem".
I have a list of this object, which the user modifies, and then sends it via JSON serialization (with Newtonsoft JSON library) to the server.
To do so, I deserialize the objects that are already in the server, as a List<QuestionItem>, then add this new modified object to the list, and then serialize it back to the server.
In order to display this list of QuestionItems to the user, I deserialize the JSON as my object, and display it somewhere.
Now, the problem is - that I want to change this QuestionItem and add some variables to it.
But I can't send this NewQuestionItem to the server, because the items in the server are of type OldQuestionItem.
How do I merge these two types, or convert the old type to the new one, while the users with the old version will still be able to use the app?
You are using an Object Oriented Language, so you might aswell use inheritance if possible.
Assuming your old QuestionItem to be:
[JsonObject(MemberSerialization.OptOut)]
public class QuestionItem
{
[JsonConstructor]
public QuestionItem(int Id, int Variant)
{
this.Id = Id;
this.Variant = Variant;
}
public int Id { get; }
public int Variant { get; }
public string Name { get; set; }
}
you can extend it by creating a child class:
[JsonObject(MemberSerialization.OptOut)]
public class NewQuestionItem : QuestionItem
{
private DateTime _firstAccess;
[JsonConstructor]
public NewQuestionItem(int Id, int Variant, DateTime FirstAccess) : base(Id, Variant)
{
this.FirstAccess = FirstAccess;
}
public DateTime FirstAccess { get; }
}
Note that using anything different than the default constructor for a class requires you to use the [JsonConstructor] Attribute on this constructor and every argument of said constructor must be named exactly like the corresponding JSON properties. Otherwise you will get an exception, because there is no default constructor available.
Your WebAPI will now send serialized NewQuestionItems, which can be deserialized to QuestionItems. In fact: By default, JSON.NET as with most Json libraries, will deserialize it to any object if they have at least one property in common. Just make sure that any member of the object you want to serialize/desreialize can actually be serialized.
You can test the example above with the following three lines of code:
var newQuestionItem = new NewQuestionItem(1337, 42, DateTime.Now) {Name = "Hello World!"};
var jsonString = JsonConvert.SerializeObject(newQuestionItem);
var oldQuestionItem = JsonConvert.DeserializeObject<QuestionItem>(jsonString);
and simply looking at the property values of the oldQuestionItem in the debugger.
So, this is possible as long as your NewQuestionItem only adds properties to an object and does neither remove nor modify them.
If that is the case, then your objects are different and thus, requiring completely different objects with a different URI in your API, as long as you still need to maintain the old instance on the existing URI.
Which brings us to the general architecture:
The most clean and streamline approach to what you are trying to achieve is to properly version your API.
For the purpose of this link I am assuming an Asp.NET WebApi, since you are handling the JSON in C#/.NET. This allows different controller methods to be called upon different versions and thus, making structural changes the resources your API is providing depending on the time of the implementation. Other API will provide equal or at least similar features or they can be implemented manually.
Depending on the amount and size of the actual objects and potential complexity of the request- and resultsets it might also be worth looking into wrapping requests or responses with additional information. So instead of asking for an object of type T, you ask for an Object of type QueryResult<T> with it being defined along the lines of:
[JsonObject(MemberSerialization.OptOut)]
public class QueryResult<T>
{
[JsonConstructor]
public QueryResult(T Result, ResultState State,
Dictionary<string, string> AdditionalInformation)
{
this.Result = result;
this.State = state;
this.AdditionalInformation = AdditionalInformation;
}
public T Result { get; }
public ResultState State { get; }
public Dictionary<string, string> AdditionalInformation { get; }
}
public enum ResultState : byte
{
0 = Success,
1 = Obsolete,
2 = AuthenticationError,
4 = DatabaseError,
8 = ....
}
which will allow you to ship additional information, such as api version number, api version release, links to different API endpoints, error information without changing the object type, etc.
The alternative to using a wrapper with a custom header is to fully implement the HATEOAS constraint, which is also widely used. Both can, together with proper versioning, save you most of the trouble with API changes.
How about you wrapping your OldQuestionItem as a property of QuestionItem? For example:
public class NewQuestionItem
{
public OldQuestionItem OldItem { get; set; }
public string Property1 {get; set; }
public string Property2 {get; set; }
...
}
This way you can maintain the previous version of the item, yet define new information to be returned.
Koda
You can use something like
public class OldQuestionItem
{
public DateTime UploadTimeStamp {get; set;} //if less then DateTime.Now then it QuestionItem
public string Property1 {get; set; }
public string Property2 {get; set; }
...
public OldQuestionItem(NewQuestionItem newItem)
{
//logic to convert new in old
}
}
public class NewQuestionItem : OldQuestionItem
{
}
and use UploadTimeStamp as marker to understand, what Question is it.
I need to add filtering to my API requests that support AutoQuery, so based on this SO answer, used q.And to add conditions. The issue is that one of the POCO properties is a List<string> and it seems doing a simple Contains() won't work. Here's a simple example of what I have:
public class PocoObject
{
public int Id { get; set; }
public List<string> Names { get; set; }
}
My service looks like this:
public object Get(PocoObjects request)
{
var q = AutoQuery.CreateQuery(request, Request.GetRequestParams());
if (someCondition)
{
q.And(x => x.Names.Contains(request.TargetName));
}
return AutoQuery.Execute(request, q);
}
Problem is, I get an error like this:
variable 'x' of type 'TestProject.ServiceModel.Types.PocoObject' referenced from scope '', but it is not defined
If I change the Contains to a simpler equality comparison on another property, the AutoQuery works. Any ideas how to accomplish this?
You can't do a server side SQL query on a blobbed complex type property like List<string>. Any queries need to be applied on the client after the results are returned from the db and its deserialised back into a typed POCO.
I've been looking for answers for this relatively simple task but with no success. So I thought I'd ask my question here. I have got a simple database with two tables, Books and Authors.
I got my models generated by the ADO.NET entity data model. This is the auto-generated Books model:
public partial class Book
{
public int BookID { get; set; }
public string Title { get; set; }
public string Description { get; set; }
public int ISBN { get; set; }
public int AuthorID { get; set; }
public virtual Author Author { get; set; }
}
And this is the auto-generated Authors model:
public partial class Author
{
public Author()
{
this.Books = new HashSet<Book>();
}
public int AuthorID { get; set; }
public string Name { get; set; }
public virtual ICollection<Book> Books { get; set; }
}
And this is a part of the controller, the method for getting a list of all the books in JSON format.
// api/books
public IQueryable<Book> GetBooks()
{
// return db.Books.Include(x => x.Authors); Don't work
return db.Books;
}
This is my JS for calling the endpoint:
$.getJSON("api/books")
.done(function (data) {
console.log(data);
})
.fail(function (xhr) { console.log(xhr.responseText) });
Nothing fancy, just trying to make a GET request and receiving a list of all the books and their related authors.
This is a portion of the error message:
{"Message":"An error has occurred.","ExceptionMessage":"The 'ObjectContent`1' type failed to serialize the response body for content type 'application/json; charset=utf-8'.","ExceptionType":"System.InvalidOperationException","StackTrace":null,"InnerException":{"Message":"An error has occurred.","ExceptionMessage":"Self referencing loop detected for property 'Author' with type 'System.Data.Entity.DynamicProxies.Author_5968F94A1BBB745A30D62CD59D0AC5F96A198B3F16F0EA2C2F61575F70034886'. Path '[0].Books[0]'.","ExceptionType":"Newtonsoft.Json.JsonSerializationException","StackTrace":"
I have tried preserving object references in JSON but that mangles the response. Is that is the only option?
If you examine the inner exception it says:
Self referencing loop detected for property 'Author'
This tells you that your Author class references back to the parent (i.e. Books or vice versa).
In your web api config (App_Start/WebApiConfig.cs), add this:
public static class WebApiConfig
{
public static void Register(HttpConfiguration config)
{
// Prevent "Self referencing loop detected" error occurring for recursive objects
var serializerSettings = new JsonSerializerSettings()
{
ReferenceLoopHandling = ReferenceLoopHandling.Ignore
};
config.Formatters.JsonFormatter.SerializerSettings = serializerSettings;
}
}
This tells JSON.NET to ignore nested objects referring back to the parent object
adaam's answer is 100% correct but I thought I'd chime in with a bit of advice, but it's too long to fit in a comment so here goes.
Directly serialising Entity Framework objects is generally not a great idea; it's better to use simple DTO-style objects for passing data back to clients.
Of course this is just advice and YMMV :)
The benefits of using DTOs include
Proper decoupling from your Entity Framework objects in your Controllers (you could create a Repository abstraction and use that in your controller, meaning your controller is free from a dependency on Entity Framework and thus more testable)
Simpler serialization control - with Entity Framework you will have difficulty trying to control what public properties are send across the wire to clients when the Entity Framework proxy is directly serialized; typically in DB First the declarations for these properties are in auto-generated files that are re-written each time your edmx changes; threfore it becomes painful to have to maintain non-serialization attributes on the properties you don't want sent across the wire (e.g [IgnoreDataMember], etc.)
If you're planning on accepting models via POST, PUT, etc, then you'll rapidly find that it is a pain (if not impossible) to effectively serialize "inward" directly to the Entity Framework proxies so you'll have to write mapping code anyway; by using a DTO approach you accept that you have to map up-front.
Circular references don't happen therefore you never need to worry about it and more importantly, you don't pay the cost of ignoring it (albeit minor, but the serializer has to do some work to avoid these references)
You can easily perform extra transformations, for example flattening, to better suit the client and/or hide details you don't want sent across the wire.
Example
public class BookDTO
{
public int BookID {get;set;}
public string Title {get;set;}
public string Description {get;set;}
public int ISBN {get;set;}
public string AuthorName{get;set;}
}
public HttpResponseMessage GetBooks()
{
//ideally you'd be using a repository abstraction instead of db directly
//but I want to keep this simple.
var books = db.Books.Select(
book=>new BookDTO(){
BookID=book.BookID,
Title=book.Title,
Description=book.Description,
ISBN=book.ISBN,
AuthorName=book.Author.Name //<-flattening
});
return Request.CreateResponse(HttpStatusCode.OK, books);
}
This produces an array of nice, flat objects for the client to consume without having to expose for example the AuthorID which might to be an internal concept you don't particularly want clients to know.
Once you get the hang of it, you can then look at using something like Automapper which will greatly reduce the maintenance burden, and allow you to perform inline projection in your queries.
It is always best to create your own custom models while returning the data but you just want to use Entity Framework classes then along with ignoring refrenceLoopHandeling you would need disable ProxyCreation from you Entity Model.
Follow these steps :
Step 1 : Like adaam described Put his in WebApiConfig register function:
// Prevent "Self referencing loop detected" error occurring for recursive objects
var serializerSettings = new JsonSerializerSettings()
{
ReferenceLoopHandling = ReferenceLoopHandling.Ignore
};
config.Formatters.JsonFormatter.SerializerSettings = serializerSettings;
Step 2 : Which is most important in my case with latest EF to disable Proxy from EF Contaxt.
Goto: [FILE].edmx file then your [FILE].Context.cs
and add the line below in your constructor..
Configuration.ProxyCreationEnabled = false;
Now you won't have related class results any more..
Class (Entity)
public class Entity
{
public ObjectId Id { get; set; }
public Entity()
{
Id = ObjectId.GenerateNewId();
}
}
Class (Member)
public class Member : Entity
{
public string FirstName { get; set; }
public string LastName { get; set; }
public string Email { get; set; }
public string MobileNumber { get; set; }
}
Action
public dynamic Get()
{
var response = UnitOfWork.MemberRepository.GetMembers();
return response;
}
I'm building a API using .NET WebAPI and using mongodb as a datastore, I'm have some troubles serializing the responding object from the database.
Can't understand why, search the internet a while and found similar problems with no solutions. Either I'm a bad google searcher or there answer is hidden somewhere deep:)
Full stack trace: http://pastie.org/8389787
This is little guessing, but the code really isn't too telling.
I'm pretty sure this is because the C# Mongo driver's BsonDocument exposes a ton of properties like AsBoolean, AsInt, AsString, etc. Calling those getters on data that isn't convertible to the respective type causes an exception. While I don't see them in the stack trace, that might be a compiler optimization.
One solution is to make the code strongly-typed (if it isn't already). I don't know what UnitOfWork.MemberRepository.GetMembers(); is, but it hides what you're doing and it's also not clear what it returns. You're losing a lot of the advantages of the C# driver. The Collection<T> class is pretty much a repository pattern already by the way.
A cleaner approach (they aren't mutually exclusive) is to not serialize the database object to the outside world, but use DTO for the WebAPI side and translate between them, for instance using AutoMapper. I would always do this, because you're throwing an object that might be decorated with DB-Attributes in a serializer you don't know - that could lead to all sorts of problems. Also, you often want to hide certain information from the outside, or make it read-only.
Another option is to use ServiceStack.Text as a JSON-serializer instead, which tends to cause less trouble in my experience.