I have api calls that return expando object & dynamic objects. Even though I have NullValueHandling set to IgnoreNulls, nulls are still sent back in the resulting json. I have tested it with my other calls that return POCO objects, and they function correctly (omitting null value fields). Is there a way around this?
I was thinking of trying to convert the expando\dynamic objects to something that that the serializer could process the same way it does POCO object results, but I don't know what that would be.
I tried manually serializing the object and then de-serializing it into a JSONArrayObject before it gets to the final Serialization in the MVC middleware, but that didn't work.
Also, I can't just create a POCO for these objects, because they are the result of "Data Shaping" i.e. the user sends in the fields they want to receive in the object, and then we take the resulting POCO, and turn it into an expando object with only the fields they requested.
Related
I have a class with properties of type object, such as:
public object Meta {get; set;}
Previously, I stored these properties without serialization, which resulted in an exception when attempting to access them due to deserialization errors. To resolve this issue, I now serialize these fields before inserting the documents into the database. This solution works well, as they are deserialized in the front-end. However, I am curious if there is a more efficient method, such as a custom serializer, that could allow these properties to be stored in the database as an object, rather than a long and unreadable string.
if we have this object for example:
"meta":{"description": "some long text here", "someValue" : "another long text here","anotherValue":"some really long text here"}
It would be quite nice and beneficial to be able to easily access and read individual fields within the meta object separately from the database.
I'm trying to deserialize an unstructured JSON object (there are multiple schema possibilities) to a BsonDocument, and I'm trying to specify the correct types for some properties (which I know in advance).
Say for example (very simplified example):
{ "Id": "039665be-a1a8-4062-97d6-e44fea2affff", "Foo":"Bar", "Baz":30 }
So I know everytime I find an "Id" property (which may or may not be there) on the root of the object, this is to be converted to an UUID type (bson binary type 4).
I've made a simple JsonReader descendant, and I'm overriding ReadBsonType, and both returning a new CurrentBsonType and providing a converted value there, then overriding all methods for every possible type (ReadDateTime(), ReadInt32(), ReadInt64(), ReadBinaryData(), etc.) and providing a parsed value.
This works fine (albeit I find it a bit unconfortable) when the JSON object is plain, but if it has nested objects with properties of the same name (which I do not want to parse), then problems arise.
I've tried overriding ReadStartArray(), ReadStartDocument(), etc., and tried building a "path" which I can query to, but the actual order of the calling of the methods of the JsonReader are baffling to me (it seems to check for the type before checking for the name of the property, so the CurrentName property when checking for the type refers to the previous property, etc.).
I've somewhat circumvented it with very ugly code... and I'm sure there must be a better way to do this without class mapping, although finding documentation is proving hard (since mongo often calls "json" the actual "json extended", so documentation gets mixed from here and there).
Has anyone ever found themselves in such situation?
PS: before anyone asks, I'm storing data returned as json strings from a third party server on a mongo database (that will be mined later on), and while there are some schemas for the datatypes available (and I could classmap them), there might be new schemas in the future so I can't just classmap everything. Some properties (if they exist) are always the same though, so instead of storing everything on mongo as a string I'd rather give them the correct possible types from a start.
Here's my situation: I have an MVC3 app that has some very complex C# objects, and those get rendered to a views in this application. However, I have a new requirement: a console application (that I am also writing) will run under a scheduler, and it needs to pull these objects from this MVC3 app, and then do something else with these complex objects.
Since I have control over both apps, I can share a library of the complex objects between them. All of these objects are marked [Serializable]. However, I cannot figure out an easy way to serialize these objects and send them from the MVC3 app to the Console app.
I tried simple JavaScriptSerializer and using the HttpClient to read the string, then deserialize it on the console-app end of things, but unfortunately it doesn't deserialize the data correctly. Everything is null. I can inspect the string on a breakpoint when it arrives at the console app, and all the data is there, in the string, but it just doesn't get de-serialized correctly.
Is there an easier way to do this? I don't care what the serialization method is. The data doesn't have to be passed as JSON and no other application but mine is going to consume these objects. But so far I can't figure out the easiest way to produce/consume these objects.
I know I can go down the whole "create a web service contract" and use data annotations route, but I was hoping there was an easier, less time-consuming way of doing it.
Using Json.NET:
Server-Side
string serializedObject = JsonConvert.SerializeObject(yourComplexObject);
// Send the string to the client...
Client-Side
In the client, you don't even have to know the deserialized object's type, you can take advantage of anonymous objects and dynamic:
string serializedObject = // ... Fetch from server
dynamic complexObject = JsonConvert.DeserializeObject(serializedObject);
// ...
string id = complexObject.UserId;
P.S.: Please note that the object's methods or state is not going to get serialized, only the public properties are.
Can your action just return your object? If so, your client code would look something like (using HttpClient)
var result = client.GetAsync(url).Result;
var myObj = await result.Content.ReadAsAsync<T>();
I need help mapping a complex C# object to a table dynamically, without manually defining a model. I'm using ASP.NET 4.5.
So I am using JSON.NET to serialize my complex C# object into a JSON object with
string json = JsonConvert.SerializeObject(myObject, Formatting.Indented);
This object is being read in dynamically from an external API. It includes complex attributes, like dictionaries, as well as standard strings, integers, etc. I input in a key and it returns the C# object.
I can't manually map the object to a model, since the object does change frequently with development, and needs to be extensible despite changes in the C# object. What is the best way to map this JSON object to a table? Essentially need to be able to render the JSON serialized from the object into a readable table with ASP.NET.
Thanks so much!
I have a class MyClass containing a private List<MySecondClass> myList. The list is exposed through a getter as follows:
public IEnumerable<MySecondClass> MyList
{
get { return myList.Select(a => a); }
}
The list is modified through public AddItem(MySecondClass itemToAdd) and ClearItems() methods. I believe that this is a properly encapsulated list.
The problem lies in that I need to pass an object of type MyClass (containing myList) via SOAP to a web service, which fills myList (using the AddItem() method), and then returns the object.
However, when the webmethod returns the class, after serialization myList is empty. I am suspecting this is because I do not have a setter for myList, which is causing the list not to be set during serialization.
Is this a good assumption, or am I way off? If the problem is what I think it is, is there a way to allow for the list to be successfully passed from the webmethod without breaking encapsulation (I do not want to expose a generic list)?
Without trying this directly myself, I believe that you could definitely be correct.
serialization in .NET makes utilizing read only properties a fun circus.because the .net default serialization process requires a setter property in order to "deserialize" the object. Without a setter property the serialization piece will still work allowing you to serialize to a drive or across the network. But, it is the deserialization process that will fail which could definitely be why your collection is empty. Im just amazed it doesn't error out to be honest.
Have you tried to add a simple setter just to verify that this is in fact the issue just so that we know with 100% certainty that this is the problem before working to solve it.
While I never really solved the initial problem, what I did do to get it working was simplify the data that was being passed to the web method. Instead of passing an entire object to the web method, I instead passed a unique identifier. The webmethod then returns the list I need, and I handle actually adding the items in this list to the object client-side.
The XML Serializer used by ASMX services only serializes public read/write properties.