Json.Net not serializing decimals the same way twice - c#

I was testing Json.NET serialization of a shopping cart I'm working on and noticed that when I serialize -> deserialize -> serialize again, I'm getting a difference in the trailing zero formatting of some of the decimal fields. Here is the serialization code:
private static void TestRoundTripCartSerialization(Cart cart)
{
string cartJson = JsonConvert.SerializeObject(cart, Formatting.Indented);
Console.WriteLine(cartJson);
Cart cartClone = JsonConvert.DeserializeObject<Cart>(cartJson);
string cloneJson = JsonConvert.SerializeObject(cartClone, Formatting.Indented);
Console.WriteLine(cloneJson);
Console.WriteLine("\r\n Serialized carts are " + (cartJson == cloneJson ? "" : "not") + " identical");
}
The Cart implements IEnumerable<T> and has a JsonObjectAttribute to allow it to serialize as an object, including its properties as well as its inner list. The decimal properties of Cart do not change, but some of the decimal properties of objects and their inner objects in the inner list/array do as in this excerpt from output of the code above:
First time serializing:
...
"Total": 27.0000,
"PaymentPlan": {
"TaxRate": 8.00000,
"ManualDiscountApplied": 0.0,
"AdditionalCashDiscountApplied": 0.0,
"PreTaxDeposit": 25.0000,
"PreTaxBalance": 0.0,
"DepositTax": 2.00,
"BalanceTax": 0.0,
"SNPFee": 25.0000,
"cartItemPaymentPlanTypeID": "SNP",
"unitPreTaxTotal": 25.0000,
"unitTax": 2.00
}
}
],
}
Second time serializing:
...
"Total": 27.0,
"PaymentPlan": {
"TaxRate": 8.0,
"ManualDiscountApplied": 0.0,
"AdditionalCashDiscountApplied": 0.0,
"PreTaxDeposit": 25.0,
"PreTaxBalance": 0.0,
"DepositTax": 2.0,
"BalanceTax": 0.0,
"SNPFee": 25.0,
"cartItemPaymentPlanTypeID": "SNP",
"unitPreTaxTotal": 25.0,
"unitTax": 2.0
}
}
],
}
Notice the Total, TaxRate, and some of the others have changed from four trailing zeroes to a single trailing zero. I did find some stuff regarding changes to handling of trailing zeroes in the source code at one point, but nothing that I understood well enough to put together with this. I can't share the full Cart implementation here, but I built a bare bones model of it and couldn't reproduce the results. The most obvious differences were my bare bones version lost some additional inheritance/implementation of abstract base classes and interfaces and some generic type usage on those (where the generic type param defines the type of some of the nested child objects).
So I'm hoping without that someone can still answer: Any idea why the trailing zeroes change? The objects appear to be identical to the original after deserializing either JSON string, but I want to be sure there isn't something in Json.NET that causes a loss of precision or rounding that may gradually change one of these decimals after many serialization round trips.
Updated
Here's a reproducible example. I thought I had ruled out the JsonConverter but was mistaken. Because my inner _items list is typed on an interface, I have to tell Json.NET which concrete type to deserialize back to. I didn't want the actual Type names in the JSON so rather than using TypeNameHandling.Auto, I've given the items a unique string identifier property. The JsonConverter uses that to choose a concrete type to create, but I guess the JObject has already parsed my decimals to doubles? This is maybe my 2nd time implementing a JsonConverter and I don't have a complete understanding of how they work because finding documentation has been difficult. So I may have ReadJson all wrong.
[JsonObject]
public class Test : IEnumerable<IItem>
{
[JsonProperty(ItemConverterType = typeof(TestItemJsonConverter))]
protected List<IItem> _items;
public Test() { }
[JsonConstructor]
public Test(IEnumerable<IItem> o)
{
_items = o == null ? new List<IItem>() : new List<IItem>(o);
}
public decimal Total { get; set; }
IEnumerator IEnumerable.GetEnumerator()
{
return _items.GetEnumerator();
}
IEnumerator<IItem> IEnumerable<IItem>.GetEnumerator()
{
return _items.GetEnumerator();
}
}
public interface IItem
{
string ItemName { get; }
}
public class Item1 : IItem
{
public Item1() { }
public Item1(decimal fee) { Fee = fee; }
public string ItemName { get { return "Item1"; } }
public virtual decimal Fee { get; set; }
}
public class TestItemJsonConverter : JsonConverter
{
public override bool CanConvert(Type objectType) { return (objectType == typeof(IItem)); }
public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
{
object result = null;
JObject jObj = JObject.Load(reader);
string itemTypeID = jObj["ItemName"].Value<string>();
//NOTE: My real implementation doesn't have hard coded strings or types here.
//See the code block below for actual implementation.
if (itemTypeID == "Item1")
result = jObj.ToObject(typeof(Item1), serializer);
return result;
}
public override bool CanWrite { get { return false; } }
public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer) { throw new NotImplementedException(); }
}
class Program
{
static void Main(string[] args)
{
Test test1 = new Test(new List<Item1> { new Item1(9.00m), new Item1(24.0000m) })
{
Total = 33.0000m
};
string json = JsonConvert.SerializeObject(test1, Formatting.Indented);
Console.WriteLine(json);
Console.WriteLine();
Test test1Clone = JsonConvert.DeserializeObject<Test>(json);
string json2 = JsonConvert.SerializeObject(test1Clone, Formatting.Indented);
Console.WriteLine(json2);
Console.ReadLine();
}
}
Snippet from my actual converter:
if (CartItemTypes.TypeMaps.ContainsKey(itemTypeID))
result = jObj.ToObject(CartItemTypes.TypeMaps[itemTypeID], serializer);

If your polymorphic models contain decimal properties, in order not to lose precision, you must temporarily set JsonReader.FloatParseHandling to be FloatParseHandling.Decimal when pre-loading your JSON into a JToken hierarchy, like so:
public class TestItemJsonConverter : JsonConverter
{
public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
{
object result = null;
var old = reader.FloatParseHandling;
try
{
reader.FloatParseHandling = FloatParseHandling.Decimal;
JObject jObj = JObject.Load(reader);
string itemTypeID = jObj["ItemName"].Value<string>();
//NOTE: My real implementation doesn't have hard coded strings or types here.
//See the code block below for actual implementation.
if (itemTypeID == "Item1")
result = jObj.ToObject(typeof(Item1), serializer);
}
finally
{
reader.FloatParseHandling = old;
}
return result;
}
Demo fiddle here.
Why is this necessary? As it turns out, you have encountered an unfortunate design decision in Json.NET. When JsonTextReader encounters a floating-point value, it parses it to either decimal or double as defined by the above-mentioned FloatParseHandling setting. Once the choice is made, the JSON value is parsed into the target type and stored in JsonReader.Value, and the underlying character sequence is discarded. Thus, if a poor choice of floating-point type is made, it's difficult to correct the mistake later on.
So, ideally we would like to choose as a default floating-point type the "most general" floating point type, one that can be converted to all others without loss of information. Unfortunately, in .Net no such type exists. The possibilities are summarized in Characteristics of the floating-point types:
As you can see, double supports a larger range while decimal supports a larger precision. As such, to minimize data loss, sometimes decimal would need to be chosen, and sometimes double. And, again unfortunately, no such logic is built into JsonReader; there is no FloatParseHandling.Auto option to choose the most appropriate representation.
In the absence of such an option or the ability to load the original floating-point value as a string and re-parse it later, you will need to hardcode your converter with an appropriate FloatParseHandling setting based upon your data model(s) when you pre-load your JToken hierarchy.
In cases where your data models contain both double and decimal members, pre-loading using FloatParseHandling.Decimal will likely meet your needs, because Json.NET will throw a JsonReaderException when attempting to deserialize a too-large value into a decimal (demo fiddle here) but will silently round the value off when attempting to deserialize a too-precise value into a double. Practically speaking, it's unlikely you will have floating-point values larger than 10^28 with more than 15 digits of precision + trailing zeros in the same polymorphic data model. In the unlikely chance you do, by using FloatParseHandling.Decimal you'll get an explicit exception explaining the problem.
Notes:
I don't know why double was chosen instead of decimal as the "default default" floating point format. Json.NET was originally released in 2006; my recollection is that decimal wasn't widely used back then, so maybe this is a legacy choice that was never revisited?
When deserializing directly to a decimal or double member, the serializer will override the default floating-point type by calling ReadAsDouble() or ReadAsDecimal(), so precision is not lost when deserializing directly from a JSON string. The problem only arises when pre-loading into a JToken hierarchy then subsequently deserializing.
Utf8JsonReader and JsonElement from system.text.json, Microsoft's replacement for Json.NET in .NET Core 3.0, avoid this problem by always maintaining the underlying byte sequence of a floating-point JSON value, which is one example of the new API being an improvement on the old.
If you actually have values larger than 10^28 with more than 15 digits of precision + trailing zeros in the same polymorphic data model, switching to this new serializer might be a valid option.

Related

Handling invalid inputs when deserializing JSON to decimal values

I have a JSON file with an array of objects, each containing a string value, grade, that I'd like to parse to decimal.
The string value contains a valid decimal about 99% percent of the time, but in that 1%, I'm getting values such as "grade":"<1" which is obviously not a valid decimal. The grade property is about 1 of 100 properties that can sometimes be set to "<1".
This of course throws the following error:
Newtonsoft.Json.JsonReaderException: 'Could not convert string to
decimal'
Here is how I am currently deserializing my JSON:
public static Product FromJson(string json) => JsonConvert.DeserializeObject<Product>(json, Converter.Settings);
Is there anything I can do to handle cases where I'm getting those pesky "<1" values? Hopefully something that does the following: if attempting to deserialize a value to decimal, and if the value cannot be parsed to decimal, default to zero.
Any ideas if this is possible? I obviously don't want to have to update my table columns to switch all values from decimal to varchar, because that just sucks and is going to require decimal <-> varchar conversions every time someone wants to query my data.
You can solve this problem by making a custom JsonConverter to handle the decimals:
class TolerantDecimalConverter : JsonConverter
{
public override bool CanConvert(Type objectType)
{
return objectType == typeof(decimal);
}
public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
{
if (reader.TokenType == JsonToken.Float || reader.TokenType == JsonToken.Integer)
{
return Convert.ToDecimal(reader.Value);
}
if (reader.TokenType == JsonToken.String && decimal.TryParse((string)reader.Value, out decimal d))
{
return d;
}
return 0.0m;
}
public override bool CanWrite
{
get { return false; }
}
public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer)
{
throw new NotImplementedException();
}
}
To use the converter, just add an instance to the Converters collection in the JsonSerializerSettings that you are passing to JsonConvert.DeserializeObject<T>.
Settings.Converters.Add(new TolerantDecimalConverter());
Note: since you are using decimals, you should probably also set FloatParseHandling to Decimal if you are not already; the default is Double.
Settings.FloatParseHandling = FloatParseHandling.Decimal;
Working demo here: https://dotnetfiddle.net/I4n00o

Modify/Add property via custom IContractResolver

I have a class:
public class MyClass
{
public MyEnum Foo{ get; set; }
}
During serialization i'd like to change the output from
{
"Foo": 1
}
to
{
"Foo": "EnumName"
}
I've tried creating an IValueProvider but hit dead ends every way I go. (My scenario is a bit more complicated than stated; I need to find a way to do this entirely within the IContractResolver.)
You could create a custom ContractResolver inheriting from DefaultContractResolver that automatically applies StringEnumConverter to every contract for an enum or nullable enum:
public class StringEnumContractResolver : DefaultContractResolver
{
readonly StringEnumConverter converter;
public StringEnumContractResolver() : this(true, false) { }
public StringEnumContractResolver(bool allowIntegerValue, bool camelCaseText)
{
this.converter = new StringEnumConverter { AllowIntegerValues = allowIntegerValue, CamelCaseText = camelCaseText };
}
protected override JsonPrimitiveContract CreatePrimitiveContract(Type objectType)
{
var contract = base.CreatePrimitiveContract(objectType);
var type = Nullable.GetUnderlyingType(contract.UnderlyingType) ?? contract.UnderlyingType;
if (type.IsEnum && contract.Converter == null)
contract.Converter = converter;
return contract;
}
}
Notes:
If the enum type already has a JsonConverter applied, that is kept in preference to the default StringEnumConverter.
Adding the converter to the JsonPrimitiveContract for the enum itself, rather than to every JsonProperty for members that return the enum, ensures that the converter is applied to enums in collections and dictionaries.
The IValueProvider merely provides methods to get and set values and thus is less convenient to this purpose than the converter. You would need to perform a nested serialization and deserialization of the enum value as a JSON string inside it, but it isn't designed for this and so doesn't have access to the JSON reader, writer, or serializer. In addition, there is no value provider for dictionary values or collection items that are enums.
You may want to cache the contract resolver for best performance as explained here.
Sample .Net fiddle.

How do I deserialize a high-precision decimal value with Json.NET?

I want to deserialize JSON containing long decimal values into custom types to maintain their precision (i.e., a custom BigDecimal class). I'm using Json.NET 9.0.1 and .NET 4.6.1. I've tried using a JsonConverter, but it seems that the value available when ReadJson is called has already been identified and read by Json.NET as a .NET decimal type and is limited to its precision.
Ideally I would have access to the raw string so I could put it in a custom type. I can use string properties on the target object and it deserializes the full string, but then I'd have to further process the object (i.e., copy it into another representation) and that's especially messy across a large schema.
Any thoughts on a better approach?
Here's the target class:
public class DecimalTest
{
public string stringValue { get; set; }
public decimal decimalValue { get; set; }
public BigDecimal bigDecimalValue { get; set; }
}
Here's a test with JSON:
[TestMethod]
public void ReadBigDecimal_Test()
{
var json = #"{
""stringValue"" : 0.0050000012852251529693603515625,
""decimalValue"" : 0.0050000012852251529693603515625,
""bigDecimalValue"" : 0.0050000012852251529693603515625
}";
var settings = new JsonSerializerSettings();
settings.FloatParseHandling = FloatParseHandling.Decimal;
settings.Converters.Add(new JsonBigDecimalConverter());
var result = JsonConvert.DeserializeObject<DecimalTest>(json, settings);
Assert.IsNotNull(result);
Assert.AreEqual("0.0050000012852251529693603515625", result.stringValue);
Assert.AreEqual(0.0050000012852251529693603516m, result.decimalValue);
// *** This case fails ***
Assert.AreEqual("0.0050000012852251529693603515625", result.bigDecimalValue.ToString());
}
Here's the custom converter:
public class JsonBigDecimalConverter : JsonConverter
{
public override bool CanConvert(Type objectType)
{
return (objectType == typeof(BigDecimal));
}
public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
{
// *** reader.Value here already appears to be a .NET decimal.
// *** If I had access to the raw string I could get this to work.
return BigDecimal.Parse(reader.Value.ToString());
}
public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer)
{
throw new NotImplementedException();
}
}
Could you try if the following implementation of ReadJson works as you expect:
public override object ReadJson(
JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
{
var token = JToken.Load(reader);
return BigDecimal.Parse(token.ToString());
}
Update
Unfortunately the above won't work. There seems to be no way to read the raw string from the JSON data.
Also note that in my tests the assert for stringValue fails first. See this working example: https://dotnetfiddle.net/s0pqg3
I assume this is because Json.NET internally immediately parses any number token it encounters according to the specified FloatParseHandling. The raw data is never preserved.
I think the only solution is to wrap the big decimal string in quotes like so:
"bigDecimalValue" : "0.0050000012852251529693603515625"
Here is a working example that does exactly that in order to preserve the desired precision: https://dotnetfiddle.net/U1UG3z

Ignore duplicates when serializing array with JSON.Net

Clarification (to anyone in the same situation):
Note that my task is to serialize an existing legacy object. As such, I would prefer to tune the serializer rather than interfere with the data structure.
I believe in most cases it's better to remove the duplicates directly from the data, as indicated by #danny-chen's answer.
As part of my object that I want to serialize with JSON.Net, there is a string[] files property which contains duplicates:
some/path/to/f1.jpg
some/path/to/f1.jpg
some/path/to/f2.jpg
some/path/to/f3.jpg
some/path/to/f2.jpg
And let's suppose these are not necessarily in order (f2, f3, f2).
Is it possible to serialize the array and ignore the duplicates ? Expected result:
{
"files": [
"some/path/to/f1.jpg",
"some/path/to/f2.jpg",
"some/path/to/f3.jpg"
]
}
I have tried the PreserveReferencesHandling setting, but it didn't work as each file in the array is a different object, with a possibly repeated value.
It's not part of the serialization, it's part of the data processing. I suggest you remove the duplicates before serialization.
string[] files = GetFiles();
data.Files = files.Distinct().ToArray();
//serialize data
//instead of data.Files = files; and do tricky things in serialization
The simplest solution is to filter the list before serialization as suggested by #Danny Chen. However, if you absolutely have to do it during serialization you can use a custom JsonConverter.
Here is the code you would need:
public class RemoveDuplicatesConverter<T> : JsonConverter
{
public override bool CanConvert(Type objectType)
{
return typeof(IEnumerable<T>).IsAssignableFrom(objectType);
}
public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer)
{
writer.WriteStartArray();
foreach (T item in ((IEnumerable<T>)value).Distinct())
{
serializer.Serialize(writer, item);
}
writer.WriteEndArray();
}
public override bool CanRead
{
get { return false; }
}
public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
{
throw new NotImplementedException();
}
}
To use the converter, add a [JsonConverter] attribute to the list or array property in your class for which you'd like to remove duplicates, as shown below. Be sure the generic type of the converter matches the type of your list.
class Foo
{
[JsonProperty("files")]
[JsonConverter(typeof(RemoveDuplicatesConverter<string>))]
public string[] Files { get; set; }
}
Then serialize as normal. The list in the JSON will have the duplicates removed, but the original list in your object will be unaffected.
string json = JsonConvert.SerializeObject(your_object, Formatting.Indented);
Fiddle: https://dotnetfiddle.net/vs2oWQ

NullValueHandling.Ignore influences deserialization into [ExtensionData] despite matching class member

My server responses consists of a set of known and unknown properties. For the known ones, I created a DTO class with members for each property. The unknown properties shall be put inside a dictionary annotated with the [ExtensionData] attribute:
[JsonObject(MemberSerialization = MemberSerialization.OptIn)]
public class Dto
{
[JsonExtensionData]
private readonly Dictionary<string, object> unknownProperties = new Dictionary<string, object>();
public IDictionary<string, object> UnknownProperties
{
get
{
return new ReadOnlyDictionary<string, object>(this.unknownProperties);
}
}
[JsonProperty(Required = Required.Default, PropertyName = "KNOWN_PROPERTY")]
public string KnownProperty { get; private set; }
}
Null is allowed as value for KnownProperty. If I try to deserialize a JSON object that contains KNOWN_PROPERTY : null, this property is also contained in the dictionary UnknownProperties, if I configure the serializer with NullValueHandling.Ignore. This is done even though a class member exists for KNOWN_PROPERTY:
static void Main(string[] args)
{
string jsonString = #"{
KNOWN_PROPERTY : null,
UNKNOWN_PROPERTY : null
}";
JsonSerializer serializer = JsonSerializer.CreateDefault(new JsonSerializerSettings()
{
NullValueHandling = NullValueHandling.Ignore
});
using (var textReader = new StringReader(jsonString))
{
Dto dto = serializer.Deserialize<Dto>(new JsonTextReader(textReader));
foreach (var pair in dto.UnknownProperties)
{
Console.WriteLine("{0}: {1}", pair.Key, pair.Value == null ? "null" : pair.Value.ToString());
}
}
}
Output:
KNOWN_PROPERTY : null
UNKNOWN_PROPERTY : null
If I configure the serializer with NullValueHandling.Include or set a value for KNOWN_PROPERTY in the JSON string, the dictionary contains only UNKNOWN_PROPERTY, as expected.
For my understanding [ExtensionData] is not working correctly if NullValueHandling is set to ignore, since the documentation states the extension is used only if no matching class member is found.
Is the behavior I'm seeing intended? Can I do something to avoid this? Because I don't like to send null values to the server, I'd like to stick to the currently set NullValueHandling.
I'm using Json.NET 8.0.2
Update
Reported in JsonExtensionData should not include the null values that are real object properties. #1719 and fixed in commit e079301. The fix should be included in the next release of Json.NET after 11.0.2.
Original Answer
Confirmed - in JsonSerializerInternalReader.PopulateObject(object, JsonReader, JsonObjectContract, JsonProperty, string) there is the following logic:
// set extension data if property is ignored or readonly
if (!SetPropertyValue(property, propertyConverter, contract, member, reader, newObject))
{
SetExtensionData(contract, member, reader, memberName, newObject);
}
The intent seems to be to put the value into the extension data if the property is to be ignored, but Json.NET puts it into the extension data if the value is to be ignored -- a slightly different concept. I agree this could be a bug. You might want to report it.
There is a workaround. Json.NET has two attributes that affect how null/default values are serialized:
NullValueHandling. Specifies to include or ignore null values when serializing and deserializing objects. Values are Include and Ignore.
DefaultValueHandling. This has more elaborate semantics:
Include: Include members where the member value is the same as the member's default value when serializing objects. Included members are written to JSON. Has no effect when deserializing.
Ignore: Ignore members where the member value is the same as the member's default value when serializing objects so that is is not written to JSON. This option will ignore all default values (e.g. null for objects and nullable types; 0 for integers, decimals and floating point numbers; and false for booleans).
Populate: Members with a default value but no JSON will be set to their default value when deserializing.
IgnoreAndPopulate: Ignore members where the member value is the same as the member's default value when serializing objects and sets members to their default value when deserializing.
So, how do these overlapping settings interact? It turns out that Json.NET checks them both: serialize if both settings are agreeable, deserialize if both settings are agreeable. And DefaultValueHandling.IgnoreAndPopulate appears to do what you want -- it omits nulls when serializing, but reads and sets them when deserializing, if present.
Thus I was able to get your desired behavior with the following JsonProperty:
public class Dto
{
[JsonProperty(Required = Required.Default, PropertyName = "KNOWN_PROPERTY", DefaultValueHandling = DefaultValueHandling.IgnoreAndPopulate, NullValueHandling = NullValueHandling.Include)]
public string KnownProperty { get; private set; }
// Remainder as before.
}
Prototype fiddle.

Categories

Resources