When serializing with Json.NET, I need to escape embedded JSON after previously unescaping while deserializing. Which means I unescaped following JSON according to this post.
Here is my JSON:
{
"Message":null,
"Error":false,
"VData":{
"RNumber":null,
"BRNumber":"Session1"
},
"onlineFields":{
"CCode":"Web",
"MNumber":"15478655",
"Product":"100",
"JsonFile":" {
\"evaluation\":{
\"number\":[
{
\"#paraID\":\"1000\",
\"#Value\":\"\",
\"#label\":\"We are america\"
},
{
\"#paraID\":\"2000\",
\"#Value\":\"100\",
\"#label\":\"We are japan\"
},
{
\"#paraID\":\"3000\",
\"#Value\":\"1000\",
\"#label\":\"We are UK\"
},
{
\"#paraID\":\"4000\",
\"#Value\":\"\",
\"#label\":\"We are China\"
}
]
}
} "
}
}
After unescaping, I bind the above JSON to my model classes. And it works properly. to Bind JSON to a model I used following code.
private static void showJSON(string testJson){
Response response = JsonConvert.DeserializeObject<Response>(testJson);
var dropdowns = response.OnlineFields.JsonFile;
string json = JsonConvert.SerializeObject(dropdowns, Newtonsoft.Json.Formatting.Indented);
Console.WriteLine(json);
}
After bind JSON to model, there has some logic to set values to JSON and returns unescaped JSON. which means it also returns unescaped JsonFile, I again need above JSON format (escaped embedded JsonFile) to send to the client API.
This is unescaped JSON format, I need convert this to above escaped JSON (escaped embedded JsonFile)
{
"Message":null,
"Error":false,
"VData":{
"RNumber":null,
"BRNumber":"Session1"
},
"onlineFields":{
"CCode":"Web",
"MNumber":"15478655",
"Product":"100",
"JsonFile":{
"evaluation":{
"number":[
{
"#paraID":"1000",
"#Value":"",
"#label":"We are america"
},
{
"#paraID":"2000",
"#Value":"100",
"#label":"We are japan"
},
{
"#paraID":"3000",
"#Value":"1000",
"#label":"We are UK"
},
{
"#paraID":"4000",
"#Value":"",
"#label":"We are China"
}
]
}
}
}
}
Previously I asked a question for how to directly deserialize such embedded JSON into c# classes, but the answer there did not explain how to re-serialize in the same format. I need to extend the answer from that previous question to writing.
You can extend EmbeddedLiteralConverter<T> from this answer to How do I convert an escaped JSON string within a JSON object? by overriding JsonConverter.WriteJson() and doing a nested serialization, then writing the resulting string literal, like so:
public class EmbeddedLiteralConverter<T> : JsonConverter
{
public override bool CanConvert(Type objectType)
{
return typeof(T).IsAssignableFrom(objectType);
}
public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer)
{
using (new PushValue<bool>(true, () => Disabled, (canWrite) => Disabled = canWrite))
{
using (var sw = new StringWriter(writer.Culture))
{
// Copy relevant settings
using (var nestedWriter = new JsonTextWriter(sw)
{
DateFormatHandling = writer.DateFormatHandling,
DateFormatString = writer.DateFormatString,
DateTimeZoneHandling = writer.DateTimeZoneHandling,
StringEscapeHandling = writer.StringEscapeHandling,
FloatFormatHandling = writer.FloatFormatHandling,
Culture = writer.Culture,
// Remove if you don't want the escaped \r\n characters in the embedded JSON literal:
Formatting = writer.Formatting,
})
{
serializer.Serialize(nestedWriter, value);
}
writer.WriteValue(sw.ToString());
}
}
}
[ThreadStatic]
static bool disabled;
// Disables the converter in a thread-safe manner.
bool Disabled { get { return disabled; } set { disabled = value; } }
public override bool CanWrite { get { return !Disabled; } }
public override bool CanRead { get { return !Disabled; } }
public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
{
if (reader.TokenType == JsonToken.Null)
return null;
var contract = serializer.ContractResolver.ResolveContract(objectType);
if (contract is JsonPrimitiveContract)
throw new JsonSerializationException("Invalid type: " + objectType);
if (existingValue == null)
existingValue = contract.DefaultCreator();
if (reader.TokenType == JsonToken.String)
{
var json = (string)JToken.Load(reader);
using (var subReader = new JsonTextReader(new StringReader(json)))
{
// By populating a pre-allocated instance we avoid an infinite recursion in EmbeddedLiteralConverter<T>.ReadJson()
// Re-use the existing serializer to preserve settings.
serializer.Populate(subReader, existingValue);
}
}
else
{
serializer.Populate(reader, existingValue);
}
return existingValue;
}
}
struct PushValue<T> : IDisposable
{
Action<T> setValue;
T oldValue;
public PushValue(T value, Func<T> getValue, Action<T> setValue)
{
if (getValue == null || setValue == null)
throw new ArgumentNullException();
this.setValue = setValue;
this.oldValue = getValue();
setValue(value);
}
#region IDisposable Members
// By using a disposable struct we avoid the overhead of allocating and freeing an instance of a finalizable class.
public void Dispose()
{
if (setValue != null)
setValue(oldValue);
}
#endregion
}
Then, add the converter to JsonSerializerSettings.Converters when deserializing and serializing:
var settings = new JsonSerializerSettings
{
Converters = { new EmbeddedLiteralConverter<JsonFile>() },
};
var response = JsonConvert.DeserializeObject<Response>(testJson, settings);
var json2 = JsonConvert.SerializeObject(response, Formatting.Indented, settings);
Or, you could apply the converter directly to your model using JsonConverterAttribute like so:
public class OnlineFields
{
public string CCode { get; set; }
public string MNumber { get; set; }
public string Product { get; set; }
[JsonConverter(typeof(EmbeddedLiteralConverter<JsonFile>))]
public JsonFile JsonFile { get; set; }
}
Notes:
Your input JSON is, strictly speaking, not well formed. The string value for the property JsonFile contains unescaped carriage return characters:
"JsonFile":" {
\"evaluation\":{
\"number\":[
According to the original JSON proposal as well as JSON RFC 7159 Page 8 such control characters must be escaped:
"{\r\n \"evaluation\": {\r\n \"number\": ..."
To confirm this, you can upload your initial JSON to https://jsonformatter.curiousconcept.com/ which reports the following error:
Invalid JSON (RFC 4627): Error:Invalid characters found.[Code 18, Structure 39]
As it turns out, Json.NET will read such invalid JSON without complaint, but will only write well-formed JSON by correctly escaping the carriage returns and line feeds inside the nested JSON literal. Thus your re-serialized JSON will not look identical to the initial JSON. It will, however, be well-formed, and should be consumable by any JSON parser.
To prevent a stack overflow exception when serializing, EmbeddedLiteralConverter<T>.WriteJson() disables itself when called recursively by using the technique from this answer to JSON.Net throws StackOverflowException when using [JsonConvert()].
Working sample .Net fiddle here.
Related
This is the Json that I am reading from a (.)json file.
{
"steps": [
{
"stepType": "runFolderUpdate",
"stepData": {
"actionType": "update",
"folderData": {
"folderName": "New Folder 1",
"dirName": "C:/demo/demo.xml",
"machineAddress": "10.23.44.12"
}
}
},
{
"stepType": "runFolderCreate",
"stepData": {
"actionType": "create",
"actionData": {
"folderName": "New Folder 2",
"dirName": "C:/Demo",
"machineAddress": "10.23.211.2"
}
}
},
{ . . . },
{ . . . }
]
}
My requirement is to get an Array out of this Json so that I can have all the fields and can access it with help of "stepType" and further with the help of "actionType" value.
For stepType = "runFolderUpdate"
{
"stepType": "runFolderUpdate",
"stepData": {
"actionType": "update",
"folderData": {
"folderName": "New Folder 1",
"dirName": "C:/demo/demo.xml",
"machineAddress": "10.23.44.12"
}
}
}
For stepType = "runFolderCreate"
{
"stepType": "runFolderCreate",
"stepData": {
"actionType": "create",
"actionData": {
"folderName": "New Folder 2",
"dirName": "C:/Demo",
"machineAddress": "10.23.211.2"
}
}
}
So now that I have two blocks, one for create and one for update, I can go on and access values per requirement and I am not restricted to how the JSON keys are arranged.
I tried to do it using JsonReader from Newtonsoft library but the problem is it is only forward moving reader and I can not go back. Now since this is a JSON file we are talking about, so order of how the Keys are placed should not matter but with JsonReader I get my hands tied.
Lets say for example, I am if stepType is below stepData then I can not use a JsonReader to go back to stepData afetr I know what type of stepType I am talking about.
I am looking on an approach on how to convert this steps object Json to array and each blob will be act as a block of information which I can access (Just like we do in array, using index so I dont have to worry about the order of keys.
////////UPDATE////////
I am trying to do something like this....
JObject object = Read Json from file...
JArray array = object.get("steps");
Now that I have array, based on stepType I can work on...
is this even possible?
You can convert all this to C# classes fairly easily with a custom converter. Newtonsoft gives some really useful extensibility points. So, lets say you had the following class structure:
public class Root
{
public List<Step> Steps { get; set; }
}
// Here we are telling the serialiser to use the converter
[JsonConverter(typeof(StepConverter))]
public class Step
{
public string StepType { get; set; }
public IStepData StepData { get; set; }
}
public interface IStepData
{
string ActionType { get; set; }
}
public class RunFolderUpdate : IStepData
{
public string ActionType { get; set; }
//etc - you can fill in the rest here
}
public class RunFolderCreate : IStepData
{
public string ActionType { get; set; }
//etc - you can fill in the rest here
}
Now we can implement the converter like this:
public class StepConverter : JsonConverter<Step>
{
public override Step ReadJson(JsonReader reader, Type objectType,
[AllowNull] Step existingValue, bool hasExistingValue, JsonSerializer serializer)
{
var step = JObject.ReadFrom(reader);
var stepType = step["stepType"].Value<string>();
switch(stepType)
{
case "runFolderUpdate":
return new Step
{
StepType = stepType,
StepData = step["stepData"].ToObject<RunFolderUpdate>()
};
case "runFolderCreate":
return new Step
{
StepType = stepType,
StepData = step["stepData"].ToObject<RunFolderCreate>()
};
}
throw new Exception("Errr, unknown step type!");
}
public override void WriteJson(JsonWriter writer, [AllowNull] Step value,
JsonSerializer serializer)
{
throw new NotImplementedException();
}
}
And finally you can deserialise like this:
var result = JsonConvert.DeserializeObject<Root>(json);
You can use JObject, JToken and JArray for your for work.
Say for your Json, your Json starts with an object...it has a "[" followed which is representation of an Array i.e. JArray...so you can do something like this...
So once you get "steps" as object like this...
JObject obj= null;
using (StreamReader file = File.OpenText(filePath))
using (JsonTextReader reader = new JsonTextReader(file))
{
obj = (JObject)JToken.ReadFrom(reader);
}
JToken token = obj.GetValue("steps");
JArray array = (JArray) token;
Now that you have an array which looks like this...since it has read "steps" already
[
{ .... },
{ .... }
]
Every curly brace is you array index (here Json object) you can reach using for loop...
for (int i = 0; i < array.length(); i++) {
//Now here get the next token which is an object again so you can
//parse through it and perform your action as needed for create or update
}
This is a same as java.
Dont worry about order of keys, JObject gives you freedom and you DO NOT NEED a model for this...Model is sure cleaner way to do, but it is tightly coupled, so if your Json changes, its of no use.
Please mark as answer if you feel this is correct.
I have JSON data that I need to parse from C# object.
this is JSON Example.
{
"types":
[
[
"tour_type",
[
["groups",1],
["individual",2]
]
]
]
}
Here are my C# classes that are meant to contain that data:
using System;
using Newtonsoft.Json;
namespace JsonDeserializationTest
{
[JsonProperty("types")]
public class Types
{
[JsonProperty]
public List<Type> Values {get;set;}
}
public class Type
{
[JsonProperty]
public string Key {get;set;}
[JsonProperty]
public List<Dictionary<string, int>> Values { get; set; }
}
}
It's not working now.
How can I fix it?
Use the JsonSerializer (System.Text.Json) object.
Code:
YourClass obj = JsonSerializer.Deserialize<YourClass>(jsonString);
Your json has a list of list of the object... but you are declaring only List of the object.
public class Types
{
[JsonProperty("types")]
public List<List<object>> Values { get; set; }
// ------ UPDATE: This can only be list of list of 'object' ------- \\
}
Also, you are using the JsonProperty on the class, which is not where that normally goes. You want to use that on the property of the class.
UPDATE:
You cannot use List<List<Type>> for the json you are getting, it can only be List<List<object>>. You have to use object because it can either be a string or a List<List<string>>. After you update your Types class, you can successfully deserialize the json above.
var obj = JsonConvert.DeserializeObject<Types>(json);
and based on your json definition, you can access tour_type by using the following code
types.Values.First()[0].ToString()
// output: tour_type
List<List<string>> data = JsonConvert.DeserializeObject<List<List<string>>>(types.Values.First()[1].ToString())
// data[0]
[0]: "groups"
[1]: "1"
// data[1]
[0]: "individual"
[1]: "2"
Since both of the items in the types are objects, you will either have to convert them to string or a list of list of strings or whatever object they actually are.
The JSON payload in the provided example is formatted quite strangely, especially since it contains seemingly unnecessary array nesting. A payload like this usually includes more nested objects (rather than a bunch of nested arrays). Additionally, it has a list of (string, int) pairs, which is semantically very similar to a Dictionary<string, int>, but the payload doesn't lend itself to that. It would be helpful to know where it is coming from (what context) to understand how it might change.
The example JSON brings up a few questions (that you may want to ask yourself):
Can the "types" array contain multiple entries (at its immediate nesting)?
Can the "tour_type" key name appear after the array of string, int pairs? Is it possible for an entry where no such name exists?
What other elements can exist in the arrays within "tour_type"?
Is it guaranteed that the most nested array will contain just a single (string, int) pair?
Similarly, it is hard to understand what the example C# class is trying to encapsulate. Is List<Dictionary<string, int>> necessary?
All that said, here's a solution using the built-in System.Text.Json library, that could work for you. You could write something similar using Newtonsoft.Json, if necessary. The solution assumes:
We can't change the JSON payload (and that the third party API response will always returns something that is structurally similar to the example)
We can only make minimal changes to the C# class object provided in the example
The solution creates and a JsonConverter<T> that uses the low-level Utf8JsonReader to manually parse and create the custom object. This is required since nested "[" are being used to delineate what should be objects rather than "{". The converter is then registered by annotating the class with the attribute. Now, simply call JsonSerializer.Deserialize, passing in the JSON payload.
public class Tours
{
[JsonPropertyName("types")]
public List<UserType> Types { get; set; }
}
// Annotate the type to register the converter to use
[JsonConverter(typeof(CustomUserTypeConverter))]
public class UserType
{
public string Key { get; set; }
public Dictionary<string, int> Values { get; set; }
}
// This will use the low-level reader to build up the UserType
public class CustomUserTypeConverter : JsonConverter<UserType>
{
// Extra structural validation was done for invalid/incomplete JSON
// which might be too strict or incorrect and hence might require adjustments.
public override UserType Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options)
{
var result = new UserType();
if (!reader.Read())
{
throw new JsonException("Incomplete JSON.");
}
if (reader.TokenType != JsonTokenType.EndArray)
{
result.Key = reader.GetString();
ReadAndValidate(ref reader, JsonTokenType.StartArray);
int depthSnapshot = reader.CurrentDepth;
var values = new Dictionary<string, int>();
do
{
reader.Read();
if (reader.TokenType != JsonTokenType.StartArray && reader.TokenType != JsonTokenType.EndArray)
{
throw new JsonException($"Invalid JSON payload. Expected Start or End Array. TokenType: {reader.TokenType}, Depth: {reader.CurrentDepth}.");
}
if (reader.CurrentDepth <= depthSnapshot)
{
break;
}
reader.Read();
if (reader.TokenType != JsonTokenType.EndArray)
{
string key = reader.GetString();
reader.Read();
int value = reader.GetInt32();
values.Add(key, value);
ReadAndValidate(ref reader, JsonTokenType.EndArray);
}
} while (true);
ReadAndValidate(ref reader, JsonTokenType.EndArray);
result.Values = values;
}
return result;
}
private void ReadAndValidate(ref Utf8JsonReader reader, JsonTokenType expectedTokenType)
{
bool readNext = reader.Read();
if (!readNext || reader.TokenType != expectedTokenType)
{
string message = readNext ?
$"Invalid JSON payload. TokenType: {reader.TokenType}, Depth: {reader.CurrentDepth}, Expected: {expectedTokenType}" :
$"Incomplete JSON. Expected: {expectedTokenType}";
throw new JsonException(message);
}
}
// Implement this method if you need to Serialize (i.e. write) the object
// back to JSON
public override void Write(Utf8JsonWriter writer, UserType value, JsonSerializerOptions options)
{
throw new NotImplementedException();
}
}
Here's how you would use the above converter to serialize the JSON string provided in the example, along with how to access the values.
public static Tours ParseJson(string json)
{
Tours tours = JsonSerializer.Deserialize<Tours>(json);
return tours;
}
public static void AccessValues(Tours tours)
{
foreach (UserType data in tours.Types)
{
string typeName = data.Key; // "tour_type"
foreach (KeyValuePair<string, int> pairs in data.Values)
{
string key = pairs.Key; // "groups" or "individual
int value = pairs.Value; // 1 or 2
}
}
}
For what it's worth, Visual Studio suggests the following C# class structure for the example JSON (which is similar to what #Jawad suggested):
public class Rootobject
{
public object[][] types { get; set; }
}
Hope that helps.
I couldn't figure out your JSON so I created an example with verified JSON.
Try this:
JSON:
{
"Items": [
{
"Name": "tour",
"Attributes": [
{
"Name": "groups",
"Value": 1
},
{
"Name": "individual",
"Value": 2
}
]
},
{
"Name": "demo",
"Attributes": [
{
"Name": "this is demo",
"Value": 3
},
{
"Name": "design pattern",
"Value": 99
}
]
}
]
}
Types foo = JsonSerializer.Deserialize<Types>(jsonString);
public class TypeAttribute
{
public string Name { get; set; }
public int Value { get; set; }
}
public class Type
{
private readonly ICollection<TypeAttribute> _attributes;
public Type()
{
_attributes = new Collection<TypeAttribute>();
}
public void AddAttributes(IEnumerable<TypeAttribute> attrs)
{
foreach(TypeAttribute ta in attrs)
{
_attributes.Add(ta);
}
}
public string Name { get; set; }
public IEnumerable<TypeAttribute> Attributes
{
get { return _attributes; }
set
{
foreach(TypeAttribute ta in value)
{
_attributes.Add(ta);
}
}
}
}
public class Types
{
ICollection<Type> _items;
public Types()
{
_items = new Collection<Type>();
}
public void AddItems(IEnumerable<Type> tps)
{
foreach (Type t in tps)
{
_items.Add(t);
}
}
public IEnumerable<Type> Items
{
get { return _items; }
set
{
foreach (Type t in value)
{
_items.Add(t);
}
}
}
}
Using JSON.NET I am reading JSON objects in an array from a large file.
As the JSON object is read, it is conditionally converted to the destination class, and returned as an item in an IEnumerable.
I use an IEnumerable to allow me to "pull" objects from the file and process them as they are read, avoiding having to read all objects into memory.
I use a similar technique when reading rows from a CSV file, where I use CsvHelper ShouldSkipRecord() to conditionally process the row in the CSV file.
I have not found a way to filter the JSON object as it is read from the array, and I end up using LINQ Where to filter the objects before they are converted and added to the IEnumerable. Problem is that the Where clause reads all the objects into memory, defeating the purpose of using IEnumerable.
I know I can manually read each object, and then process them, but I am looking for a more elegant way to have a form of callback that will allow me to pull records and the callback filter records I do not want.
E.g. how I filter rows in a CSV file:
internal static bool ShouldSkipRecord(string[] fields)
{
// Skip rows with incomplete data
// 2019-01-24 20:46:57 UTC,63165,4.43,6.23,6.80,189,-18,81.00,16.00,6.23
// 2019 - 01 - 24 20:47:40 UTC,63166,4.93,5.73,5.73,0,-20,,,5.73
if (fields.Length < 10)
return true;
// Temperature and humidity is optional, air quality is required
if (string.IsNullOrEmpty(fields[9]))
return true;
return false;
}
E.g. how I filter JSON objects:
internal static PurpleAirData Convert(Feed jsonData)
{
PurpleAirData data = new PurpleAirData()
{
TimeStamp = jsonData.CreatedAt.DateTime,
AirQuality = Double.Parse(jsonData.Field8)
};
// Temperature and humidity is optional
if (double.TryParse(jsonData.Field6, out double val))
data.Temperature = val;
if (double.TryParse(jsonData.Field7, out val))
data.Humidity = val;
return data;
}
internal static IEnumerable<PurpleAirData> Load(JsonTextReader jsonReader)
{
// Deserialize objects in parts
jsonReader.SupportMultipleContent = true;
JsonSerializer serializer = new JsonSerializer();
// Read Channel
// TODO : Add format checking
jsonReader.Read();
jsonReader.Read();
jsonReader.Read();
Channel channel = serializer.Deserialize<Channel>(jsonReader);
// Read the Feeds
jsonReader.Read();
jsonReader.Read();
// TODO : The Where results in a full in-memory iteration defeating the purpose of the streaming iteration
return serializer.Deserialize<List<Feed>>(jsonReader).Where(feed => !string.IsNullOrEmpty(feed.Field8)).Select(Convert);
}
Example JSON:
{
"channel":{
"id":622370,
"name":"AirMonitor_e81a",
"latitude":"0.0",
"longitude":"0.0",
"field1":"PM1.0 (ATM)",
"field2":"PM2.5 (ATM)",
"field3":"PM10.0 (ATM)",
"field4":"Uptime",
"field5":"RSSI",
"field6":"Temperature",
"field7":"Humidity",
"field8":"PM2.5 (CF=1)",
"created_at":"2018-11-09T00:35:34Z",
"updated_at":"2018-11-09T00:35:35Z",
"last_entry_id":65435
},
"feeds":[
{
"created_at":"2019-01-10T23:56:09Z",
"entry_id":56401,
"field1":"1.00",
"field2":"1.80",
"field3":"1.80",
"field4":"369",
"field5":"-30",
"field6":"66.00",
"field7":"59.00",
"field8":"1.80"
},
{
"created_at":"2019-01-10T23:57:29Z",
"entry_id":56402,
"field1":"1.08",
"field2":"2.44",
"field3":"3.33",
"field4":"371",
"field5":"-32",
"field6":"66.00",
"field7":"59.00",
"field8":"2.44"
},
{
"created_at":"2019-01-26T00:14:04Z",
"entry_id":64400,
"field1":"0.27",
"field2":"0.95",
"field3":"1.25",
"field4":"213",
"field5":"-27",
"field6":"72.00",
"field7":"40.00",
"field8":"0.95"
}
]
}
Example JSON:
[
{
"monthlyrainin": 0.01,
"humidityin": 42,
"eventrainin": 0,
"humidity": 29,
"maxdailygust": 20.13,
"dateutc": 1549476900000,
"battout": "1",
"lastRain": "2019-02-05T19:21:00.000Z",
"dailyrainin": 0,
"tempf": 52.2,
"winddir": 286,
"totalrainin": 0.01,
"dewPoint": 20.92,
"baromabsin": 29.95,
"hourlyrainin": 0,
"feelsLike": 52.2,
"yearlyrainin": 0.01,
"uv": 1,
"weeklyrainin": 0.01,
"solarradiation": 157.72,
"windspeedmph": 0,
"tempinf": 73.8,
"windgustmph": 0,
"battin": "1",
"baromrelin": 30.12,
"date": "2019-02-06T18:15:00.000Z"
},
{
"dewPoint": 20.92,
"tempf": 52.2,
"maxdailygust": 20.13,
"humidityin": 42,
"windspeedmph": 4.03,
"eventrainin": 0,
"tempinf": 73.6,
"feelsLike": 52.2,
"dateutc": 1549476600000,
"windgustmph": 4.92,
"hourlyrainin": 0,
"monthlyrainin": 0.01,
"battin": "1",
"humidity": 29,
"totalrainin": 0.01,
"baromrelin": 30.12,
"winddir": 314,
"lastRain": "2019-02-05T19:21:00.000Z",
"yearlyrainin": 0.01,
"baromabsin": 29.94,
"dailyrainin": 0,
"battout": "1",
"uv": 1,
"solarradiation": 151.86,
"weeklyrainin": 0.01,
"date": "2019-02-06T18:10:00.000Z"
}]
Is there a way in JSON.NET to filter objects as they are read?
What you can do is to adopt the basic approaches of Issues parsing a 1GB json file using JSON.NET and Deserialize json array stream one item at a time, which is to stream through the array and yield return each item; but in addition apply a where expression to filter incomplete items, or a select clause to transform some intermediate deserialized object such as a JObject or a DTO to your final data model. By applying the where clause during streaming, unwanted objects will never get added to the list being deserialized, and thus will get cleaned up by the garbage collector during streaming. Filtering array contents while streaming can be done at the root level, when the root JSON container is an array, or as part of some custom JsonConverter for List<T> when the array to be deserialized is nested with some outer JSON.
As a concrete example, consider your first JSON example. You would like to deserialize it to a data model that looks like:
public class PurpleAirData
{
public PurpleAirData(DateTime createdAt, double airQuality)
{
this.CreatedAt = createdAt;
this.AirQuality = airQuality;
}
// Required properties
public DateTime CreatedAt { get; set; }
public double AirQuality { get; set; }
// Optional properties, thus nullable
public double? Temperature { get; set; }
public double? Humidity { get; set; }
}
public class RootObject
{
public Channel channel { get; set; } // Define this using http://json2csharp.com/
public List<PurpleAirData> feeds { get; set; }
}
To do this, first introduce the following extension methods:
public static partial class JsonExtensions
{
public static IEnumerable<T> DeserializeArrayItems<T>(this JsonSerializer serializer, JsonReader reader)
{
if (reader.MoveToContent().TokenType == JsonToken.Null)
yield break;
if (reader.TokenType != JsonToken.StartArray)
throw new JsonSerializationException(string.Format("Current token {0} is not an array at path {1}", reader.TokenType, reader.Path));
// Process the collection items
while (reader.Read())
{
switch (reader.TokenType)
{
case JsonToken.EndArray:
yield break;
case JsonToken.Comment:
break;
default:
yield return serializer.Deserialize<T>(reader);
break;
}
}
// Should not come here.
throw new JsonReaderException(string.Format("Unclosed array at path {0}", reader.Path));
}
public static JsonReader MoveToContent(this JsonReader reader)
{
if (reader.TokenType == JsonToken.None)
reader.Read();
while (reader.TokenType == JsonToken.Comment && reader.Read())
;
return reader;
}
}
Next, introduce the following JsonConverter for List<PurpleAirData>:
class PurpleAirListConverter : JsonConverter
{
class PurpleAirDataDTO
{
// Required properties
[JsonProperty("created_at")]
public DateTime? CreatedAt { get; set; }
[JsonProperty("Field8")]
public double? AirQuality { get; set; }
// Optional properties
[JsonProperty("Field6")]
public double? Temperature { get; set; }
[JsonProperty("Field7")]
public double? Humidity { get; set; }
}
public override bool CanConvert(Type objectType)
{
return objectType == typeof(List<PurpleAirData>);
}
public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
{
if (reader.MoveToContent().TokenType == JsonToken.Null)
return null;
var list = existingValue as List<PurpleAirData> ?? new List<PurpleAirData>();
var query = from dto in serializer.DeserializeArrayItems<PurpleAirDataDTO>(reader)
where dto != null && dto.CreatedAt != null && dto.AirQuality != null
select new PurpleAirData(dto.CreatedAt.Value, dto.AirQuality.Value) { Humidity = dto.Humidity, Temperature = dto.Temperature };
list.AddRange(query);
return list;
}
public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer)
{
throw new NotImplementedException();
}
}
The purpose of this converter is to stream through the "feeds" array, deserialize each JSON item to an intermediate PurpleAirDataDTO, check for the presence of required members, then convert the DTO to the final model.
Finally, deserialize the entire file as follows:
static RootObject DeserializePurpleAirDataFile(TextReader textReader)
{
var settings = new JsonSerializerSettings
{
Converters = { new PurpleAirListConverter() },
NullValueHandling = NullValueHandling.Ignore,
};
var serializer = JsonSerializer.CreateDefault(settings);
using (var reader = new JsonTextReader(textReader) { CloseInput = false })
{
return serializer.Deserialize<RootObject>(reader);
}
}
Demo fiddle here.
When the array to be filtered is the root container in the JSON file, the extension method JsonExtensions.DeserializeArrayItems() can be used directly, e.g. as follows:
static bool IsValid(WeatherData data)
{
// Return false if certain fields are missing
// Otherwise return true;
return true;
}
static List<WeatherData> DeserializeFilteredWeatherData(TextReader textReader)
{
var serializer = JsonSerializer.CreateDefault();
using (var reader = new JsonTextReader(textReader) { CloseInput = false })
{
var query = from data in serializer.DeserializeArrayItems<WeatherData>(reader)
where IsValid(data)
select data;
return query.ToList();
}
}
Notes:
nullable types can be used to track whether or not value type members were actually encountered during deserialization.
Here the conversion from DTO to final data model is done manually, but for more complicated models something like automapper could be used instead.
I have an API that returns a JSON object from a MongoDB in which one of the properties is an "open-ended" document, meaning it can be any valid JSON for that property. I don't know what the names of the properties are ahead of time, they can be any string. I only know that this particular property needs to be serialized exactly how it is stored in the database. So if the property name that was originally stored was "Someproperty", the serialized response in JSON needs to be "Someproperty", NOT "someProperty".
We have this configuration:
ContractResolver = new CamelCasePropertyNamesContractResolver();
in our CustomJsonSerializer, but it is messing with the formatting of the response when returning the "open ended" JSON. It is camel-casing all of these properties when in fact we want the response to be exactly how they are stored in MongoDB (BSON). I know the values are maintaining their case when storing/retrieving via the database, so that is not the issue.
How can I tell JSON.net to essentially bypass the CamelCasePropertyNameResolver for all of the child properties of a particular data point?
EDIT:
Just to give a bit more info, and share what I have already tried:
I thought about overriding the PropertyNameResolver like so:
protected override string ResolvePropertyName(string propertyName)
{
if (propertyName.ToLower().Equals("somedocument"))
{
return propertyName;
}
else return base.ResolvePropertyName(propertyName);
}
However, if I have a JSON structure like this:
{
"Name" : "MyObject",
"DateCreated" : "11/14/2016",
"SomeDocument" :
{
"MyFirstProperty" : "foo",
"mysecondPROPERTY" : "bar",
"another_random_subdoc" :
{
"evenmoredata" : "morestuff"
}
}
}
then I would need all of the properties any child properties' names to remain exactly as is. The above override I posted would (I believe) only ignore on an exact match to "somedocument", and would still camelcase all of the child properties.
What you can do is, for the property in question, create a custom JsonConverter that serializes the property value in question using a different JsonSerializer created with a different contract resolver, like so:
public class AlternateContractResolverConverter : JsonConverter
{
[ThreadStatic]
static Stack<Type> contractResolverTypeStack;
static Stack<Type> ContractResolverTypeStack { get { return contractResolverTypeStack = (contractResolverTypeStack ?? new Stack<Type>()); } }
readonly IContractResolver resolver;
JsonSerializerSettings ExtractAndOverrideSettings(JsonSerializer serializer)
{
var settings = serializer.ExtractSettings();
settings.ContractResolver = resolver;
settings.CheckAdditionalContent = false;
if (settings.PreserveReferencesHandling != PreserveReferencesHandling.None)
{
// Log an error throw an exception?
Debug.WriteLine(string.Format("PreserveReferencesHandling.{0} not supported", serializer.PreserveReferencesHandling));
}
return settings;
}
public AlternateContractResolverConverter(Type resolverType)
{
if (resolverType == null)
throw new ArgumentNullException("resolverType");
resolver = (IContractResolver)Activator.CreateInstance(resolverType);
if (resolver == null)
throw new ArgumentNullException(string.Format("Resolver type {0} not found", resolverType));
}
public override bool CanRead { get { return ContractResolverTypeStack.Count == 0 || ContractResolverTypeStack.Peek() != resolver.GetType(); } }
public override bool CanWrite { get { return ContractResolverTypeStack.Count == 0 || ContractResolverTypeStack.Peek() != resolver.GetType(); } }
public override bool CanConvert(Type objectType)
{
throw new NotImplementedException("This contract resolver is intended to be applied directly with [JsonConverter(typeof(AlternateContractResolverConverter), typeof(SomeContractResolver))] or [JsonProperty(ItemConverterType = typeof(AlternateContractResolverConverter), ItemConverterParameters = ...)]");
}
public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
{
using (ContractResolverTypeStack.PushUsing(resolver.GetType()))
return JsonSerializer.CreateDefault(ExtractAndOverrideSettings(serializer)).Deserialize(reader, objectType);
}
public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer)
{
using (ContractResolverTypeStack.PushUsing(resolver.GetType()))
JsonSerializer.CreateDefault(ExtractAndOverrideSettings(serializer)).Serialize(writer, value);
}
}
internal static class JsonSerializerExtensions
{
public static JsonSerializerSettings ExtractSettings(this JsonSerializer serializer)
{
// There is no built-in API to extract the settings from a JsonSerializer back into JsonSerializerSettings,
// so we have to fake it here.
if (serializer == null)
throw new ArgumentNullException("serializer");
var settings = new JsonSerializerSettings
{
CheckAdditionalContent = serializer.CheckAdditionalContent,
ConstructorHandling = serializer.ConstructorHandling,
ContractResolver = serializer.ContractResolver,
Converters = serializer.Converters,
Context = serializer.Context,
Culture = serializer.Culture,
DateFormatHandling = serializer.DateFormatHandling,
DateFormatString = serializer.DateFormatString,
DateParseHandling = serializer.DateParseHandling,
DateTimeZoneHandling = serializer.DateTimeZoneHandling,
DefaultValueHandling = serializer.DefaultValueHandling,
EqualityComparer = serializer.EqualityComparer,
// No Get access to the error event, so it cannot be copied.
// Error = += serializer.Error
FloatFormatHandling = serializer.FloatFormatHandling,
FloatParseHandling = serializer.FloatParseHandling,
Formatting = serializer.Formatting,
MaxDepth = serializer.MaxDepth,
MetadataPropertyHandling = serializer.MetadataPropertyHandling,
MissingMemberHandling = serializer.MissingMemberHandling,
NullValueHandling = serializer.NullValueHandling,
ObjectCreationHandling = serializer.ObjectCreationHandling,
ReferenceLoopHandling = serializer.ReferenceLoopHandling,
// Copying the reference resolver doesn't work in the default case, since the
// actual BidirectionalDictionary<string, object> mappings are held in the
// JsonSerializerInternalBase.
// See https://github.com/JamesNK/Newtonsoft.Json/blob/master/Src/Newtonsoft.Json/Serialization/DefaultReferenceResolver.cs
ReferenceResolverProvider = () => serializer.ReferenceResolver,
PreserveReferencesHandling = serializer.PreserveReferencesHandling,
StringEscapeHandling = serializer.StringEscapeHandling,
TraceWriter = serializer.TraceWriter,
TypeNameHandling = serializer.TypeNameHandling,
// Changes in Json.NET 10.0.1
//TypeNameAssemblyFormat was obsoleted and replaced with TypeNameAssemblyFormatHandling in Json.NET 10.0.1
//TypeNameAssemblyFormat = serializer.TypeNameAssemblyFormat,
TypeNameAssemblyFormatHandling = serializer.TypeNameAssemblyFormatHandling,
//Binder was obsoleted and replaced with SerializationBinder in Json.NET 10.0.1
//Binder = serializer.Binder,
SerializationBinder = serializer.SerializationBinder,
};
return settings;
}
}
public static class StackExtensions
{
public struct PushValue<T> : IDisposable
{
readonly Stack<T> stack;
public PushValue(T value, Stack<T> stack)
{
this.stack = stack;
stack.Push(value);
}
// By using a disposable struct we avoid the overhead of allocating and freeing an instance of a finalizable class.
public void Dispose()
{
if (stack != null)
stack.Pop();
}
}
public static PushValue<T> PushUsing<T>(this Stack<T> stack, T value)
{
if (stack == null)
throw new ArgumentNullException();
return new PushValue<T>(value, stack);
}
}
Then use it like so:
public class RootObject
{
public string Name { get; set; }
public DateTime DateCreated { get; set; }
[JsonProperty(NamingStrategyType = typeof(DefaultNamingStrategy))]
[JsonConverter(typeof(AlternateContractResolverConverter), typeof(DefaultContractResolver))]
public SomeDocument SomeDocument { get; set; }
}
public class SomeDocument
{
public string MyFirstProperty { get; set; }
public string mysecondPROPERTY { get; set; }
public AnotherRandomSubdoc another_random_subdoc { get; set; }
}
public class AnotherRandomSubdoc
{
public string evenmoredata { get; set; }
public DateTime DateCreated { get; set; }
}
(Here I am assuming you want the "SomeDocument" property name to be serialized verbatim, even though it wasn't entirely clear from your question. To do that, I'm using JsonPropertyAttribute.NamingStrategyType from Json.NET 9.0.1. If you're using an earlier version, you'll need to set the property name explicitly.)
Then the resulting JSON will be:
{
"name": "Question 40597532",
"dateCreated": "2016-11-14T05:00:00Z",
"SomeDocument": {
"MyFirstProperty": "my first property",
"mysecondPROPERTY": "my second property",
"another_random_subdoc": {
"evenmoredata": "even more data",
"DateCreated": "2016-11-14T05:00:00Z"
}
}
}
Note that this solution does NOT work well with preserving object references. If you need them to work together, you may need to consider a stack-based approach similar to the one from Json.NET serialize by depth and attribute
Demo fiddle here.
Incidentally, have you considered storing this JSON as a raw string literal, as in the answer to this question?
I think you should look at this backwards.
Instead of trying to NOT touch the properties you don't know, let that be the default behavior and touch the ones you DO know.
In other words, don't use the CamelCasePropertyNamesContractResolver. Deal with the properties you know appropriately and let the other ones pass through transparently.
I would like to have a serialization format that is nearly identical to JSON, except that key-values are represented as <key>="<value>" instead of "<key>":"<value>".
With Newtonsoft I made a custom JsonConverter called TsonConverter that works fairly well, except that it can't "see" an embedded dictionary. Given the following type:
public class TraceyData
{
[Safe]
public string Application { get; set; }
[Safe]
public string SessionID { get; set; }
[Safe]
public string TraceID { get; set; }
[Safe]
public string Workflow { get; set; }
[Safe]
public Dictionary<string, string> Tags {get; set; }
[Safe]
public string[] Stuff {get; set;}
}
And the following code:
TsonConverter weird = new TsonConverter();
JsonSerializerSettings settings = new JsonSerializerSettings();
settings.NullValueHandling = NullValueHandling.Ignore;
settings.Converters.Add(weird);
var tracey = new TraceyData();
tracey.TraceID = Guid.NewGuid().ToString();
tracey.SessionID = "5";
tracey.Tags["Referrer"] = "http://www.sky.net/deals";
tracey.Stuff = new string[] { "Alpha", "Bravo", "Charlie" };
tracey.Application = "Responsive";
string stuff = JsonConvert.SerializeObject(tracey, settings);
I get this:
[Application="Responsive" SessionID="5" TraceID="082ef853-92f8-4ce8-9f32-8e4f792fb022" Tags={"Referrer":"http://www.sky.net/deals"} Stuff=["Alpha","Bravo","Charlie"]]
Obviously I have also overridden the StartObject/EndObject notation, replacing { } with [ ]. Otherwise the results are not bad.
However, there is still the problem of the internal dictionary. In order
to convert the dictionary as well to use my <key>="<value>" format, it looks like I must make a deep dictionary converter.
I'm wondering if there is an easier way to do this.
Perhaps the Newtonsoft tool has a "property generator" and "key-value" generator property that I can set that globally handles this for me?
Any suggestions?
And while we're here, I wonder if there is a StartObject/EndObject formatter property override I can set, which would handle the other customization I've shown above. It would be nice to "skip" making JsonConverter tools for these kinds of simple alterations.
Incidentally:
My custom JsonConverter is choosing properties to serialize based on the [Safe] attribute shown in my sample. This is another nice-to-have. It would be wonderful if the JSon settings could expose an "attribute handler" property that lets me override the usual JSon attributes in favor of my own.
I have no need to de-serialize this format. It is intended as a one-way operation. If someone wishes also to explain how to de-serialize my custom format as well that is an interesting bonus, but definitely not necessary to answer this question.
Appendix
Below is the TraceConverter I had made. It references a FieldMetaData class that simply holds property info.
public class TsonConverter : JsonConverter
{
public override bool CanRead
{
get
{
return false;
}
}
public override bool CanConvert(Type ObjectType)
{
return DataClassifier.TestForUserType(ObjectType);
}
public override void WriteJson(
JsonWriter writer, object value, JsonSerializer serializer)
{
Type objType = value.GetType();
var props = objType.GetProperties(BindingFlags.Instance | BindingFlags.Public | BindingFlags.NonPublic);
var propMap = from p in props
from a in p.GetCustomAttributes(typeof(ProfileAttribute), false)
select new FieldMetaData(p, (ProfileAttribute)a);
//writer.WriteStartObject();
writer.WriteStartArray();
bool loopStarted = true;
foreach(var prop in propMap){
object rawValue = prop.GetValue(value);
if (rawValue != null || serializer.NullValueHandling == NullValueHandling.Include)
{
string jsonValue = JsonConvert.SerializeObject(prop.GetValue(value), this);
if (loopStarted)
{
loopStarted = false;
writer.WriteRaw(String.Format("{0}={1}", prop.Name, jsonValue));
}
else
{
writer.WriteRaw(String.Format(" {0}={1}", prop.Name, jsonValue));
}
}
//writer.WriteRaw(String.Format("{0}={1}", prop.Name, prop.GetValue(value)));
//writer.WritePropertyName(prop.Name, false);
//writer.WriteValue(prop.GetValue(value));
}
writer.WriteEndArray();
}
public override object ReadJson(
JsonReader reader, Type objectType,
object existingValue, JsonSerializer serializer)
{
throw new NotImplementedException();
}
}
Rather than creating your own converter, you're going to need to create your own subclass of JsonWriter that writes to your custom file format. (This is how Json.NET implements its BsonWriter.) In your case, your file format is close enough to JSON that you can inherit from JsonTextWriter:
public class TsonTextWriter : JsonTextWriter
{
TextWriter _writer;
public TsonTextWriter(TextWriter textWriter)
: base(textWriter)
{
if (textWriter == null)
throw new ArgumentNullException("textWriter");
QuoteName = false;
_writer = textWriter;
}
public override void WriteStartObject()
{
SetWriteState(JsonToken.StartObject, null);
_writer.Write('[');
}
protected override void WriteEnd(JsonToken token)
{
switch (token)
{
case JsonToken.EndObject:
_writer.Write(']');
break;
default:
base.WriteEnd(token);
break;
}
}
public override void WritePropertyName(string name)
{
WritePropertyName(name, true);
}
public override void WritePropertyName(string name, bool escape)
{
SetWriteState(JsonToken.PropertyName, name);
var escaped = name;
if (escape)
{
escaped = JsonConvert.ToString(name, '"', StringEscapeHandling);
escaped = escaped.Substring(1, escaped.Length - 2);
}
// Maybe also escape the space character if it appears in a name?
_writer.Write(escaped.Replace("=", #"\u003d"));// Replace "=" with unicode escape sequence.
_writer.Write('=');
}
/// <summary>
/// Writes the JSON value delimiter. (Remove this override if you want to retain the comma separator.)
/// </summary>
protected override void WriteValueDelimiter()
{
_writer.Write(' ');
}
/// <summary>
/// Writes an indent space.
/// </summary>
protected override void WriteIndentSpace()
{
// Do nothing.
}
}
Having done this, now all classes will be serialized to your custom format when you use this writer, for instance:
var tracey = new TraceyData();
tracey.TraceID = Guid.NewGuid().ToString();
tracey.SessionID = "5";
tracey.Tags["Referrer"] = "http://www.sky.net/deals";
tracey.Stuff = new string[] { "Alpha", "Bravo", "Charlie" };
tracey.Application = "Responsive";
JsonSerializerSettings settings = new JsonSerializerSettings();
settings.NullValueHandling = NullValueHandling.Ignore;
using (var sw = new StringWriter())
{
using (var jsonWriter = new TsonTextWriter(sw))
{
JsonSerializer.CreateDefault(settings).Serialize(jsonWriter, tracey);
}
Debug.WriteLine(sw.ToString());
}
Produces the output
[Application="Responsive" SessionID="5" TraceID="2437fe67-9788-47ba-91ce-2e5b670c2a34" Tags=[Referrer="http://www.sky.net/deals"] Stuff=["Alpha" "Bravo" "Charlie"]]
As far as deciding whether to serialize properties based on the presence of a [Safe] attribute, that's sort of a second question. You will need to create your own ContractResolver and override CreateProperty, for instance as shown here: Using JSON.net, how do I prevent serializing properties of a derived class, when used in a base class context?
Update
If you want to retain the comma separator for arrays but not objects, modify WriteValueDelimiter as follows:
/// <summary>
/// Writes the JSON value delimiter. (Remove this override if you want to retain the comma separator.)
/// </summary>
protected override void WriteValueDelimiter()
{
if (WriteState == WriteState.Array)
_writer.Write(',');
else
_writer.Write(' ');
}