I am using Newtonsoft.Json 11.0.2 in .Net core 2.0.
If i use JObject, i am able to SelectToken like so:
JObject.Parse("{\"context\":{\"id\":42}}").SelectToken("context.id")
Returns
42
However, if i use JRaw, i get null for the same path?
new JRaw("{\"context\":{\"id\":42}}").SelectToken("context.id")
returns
null
Due to how my code is setup, my model is already in JRaw, and converting it to JObject just to select this token seems like a waste of RAM (this call is on the hot path).
UPDATE
Ok, my actual data comes down in a model where only one of the properties is JRaw, so i need something like the below to work:
JsonConvert.DeserializeObject<Dictionary<string, JRaw>>(
"{\"a\":{\"context\":{\"id\":42}}}")["a"].SelectToken("context.id")
The above returns null again.
Title might be a bit misleading, but basically what the OP needs is a way to parse an existing (and large) JRaw object without consuming too much memory.
I ran some tests and I was able to find a solution using a JsonTextReader.
I don't know the exact structure of the OP's json strings, so I'll assume something like this:
[
{
"context": {
"id": 10
}
},
{
"context": {
"id": 20
}
},
{
"context": {
"id": 30
}
}
]
Result would be an integer array with the id values (10, 20, 30).
Parsing method
So this is the method that takes a JRaw object as a parameter and extracts the Ids, using a JsonTextReader.
private static IEnumerable<int> GetIds(JRaw raw)
{
using (var stringReader = new StringReader(raw.Value.ToString()))
using (var textReader = new JsonTextReader(stringReader))
{
while (textReader.Read())
{
if (textReader.TokenType == JsonToken.PropertyName && textReader.Value.Equals("id"))
{
int? id = textReader.ReadAsInt32();
if (id.HasValue)
{
yield return id.Value;
}
}
}
}
}
In the above example I'm assuming there is one and only one type of object with an id property.
There are other ways to extract the information we need - e.g. we can check the token type and the path as follows:
if (textReader.TokenType == JsonToken.Integer && textReader.Path.EndsWith("context.id"))
{
int id = Convert.ToInt32(textReader.Value);
yield return id;
}
Testing the code
I created the following C# classes that match the above json structure, for testing purposes:
public class Data
{
[JsonProperty("context")]
public Context Context { get; set; }
public Data(int id)
{
Context = new Context
{
Id = id
};
}
}
public class Context
{
[JsonProperty("id")]
public int Id { get; set; }
}
Creating a JRaw object and extracting the Ids:
class Program
{
static void Main(string[] args)
{
JRaw rawJson = CreateRawJson();
List<int> ids = GetIds(rawJson).ToList();
Console.Read();
}
// Instantiates 1 million Data objects and then creates a JRaw object
private static JRaw CreateRawJson()
{
var data = new List<Data>();
for (int i = 1; i <= 1_000_000; i++)
{
data.Add(new Data(i));
}
string json = JsonConvert.SerializeObject(data);
return new JRaw(json);
}
}
Memory Usage
Using Visual Studio's Diagnostic tools I took the following snapshots, to check the memory usage:
Snapshot #1 was taken at the beginning of the console application (low memory as expected)
Snapshot #2 was taken after creating the JRaw object
JRaw rawJson = CreateRawJson();
Snapshot #3 was taken after extracting the ids
List ids = GetIds(rawJson).ToList();
Related
I am rather new in strongly typed languages and I working on a Umbraco controller that outputs some JSON with a list of dates.
"meetingTimes": [10:30, 11:30]
That works pretty well. Now I want to output an extry field with the time containing a unique key
So it should be like
meetingTimes: [{ time: "10:30", key: "abcd-1234-efgh-5678" }, { time: "11:30", key: "defg-1234-sktg-5678" }]
But I can't figure out how to do it.
The part of my current code that handles this is:
try {
IPublishedContent content = Umbraco.Content(Guid.Parse("ff3e93f6-b34f-4664-a08b-d2eae2a0adbd"));
var meetingDatesAvailabled = content.Value<IEnumerable<IPublishedElement>>("meetingDatesAvailable");
var items = new List<object>();
foreach(var meetingDate in meetingDatesAvailabled)
{
if (meetingDate.Value("meetingItemDay").ToString().Substring(0, 8) == theDate) {
var times = meetingDate.Value<IEnumerable<IPublishedElement>>("meetingItemDayTimes");
foreach (var time in times)
{
items.Add(time.Value("meetingdateTimeItem").ToString());
}
}
}
return new { dateChosen = theDate, meetingTimes = items };
}
Initially, we have to create a class that abstract the meeting time:
public class MeetingTime
{
public string Time { get; }
public Guid Key { get; }
public MeetingTime(string time, Guid key)
{
Time = time;
Key = key;
}
}
Then we are going to create an empty list of MeetingItem, instead of creating an empty list of object, var items = new List<object>();.
var items = new List<MeetingItem>();
Then inside you foreach:
foreach (var time in times)
{
items.Add(new MeetingTime(time.Value("meetingdateTimeItem").ToString(),
Guid.NewGuid())
);
}
Note: I am not aware of the way you serialize your objects (e.g. https://www.newtonsoft.com), but quite probably you have to decorate both properties Time and Key with an attribute for using a proper name during the serialization. If you don't do so, then it would use the default names, Time and Key. I say this, because I noticed in the json you shared that you want to be time and key.
I have over 128 documents in my Raven database of type Foo:
class Foo {
public string Id {get; set;}
public string Name {get; set;}
}
For two documents, the Name property has value "MyName".
With an IDocumentSession session, if I perform session.query<Foo>().Where(f => f.Name.equals("MyName")), I get zero results. This appears to be because the two documents that match "MyName" are not returned in the 128 documents returned from the RavenDB server (which is the default client-side page size). So, the client API filters by Name=="MyName" on the 128 documents returned, but since my two matching documents were not among those first 128, no matching documents are found. I verified this hypothesis by 1. looking at my RavenDb studio in my browser and verifying that these two documents exist, and 2. by implementing an unbounded, streaming query and successfully retrieving these two documents:
var results = new List<Foo>();
var query = session.Query<Foo>().Where(f => f.Name.equals("MyName");
using (var enumerator = session.Advanced.Stream(query){
while (enumerator.MoveNext()){
results.Add(enumerator.Current.Document);
}
}
However, the streaming solution is not ideal for me. My question is the following: is there a way to ask RavenDB to perform the filter on Name on the server, before returning 128 documents to the client? I want to search through all documents in my database for my given Where filter, but once the filter is applied, I am perfectly content to have the server return <= 128 documents to the client API.
Your assumption is not correct. The default page size applies to the result of the query and not to the document collection you are querying on (if this was really true it would cause ugly problems left and right as you have no control over what comes first and what comes last in the collection).
Are you actually executing the query (i.e. calling query.ToList() or something similiar)? - If you do, please provide further code showing your query and assigning the result.
EDIT
So this here works as expected on my machine:
[TestFixture]
public class UnitTest3
{
public class Foo
{
public string Id { get; set; }
public string Name { get; set; }
}
private readonly IDocumentStore _documentStore;
public UnitTest3()
{
_documentStore = new EmbeddableDocumentStore
{
Configuration =
{
RunInUnreliableYetFastModeThatIsNotSuitableForProduction = true,
RunInMemory = true,
}
}.Initialize();
}
public void InsertDummies()
{
using (IDocumentSession session = _documentStore.OpenSession())
{
for (int i = 0; i < 1000; i++)
{
Foo foo = new Foo { Name = "Foo" + i };
session.Store(foo);
}
Foo fooA = new Foo { Name = "MyName"};
session.Store(fooA);
Foo fooB = new Foo { Name = "MyName" };
session.Store(fooB);
session.SaveChanges();
}
}
[Test]
public void Query()
{
List<Foo> result;
InsertDummies();
using (IDocumentSession session = _documentStore.OpenSession())
{
result = session.Query<Foo>().Where(f => f.Name.Equals("MyName")).ToList();
}
Assert.AreEqual(2, result.Count);
}
}
Did you check whether the index might be stale? - https://ravendb.net/docs/article-page/3.0/csharp/indexes/stale-indexes
I'm looking for a way to do deserialization from Json to be version dependent using the data within the Json itself.
I'm targeting to use ServiceStack.Text.JsonDeserializer, but can switch to another library.
For example, I'd like to define a data in JSON for v1.0 to be:
{
version: "1.0"
condition: "A < B"
}
and then, a next version of the data (say 2.0) to be:
{
version: "2.0"
condition: ["A < B", "B = C", "B < 1"]
}
At the end, I want to be able to validate version of the data to know how to deserialize the JSON correctly.
UPDATE:
It looks like there is no any kind of implicit support for version-dependent JSON (de)serialization in known products.
The right solution seems to be to split the task by (de)serializing only version part and then use implicit (de)serializing for the correct type(s).
Gratitudes to everyone who shared knowledge and thoughts on the problem.
What you can do is either the following:
Create a base class for the data objects you want to deserialize that contains a version field and nothing else.
Make the data classes for your different versions be derived classes of this base class.
When you deserialize your data object, first, deserialize it as an instance of your base class - so now you have a POCO object that contains the version number. You can use this to decide which of your derived data classes you should use to deserialize your data (in the simplest case, you can do a switch/case and handle each version individually)
An example (using System.Web.Script.Serialization.JavaScriptSerializer):
class BaseClass
{
public int version { get; set; }
}
class FirstVersion: BaseClass
{
public string condition { get; set; }
}
class SecondVersion: BaseClass
{
public IEnumerable<string> condition { get; set; }
}
public void Deserialize (string jsonString)
{
JavaScriptSerializer serializer = new JavaScriptSerializer();
BaseClass myData = serializer.Deserialize<BaseClass>(jsonString);
switch (myData.version)
{
case 1:
FirstVersion firstVersion = serializer.Deserialize<FirstVersion>(jsonString);
// ...
break;
case 2:
SecondVersion secondVersion = serializer.Deserialize<SecondVersion>(jsonString);
// ...
break;
}
}
As you can see, this code deserializes the data twice - that may be a problem for you if you are working with large data structures. If you want to avoid that at all costs, you either have to give up static typing or modify the data model of your application.
And here is how it looks like with dynamic:
public void Deserialize (string jsonString)
{
JavaScriptSerializer serializer = new JavaScriptSerializer();
dynamic myData = serializer.Deserialize<object>(jsonString);
if (myData ["version"] == 1) {
...
}
}
There is also the option to write your own custom JavaScriptConverter. That is a lot more work, but I'm pretty sure you can achieve what you want and it will look nicer.
Another advice to consider is never to remove properties from your JSON structure. If you need to modify a property, keep the old one and add a new one instead - this way, old code can always read data from newer code. Of course, this can get out of hand pretty quickly if you modify your data structures a lot...
In Java, you could use Google's GSON library, as it has a built-in support for versioning. I haven't looked into it, but it is open source and if it's really important to you, I guess you can port the implementation to a different language.
I suggest that you use json.net is it allows you to add your custom type converts which can be used for versioning.
The problem is not serialization as it will always use the current schema. The problem is when the client uses a different type version that the server that receives the object.
What you need to do is to check the version programatically in your type converter and the convert the value by yourself (in this case convert the string to an array).
Documentation: http://www.newtonsoft.com/json/help/html/CustomJsonConverter.htm
You might want to use the NewtonSoft.Json NuGET package.
This is kind of a standard within the .NET community. It is also often referred to as Json.NET
You can use it like this (example from official website):
Product product = new Product();
product.Name = "Apple";
product.ExpiryDate = new DateTime(2008, 12, 28);
product.Price = 3.99M;
product.Sizes = new string[] { "Small", "Medium", "Large" };
string output = JsonConvert.SerializeObject(product);
//{
// "Name": "Apple",
// "ExpiryDate": "2008-12-28T00:00:00",
// "Price": 3.99,
// "Sizes": [
// "Small",
// "Medium",
// "Large"
// ]
//}
Product deserializedProduct = JsonConvert.DeserializeObject<Product>(output);
If you are willing to switch to JSON.net, then there is a simpler way of doing it. You don't have to use a BaseClass containing version and you don't have to parse twice. The trick is to use JObject and then query JSON for the version:
JObject obj = JObject.Parse(json);
string version = obj.SelectToken("$.Version")?.ToString();
Then you can proceed as Sándor did with the bonus part that you can use JObject to get your dto instead of re-reading json:
ConditionsDto v1Dto = obj.ToObject<ConditionsDto>(readSerializer);
Putting it all together:
public static ConditionsBusinessObject Parse(string json)
{
JObject obj = JObject.Parse(json);
string version = obj.SelectToken("$.Version")?.ToString();
JsonSerializer readSerializer = JsonSerializer.CreateDefault(/*You might want to place your settings here*/);
switch (version)
{
case null: //let's assume that there are some old files out there with no version at all
//and that these are equivalent to the version 1
case "1":
ConditionsDto v1Dto = obj.ToObject<ConditionsDto>(readSerializer);
if (v1Dto == null) return null; //or throw
List<string> convertedConditions = new List<string> {v1Dto.Condition}; //See what I've done here?
return new ConditionsBusinessObject(convertedConditions);
case "2":
ConditionsDtoV2 v2Dto = obj.ToObject<ConditionsDtoV2>(readSerializer);
return v2Dto == null ? null //or throw
: new ConditionsBusinessObject(v2Dto.Condition);
default:
throw new Exception($"Unsupported version {version}");
}
}
For reference here are the classes that I have:
public class ConditionsDto
{
public string Version { get; set; }
public string Condition { get; set; }
}
public class ConditionsDtoV2
{
public string Version { get; set; }
public List<string> Condition { get; set; }
}
public class ConditionsBusinessObject
{
public ConditionsBusinessObject(List<string> conditions)
{
Conditions = conditions;
}
public List<string> Conditions { get; }
}
and a couple of tests to wrap it up:
[Test]
public void TestV1()
{
string v1 = #"{
Version: ""1"",
Condition: ""A < B""
}";
//JsonHandler is where I placed Parse()
ConditionsBusinessObject fromV1 = JsonHandler.Parse(v1);
Assert.AreEqual(1, fromV1.Conditions.Count);
Assert.AreEqual("A < B", fromV1.Conditions[0]);
}
[Test]
public void TestV2()
{
string v2 = #"{
Version: ""2"",
Condition: [""A < B"", ""B = C"", ""B < 1""]
}";
ConditionsBusinessObject fromV2 = JsonHandler.Parse(v2);
Assert.AreEqual(3, fromV2.Conditions.Count);
Assert.AreEqual("A < B", fromV2.Conditions[0]);
Assert.AreEqual("B = C", fromV2.Conditions[1]);
Assert.AreEqual("B < 1", fromV2.Conditions[2]);
}
In a normal real world application, the //See what I've done here? part is where you will have to do all your conversion chores. I didn't do anything smart there, I just wrapped the single condition to a list to make it compatible with the current business object. As you could guess though, this can explode as the application evolves. This answer in softwareengineering SE has more details in the theory behind versioned JSON data so you might want to have a look in order to know what to expect.
One final word about performance impact of reading to JObject and then converting to the dto is that I haven't done any measurements but I expect it to be better than parsing twice. If I find out that this is not true, I will update the answer accordingly.
Take a look at
System.Web.Script.Serialization.JavaScriptSerializer
Sample
var ser = new JavaScriptSerializer();
var result = (IReadOnlyDictionary<string, object>)ser.DeserializeObject(json);
if(result["version"] == "1.0")
{
// You expect a string for result["condition"]
}
else
{
// You expect an IEnumerable<string> for result["condition"]
}
This question already has answers here:
How to handle both a single item and an array for the same property using JSON.net
(9 answers)
Closed 4 years ago.
We're dealing with an json API result. Just to make our lives difficult the provider of the API returns on of the items as an array if there are multiple objects, or as a single object if there is only one.
e.g.
If there is only one object...
{
propertyA: {
first: "A",
second: "B"
}
}
Or if there are multiple:
{
propertyA: [
{
first: "A",
second: "B"
},
{
first: "A",
second: "B"
}
]
}
Does anybody have a good way of dealing with this scenario?
Ideally we'd like to serialize both to
public class ApiResult{
public ApiItem[] PropertyA {get;set;}
}
This works for the second example but of you encounter the first example you get a A first chance exception of type 'System.MissingMethodException' occurred in System.Web.Extensions.dll
Additional information: No parameterless constructor defined for type of 'ApiItem[]'.
I assume the class definition is as below
public class ApiResult
{
public ApiItem[] PropertyA { get; set; }
}
public class ApiItem
{
public string First { get; set; }
public string Second { get; set; }
}
You can deserialize the json into a dynamic variable, then check the type of d.propertyA. If it's JArray then propertyA is an array, so you can deserialize the json into a ApiResult. If it's JObject then propertyA is a single object, so you need to manually construct ApiItem and assign it to PropertyA of ApiResult. Consider the method below
public ApiResult Deserialize(string json)
{
ApiResult result = new ApiResult();
dynamic d = JsonConvert.DeserializeObject(json);
if (d.propertyA.GetType() == typeof (Newtonsoft.Json.Linq.JObject))
{
// propertyA is a single object, construct ApiItem manually
ApiItem item = new ApiItem();
item.First = d.propertyA.first;
item.Second = d.propertyA.second;
// assign item to result.PropertyA[0]
result.PropertyA = new ApiItem[1];
result.PropertyA[0] = item;
}
else if (d.propertyA.GetType() == typeof (Newtonsoft.Json.Linq.JArray))
{
// propertyA is an array, deserialize json into ApiResult
result = JsonConvert.DeserializeObject<ApiResult>(json);
}
return result;
}
The code above will return an instance of ApiResult for both json examples.
Working demo: https://dotnetfiddle.net/wBQKrp
Building upon ekad's answer, I made the code:
Shorter: as now you don't have map one by one every property inside ApiItem
Probably faster in the case of being already an Array (by not calling to both versions of JsonConvert.DeserializeObject with the same input of the json string)
Explicit about how to introduce other properties
Handling the error case of unknown type of our propertyA (the one that can be either an array or an object).
Notice that instead of JsonConvert.DeserializeObject, I call JObject.Parse, and then ToObject<> for only the part I need in that particular case:
static ApiResult Deserialize(string json)
{
JObject j = JObject.Parse(json);
var propA = j["propertyA"];
switch (propA.Type.ToString())
{
case "Object":
return new ApiResult {
PropertyA = new[]{propA.ToObject<ApiItem>()},
SomethingElse = j["somethingElse"].ToObject<string>(),
};
case "Array":
return j.ToObject<ApiResult>();
default:
throw new Exception("Invalid json with propertyA of type " + propA.Type.ToString());
}
}
The API is pretty much the same, but I've added SomethingElse (for showing how other properties can be easily handled with this approach):
public class ApiResult
{
public ApiItem[] PropertyA { get; set; }
public string SomethingElse { get; set; }
}
public class ApiItem
{
public string First { get; set; }
public string Second { get; set; }
}
Working demo: https://dotnetfiddle.net/VLbTMu
JSON# has a very lightweight tool that allows you to achieve this. It will retrieve embedded JSON, regardless of whether or not the embedded JSON is an object, or array, from within larger JSON objects:
const string schoolMetadata = #"{ "school": {...";
var jsonParser = new JsonObjectParser();
using (var stream =
new MemoryStream(Encoding.UTF8.GetBytes(schoolMetadata))) {
Json.Parse(_jsonParser, stream, "teachers");
}
Here we retrieve a "teachers" object from within a larger "school" object.
best way to Serialize/Deserialize to/from JSON is Json.NET
Popular high-performance JSON framework for .NET
Product product = new Product();
product.Name = "Apple";
product.Expiry = new DateTime(2008, 12, 28);
product.Sizes = new string[] { "Small" };
string json = JsonConvert.SerializeObject(product);
//{
// "Name": "Apple",
// "Expiry": "2008-12-28T00:00:00",
// "Sizes": [
// "Small"
// ]
//}
string json = #"{
'Name': 'Bad Boys',
'ReleaseDate': '1995-4-7T00:00:00',
'Genres': [
'Action',
'Comedy'
]
}";
Movie m = JsonConvert.DeserializeObject<Movie>(json);
string name = m.Name;
// Bad Boys
(This is all kind of background to give you context around my problem. You can skip down to "The Problem" and read that, and then maybe come back up and skim the background if you want to get straight to the point. Sorry it's a wall of text!)
I've got a bunch of terrible, terrible JSON I need to store in a database. Essentially, someone took a large XML file, and serialized it to one, big, flat JSON object by simply using the XML's XPath. Here's an example of what I mean:
Original XML:
<statistics>
<sample>
<date>2012-5-10</date>
<duration>11.2</duration>
</sample>
<sample>
<date>2012-6-10</date>
<duration>13.1</duration>
</sample>
<sample>
<date>2012-7-10</date>
<duration>10.0</duration>
</sample>
</statistics>
The Horrible JSON I Have to Work With: (basically just the XPath from above)
{
"statistics":"",
"statistics/sample":"",
"statistics/sample/date":"2012-5-10",
"statistics/sample/duration":"11.2",
"statistics/sample#1":"",
"statistics/sample/date#1":"2012-6-10",
"statistics/sample/duration#1":"13.1",
"statistics/sample#2":"",
"statistics/sample/date#2":"2012-7-10",
"statistics/sample/duration#2":"10.0",
}
And now I need to put it in a database that contains a statistics table with date and duration columns.
How I'm Currently Doing It: (or at least a simple example of how)
Tuple<string, string>[] jsonToColumn = // maps JSON value to SQL table column
{
new Tuple<string, string>("statistics/sample/date", "date"),
new Tuple<string, string>("statistics/sample/duration", "duration")
};
// Parse the JSON text
// jsonText is just a string holding the raw JSON
JavaScriptSerializer serializer = new JavaScriptSerializer();
Dictionary<string, object> json = serializer.DeserializeObject(jsonText) as Dictionary<string, object>;
// Duplicate JSON fields have some "#\d+" string appended to them, so we can
// find these and use them to help uniquely identify each individual sample.
List<string> sampleIndices = new List<string>();
foreach (string k in json.Keys)
{
Match m = Regex.Match(k, "^statistics/sample(#\\d*)?$");
if (m.Success)
{
sampleIndices .Add(m.Groups[1].Value);
}
}
// And now take each "index" (of the form "#\d+" (or "" for the first item))
// and generate a SQL query for its sample.
foreach (string index in compressionIndices)
{
List<string> values = new List<string>();
List<string> columns = new List<string>();
foreach (Tuple<string, string> t in jsonToColumn)
{
object result;
if (json.TryGetValue(t.Item1 + index, out result))
{
columns.Add(t.Item2);
values.Add(result);
}
}
string statement = "INSERT INTO statistics(" + string.Join(", ", columns) + ") VALUES(" + string.Join(", ", values) + ");";
// And execute the statement...
}
However, I'd like to use an ADO.NET Entity Data Model (or something LINQ-ish) rather than this hackery, because I need to start performing some queries before inserting and apply some updates, and creating and executing my own SQL statements is just... cumbersome. I created an ADO.NET Entity Data Model (.edmx) file and set things up, and now I can easily use this model to interact with and write to my database.
The Problem/Question
The problem is I'm not sure how to best map from my JSON to my ADO.NET Entity Data Model Statistic object (that represents a sample/record in the statistics table). The easiest would be to change my Tuple list to use something like pointers-to-members (a la Tuple<string, Statistic::*Duration>("statistics/sample/duration", &Statistic::Duration) if this were C++), but a) I don't even think this is possible in C#, and b) even if it was it makes my Tuples all have different types.
What are some of my options here? How can I best map the JSON to my Statistic objects? I'm kinda new to the LINQ world, and am wondering if there's a way (through LINQ or something else) to map these values.
It's a sub-optimal position I'm in (working with such poor JSON), and I recognize it's possible that maybe my current method is better than anything else given my situation, and if that's the case I'd even accept that as my answer. But I would really like to explore what options there are for mapping this JSON to a C# object (and ultimately to the SQL database).
If the whole problem is mapping that "JSON" you have to POCO entities, here's an example on how to Deserialize it using a Custom JavascriptConverter:
Your POCO entities:
public class Statistics
{
public Statistics()
{
Samples = new List<Sample>();
}
public List<Sample> Samples { get; set; }
}
public class Sample
{
public DateTime Date { get; set; }
public float Duration { get; set; }
}
Your StatsConverter:
public class StatsConverter : JavaScriptConverter
{
public override object Deserialize(IDictionary<string, object> dictionary, Type type, JavaScriptSerializer serializer)
{
if (dictionary == null)
throw new ArgumentNullException("dictionary");
else if (type == typeof(Statistics))
{
Statistics statistics = null;
Sample sample = null;
{
foreach (var item in dictionary.Keys)
{
if (dictionary[item] is string && item.Contains("duration"))
sample.Duration = float.Parse(dictionary[item].ToString());
else if (dictionary[item] is string && item.Contains("date"))
sample.Date = DateTime.Parse((dictionary[item].ToString()));
else if (dictionary[item] is string && item.Contains("sample"))
{
sample = new Sample();
statistics.Samples.Add(sample);
}
else if (dictionary[item] is string && item.Contains("statistics"))
statistics = new Statistics();
}
}
return statistics;
}
return null;
}
public override IDictionary<string, object> Serialize(object obj, JavaScriptSerializer serializer)
{
throw new NotImplementedException();
}
public override IEnumerable<Type> SupportedTypes
{
get { return new ReadOnlyCollection<Type>(new List<Type>(new Type[] { typeof(Statistics)})); }
}
}
Now a sample on how to Deserialize it:
string json = #"{
""statistics"":"""",
""statistics/sample"":"""",
""statistics/sample/date"":""2012-5-10"",
""statistics/sample/duration"":""11.2"",
""statistics/sample#1"":"""",
""statistics/sample/date#1"":""2012-6-10"",
""statistics/sample/duration#1"":""13.1"",
""statistics/sample#2"":"""",
""statistics/sample/date#2"":""2012-7-10"",
""statistics/sample/duration#2"":""10.0""
}";
//These are the only 4 lines you'll require on your code
JavaScriptSerializer serializer = new JavaScriptSerializer();
StatsConverter sc = new StatsConverter();
serializer.RegisterConverters(new JavaScriptConverter[] { new StatsConverter() });
Statistics stats = serializer.Deserialize<Statistics>(json);
stats object above will deserialize to a Statistics object with 3 Sample objects in its Samples collection.
If you use a ORM (e.g. EntityFramework) then you are just working against a datamodel so assuming you would then have a model class defined something like...
public class Sample
{
public string Date { get; set; }
public double Duration { get; set; }
}
Then you could do something like this...
List<Sample> samples = new List<Sample>();
Dictionary<string, object> data = ser.DeserializeObject(json) as Dictionary<string, object>;
var keys = data.Keys.ToList();
for (int i = 0; i <keys.Count; i++)
{
string k = keys[i];
if (Regex.IsMatch(k, "^statistics/sample(#\\d*)?$"))
{
samples.Add(new Sample
{
Date = (string)data[keys[i + 1]],
Duration = double.Parse((string)data[keys[i + 2]])
});
}
}
I just populate a list for the example and again if you are using something like EntityFramework then you could could just be adding the instances directly to your repository/datacontext.