I am rather new in strongly typed languages and I working on a Umbraco controller that outputs some JSON with a list of dates.
"meetingTimes": [10:30, 11:30]
That works pretty well. Now I want to output an extry field with the time containing a unique key
So it should be like
meetingTimes: [{ time: "10:30", key: "abcd-1234-efgh-5678" }, { time: "11:30", key: "defg-1234-sktg-5678" }]
But I can't figure out how to do it.
The part of my current code that handles this is:
try {
IPublishedContent content = Umbraco.Content(Guid.Parse("ff3e93f6-b34f-4664-a08b-d2eae2a0adbd"));
var meetingDatesAvailabled = content.Value<IEnumerable<IPublishedElement>>("meetingDatesAvailable");
var items = new List<object>();
foreach(var meetingDate in meetingDatesAvailabled)
{
if (meetingDate.Value("meetingItemDay").ToString().Substring(0, 8) == theDate) {
var times = meetingDate.Value<IEnumerable<IPublishedElement>>("meetingItemDayTimes");
foreach (var time in times)
{
items.Add(time.Value("meetingdateTimeItem").ToString());
}
}
}
return new { dateChosen = theDate, meetingTimes = items };
}
Initially, we have to create a class that abstract the meeting time:
public class MeetingTime
{
public string Time { get; }
public Guid Key { get; }
public MeetingTime(string time, Guid key)
{
Time = time;
Key = key;
}
}
Then we are going to create an empty list of MeetingItem, instead of creating an empty list of object, var items = new List<object>();.
var items = new List<MeetingItem>();
Then inside you foreach:
foreach (var time in times)
{
items.Add(new MeetingTime(time.Value("meetingdateTimeItem").ToString(),
Guid.NewGuid())
);
}
Note: I am not aware of the way you serialize your objects (e.g. https://www.newtonsoft.com), but quite probably you have to decorate both properties Time and Key with an attribute for using a proper name during the serialization. If you don't do so, then it would use the default names, Time and Key. I say this, because I noticed in the json you shared that you want to be time and key.
Related
I have a concurrent dictionary that I am storing key/value pairs in. The complexity here is that the values are not single objects like string or integer, they are a collection of items that are added using a class model. It seems looking at the debug, I am able to add the items to the dictionary successfully, but the issue I have is not knowing how to read these items when iterating through the dictionary from another thread. I simply want to read all the key/value pairs, including all the items and add them to an array that I can pass back to the user through JsonResult/Ajax.
My AlarmEngine Class:
public static ConcurrentDictionary<string, object> concurrentDictionary = new ConcurrentDictionary<string, object>();
public class Alarm
{
public int AlarmId { get; set; }
public string AlarmName { get; set; }
public string AlarmDescription { get; set; }
public string ApplicationRoleId { get; set; }
public DateTime DateTimeActivated { get; set; }
}
An extract from the method where I add items to the dictionary:
concurrentDictionary.TryAdd(DateTime.Now.ToString("MM/dd/yyyy HH:mm:ss"), new Alarm
{
DateTimeActivated = DateTime.Now,
ApplicationRoleId = applicationRoleId,
AlarmId = alarmId,
AlarmName = alarmName,
AlarmDescription = description,
}); // I'm using datetime as the key as this is unique form each key/value pair added.
My AlarmList MVC Controller:
[HttpGet]
public Task<IActionResult> FetchAlarmsListAsync()
{
// Enumeration is thread-safe in ConcurrentDictionary.
foreach (var item in AlarmEngine.concurrentDictionary)
{
// How do I access the key and value pairs here?
// Note each value contains a subset of items
// i want to access all items stored
}
return new JsonResult(Array containing all keys and value items);
}
If it's better to read through the whole dictionary and store all keys/value(with sub items) to a list before returning to the user as an JSON array, this would be an acceptable solution for me as well, just keep in mind I want to do this as efficiently as possible to minimize converting from one form to another too many times. Items will be added and removed to the dictionary from other threads, but this particular bit of my application i.e. iteration through the dictionary is just for readonly purposes and it doesn't really matter to me if the contents have changed whilst reading through it, it's just to put together a list of alarms relevant/active at the time of the query.
You need to store the data as the right class, do not use object
public static ConcurrentDictionary<string, Alarm> concurrentDictionary = new ConcurrentDictionary<string, Alarm>();
// ^^^
public class Alarm
{
public int AlarmId { get; set; }
public string AlarmName { get; set; }
public string AlarmDescription { get; set; }
public string ApplicationRoleId { get; set; }
public DateTime DateTimeActivated { get; set; }
}
Now you can access the properties of the alarm.
// Enumeration is thread-safe in Dictionary.
foreach (var item in AlarmEngine.concurrentDictionary)
{
Trace.WriteLine(item.Key); // which is the datetime
Trace.WriteLine(item.Value.AlarmName);
// How do I access the key and value pairs here?
// Note each value contains a subset of items
// i want to access all items stored
}
One thing you should change is not storing the DateTime as string.
public static ConcurrentDictionary<DateTime, Alarm> concurrentDictionary = new ConcurrentDictionary<DateTime, Alarm>();
If these resources are shared amoung multiple clients/threads and mostly the clients are reading. I would suggest a ReaderWriterLockSlim. It allows multiple readers and one writer.
This way the jsons can be generated parallel. A ConcurrentX is usefull passing resources to others thread (like passing ownership) or just share simple typed data.
private static ReaderWriterLockSlim cacheLock = new ReaderWriterLockSlim();
[HttpGet]
public Task<IActionResult> FetchAlarmsListAsync()
{
cacheLock.EnterReadLock();
try
{
// Enumeration is thread-safe in Dictionary.
foreach (var item in AlarmEngine.normalDictionary)
{
// How do I access the key and value pairs here?
// Note each value contains a subset of items
// i want to access all items stored
}
return new JsonResult(Array containing all keys and value items);
}
finally
{
cacheLock.ExitReadLock();
}
}
cacheLock.EnterWriteLock();
try
{
normalDictionary.TryAdd(DateTime.Now.ToString("MM/dd/yyyy HH:mm:ss"), new Alarm
{
DateTimeActivated = DateTime.Now,
ApplicationRoleId = applicationRoleId,
AlarmId = alarmId,
AlarmName = alarmName,
AlarmDescription = description,
}); // I'm using datetime as the key as this is unique form each key/value pair added.
}
finally
{
cacheLock.ExitWriteLock();
}
It's even possible to cache the json result when the dictionary isn't changed, but you should only do such thing when it is a problem.
I am working on .Net project. I have my Product model below.
class Product
{
public IEnumerable<OptionData> Options { get; set; }
}
Then I have OptionData model below.
public class OptionData
{
public Colour PrimaryColour { get; set; }
public Colour SecondaryColour { get; set; }
public IEnumerable<SizeData> Sizes { get; set; }
}
Then I have SizeData model below.
public class SizeData
{
public string KeycodeNumber { get; set; }
public Size Size { get; set; }
}
Then I have my size model below.
public class Size
{
public string Name { get; set; }
}
Then I am sending data using these models to some messaging system. In my case it is confluent kafka.
Options = new Option[]
{
new Option
{
PrimaryColour = new CodeNamePair
{
Name = "White",
},
SecondaryColour = new CodeNamePair
{
Name = "red",
},
Sizes = new SizeElement[]
{
new SizeElement
{
Size = new KafkaProductEvent.Size
{
Name = "10"
},
KeycodeNumber = 232
}
}
}
}
Then through consumer I am extracting data. I am able to get PrimaryColour or SecondaryColour as below.
IEnumerable<object> options = (IEnumerable<object>)((GenericRecord)response.Message.Value["Product"])["Options"];
foreach (var data in options)
{
OptionData optionData = new OptionData()
{
PrimaryColour = new Colour()
{
Name = (string)((GenericRecord)((GenericRecord)data)["PrimaryColour"])["Name"],
},
SecondaryColour = new Colour()
{
Name = (string)((GenericRecord)((GenericRecord)data)["SecondaryColour"])["Name"]
}
};
}
Then I want to get Sizes data as well. I tried something like this.
Sizes = new SizeData[]
{
new SizeData()
{
Size = new ProductEvents.Size()
{
Name = "";
}
}
}
I am not sure how to get size name from above. Can someone help me to find it out. Any help would appreciated. Thanks
Main challenge that I see in the code posted by you is the kind of APIs exposed by the client adapter you are using for deserializing the complex data model with multiple aggregated objects, challenge with what you are doing is, typecasting every record in every hierarchy to GenericRecord, then typecast to actual .Net type object, which means as the aggregated hierarchy grows, it will make it extremely complex to deserialize the actual object, especially with aggregated collections:
Also a point related to deserializing Options :
class Product
{
public IEnumerable<OptionData> Options { get; set; }
}
Your code is:
IEnumerable<object> options = (IEnumerable<object>)((GenericRecord)response.Message.Value["Product"])["Options"];
What I am wondering is why you cannot type cast to IEnumerable<OptionData>, to make it little simple and let's assume if that's not possible, then while enumerating through the IEnumerable<object> options why it can't still be type casted to the OptionData object, challenge with the adapter or approach you are using is that it needs complete object and hierarchy / property name awareness to de-serialize, when ideally once you fill the top level object like Product in this case, rest all shall recursively fill in, a good example is Newtonsoft Json, it will automatically fill object of any complexity, will make anything unavailable as null / default and require the minimal deserialization code.
Actually what you can do is, develop your own adapter that reads property details via reflection and fill data that is available in the input or discard it. For now assuming, this is all that you have as APIs, then following shall be the approach:
IEnumerable<object> options = (IEnumerable<object>)((GenericRecord)response.Message.Value["Product"])["Options"];
foreach (var data in options)
{
OptionData optionData = new OptionData()
{
PrimaryColour = new Colour()
{
Name = (string)((GenericRecord)((GenericRecord)data)["PrimaryColour"])["Name"],
},
SecondaryColour = new Colour()
{
Name = (string)((GenericRecord)((GenericRecord)data)["SecondaryColour"])["Name"]
},
Sizes = new List<SizeData>() // Initialize Collection Here
};
IEnumerable<object> SizesEnumerable = (IEnumerable<object>)(((GenericRecord)data)["Sizes"]);
foreach (var size in SizesEnumerable)
{
var sizeValue = new SizeData
{
KeycodeNumber = (string)((GenericRecord)size)["KeycodeNumber"],
Size = new Size
{
Name = (string)((GenericRecord)((GenericRecord)size)["Size"])["Name"]
}
};
((List<SizeData>)optionData.Sizes).Add(sizeValue); // Add Data here
}
}
What's the difference ?
You are trying to use the object initializer to fill in the IEnumerable<SizeData> Sizes to fill in the collection, but that doesn't provide option to do further processing as required in your case
Also note I have made IEnumerable<SizeData> Sizes as List<SizeData>, since we cannot use object initializer, so we cannot use array, since we don't know the size in advance
Going further used the same logic as yours to fill the data
Scenario:
I have to export an excel file which will contain list of Parts. We have enabled the user to select the columns and get only selected columns' data in the exported file. Since this is a dynamic report, I am not using any concrete class to map the report as this will result in exporting empty column headers in the report, which is unnecessary. I am using Dynamic Linq to deal with this scenario.
I have a list of dynamic objects fetched from dynamic linq.
[
{"CleanPartNo":"Test","Description":"test","AliasPartNo":["258","145","2313","12322"]},
{"CleanPartNo":"Test1","Description":"test1","AliasPartNo":[]}
]
How can I get 4 rows out of this json like
Please note that I cannot use a strongly typed object to deserialize/ Map it using JSON.Net
Update
Following is the code:
public class Part
{
public int Id { get; set; }
public string CleanPartNo { get; set; }
public string Description { get; set; }
public List<PartAlias> AliasPartNo { get; set; }
}
public class PartAlias
{
public int PartId { get; set; }
public int PartAliasId { get; set; }
public string AliasPartNo { get; set; }
}
var aliases = new List<PartAlias> {
new PartAlias{AliasPartNo="258" },
new PartAlias{AliasPartNo="145" },
new PartAlias{AliasPartNo="2313" },
new PartAlias{AliasPartNo="12322" }
};
List<Part> results = new List<Part> {
new Part{CleanPartNo="Test", Description= "test", PartAlias=aliases },
new Part{CleanPartNo="Test1", Description= "test1" }
};
var filters = "CleanPartNo,Description, PartAlias.Select(AliasPartNo) as AliasPartNo";
var dynamicObject = JsonConvert.SerializeObject(results.AsQueryable().Select($"new ({filters})"));
in the dynamicObject variable I get the json mentioned above
Disclaimer: The following relies on anonymous classes, which is not exactly the same as dynamic LINQ (not at all), but I figured that it may help anyway, depending on your needs, hence I decided to post it.
To flatten your list, you could go with a nested Select, followed by a SelectMany (Disclaimer: This assumes that every part has at least one alias, see below for the full code)
var flattenedResult = result.Select(part => part.AliasPartNumber.Select(alias => new
{
CleanPartNo = part.CleanPartNo,
Description = part.Description,
AliasPartNo = alias.AliasPartNo
})
.SelectMany(part => part);
You are first projecting your items from result (outer Select). The projection projects each item to an IEnumerable of an anonymous type in which each item corresponds to an alias part number. Since the outer Select will yield an IEnumerable<IEnumerable> (or omething alike), we are using SelectMany to get a single IEnumerable of all the items from your nested IEnumerables. You can now serialize this IEnumerable of instances of an anonymous class with JsonConvert
var json = sonConvert.SerializeObject(flatResults);
Handling parts without aliases
If there are no aliases, the inner select will yield an empty IEnumerable, hence we will have to introduce a special case
var selector = (Part part) => part.AliasPartNumber?.Any() == true
? part.AliasPartNumber.Select(alias => new
{
CleanPartNo = part.CleanPartNo,
Description = part.Description,
AliasPartNo = alias.AliasPartNo
})
: new[]
{
new
{
CleanPartNo = part.CleanPartNo,
Description = part.Description,
AliasPartNo = alias.AliasPartNo
}
};
var flattenedResult = result.Select(selector).SelectMany(item => item);
From json you provided you can get values grouped by their name in this way:
var array = JArray.Parse(json);
var lookup = array.SelectMany(x => x.Children<JProperty>()).ToLookup(x => x.Name, x => x.Value);
then this is just a manner of simple loop over the lookup to fill the excel columns.
However, I would suggest to do the flatenning before JSON. I tried for some time to make it happen even without knowing the names of the columns that are arrays, but I failed, and since it's your job, I won't try anymore :P
I think the best way here would be to implement custom converter that would just multiply objects for properties that are arrays. If you do it well, you would get infinite levels completely for free.
I am using Newtonsoft.Json 11.0.2 in .Net core 2.0.
If i use JObject, i am able to SelectToken like so:
JObject.Parse("{\"context\":{\"id\":42}}").SelectToken("context.id")
Returns
42
However, if i use JRaw, i get null for the same path?
new JRaw("{\"context\":{\"id\":42}}").SelectToken("context.id")
returns
null
Due to how my code is setup, my model is already in JRaw, and converting it to JObject just to select this token seems like a waste of RAM (this call is on the hot path).
UPDATE
Ok, my actual data comes down in a model where only one of the properties is JRaw, so i need something like the below to work:
JsonConvert.DeserializeObject<Dictionary<string, JRaw>>(
"{\"a\":{\"context\":{\"id\":42}}}")["a"].SelectToken("context.id")
The above returns null again.
Title might be a bit misleading, but basically what the OP needs is a way to parse an existing (and large) JRaw object without consuming too much memory.
I ran some tests and I was able to find a solution using a JsonTextReader.
I don't know the exact structure of the OP's json strings, so I'll assume something like this:
[
{
"context": {
"id": 10
}
},
{
"context": {
"id": 20
}
},
{
"context": {
"id": 30
}
}
]
Result would be an integer array with the id values (10, 20, 30).
Parsing method
So this is the method that takes a JRaw object as a parameter and extracts the Ids, using a JsonTextReader.
private static IEnumerable<int> GetIds(JRaw raw)
{
using (var stringReader = new StringReader(raw.Value.ToString()))
using (var textReader = new JsonTextReader(stringReader))
{
while (textReader.Read())
{
if (textReader.TokenType == JsonToken.PropertyName && textReader.Value.Equals("id"))
{
int? id = textReader.ReadAsInt32();
if (id.HasValue)
{
yield return id.Value;
}
}
}
}
}
In the above example I'm assuming there is one and only one type of object with an id property.
There are other ways to extract the information we need - e.g. we can check the token type and the path as follows:
if (textReader.TokenType == JsonToken.Integer && textReader.Path.EndsWith("context.id"))
{
int id = Convert.ToInt32(textReader.Value);
yield return id;
}
Testing the code
I created the following C# classes that match the above json structure, for testing purposes:
public class Data
{
[JsonProperty("context")]
public Context Context { get; set; }
public Data(int id)
{
Context = new Context
{
Id = id
};
}
}
public class Context
{
[JsonProperty("id")]
public int Id { get; set; }
}
Creating a JRaw object and extracting the Ids:
class Program
{
static void Main(string[] args)
{
JRaw rawJson = CreateRawJson();
List<int> ids = GetIds(rawJson).ToList();
Console.Read();
}
// Instantiates 1 million Data objects and then creates a JRaw object
private static JRaw CreateRawJson()
{
var data = new List<Data>();
for (int i = 1; i <= 1_000_000; i++)
{
data.Add(new Data(i));
}
string json = JsonConvert.SerializeObject(data);
return new JRaw(json);
}
}
Memory Usage
Using Visual Studio's Diagnostic tools I took the following snapshots, to check the memory usage:
Snapshot #1 was taken at the beginning of the console application (low memory as expected)
Snapshot #2 was taken after creating the JRaw object
JRaw rawJson = CreateRawJson();
Snapshot #3 was taken after extracting the ids
List ids = GetIds(rawJson).ToList();
(This is all kind of background to give you context around my problem. You can skip down to "The Problem" and read that, and then maybe come back up and skim the background if you want to get straight to the point. Sorry it's a wall of text!)
I've got a bunch of terrible, terrible JSON I need to store in a database. Essentially, someone took a large XML file, and serialized it to one, big, flat JSON object by simply using the XML's XPath. Here's an example of what I mean:
Original XML:
<statistics>
<sample>
<date>2012-5-10</date>
<duration>11.2</duration>
</sample>
<sample>
<date>2012-6-10</date>
<duration>13.1</duration>
</sample>
<sample>
<date>2012-7-10</date>
<duration>10.0</duration>
</sample>
</statistics>
The Horrible JSON I Have to Work With: (basically just the XPath from above)
{
"statistics":"",
"statistics/sample":"",
"statistics/sample/date":"2012-5-10",
"statistics/sample/duration":"11.2",
"statistics/sample#1":"",
"statistics/sample/date#1":"2012-6-10",
"statistics/sample/duration#1":"13.1",
"statistics/sample#2":"",
"statistics/sample/date#2":"2012-7-10",
"statistics/sample/duration#2":"10.0",
}
And now I need to put it in a database that contains a statistics table with date and duration columns.
How I'm Currently Doing It: (or at least a simple example of how)
Tuple<string, string>[] jsonToColumn = // maps JSON value to SQL table column
{
new Tuple<string, string>("statistics/sample/date", "date"),
new Tuple<string, string>("statistics/sample/duration", "duration")
};
// Parse the JSON text
// jsonText is just a string holding the raw JSON
JavaScriptSerializer serializer = new JavaScriptSerializer();
Dictionary<string, object> json = serializer.DeserializeObject(jsonText) as Dictionary<string, object>;
// Duplicate JSON fields have some "#\d+" string appended to them, so we can
// find these and use them to help uniquely identify each individual sample.
List<string> sampleIndices = new List<string>();
foreach (string k in json.Keys)
{
Match m = Regex.Match(k, "^statistics/sample(#\\d*)?$");
if (m.Success)
{
sampleIndices .Add(m.Groups[1].Value);
}
}
// And now take each "index" (of the form "#\d+" (or "" for the first item))
// and generate a SQL query for its sample.
foreach (string index in compressionIndices)
{
List<string> values = new List<string>();
List<string> columns = new List<string>();
foreach (Tuple<string, string> t in jsonToColumn)
{
object result;
if (json.TryGetValue(t.Item1 + index, out result))
{
columns.Add(t.Item2);
values.Add(result);
}
}
string statement = "INSERT INTO statistics(" + string.Join(", ", columns) + ") VALUES(" + string.Join(", ", values) + ");";
// And execute the statement...
}
However, I'd like to use an ADO.NET Entity Data Model (or something LINQ-ish) rather than this hackery, because I need to start performing some queries before inserting and apply some updates, and creating and executing my own SQL statements is just... cumbersome. I created an ADO.NET Entity Data Model (.edmx) file and set things up, and now I can easily use this model to interact with and write to my database.
The Problem/Question
The problem is I'm not sure how to best map from my JSON to my ADO.NET Entity Data Model Statistic object (that represents a sample/record in the statistics table). The easiest would be to change my Tuple list to use something like pointers-to-members (a la Tuple<string, Statistic::*Duration>("statistics/sample/duration", &Statistic::Duration) if this were C++), but a) I don't even think this is possible in C#, and b) even if it was it makes my Tuples all have different types.
What are some of my options here? How can I best map the JSON to my Statistic objects? I'm kinda new to the LINQ world, and am wondering if there's a way (through LINQ or something else) to map these values.
It's a sub-optimal position I'm in (working with such poor JSON), and I recognize it's possible that maybe my current method is better than anything else given my situation, and if that's the case I'd even accept that as my answer. But I would really like to explore what options there are for mapping this JSON to a C# object (and ultimately to the SQL database).
If the whole problem is mapping that "JSON" you have to POCO entities, here's an example on how to Deserialize it using a Custom JavascriptConverter:
Your POCO entities:
public class Statistics
{
public Statistics()
{
Samples = new List<Sample>();
}
public List<Sample> Samples { get; set; }
}
public class Sample
{
public DateTime Date { get; set; }
public float Duration { get; set; }
}
Your StatsConverter:
public class StatsConverter : JavaScriptConverter
{
public override object Deserialize(IDictionary<string, object> dictionary, Type type, JavaScriptSerializer serializer)
{
if (dictionary == null)
throw new ArgumentNullException("dictionary");
else if (type == typeof(Statistics))
{
Statistics statistics = null;
Sample sample = null;
{
foreach (var item in dictionary.Keys)
{
if (dictionary[item] is string && item.Contains("duration"))
sample.Duration = float.Parse(dictionary[item].ToString());
else if (dictionary[item] is string && item.Contains("date"))
sample.Date = DateTime.Parse((dictionary[item].ToString()));
else if (dictionary[item] is string && item.Contains("sample"))
{
sample = new Sample();
statistics.Samples.Add(sample);
}
else if (dictionary[item] is string && item.Contains("statistics"))
statistics = new Statistics();
}
}
return statistics;
}
return null;
}
public override IDictionary<string, object> Serialize(object obj, JavaScriptSerializer serializer)
{
throw new NotImplementedException();
}
public override IEnumerable<Type> SupportedTypes
{
get { return new ReadOnlyCollection<Type>(new List<Type>(new Type[] { typeof(Statistics)})); }
}
}
Now a sample on how to Deserialize it:
string json = #"{
""statistics"":"""",
""statistics/sample"":"""",
""statistics/sample/date"":""2012-5-10"",
""statistics/sample/duration"":""11.2"",
""statistics/sample#1"":"""",
""statistics/sample/date#1"":""2012-6-10"",
""statistics/sample/duration#1"":""13.1"",
""statistics/sample#2"":"""",
""statistics/sample/date#2"":""2012-7-10"",
""statistics/sample/duration#2"":""10.0""
}";
//These are the only 4 lines you'll require on your code
JavaScriptSerializer serializer = new JavaScriptSerializer();
StatsConverter sc = new StatsConverter();
serializer.RegisterConverters(new JavaScriptConverter[] { new StatsConverter() });
Statistics stats = serializer.Deserialize<Statistics>(json);
stats object above will deserialize to a Statistics object with 3 Sample objects in its Samples collection.
If you use a ORM (e.g. EntityFramework) then you are just working against a datamodel so assuming you would then have a model class defined something like...
public class Sample
{
public string Date { get; set; }
public double Duration { get; set; }
}
Then you could do something like this...
List<Sample> samples = new List<Sample>();
Dictionary<string, object> data = ser.DeserializeObject(json) as Dictionary<string, object>;
var keys = data.Keys.ToList();
for (int i = 0; i <keys.Count; i++)
{
string k = keys[i];
if (Regex.IsMatch(k, "^statistics/sample(#\\d*)?$"))
{
samples.Add(new Sample
{
Date = (string)data[keys[i + 1]],
Duration = double.Parse((string)data[keys[i + 2]])
});
}
}
I just populate a list for the example and again if you are using something like EntityFramework then you could could just be adding the instances directly to your repository/datacontext.