Add a JsonElement to a JsonPatchDocument - c#

I'm working with creating JsonPatchDocuments (MSDN) in .NET (as opposed to just accepting through an endpoint). I would really like to be able to use System.Text.Json for this as opposed to Newtonsoft.JSON, but I can't figure out equivalent functionality. Microsoft guidance on replacing JObject does not seem to cover this behavior from what I can tell.
At this point in the code I have worked through a JsonDocument, and identified a JsonElement that is new and needs to be added to the patch.
Pulling from the RFC, here are two example documents and their JSON patch:
Start: { "foo": "bar" }
Patch: [{ "op": "add", "path": "/child", "value": { "grandchild": { } } }]
End: {"foo": "bar","child": {"grandchild": { } } }
I tried a number of different varieties and didn't save them all. But here are some ways to not successfully accomplish this:
using System.Text.Json;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
using Microsoft.AspNetCore.JsonPatch;
// JsonElement element is the change we want to see added in the patch
JsonPatchDocument patchDocument = new JsonPatchDocument();
patchDocument.Add(path, element); // Only the JsonElement property comes through. Gives [{"value":{"ValueKind":1},"path":"/child","op":"add"}]
patchDocument.Add(path, element.GetRawText()); // Obviously treated as a string so fails and gives [{"value":"{\n \"grandchild\": {\n }\n }","path":"/child","op":"add"}]
patchDocument.Add(path, element.GetString()); // Similar to the above attempt, but fails because element is an Object and throws exception. Attempting to replicate Newtonsoft's ToObject gives the next example
patchDocument.Add(path, netjson.JsonSerializer.Deserialize<object>(element.GetRawText())); // Just a more convoluted version of our first attempt. Gives [{"value":{"ValueKind":1},"path":"/child","op":"add"}]
Here is working code that uses Newtonsoft.Json to take my JsonElement and produce the correct patch output. (but is very gross)
JsonPatchDocument patchDocument = new JsonPatchDocument();
JsonDocument newDocument = JsonDocument.Parse("{ \"foo\": \"bar\", \"child\": {\"grandchild\": { } } }");
JsonElement parentObject = newDocument.RootElement;
JsonElement element = parentObject.GetProperty("child");
// This is more dynamic in real code, but hardcoded for example
patchDocument.Add("/", JObject.Parse(parentObject.GetRawText()).Property("child").Value); // Correct answer [{\"value\":{\"grandchild\":{}},\"path\":\"/\",\"op\":\"add\"}]
I can (and for the moment am) proceeding just using Newtonsoft.Json. But as that isn't really supposed to be the way of the future - is there a way to successfully provide the correct value to JsonPatchDocument.Add using System.Text.Json?

Related

Create Kafka Producer that uses a schema, but without schema registry url

Good afternoon,
I have only recently started working with Kafka and have a question about the producer in connection with the schema.
Initially I tried to build a simple producer without a schema in C#. This works so far, the code is also given in a shortened version.
Code of producer without schema:
var config = new ProducerConfig
{
BootstrapServers = "localhost:9092",
BrokerAddressFamily = BrokerAddressFamily.V4,
};
using (var producer = new ProducerBuilder<Null, string>(config).Build())
{
producer.Flush();
for(int i = 0; i < 3; i++)
{
producer.Produce("topic", new Message<Null, string> { Value = "Value: " + i + "..." });
}
producer.Flush();
};
But the schema causes me problems (see next section).
Suppose I have given a consumer, say in Python, who uses the following scheme, to receive integer numbers:
{"type": "record",
"name": "Sample",
"fields": [{"name": "Value", "type": ["int"]}]
}
I now want to create a C# producer that uses this scheme and send messages to the Python consumer. The message should contain only numbers, according to the scheme.
I tried to build a producer that uses the schema, but unfortunately in many tutorials the "schema registry url" is necessary to run the producer. This is what I have, but unfortunately I cannot avoid the use of "schema registry url"...
Code of producer that uses the schema
using Avro;
using Avro.Generic;
using Confluent.Kafka.SyncOverAsync;
using Confluent.SchemaRegistry.Serdes;
using Confluent.SchemaRegistry;
using System;
using System.Threading;
using System.Text;
using System.Threading.Tasks;
using System.Diagnostics;
using System.IO;
namespace producer
{
class Program
{
static async Task Main(string[] args)
{
//This is not further specified, nothing is set up here
string schemaRegistryUrl = "localhost:8081";
var schema = (RecordSchema)RecordSchema.Parse(
#"{
""type"": ""record"",
""name"": ""Sample"",
""fields"": [
{""name"": ""Value"", ""type"": [""int""]}
]
}"
);
var config = new Confluent.Kafka.ProducerConfig
{
BootstrapServers = "localhost:9092",
BrokerAddressFamily = BrokerAddressFamily.V4,
};
using (var schemaRegistry = new CachedSchemaRegistryClient(new SchemaRegistryConfig { Url = schemaRegistryUrl }))
using (var producer = new Confluent.Kafka.ProducerBuilder<Confluent.Kafka.Null, GenericRecord>(config)
.SetValueSerializer(new AvroSerializer<GenericRecord>(schemaRegistry))
.Build())
{
for(int i = 0; i < 3; i++)
{
var record = new GenericRecord(schema);
record.Add("Value", i);
producer.ProduceAsync("topic", new Confluent.Kafka.Message<Confluent.Kafka.Null, GenericRecord> { Value = record });
}
producer.Flush();
}
}
}
}
Here are two questions, how can one build a producer without the "schema registry url"? I have already found something like this (but unfortunately in Java). How would it look like in C#? I'm interested in a solution from a producer that sends numbers to the Python consumer using the scheme (from above), preferably without using the "schema registry url". However, I would also be interested in how to get the "schema registry url" to work.
Just as a hint: If the producer tries to send a message to the Python consumer without a schema, the consumer registers this. But the consumer cannot display the sent number, because the simple producer does not use a scheme. I refer to the code of the very first producer!
I hope that my question(s) are as far as understandable, I am looking forward to receiving answers!
The confluent_kafka Python library requires the data adheres to the Confluent Schema Registry wire format.
This means, you're required to have the producer match that contract as well, so you cannot bypass the registry without writing your own implementation of the new AvroSerializer you've referenced in the code
If the producer tries to send a message to the Python consumer without a schema, the consumer registers this
This doesn't happen in confluent_kafka library. The consumer fetches a schema from the registry based on the ID emedded in the message.
Therefore, unless you're using some other deserializer, you'll need to write your own implementation there, as well

Convert json file into string

I'm trying to capture the names and values of all nodes coming from a random Json file that I don't know its structure (uploaded by the user).
I can loop through a json string and get the info I need but how do I start from a file? If I want to deserialize it I believe I need a class to hold that data (and I don't know the file structure).
Here's how I loop through elements from a json string:
string json = #"{
'CPU': 'Intel',
'PSU': '500W',
'Drives': [
'DVD read/writer'
/*(broken)*/,
'500 gigabyte hard drive',
'200 gigabype hard drive'
]
}";
JsonTextReader reader = new JsonTextReader(new StringReader(json));
while (reader.Read())
{
if (reader.Value != null)
{
Console.WriteLine("Token: {0}, Value: {1}", reader.TokenType, reader.Value);
}
else
{
Console.WriteLine("Token: {0}", reader.TokenType);
}
}
How do I read from a file and set it as a string that can be handled by the code above? It sounds like a basic question but I'm still struggling with this after several hours. All I've seen expects that you know the structure of the Json file and I don't in this case.
This is the exact use case that dynamic and ExpandoObject were made for! Here you can deserialize the JSON to an object, then traverse the object's properties (look up online how to work with ExpandoObjects).
var expandoConverter = new ExpandoObjectConverter();
dynamic obj = JsonConvert.DeserializeObject<ExpandoObject>(json, expandoConverter);
Or, if you were just looking to read the json from disk as a string, then use string json = File.ReadAllText(filePathAndName);
The above code snippet requires installing the package NewtonSoft.
Eriks answer was great to mention the ExpandoObject, dynamic and File call. It got my +1.
I'm adding to ErikEs answer to include the package and include details required for a minimal running program.
This code was in my Program.cs file for a Console App project type in Visual Studio 2017. You should also run install-package newtonsoft.json in the Package Manager Console. This command will load newtonsoft.json.11.0.1 pakcage. .Net Framework was 4.6.1.
using Newtonsoft.Json;
using Newtonsoft.Json.Converters;
using System.Dynamic;
using System.IO;
namespace ReadJson
{
class Program
{
static void Main(string[] args)
{
string filePath = "<full path to json file>";
string json = File.ReadAllText(filePath);
var expandoConverter = new ExpandoObjectConverter();
dynamic obj = JsonConvert.DeserializeObject<ExpandoObject>(json, expandoConverter);
//do more with your dynamic object here...
}
}
}

Newtonsoft Json.NET InvalidCastException

This is more of a tech support question, but the newtonsoft.com website says stackoverflow is the best place to ask questions.
I have a string that was serialized with 6.0.3, and fails to deserialize with 6.0.8. It throws InvalidCastException. When I downgrade to 6.0.3, it deserializes fine; when I upgrade again to 6.0.8, the exception is repeatable.
Edit:
Rather than post the actual string, which is 10KB long and contains sensitive information, I was able to create a simple reproducible test case that demonstrates the problem.
The line that throws exception is:
this.SomeStrings = (string[])infoEnum.Current.Value;
The InvalidCastException says "Unable to cast object of type 'Newtonsoft.Json.Linq.JObject' to type 'System.String[]'"
As mentioned in the comments below, I serialized an instance of FooClass with 6.0.3, and then hard-coded the string into the asdf() method and attempt to deserialize. Deserialization succeeds on 6.0.3, and fails with InvalidCastException on 6.0.8.
Obviously, in the trivial repro case below, there is no point in doing ISerializable on FooClass, but in real life I have a need to use ISerializable in a complex data type that serializes and deserializes itself as a string array; the following is just to academically illustrate reproduction of the bug behavior.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Newtonsoft.Json;
using System.Runtime.Serialization;
namespace CBClasses
{
[Serializable]
public class FooClass : ISerializable
{
public string[] SomeStrings { get; set; }
public FooClass() { }
protected FooClass(SerializationInfo info, StreamingContext context)
{
if (info == null)
throw new ArgumentNullException();
SerializationInfoEnumerator infoEnum = info.GetEnumerator();
while (infoEnum.MoveNext()) {
SerializationEntry entry = infoEnum.Current;
switch (entry.Name) {
case "SomeStrings":
this.SomeStrings = (string[])infoEnum.Current.Value;
break;
default:
throw new SerializationException("Deserialization failed on unhandled member '" + entry.Name + "'");
}
}
}
public virtual void GetObjectData(SerializationInfo info, StreamingContext context)
{
info.AddValue("SomeStrings", this.SomeStrings, this.SomeStrings.GetType());
}
}
public class NewtonsoftDebugTest
{
private static JsonSerializerSettings settings = new JsonSerializerSettings() {
TypeNameHandling = TypeNameHandling.All,
Formatting = Formatting.Indented,
PreserveReferencesHandling = PreserveReferencesHandling.Objects,
ReferenceLoopHandling = ReferenceLoopHandling.Serialize
};
public static void asdf()
{
/* FooClass foo = new FooClass() { SomeStrings = new string[0] };
* string serializedBy603 = JsonConvert.SerializeObject(foo, settings);
* Resulted in:
*
* {
* "$id": "1",
* "$type": "CBClasses.FooClass, CBClasses",
* "SomeStrings": {
* "$type": "System.String[], mscorlib",
* "$values": []
* }
* }
*
* Now hard-coded below:
*/
string serializedBy603 =
"{\n" +
" \"$id\": \"1\",\n" +
" \"$type\": \"CBClasses.FooClass, CBClasses\",\n" +
" \"SomeStrings\": {\n" +
" \"$type\": \"System.String[], mscorlib\",\n" +
" \"$values\": []\n" +
" }\n" +
"}\n";
FooClass deserialized = (FooClass)JsonConvert.DeserializeObject(serializedBy603, settings);
System.Diagnostics.Debugger.Break();
}
}
}
I did some investigation into this and can confirm that the problem first appeared in version 6.0.7 and is still reproducible with the latest release (9.0.1 as of this writing). The change appears to have been made as part of the commit to "Support reference preservation for ISerializable objects" from November 4, 2014. According to the source code diffs, the following code in the CreateISerializable() method of the JsonSerializerInternalReader class was changed from this:
if (reader.TokenType == JsonToken.StartObject)
{
// this will read any potential type names embedded in json
object o = CreateObject(reader, null, null, null, contract, member, null);
serializationInfo.AddValue(memberName, o);
}
else
{
serializationInfo.AddValue(memberName, JToken.ReadFrom(reader));
}
to this:
serializationInfo.AddValue(memberName, JToken.ReadFrom(reader));
It seems pretty clear that the former code used to handle the embedded type metadata, whereas the replacement code does not. And, in fact, I can confirm that reverting this one section of code back to its original state fixes the problem. However, without knowing the intent of this change (maybe the metadata was supposed to be handled somewhere else in the code?), I can't recommend blindly reverting it, as that might break something else. The current serialization code still adds the type metadata as before (I get the same JSON as posted in the question), so this definitely seems to be a regression on the deserialization end. If you haven't already, you might want to report an issue on GitHub. (And yes, I do realize this question is over a year and a half old; I'm just trying to provide some closure here. ;-))
As a workaround, you can extract the string array from the SerializationEntry like this:
this.SomeStrings = ((JObject)entry.Value)["$values"].ToObject<string[]>();

How to write a JSON file in C#?

I need to write the following data into a text file using JSON format in C#. The brackets are important for it to be valid JSON format.
[
{
"Id": 1,
"SSN": 123,
"Message": "whatever"
},
{
"Id": 2,
"SSN": 125,
"Message": "whatever"
}
]
Here is my model class:
public class data
{
public int Id { get; set; }
public int SSN { get; set; }
public string Message { get; set;}
}
Update 2020: It's been 7 years since I wrote this answer. It still seems to be getting a lot of attention. In 2013 Newtonsoft Json.Net was THE answer to this problem. Now it's still a good answer to this problem but it's no longer the the only viable option. To add some up-to-date caveats to this answer:
.NET Core now has the spookily similar System.Text.Json serializer (see below)
The days of the JavaScriptSerializer have thankfully passed and this class isn't even in .NET Core. This invalidates a lot of the comparisons ran by Newtonsoft.
It's also recently come to my attention, via some vulnerability scanning software we use in work that Json.Net hasn't had an update in some time. Updates in 2020 have dried up and the latest version, 12.0.3, is over a year old (2021).
The speed tests (previously quoted below but now removed as they are so out of date that they seem irrelevant) are comparing an older version of Json.Net (version 6.0 and like I said the latest is 12.0.3) with an outdated .Net Framework serialiser.
One advantage the System.Text.Json serializer has over Newtonsoft is it's support for async/await
Are Json.Net's days numbered? It's still used a LOT and it's still used by MS libraries. So probably not. But this does feel like the beginning of the end for this library that may well of just run it's course.
.NET Core 3.0+ and .NET 5+
A new kid on the block since writing this is System.Text.Json which has been added to .Net Core 3.0. Microsoft makes several claims to how this is, now, better than Newtonsoft. Including that it is faster than Newtonsoft. I'd advise you to test this yourself .
Examples:
using System.Text.Json;
using System.Text.Json.Serialization;
List<data> _data = new List<data>();
_data.Add(new data()
{
Id = 1,
SSN = 2,
Message = "A Message"
});
string json = JsonSerializer.Serialize(_data);
File.WriteAllText(#"D:\path.json", json);
or
using System.Text.Json;
using System.Text.Json.Serialization;
List<data> _data = new List<data>();
_data.Add(new data()
{
Id = 1,
SSN = 2,
Message = "A Message"
});
await using FileStream createStream = File.Create(#"D:\path.json");
await JsonSerializer.SerializeAsync(createStream, _data);
Documentation
Newtonsoft Json.Net (.Net framework and .Net Core)
Another option is Json.Net, see example below:
List<data> _data = new List<data>();
_data.Add(new data()
{
Id = 1,
SSN = 2,
Message = "A Message"
});
string json = JsonConvert.SerializeObject(_data.ToArray());
//write string to file
System.IO.File.WriteAllText(#"D:\path.txt", json);
Or the slightly more efficient version of the above code (doesn't use a string as a buffer):
//open file stream
using (StreamWriter file = File.CreateText(#"D:\path.txt"))
{
JsonSerializer serializer = new JsonSerializer();
//serialize object directly into file stream
serializer.Serialize(file, _data);
}
Documentation: Serialize JSON to a file
The example in Liam's answer saves the file as string in a single line. I prefer to add formatting. Someone in the future may want to change some value manually in the file. If you add formatting it's easier to do so.
The following adds basic JSON indentation:
string json = JsonConvert.SerializeObject(_data.ToArray(), Formatting.Indented);
There is built in functionality for this using the JavaScriptSerializer Class:
var json = JavaScriptSerializer.Serialize(data);
var responseData = //Fetch Data
string jsonData = JsonConvert.SerializeObject(responseData, Formatting.None);
System.IO.File.WriteAllText(Server.MapPath("~/JsonData/jsondata.txt"), jsonData);

Parsing the JSON with multilingual text using System.Web.Script.Serialization or System.Web.* library

I had asked this question before. The solution using Newtonsoft worked great until I deployed the website on web hosting server, where its giving me nightmares. Therefore, I am planning to use some System.Web libraries so that I don't have to deal with *.dlls and such and I can easily deploy my website.
Can someone help me parse the same json output using System.Web.Script.Serialization or any System library? Thanks a lot.
I hope your real json string is valid since the one you posted is corrupted but Json.Net tolerates it.
The only trick is deserializing to this funny type List<Dictionary<string,object>>
Here is an example with corrected json(removed trailing ,s)
string json = #"[ { ""ew"" : ""bharat"", ""hws"" : [ ""\u092D\u093E\u0930\u0924"",""\u092D\u0930\u0924"",""\u092D\u0930\u093E\u0924"",""\u092D\u093E\u0930\u093E\u0924"",""\u092C\u0939\u0930\u0924"" ] }, { ""ew"" : ""india"", ""hws"" : [ ""\u0907\u0902\u0921\u093F\u092F\u093E"",""\u0907\u0928\u094D\u0921\u093F\u092F\u093E"",""\u0907\u0923\u094D\u0921\u093F\u092F\u093E"",""\u0908\u0928\u094D\u0921\u093F\u092F\u093E"",""\u0907\u0928\u0921\u093F\u092F\u093E"" ] } ]";
var list = new JavaScriptSerializer().Deserialize<List<Dictionary<string,object>>>(json);
foreach (var dict in list)
{
var ew = (string)dict["ew"];
var firstValOfHws = ((ArrayList)dict["hws"])[0];
}
--EDIT--
OK, This should work
var serializer = new DataContractJsonSerializer(typeof(List<Result>));
var result = (List<Result>)serializer.ReadObject(new MemoryStream(Encoding.UTF8.GetBytes(json)));
public class Result
{
public string ew;
public List<string> hws;
}

Categories

Resources