How to convert UpdateResult from MongoDB C# Driver to JSON - c#

Providing some contextual information: I want to integrate some technology with MongoDB using the official C# driver, but due some limitations I need to integrate it using only JSON strings. So, I'm building a simple wrapper to the native functions to call they by passing and receiving JSON as simple strings.
Example of the find method:
public string find(string json)
{
BsonDocument query = BsonDocument.Parse(json);
var list = Collection.Find(query).ToListAsync().Result;
return list.ToJson();
}
P.S.: I know the performance implications of using an async method as a synchronous one, but I have no choice.
This works pretty well, the problem is with the update/replace method:
public string updateMany(string jsonFilter, string jsonUpdate)
{
BsonDocument filter = BsonDocument.Parse(jsonFilter);
BsonDocument update = BsonDocument.Parse(jsonUpdate);
UpdateResult r = Collection.UpdateManyAsync(filter, update).Result;
return r.ToJson();
}
This returns the string: { "_t" : "Acknowledged" }, which only tells me the class of the UpdateManyAsync() result. That class expose some properties like MatchedCount and ModifiedCount that I would like to put on JSON too, but the default serializer is ignoring them by some reason (those properties are read-only, so should be ignored on deserializing, but not on serializing).
I tried to use r.ToJson<UpdateResult>(); and r.ToBsonDocument<UpdateResult>();, but got the same result.
I saw that toJson() has some overloads receiving a JsonWriterSettings, an IBsonSerializer and a BsonSerializationArgs, so maybe one of them holds some configuration about the subject, but I have no luck searching for it.
I think that I could import Json.NET DLL to see if it serializes all the properties, but I would like to solve it without another dependency.

Related

How to cast Azure DocumentDB Document class to my POCO class?

Is there a way to cast the Microsoft.Azure.Documents.Document object to my class type?
I've written an Azure Function class, with a CosmosDBTrigger. The trigger receives an array of Microsoft.Azure.Documents.Document. I like having that Document class so that I can access the meta data about the record itself, but I also would like to interact with my data from my class type in a static way.
I see the JSON representation of my data when I call ToString. Should I manually convert that JSON to my class type using Newtonsoft?
If you need to map your Document to your POCO in the function then the easiest way to do that is what you suggested.
Call the document.Resource.ToString() method and use DeserializeObject from JSON.NET or the json library you prefer. JSON.NET is recommended however as Microsoft's CosmosDB libraries use it as well.
Your mapping call will look like this:
var yourPoco = JsonConvert.DeserializeObject<YourPocoType>(document.Resource.ToString())
While solution offered by Nick Chapsas works, I would like to offer a few better options.
Preferred solution - improve your model
First, if you are interested in the extra meta fields then you can always include the chosen properties into your data access model and they will be filled in. for example:
public class Model
{
public String id { get; set; }
public String _etag { get; set; }
//etc.
}
Then you can use the existing API for deserializing thats explicit and familiar to all. For example:
var explicitResult = await client.ReadDocumentAsync<Model>(documentUri);
Model explicitModel = explicitResult.Document;
If you want the next layer model (ex: domain model) to NOT have those storage-specific meta fields then you need to transform to another model, but that is no longer a cosmosDB-level issue and there are plenty of generic mappers to convert between POCOs.
This is the IMHO cleanest and recommended way to handing data access in cosmosDB if you work on strongly typed document models.
Alternative: dynamic
Another trick is to use dynamic as the intermediate casting step. This is short and elegant in a way, but personally using dynamic always feels a bit dirty:
var documentResult = await client.ReadDocumentAsync(documentUri);
Model dynamicModel = (dynamic)documentResult.Resource;
Alternative: read JObject
Another alternative is to read the document as NewtonSoft's JObject. This would also include all the meta fields and you could cast it further yourself without all the extra hopping between string representations. Example:
var jObjectResult = await client.ReadDocumentAsync<JObject>(documentUri);
Model JObjectResult = jObjectResult.Document.ToObject<Model>();
Alternative: Document + JObject at the same time
Should you really-really want to avoid the document level meta fields in model AND still access them then you could use a little reflection trick to get the JObject from the Document instance:
var documentResult = await client.ReadDocumentAsync(documentUri);
Document documentModel = documentResult.Resource;
var propertyBagMember = documentResult.Resource.GetType()
.GetField("propertyBag", BindingFlags.NonPublic| BindingFlags.Instance);
Model reflectionModel = ((JObject)propertyBagMember.GetValue(documentResult.Resource))
.ToObject<Model>();
Beware that the reflection trick is relying on the internal implementation details and it is not subject to backwards compatibility guarantees by library authors.
You can simply do a .ToString() in the Microsoft.Azure.Documents.Document class.
This class inherits from the Microsoft.Azure.Documents.JsonSerializable class that overrides the .ToString() method.
Here below is an example of deserializing the Document class to my Car.cs POCO using the new high-performant System.Text.Json Namespace:
Car car = JsonSerializer.Deserialize<Car>(document.ToString());

How do I serialize properties of type JToken or JObject in Elasticsearch NEST?

I'm introducing Elasticsearch into a C# API project. I'd like to leverage existing API models as search documents, many of which allow for adding custom data points. These are implemented using the JObject type from Json.NET. For example:
public class Product
{
public int Id { get; set; }
public string Name { get; set; }
public JObject ExtraProps { get; set; }
}
This allows users to send JSON request bodies like this, which works great:
{
"Id": 123,
"Name": "Thing",
"ExtraProps": {
"Color": "red",
"Size": "large"
}
}
However, if I use this as a document type in NEST, those extra properties are losing their values somehow, serializing as:
{
"Id": 123,
"Name": "Thing",
"ExtraProps": {
"Color": [],
"Size": []
}
}
Adding a [Nest.Object] attribute to ExtraProps didn't change the behavior. As I understand it, NEST uses Json.NET internally, so I wouldn't expect it to have problems with Json.NET types. Is there a relatively simple fix for this?
Here are some options I'm weighing:
Use custom serialization. I started down this path, it got to feeling way more complicated than it should be, and I never did get it working.
Map JObjects to Dictionary<string, object>s. I have verified this works, but if there are nested objects (which there could be), I'll need to enhance it with recursion. And, ideally, I'd like this to work with the more general JToken type. This is the option I'm leaning toward, but again, it feels more complicated than it should be.
Use the "Low Level" client or even raw HTTP calls. Admittedly I haven't explored this, but if it's really simpler/cleaner than the alternatives, I'm open to it.
Report this as a bug. I'll probably do this regardless. I just have a hunch this should work with JObject or any JToken out of the box, unless there is some reason that this is intended behavior.
This is expected behaviour with NEST 6.x.
NEST uses Json.NET for serialization. In NEST 6.x however, this dependency was internalized within the NEST assembly by
IL-merging all Json.NET types into the NEST assembly
renamespacing the types within Newtonsoft.Json to Nest.Json
marking all types internal
There's a blog post with further details explaining the motivations behind this change.
When it comes to handling Json.NET types such as Newtonsoft.Json.Linq.JObject, Json.NET has special handling for these types for serialization/deserialization. With NEST 6.x, the internalized Json.NET does not know how to specially handle Newtonsoft.Json.Linq.JObject because all types within the internalized Json.NET have been renamespaced to the Nest.Json namespace.
To support Json.NET types, a serializer that uses Json.NET to serialize your documents needs to be hooked up. The NEST.JsonNetSerializer nuget package was created to help with this. Simply add a reference to NEST.JsonNetSerializer to your project, then hook up the serializer as follows
// choose the appropriate IConnectionPool for your use case
var pool = new SingleNodeConnectionPool(new Uri("http://localhost:9200"));
var connectionSettings =
new ConnectionSettings(pool, JsonNetSerializer.Default);
var client = new ElasticClient(connectionSettings);
With this is place, documents with JObject properties will be serialized as expected.

Send raw bulk index query in NEST or serialize ONLY the document in a specific way

I need to enable Objects TypeHandling for the JsonSerializer that is used for bulk index query. However, when I change the Serializer settings for NEST, the bulk query is being serialized wrong as a whole.
The serializer I used:
public class SearchJsonNetSerializer : JsonNetSerializer
{
public SearchJsonNetSerializer(IConnectionSettingsValues settings)
: base(settings)
{
}
protected override void ModifyJsonSerializerSettings(JsonSerializerSettings settings)
{
settings.Formatting = Formatting.None;
settings.TypeNameHandling = TypeNameHandling.Objects;
}
}
The output I got:
{"index":{"$type":"Nest.BulkIndexOperation`1[[TestProject.TestDTO, TestProject]], Nest","_type":"testdto","_id":"146949756709543936"}}
{"$type":"TestProject.TestDTO, TestProject","Id":146949756709543936,"Title":"test","TitleRaw":"test"}
The second line is correct, however, NEST used the serializer settings to serialize the initial line in a way that totally destroys the request.
Is there a way to apply the changed serializing only to the actual object? If not, is there a way to send a raw, prepared json string as a request for a bulk query? I've seen that functionality in older version, but in the current one - 2.0, I just can't find a way to do that...
This is related to https://github.com/elastic/elasticsearch-net/issues/1155
Sadly you can not do the following in JSON.NET
[JsonObject(TypeNameHandling = TypeNameHandling.Objects)]
public class MyPoco {}
That would solve the issue at hand, only enabling that typenamehandling for your specific types. Sadly it can only be specified on properties. Would make a great feature request there.
You have two options either writing a custom serializer for your type, or preserialize them and send them using the lowlevel client, but then you need to add the metadata items manually as well.
var client = new ElasticClient().LowLevel.Bulk<BulkResponse>("index", "type", new[]
{
"",
});
NEST does provide several ways to get true covariant search results without having to index $type:
https://www.elastic.co/guide/en/elasticsearch/client/net-api/current/covariant-search-results.html

Best way to convert C# Class to JSON representation

I've got c# objects that I need references to in javascript for default population. Currently I'm maintaining 2 different objects which is not that maintainable.
For example ( simplified for demo purposes ):
C#
public class Text
{
public string Name {get;set;}
}
JSON
{
'text': {
name: undefined
}
}
I know there is a number of ways to accomplish this but wondering if anyone has a recommended solution. Thanks!
I personally recommend json.NET. Getting the json of any object is as simple as;
using Newtonsoft.Json;
string json = JsonConvert.SerializeObject(new Text { Name = "test" });
There are a lot of other options but I've been using it since before there was json serilization support in .NET and I strongly prefer it over what is there now. In fact I think it's better in every way, if you want a big robust data layer I like it more and it's vastly superior for one off serilizations.
If you are using .NET 4.0 or above, you can use DataContractJsonSerializer class.
I recommend you to look at this benchmark http://theburningmonk.com/2013/09/binary-and-json-serializer-benchmarks-updated/

Build a WCF with generic type

I want to build a service that will pass the data read from the database to the client in JSON format. I don't know the schema table and the types.
I thought about implementing the WCF over Dictionary but the JSON is very complicated and contains objects like "key = ...; value = ..." and i want just "key=value" and i need to return list of Dictionary objects. Sometimes from database i will receive a comma separated array, so i will insert in my Dictionary a key with a new Dictionary as value.
In PHP my boss said that it can be done through associative arrays. Please help me with some ideas or link because i don't know where to start to look.
If there is something that you didn't understood please comment and i will try another explanation.
Edits:
I need it to be a rest service, so JSON is mandatory.
How can i load data from the table ? What type can i use ?
Edit #2 : This is what i want to get : CorectJSON
Edit #3 : This is my current json :
stdClass Object
(
[areaGetStreetTypesResult] => stdClass Object
(
[responseMessage] => [{"name":"IMPASSE","street_type":"IMP"}{"name":"LOTISSEMENT","street_type":"LOT"}{"name":"ROUTE","street_type":"RTE"}{"name":"RUE","street_type":"RUE"}]
[response_status] => stdClass Object
(
[message] => Success : JSON created into the responseMessage variable !
[status] => 0
)
)
)
Is not containing some commas between so it cannot be decoded by php. What should i do ?
This is my method Code
I think that doing everything as a dictionary in webservice API is bad practice and I hate when I need to work with API's like this. If it is a WCF, it produces WSDL and WDSL describes the data is going in and out, so if everything is dictionary, WSDL can not provide anything meaningfull, so your datacontracts tell you nothing about the data.
If you need simply forward database data through webservice, WCF has DataServices http://msdn.microsoft.com/en-us/data/bb931106 although I think you should create API that fits your business needs and is not simple proxy between database and your client.
What is the reason why you need to pass JSON? If you want to create a WCF REST service, it is sufficient to tell WCF to create JSON messages as described here: http://www.codeproject.com/Articles/327420/WCF-REST-Service-with-JSON
If you access the service from a C# application, you don't need to care about how data is passed back and forth. Just take "normal" method parameters and use return values like you'd do locally and you're set.
Example:
string[] GetResultStrings(List<Rectangle> sourceRectangles);
If you really need to pass JSON strings, just pass strings and use the JSON serializer and deserializer to encode the reply and decode the parameters.
For example:
string GetJSONString(string jsonRequest);
The following information may help on using the JSON serializer and deserializer: http://www.codeproject.com/Articles/272335/JSON-Serialization-and-Deserialization-in-ASP-NET
EDIT
I'm using the following method to serialize serializable objects to JSON:
public static string SerializeJSON(this object obj)
{
DataContractJsonSerializer serializer = new DataContractJsonSerializer(obj.GetType());
using (MemoryStream ms = new MemoryStream())
{
serializer.WriteObject(ms, obj);
return Encoding.UTF8.GetString(ms.ToArray());
}
}
This works just fine for any DataContract class like:
[DataContract]
public class MyJSONReturnableClass
{
[DataMember]
public string ThisBecomesANamedString;
[DataMember]
public MyJSONReturnableClass[] AndWorksAlsoForNestedArrays;
}
Populate your dictionary, then serialize it using JSon.
Pass it to your client using WCF or RabbitMq...
JsonConvert.SerializeObject(yourDict);
Download the NewtonSoft.dll
Put using:
using Newtonsoft.Json;

Categories

Resources