I'm trying to deserialize a list of heavy objects from a json file. I do not want to deserialize it the classic way, like directly to a list, because it will expose me to an OutOfMemory exception. So I'm looking for a way to handle object one by one to store them one by one in the database and be memory safe.
I already handle the serialization and it's working well, but I'm facing some difficulties for deserialization.
Any idea ?
Thanks in advance
// Serialization
using (var FileStream = new FileStream(DirPath + "/TPV.Json", FileMode.Create))
{
using (var sw = new StreamWriter(FileStream))
{
using (var jw = new JsonTextWriter(sw))
{
jw.WriteStartArray();
using (var _Database = new InspectionBatimentsDataContext(TheBrain.DBClient.ConnectionString))
{
foreach (var TPVId in TPVIds)
{
var pic = (from p in _Database.TPV
where Operators.ConditionalCompareObjectEqual(p.Release, TPVId.Release, false) & Operators.ConditionalCompareObjectEqual(p.InterventionId, TPVId.InterventionId, false)
select p).FirstOrDefault;
var ser = new JsonSerializer();
ser.Serialize(jw, pic);
jw.Flush();
}
}
jw.WriteEndArray();
}
}
}
I finnaly found a way to do it by using custom separator beetween each object during serialization. Then for deserialization, I simply read the json file as string until I find my custom separator and I deserialise readed string, all in a loop. It's not the perfect answer because I'm breaking json format in my files, but it's not a constraint in my case.
I'm working on a project to integrate with yahoo, and they're requiring that the JSON file I send over is in a format of: each object in the list be in a single row, with no starting/ending "[]" and no commas.
{"urn": "e12b6e8135d73...","att": {"id": "MyId","val": 2607}}
{"urn": "6c6355c27642e...","att": {"id": "MyId","val": 2607}}
{"urn": "d415821e454c7...","att": {"id": "MyId","val": 2607}}
Because these files are multi million objects large, I'm using a TextWriter and JsonSerializer to build the file.
using (TextWriter writer = File.CreateText(dataFileLocation))
{
var serializer = new JsonSerializer();
serializer.Serialize(writer, data);
}
I know I can do a char replacement and newline after each object after building the file, but I feel like that would be much more work than needed and isn't very flexible for any changes that may come down the road.
I'm using NewtonSoft for my JSON manipulation, but I can use something else if something else has a formatter that does this specific format. I've looked for other options, but couldn't find anything. So either I'm blind or this has to be done a custom way.
Looks like it was a little easier than I thought, and this just hit me... The answer is:
using (TextWriter writer = File.CreateText(dataFileLocation))
{
var serializer = new JsonSerializer();
foreach (var item in data)
{
serializer.Serialize(writer, item);
writer.WriteLine();
}
}
I need to deserialize an XML response from an external service containing more than 100.000 rows, but I have a problem unescapeing numeric character references in various places. Since the DOM is complex, I need to have a global solution that applies to the whole document, and not it's specific elements. I have the following situations:
<text>First & Second</text>
I use the following XmlSerializer implementation:
public T DeserializeXmlReader<T>(string path) where T : class
{
XmlSerializer serializer = new XmlSerializer(typeof(T));
using (FileStream fileStream = new FileStream(path, FileMode.Open))
{
using (XmlReader xmlReader = XmlReader.Create(fileStream))
{
return (T)serializer.Deserialize(xmlReader);
}
}
}
After deserialization, I will get the following result: "First & Second", instead of "First & Second". I am not sure if there is an additional step I need to undertake to get the "&" deserialized correctly?
Note: After doing some research, I believe the problem might be similar to this one, but I'm not sure if it's applicable since this concerns php: How to deserialize a xml string along with NCR unescaping?
I'm trying to convert from a huge JSON file(2GB) to xml file. I have some troubles reading the huge JSON file.
I've been researching about how i can read huge JSON files.
I found this:
Out of memory exception while loading large json file from disk
How to parse huge JSON file as stream in Json.NET?
Parsing large json file in .NET
It seems that i'm duplicating my question but i have some troubles which aren't solved in these posts.
So, i need to load the huge JSON File and the community propose something like this:
MyObject o;
using (StreamReader sr = new StreamReader("foo.json"))
using (JsonTextReader reader = new JsonTextReader(sr))
{
var serializer = new JsonSerializer();
reader.SupportMultipleContent = true;
while (reader.Read())
{
if (reader.TokenType == JsonToken.StartObject)
{
// Deserialize each object from the stream individually and process it
var o = serializer.Deserialize<MyObject>(reader);
//Do something with the object
}
}
}
So, We can read by parts and deserialize objects one by one.
I'll show you my code
JsonSerializer serializer = new JsonSerializer();
string hugeJson = "hugJSON.json";
using (FileStream s = File.Open(hugeJson , FileMode.Open))
{
using (StreamReader sr = new StreamReader(s))
{
using (JsonReader reader = new JsonTextReader(sr))
{
reader.SupportMultipleContent = true;
while (reader.Read())
{
if (reader.TokenType == JsonToken.StartObject)
{
var jsonObject = serializer.Deserialize(reader);
string xmlString = "";
XmlDocument doc = JsonConvert.DeserializeXmlNode(jsonObject.ToString(), "json");
using (var stringWriter = new StringWriter())
{
using (var xmlTextWriter = XmlWriter.Create(stringWriter))
{
doc.WriteTo(xmlTextWriter);
xmlTextWriter.Flush();
xmlString = stringWriter.GetStringBuilder().ToString();
}
}
}
}
}
}
}
But when i try doc.WriteTo(xmlTextWriter), i get Exception of type System.OutOfMemoryException was thrown.
I've been trying with BufferedStream. This class allows me manage big files but i have another problem.
I'm reading in byte[] format. When i convert to string, the json is splitted and i can't parse to xml file because there are missing characters
for example:
{ foo:[{
foo:something,
foo1:something,
foo2:something
},
{
foo:something,
foo:som
it is cutted.
Is any way to read a huge JSON and convert to XML without load the JSON by parts? or i could load a convert by parts but i don't know how to do this.
Any ideas?
UPDATE:
I have been trying with this code:
static void Main(string[] args)
{
string json = "";
string pathJson = "foo.json";
//Read file
string temp = "";
using (FileStream fs = new FileStream(pathJson, FileMode.Open))
{
using (BufferedStream bf = new BufferedStream(fs))
{
byte[] array = new byte[70000];
while (bf.Read(array, 0, 70000) != 0)
{
json = Encoding.UTF8.GetString(array);
temp = String.Concat(temp, json);
}
}
}
XmlDocument doc = new XmlDocument();
doc = JsonConvert.DeserializeXmlNode(temp, "json");
using (var stringWriter = new StringWriter())
using (var xmlTextWriter = XmlWriter.Create(stringWriter))
{
doc.WriteTo(xmlTextWriter);
xmlTextWriter.Flush();
xmlString = stringWriter.GetStringBuilder().ToString();
}
File.WriteAllText("outputPath", xmlString);
}
This code convert from json file to xml file. but when i try to convert a big json file (2GB), i can't. The process cost a lot of time and the string doesn't have capacity to store all the json. How i can store it? Is any way to do this conversion without use the datatype string?
UPDATE:
The json format is:
[{
'key':[some things],
'data': [some things],
'data1':[A LOT OF ENTRIES],
'data2':[A LOT OF ENTRIES],
'data3':[some things],
'data4':[some things]
}]
Out-of-memory exceptions in .Net can be caused by several problems including:
Allocating too much total memory.
If this might be happening, check whether you are running in 64-bit mode as described here. If not, rebuild in 64-bit mode as described here and re-test.
Allocating too many objects on the large object heap causing memory fragmentation.
Allocating a single object that is larger than the .Net object size limit.
Failing to dispose of unmanaged memory (not applicable here).
In your case, you may be trying to allocate too much total memory but are definitely allocating three very large objects: the in-memory temp JSON string, the in-memory xmlString XML string and the in-memory stringWriter.
You can substantially reduce your memory footprint and completely eliminate these objects by constructing an XDocument or XmlDocument directly via a streaming translation from the JSON file. Then afterward, write the document directly to the XML file using XDocument.Save() or XmlDocument.Save().
To do this, you will need to allocate your own XmlNodeConverter, then construct a JsonSerializer using it and deserialize as shown in Deserialize JSON from a file. The following method(s) do the trick:
public static partial class JsonExtensions
{
public static XDocument LoadXNode(string pathJson, string deserializeRootElementName)
{
using (var stream = File.OpenRead(pathJson))
return LoadXNode(stream, deserializeRootElementName);
}
public static XDocument LoadXNode(Stream stream, string deserializeRootElementName)
{
// Let caller dispose the underlying streams.
using (var textReader = new StreamReader(stream, Encoding.UTF8, true, 1024, true))
return LoadXNode(textReader, deserializeRootElementName);
}
public static XDocument LoadXNode(TextReader textReader, string deserializeRootElementName)
{
var settings = new JsonSerializerSettings
{
Converters = { new XmlNodeConverter { DeserializeRootElementName = deserializeRootElementName } },
};
using (var jsonReader = new JsonTextReader(textReader) { CloseInput = false })
return JsonSerializer.CreateDefault(settings).Deserialize<XDocument>(jsonReader);
}
public static void StreamJsonToXml(string pathJson, string pathXml, string deserializeRootElementName, SaveOptions saveOptions = SaveOptions.None)
{
var doc = LoadXNode(pathJson, deserializeRootElementName);
doc.Save(pathXml, saveOptions);
}
}
Then use them as follows:
JsonExtensions.StreamJsonToXml(pathJson, outputPath, "json");
Here I am using XDocument instead of XmlDocument because I believe (but have not checked personally) that it uses less memory, e.g. as reported in Some hard numbers about XmlDocument, XDocument and XmlReader (x86 versus x64) by Ken Lassesen.
This approach eliminates the three large objects mentioned previously and substantially reduces the chance of running out of memory due to problems #2 or #3.
Demo fiddle here.
If you are still running out of memory even after ensuring you are running in 64-bit mode and streaming directly from and to your file(s) using the methods above, then it may simply be that your XML is too large to fit in your computer's virtual memory space using XDocument or XmlDocument. If that is so, you will need to adopt a pure streaming solution that transforms from JSON to XML on the fly as it streams. Unfortunately, Json.NET does not provide this functionality out of the box, so you will need a more complex solution.
So, what are your options?
You could fork your own version of XmlNodeConverter.cs and rewrite ReadElement(JsonReader reader, IXmlDocument document, IXmlNode currentNode, string propertyName, XmlNamespaceManager manager) to write directly to an XmlWriter instead of an IXmlDocument.
While probably doable with a couple days effort, the difficulty would seem to exceed that of a single stackoverflow answer.
You could use the reader returned by JsonReaderWriterFactory to translate JSON to XML on the fly, and pass that reader directly to XmlWriter.WriteNode(XmlReader). The readers and writers returned by this factory are used internally by DataContractJsonSerializer but can be used directly as well.
If your JSON has a fixed schema (which is unclear from your question) you have many more straightforward options. Incrementally deserializing to some c# data model as shown in Parsing large json file in .NET and re-serializing that model to XML is likely to use much less memory than loading into some generic DOM such as XDocument.
Option #2 can be implemented very simply, as follows:
using (var stream = File.OpenRead(pathJson))
using (var jsonReader = JsonReaderWriterFactory.CreateJsonReader(stream, XmlDictionaryReaderQuotas.Max))
{
using (var xmlWriter = XmlWriter.Create(outputPath))
{
xmlWriter.WriteNode(jsonReader, true);
}
}
However, the XML thereby produced is much less pretty than the XML generated by XmlNodeConverter. For instance, given the simple input JSON
{"Root":[{
"key":["a"],
"data": [1, 2]
}]}
XmlNodeConverter will create the following XML:
<json>
<Root>
<key>a</key>
<data>1</data>
<data>2</data>
</Root>
</json>
While JsonReaderWriterFactory will create the following (indented for clarity):
<root type="object">
<Root type="array">
<item type="object">
<key type="array">
<item type="string">a</item>
</key>
<data type="array">
<item type="number">1</item>
<item type="number">2</item>
</data>
</item>
</Root>
</root>
The exact format of the XML generated can be found in
Mapping Between JSON and XML.
Still, once you have valid XML, there are streaming XML-to-XML transformation solutions that will allow you to transform the generated XML to your final, desired format, including:
C# XSLT Transforming Large XML Files Quickly.
How to: Perform Streaming Transform of Large XML Documents (C#).
Combining the XmlReader and XmlWriter classes for simple streaming transformations.
Is it possible to do the other way?
Unfortunately
JsonReaderWriterFactory.CreateJsonWriter().WriteNode(xmlReader, true);
isn't really suited for conversion of arbitrary XML to JSON as it only allows for conversion of XML with the precise schema specified by Mapping Between JSON and XML.
Furthermore, when converting from arbitrary XML to JSON the problem of array recognition exists: JSON has arrays, XML doesn't, it only has repeating elements. To recognize repeating elements (or tuples of elements where identically named elements may not be adjacent) and convert them to JSON array(s) requires buffering either the XML input or the JSON output (or a complex two-pass algorithm). Mapping Between JSON and XML avoids the problem by requiring type="object" or type="array" attributes.
I have tried to serialize data using XmlSerializer. I have found very a useful post: XML Serializable Generic Dictionary.
But in fact I need to put the result of serialization not in file but in a string variable, how can I do this?
Instead of using some StreamWriter which points to file you can use StringWriter class.
using (StringWriter writer = new StringWriter())
{
XmlSerializer serializer = new XmlSerializer(typeof (YourType));
serializer.Serialize(writer, yourObject);
}
XmlWriter.Create() function has one overload which takes StringBuilder, try using it.