Hashtable serialization without BinaryFormatter? - c#

I used BinaryFormatter which is now obsolete, to (de)serialize Hashtable objects.
Hashtables are tricky as they can wrap anything, including other Hashtables:
[ "fruits": [ "apple":10, "peach":4, "raspberry":5 ], "vegetables": [ "lettuce":5 ]
"animals": [ "with4legs": [ "cat":15, "dog":2 ], "withwings": [ "bird":51 ]
]
BinaryFormatter could perfectly serialize these Hashtables (under dotNET 4.8):
Hashtable main = new Hashtable();
Hashtable fruits = new Hashtable()
{
{ "apple", 10 },
{ "peach", 4 },
{ "raspberry", 5 }
};
Hashtable vegetables = new Hashtable()
{
{ "lettuce", 5 }
};
Hashtable with4legs = new Hashtable()
{
{ "cat", 15 },
{ "dog", 2 }
};
Hashtable withwings = new Hashtable()
{
{ "bird", 51 }
};
Hashtable animals = new Hashtable()
{
{ "with4legs", with4legs },
{ "withwings", withwings }
};
main.Add("fruits", fruits);
main.Add("vegetable", vegetables);
main.Add("animals", animals);
BinaryFormatter binaryFormatter = new BinaryFormatter();
using (Stream stream = new FileStream("Test.file", FileMode.Create))
{
binaryFormatter.Serialize(stream, main);
}
The file is somewhat messy:
However, when reading it back with the BinaryFormatter:
Hashtable newMain = new Hashtable();
using (Stream stream = new FileStream("Test.file", FileMode.OpenOrCreate))
{
newMain = (Hashtable)binaryFormatter.Deserialize(stream);
}
It could perfectly reassemble the Hashtable:
Now, with dotNET, BinaryFormatter is obsolete, and XmlSerializer or JsonSerializer is recommended to be used instead:
using (Stream stream = new FileStream("Test.file", FileMode.Create))
{
JsonSerializer.Serialize(stream, main);
}
File is in JSON format now:
And unfortunately when deserializing it:
Hashtable newMain = new Hashtable();
using (Stream stream = new FileStream("Test.file", FileMode.OpenOrCreate))
{
newMain = (Hashtable)JsonSerializer.Deserialize<Hashtable>(stream);
}
Hashtable loses its structure:
I did also try with MessagePack: https://msgpack.org, but I can't make it to go below one level either:
Now I know there can be more efficient or robust solution for this than Hashtable, but still, is there a way to move from BinaryFormatter to any recommendation which can handle saving and reloading this structure?

Related

Why is streamwriter writting in the half of the file instead of overwriting

Here's my original file
[
{
"Quantity": 34,
"ShopOrderId": "51e400ff-76b8-4be4-851a-86e2681db960",
"Id": "ae7664cb-135e-4c01-b353-5ecf09ac56af",
"Direction": 2
},
{
"Accepted": true,
"Id": "7bfc2163-2274-4a0e-83b9-203cb376a8f8",
"Direction": 1
}
]
Now I want to load its content, remove one item, and overwrite the whole file
// items loaded from file
var result = new List<QueueItem>();
using (var fs = new FileStream(Path, FileMode.OpenOrCreate, FileAccess.ReadWrite, FileShare.Read))
{
// reading from file
using (var streamReader = new StreamReader(fs))
{
var json = streamReader.ReadToEnd();
result = JsonConvert.DeserializeObject<List<QueueItem>>(json) ?? new List<QueueItem>();
}
}
// removing unwanted items
result = result.Where(x => !IdsToRemove.Contains(x.Id)).ToList();
// overwriting whole file
using (var fs = new FileStream(Path, FileMode.OpenOrCreate, FileAccess.ReadWrite, FileShare.None))
{
using (var streamWriter = new StreamWriter(fs))
{
var json = JsonConvert.SerializeObject(result, Formatting.Indented);
streamWriter.Write(json);
}
}
The value of json in streamWriter.Write(json) is
[
{
"Accepted": true,
"Id": "7bfc2163-2274-4a0e-83b9-203cb376a8f8",
"Direction": 1
}
]
so it's valid.
But for some reason after performing Write it's a mess
[
{
"Accepted": true,
"Id": "7bfc2163-2274-4a0e-83b9-203cb376a8f8",
"Direction": 1
}
]-135e-4c01-b353-5ecf09ac56af",
"Direction": 2
},
{
"Accepted": true,
"Id": "7bfc2163-2274-4a0e-83b9-203cb376a8f8",
"Direction": 1
}
]
how can I make it that my streamWriter actually overwrite and also why does it happen when I'm opening new FileStream? shouldn't it be not aware of previous operations with streamReader? I guess it's the reason why it doesn't start from index 0.
Or maybe there is easier method to read/write from file while having locks (preventing other programs from modifying?)
Sounds like you need to use Truncate for the file mode when you open it to overwrite.

Is it possible to compress JSON object in memory

I would like to keep in memory medium size JSON file from hard disc (it have 14mb). I found easy way to do it:
public static void Load(string path = CacheFilePath)
{
path = Path.GetFullPath(path);
if (!File.Exists(path))
return;
JsonSerializer serializer = new JsonSerializer();
using (FileStream s = File.Open(path, FileMode.Open))
using (StreamReader sr = new StreamReader(s))
using (JsonReader red = new JsonTextReader(sr))
{
while (red.Read())
{
JObject Data = serializer.Deserialize<JObject>(red);
}
}
}
And everything is working but, this JObject take almost 360mb RAM. Is it possible to set any JSON setting to reduce this value?
or maybe there is another way to keep data as JSON in memory?
Quick look on file (example, real have something about 230 k):
{
"OrderValidationData":
{
"Count":2,
"Name":"OrderValidationData",
"Keys":[
"WAL22999-96",
"HEL5DA 193 175-111",
**n**
],
"Data":
{
"WAL22999-96":
{
"MinimalNetPrice":10.00,
"VatRate":12.00
},
"HEL5DA 193 175-111":
{
"MinimalNetPrice":10.00,
"VatRate":12.00
}
, **{n}**
}
}
}

DataContractJsonSerializer stringifies DateTime values inside dictionaries during deserialization

I need a JSON serializer/deserializer that comes with .NET. I cannot use Newtonsoft Json.NET.
As far as I know, that leaves me with JavaScriptSerializer and DataContractJsonSerializer. JavaScriptSerializer didn't work properly, because it insists on some silly format for datetimes, which loses the UTC offset after being deserialized, nevermind that.
I am using DataContractJsonSerializer, and the test method below shows what is wrong.
I am specififying custom date format (ISO) for both serialization and deserialization, and the string looks OK. Additionally, it is readable by the Json.NET correctly.
How can I efficiently and generically workaround this issue, without doing any regexes and manual conversions?
[Test]
public void SerializerFailure()
{
TestObject testObject = new TestObject() {Dict = new Dictionary<string, object>() {{"DateTest", DateTime.UtcNow}, {"DecimalTest", 66.6M}}};
Assert.IsInstanceOf<DateTime>(testObject.Dict["DateTest"]);
string serialized = this.Serialize(testObject);
//output is OK...
//{"Dict":{"DateTest":"2019-01-07T23:16:59.5142225Z","DecimalTest":66.6}}
TestObject deserialized = this.Deserialize<TestObject>(serialized);
Assert.IsInstanceOf<string>(deserialized.Dict["DateTest"]);
TestObject newtonDeserialized = JsonConvert.DeserializeObject<TestObject>(serialized);
testObject.ShouldBeEquivalentTo(newtonDeserialized); //passes OK
testObject.ShouldBeEquivalentTo(deserialized); //Fails
//ERROR: Expected member Dict[DateTest] to be
// "2019-01-07T23:27:23.0758967Z" with a length of 28, but
// "07.01.2019 23:27:23" has a length of 19.
// Expected member Dict[DateTest] to be
// "2019-01-07T23:27:23.0758967Z", but
// "07.01.2019 23:27:23" differs near "07."(index 0).
}
Serialization:
public string Serialize(object objectToPost)
{
using (MemoryStream stream = new System.IO.MemoryStream())
{
var settings = new DataContractJsonSerializerSettings() { DateTimeFormat = new DateTimeFormat("O"), UseSimpleDictionaryFormat = true };
DataContractJsonSerializer serializer
= new DataContractJsonSerializer(objectToPost.GetType(), settings);
serializer.WriteObject(stream, objectToPost);
stream.Flush();
stream.Seek(0, SeekOrigin.Begin);
using (StreamReader reader = new StreamReader(stream))
{
return reader.ReadToEnd();
}
}
}
Deserialization
public T Deserialize<T>(string stringContent)
{
try
{
using (MemoryStream ms = new MemoryStream(Encoding.UTF8.GetBytes(stringContent)))
{
var settings = new DataContractJsonSerializerSettings() { DateTimeFormat = new DateTimeFormat("O"), UseSimpleDictionaryFormat = true };
DataContractJsonSerializer serializer = new DataContractJsonSerializer(typeof(T), settings);
return (T)serializer.ReadObject(ms);
}
}
catch (Exception ex)
{
throw new InvalidOperationException($"Error while deserializing string as {typeof(T).Name}", ex);
}
}

Serializing into JSON using MemoryStream,while adding newlines C#

I have to serialize some C# class data into JSON. For this purpose I use a MemoryStream and a DataContractJsonSerializer.
MemoryStream stream1 = new MemoryStream();
DataContractJsonSerializer ser = new DataContractJsonSerializer(typeof(Person));
ser.WriteObject(stream1, p);
using (FileStream file = new FileStream("Write.json", FileMode.Create, FileAccess.ReadWrite))
{
stream1.Position = 0;
stream1.Read(stream1.ToArray(), 0, (int)stream1.Length);
file.Write(stream1.ToArray(), 0, stream1.ToArray().Length);
stream1.Close();
file.Close();
}
Running the application like this produces this output:
{"age":42,"arr":[{},{},{}],"name":"John","value":null,"w":{}}
However, for my task I have to produce a JSON file where each entry is entered in a new line. Example:
"SomeData":[
{
"SomeData" : "Value",
"SomeData" : 0,
"SomeData": [
{
"SomeData" : "Value",
"SomeData" : 0
]
}
]
}, etc. etc.
Any ideas as to how I can do this? Thanks in advance!
Ok, so if you want to do that just add this code answered in this question.
question here

Converting code to use JSON.net instead of System.Runtime.Serialization.DataContractSerializer

How do I do the equivalent in JSON.net?
public SerializedResults SerializeResults(Type queryType, IEnumerable entities)
{
var results = SerializeDynamicType(queryType);
var objList = AnonymousFns.DeconstructMany(entities, false, queryType).ToList();
var ms = new MemoryStream();
var type = objList.GetType();
var serializer = new DataContractSerializer(type);
using (ms)
{
using (GZipStream compress = new GZipStream(ms, CompressionMode.Compress, CompressionLevel.BestCompression))
{
serializer.WriteObject(compress, objList);
}
}
results.ByteArray = ms.ToArray();
return results;
}
I am confused with this line in particular:
var serializer = new DataContractSerializer(type);
How do you do that in JSON.NET??
THANKS :-)
With JSON.NET, you don't need the type when serializing. I'm assuming that it works out the type you are passing in on its own.
So, you can get rid of this completely:
var type = objList.GetType();
var serializer = new DataContractSerializer(type);
And change this:
serializer.WriteObject(compress, objList);
To:
var json = JsonConvert.SerializeObject(objList);
Here are the JSON.Net docs for JsonConvert.
I believe you can use the BsonWriter to write to a stream. I'm not sure it will give you the exact same binary format you had before, but in concept it is the same.
public SerializedResults SerializeResults(Type queryType, IEnumerable entities)
{
var results = SerializeDynamicType(queryType);
var objList = AnonymousFns.DeconstructMany(entities, false, queryType).ToList();
var ms = new MemoryStream();
using (ms)
{
using (GZipStream compress = new GZipStream(ms, CompressionMode.Compress, CompressionLevel.BestCompression))
{
using( BsonWriter writer = new BsonWriter(compress))
{
JsonSerializer serializer = new JsonSerializer();
serializer.Serialize(writer, objList);
}
}
}
results.ByteArray = ms.ToArray();
return results;
}

Categories

Resources