I know there were already many discussions on that topic, like this one:
BinaryFormatter and Deserialization Complex objects
but this looks awfully complicated. What I'm looking for is an easier way to serialize and deserialize a generic List of objects into/from one file. This is what I've tried:
public void SaveFile(string fileName)
{
List<object> objects = new List<object>();
// Add all tree nodes
objects.Add(treeView.Nodes.Cast<TreeNode>().ToList());
// Add dictionary (Type: Dictionary<int, Tuple<List<string>, List<string>>>)
objects.Add(dictionary);
using(Stream file = File.Open(fileName, FileMode.Create))
{
BinaryFormatter bf = new BinaryFormatter();
bf.Serialize(file, objects);
}
}
public void LoadFile(string fileName)
{
ClearAll();
using(Stream file = File.Open(fileName, FileMode.Open))
{
BinaryFormatter bf = new BinaryFormatter();
object obj = bf.Deserialize(file);
// Error: ArgumentNullException in System.Core.dll
TreeNode[] nodeList = (obj as IEnumerable<TreeNode>).ToArray();
treeView.Nodes.AddRange(nodeList);
dictionary = obj as Dictionary<int, Tuple<List<string>, List<string>>>;
}
}
The serialization works, but the deserialization fails with an ArgumentNullException. Does anyone know how to pull the dictionary and the tree nodes out and cast them back, may be with a different approach, but also nice and simple? Thanks!
You have serialized a list of objects where the first item is a list of nodes and the second a dictionary. So when deserializing, you will get the same objects back.
The result from deserializing will be a List<object>, where the first element is a List<TreeNode> and the second element a Dictionary<int, Tuple<List<string>, List<string>>>
Something like this:
public static void LoadFile(string fileName)
{
ClearAll();
using(Stream file = File.Open(fileName, FileMode.Open))
{
BinaryFormatter bf = new BinaryFormatter();
object obj = bf.Deserialize(file);
var objects = obj as List<object>;
//you may want to run some checks (objects is not null and contains 2 elements for example)
var nodes = objects[0] as List<TreeNode>;
var dictionary = objects[1] as Dictionary<int, Tuple<List<string>,List<string>>>;
//use nodes and dictionary
}
}
You can give it a try on this fiddle.
Related
I am simply trying to serialize and deserialize a string array in Bson format using Json.NET, but the following code fails:
var jsonSerializer = new JsonSerializer();
var array = new string [] { "A", "B" };
// Serialization
byte[] bytes;
using (var ms = new MemoryStream())
using (var bson = new BsonWriter(ms))
{
jsonSerializer.Serialize(bson, array, typeof(string[]));
bytes = ms.ToArray();
}
// Deserialization
using (var ms = new MemoryStream(bytes))
using (var bson = new BsonReader(ms))
{
// Exception here
array = jsonSerializer.Deserialize<string[]>(bson);
}
Exception message:
Cannot deserialize the current JSON object (e.g. {"name":"value"}) into type 'System.String[]' because the type requires a JSON array (e.g. [1,2,3]) to deserialize correctly.
To fix this error either change the JSON to a JSON array (e.g. [1,2,3]) or change the deserialized type so that it is a normal .NET type (e.g. not a primitive type like integer, not a collection type like an array or List) that can be deserialized from a JSON object. JsonObjectAttribute can also be added to the type to force it to deserialize from a JSON object.
How can I get this to work?
Set ReadRootValueAsArray to true on BsonReader
http://james.newtonking.com/projects/json/help/index.html?topic=html/P_Newtonsoft_Json_Bson_BsonReader_ReadRootValueAsArray.htm
This setting is required because the BSON data spec doesn't save metadata about whether the root value is an object or an array.
Hmmm, from where I sit, your code should work, but Json.Net seems to think that your serialized array of strings is a dictionary. This could be because, according to the BSON specification, arrays actually do get serialized as a list of key-value pairs just like objects do. The keys in this case are simply the string representations of the array index values.
In any case, I was able to work around the issue in a couple of different ways:
Deserialize to a Dictionary and then manually convert it back to an array.
var jsonSerializer = new JsonSerializer();
var array = new string[] { "A", "B" };
// Serialization
byte[] bytes;
using (var ms = new MemoryStream())
using (var bson = new BsonWriter(ms))
{
jsonSerializer.Serialize(bson, array);
bytes = ms.ToArray();
}
// Deserialization
using (var ms = new MemoryStream(bytes))
using (var bson = new BsonReader(ms))
{
var dict = jsonSerializer.Deserialize<Dictionary<string, string>>(bson);
array = dict.OrderBy(kvp => kvp.Key).Select(kvp => kvp.Value).ToArray();
}
Wrap the array in an outer object.
class Wrapper
{
public string[] Array { get; set; }
}
Then serialize and deserialize using the wrapper object.
var jsonSerializer = new JsonSerializer();
var obj = new Wrapper { Array = new string[] { "A", "B" } };
// Serialization
byte[] bytes;
using (var ms = new MemoryStream())
using (var bson = new BsonWriter(ms))
{
jsonSerializer.Serialize(bson, obj);
bytes = ms.ToArray();
}
// Deserialization
using (var ms = new MemoryStream(bytes))
using (var bson = new BsonReader(ms))
{
obj = jsonSerializer.Deserialize<Wrapper>(bson);
}
Hope this helps.
As explained in this answer by James Newton-King, the BSON format doesn't save metadata about whether the root value is a collection, making it necessary to set BsonDataReader.ReadRootValueAsArray appropriately before beginning to deserialize.
One easy way to do this, when deserializing to some known POCO type (rather than dynamic or JToken), is to initialize the reader based on whether the root type will be serialized using an array contract. The following extension methods do this:
public static partial class BsonExtensions
{
public static T DeserializeFromFile<T>(string path, JsonSerializerSettings settings = null)
{
using (var stream = new FileStream(path, FileMode.Open))
return Deserialize<T>(stream, settings);
}
public static T Deserialize<T>(byte [] data, JsonSerializerSettings settings = null)
{
using (var stream = new MemoryStream(data))
return Deserialize<T>(stream, settings);
}
public static T Deserialize<T>(byte [] data, int index, int count, JsonSerializerSettings settings = null)
{
using (var stream = new MemoryStream(data, index, count))
return Deserialize<T>(stream, settings);
}
public static T Deserialize<T>(Stream stream, JsonSerializerSettings settings = null)
{
// Use BsonReader in Json.NET 9 and earlier.
using (var reader = new BsonDataReader(stream) { CloseInput = false }) // Let caller dispose the stream
{
var serializer = JsonSerializer.CreateDefault(settings);
//https://www.newtonsoft.com/json/help/html/DeserializeFromBsonCollection.htm
if (serializer.ContractResolver.ResolveContract(typeof(T)) is JsonArrayContract)
reader.ReadRootValueAsArray = true;
return serializer.Deserialize<T>(reader);
}
}
}
Now you can simply do:
var newArray = BsonExtensions.Deserialize<string []>(bytes);
Notes:
BSON support was moved to its own package, Newtonsoft.Json.Bson, in Json.NET 10.0.1. In this version and later versions BsonDataReader replaces the now-obsolete BsonReader.
The same extension methods can be used to deserialize a dictionary, e.g.:
var newDictionary = BsonExtensions.Deserialize<SortedDictionary<int, string>>(bytes);
By checking the contract type ReadRootValueAsArray is set appropriately.
Demo fiddle here.
In general, you could check data type first before set ReadRootValueAsArray to true, like this:
if (typeof(IEnumerable).IsAssignableFrom(type))
bSonReader.ReadRootValueAsArray = true;
I know this is an old thread but I discovered a easy deserialization while using the power of MongoDB.Driver
You can use BsonDocument.parse(JSONString) to deserialize a JSON object so to deserialize a string array use this:
string Jsonarray = "[\"value1\", \"value2\", \"value3\"]";
BsonArray deserializedArray = BsonDocument.parse("{\"arr\":" + Jsonarray + "}")["arr"].asBsonArray;
deserializedArray can then be used as any array such as a foreach loop.
I am trying to store a collection of lists (each containing over 20.000 int's) and was hoping to use a nested lest for this since each day a new list will be added.
Eventually I need to access the data in the following way:
"Take the first value of each list and compile a new list".
Iddeally I'd like to serialise a List<List<int>> however this does not seem to work (I can serialise a List<int>). Is there a trick to doing this (preferably without getting any addons)?
If not, how would you advice me to store such data efficiently and quick?
The way I try it now:
static void saveFunction(List<int> data, string name)
{
using (Stream stream = File.Open(name + ".bin", FileMode.OpenOrCreate))
{
BinaryFormatter bin = new BinaryFormatter();
if (stream.Length == 0)
{
List<List<int>> List = new List<List<int>>();
List.Add(data);
bin.Serialize(stream, List);
}
else
{
List<List<int>> List = (List<List<int>>)bin.Deserialize(stream);
List.Add(data);
bin.Serialize(stream, List);
}
}
}
Strangely the list.Count remains 1, and the number of int in the list remain the same as well while the file size increases.
You need to rewind the stream and clear the previous data between reading and writing:
static void saveFunction(List<int> data, string name)
{
using (Stream stream = File.Open(name + ".bin", FileMode.OpenOrCreate))
{
BinaryFormatter bin = new BinaryFormatter();
if (stream.Length == 0)
{
var List = new List<List<int>>();
List.Add(data);
bin.Serialize(stream, List);
}
else
{
var List = (List<List<int>>)bin.Deserialize(stream);
List.Add(data);
stream.SetLength(0); // Clear the old data from the file
bin.Serialize(stream, List);
}
}
}
What you are doing now is appending the new list to the end of the file while leaving the old list as-is -- which BinaryFormatter will happily read as the (first) object in the file when it is re-opened.
As for your second question, "how would you advice me to store such data efficiently and quick?", since your plan is to "take the first value of each list and compile a new list", it appears you're going to need to re-read the preceding lists when writing a new list. If that were not true, however, and each new list was independent of the preceding lists, BinaryFormatter does support writing multiple root objects to the same file. See here for details: Serializing lots of different objects into a single file
What is the simplest way to create a deep copy of an OrderedDictionary? I tried making a new variable like this:
var copy = dict[x] as OrderedDictionary;
But if I update the values/keys in copy, the dictionary in dict[x] gets updated as well.
Edit: dict is another OrderedDictionary.
You should be able to use a generic deep cloning method. An example of deep cloning from msdn magazine:
Object DeepClone(Object original)
{
// Construct a temporary memory stream
MemoryStream stream = new MemoryStream();
// Construct a serialization formatter that does all the hard work
BinaryFormatter formatter = new BinaryFormatter();
// This line is explained in the "Streaming Contexts" section
formatter.Context = new StreamingContext(StreamingContextStates.Clone);
// Serialize the object graph into the memory stream
formatter.Serialize(stream, original);
// Seek back to the start of the memory stream before deserializing
stream.Position = 0;
// Deserialize the graph into a new set of objects
// and return the root of the graph (deep copy) to the caller
return (formatter.Deserialize(stream));
}
What type of objects are you storing in your dictionary ?
You'll need to iterate over the content of the Dictionary and clone/duplicate the contents in some way.
If your object implements ICloneable you could do something like,
Dictionary<int, MyObject> original = new Dictionary<int, MyObject>();
... code to populate original ...
Dictionary<int, MyObject> deepCopy = new Dictionary<int, MyObject>();
foreach(var v in a)
{
MyObject clone = v.Value.Clone();
b.Add(v.Key, clone);
}
I can't tell from your question if dict is a dictionary of dictionaries? The simplest way to make a deep copy of a collection is to iterate through its members and clone each one.
If your value implements ICloneable:
OrderedDictionary newDict = new OrderedDictionary();
foreach(DictionaryEntry entry in OriginalDictionary)
{
newDict[entry.Key] = entry.Value.Clone();
}
If your values can't be Clone()d, you'll have to copy them another way.
OrderedDictionary newDict = new OrderedDictionary();
foreach(DictionaryEntry entry in OriginalDictionary)
{
MyClass x = new MyClass();
x.myProp1 = entry.Value.myProp1 as primitive value;
x.myProp2 = entry.Value.myProp2 as primitive value;
newDict[entry.Key] = x;
}
I have such dictionary Dictionary<string, object>, dictionary holds string keys and objects as values. I need to save and later load such dictionary. What would be best method to do that?
You can use this serializable Dictionary<TKey, TVal>(tested):
http://www.dacris.com/blog/2010/07/31/c-serializable-dictionary-a-working-example/
Dictionary<String, Object> otherDictionary = new Dictionary<String, Object>();
otherDictionary.Add("Foo", new List<String>() { "1st Foo","2nd Foo","3rd Foo" });
var dict = new SerializableDictionary<String, Object>(otherDictionary);
write it to a file:
using (FileStream fileStream = new FileStream("test.binary", FileMode.Create))
{
IFormatter bf = new System.Runtime.Serialization.Formatters.Binary.BinaryFormatter();
bf.Serialize(fileStream, dict);
}
read it from a file:
using (FileStream fileStream = new FileStream("test.binary", FileMode.Open))
{
IFormatter bf = new System.Runtime.Serialization.Formatters.Binary.BinaryFormatter();
dict = (SerializableDictionary<String, Object>)bf.Deserialize(fileStream);
}
Note: Of course you don't need to create a second Dictionary. You can use the SerializableDictionary at the first place. That should just demonstrate how to use it with an already existing Dictionary.
You can use NewtonSoft Json.net: http://james.newtonking.com/projects/json-net.aspx.
It is flexible and very friendly.
How about serializing it to XML?
http://support.microsoft.com/kb/815813
object yourType = new Object();
System.Xml.Serialization.XmlSerializer x = new System.Xml.Serialization.XmlSerializer(yourType.GetType());
x.Serialize(Console.Out,yourType);
If you don't mind adding in an extra dependency, you can use protobuf.net. This has the benefit of being fast (definitely faster than the XML serializer approach), and it works on the bog standard Dictionary.
Reference it in your assembly, then the following shows how to use it with a MemoryStream:
class Program
{
static void Main(string[] args)
{
Dictionary<int, string> dict = new Dictionary<int, string>();
for (int i = 0; i < 10; i++)
{
dict[i] = i.ToString();
}
using (var ms = new MemoryStream())
{
ProtoBuf.Serializer.Serialize(ms, dict);
ms.Seek(0, SeekOrigin.Begin);
var dict2 = ProtoBuf.Serializer.Deserialize<Dictionary<int, string>>(ms);
}
}
}
So long as both the key and value type for the dictionary are serializable by protobuf.net, this will work.
EDIT:
If you want to use this approach, you have to make your key and value object's class serializable by ProtoBuf. In short, attach [ProtoContract] to the class, and attach e.g. [ProtoMember(1)] to each Property you want to be serialized. See the website for more details. Note both string and int are serializable out of the box.
I got a List<List<CustomClass>>, where CustomClass is a reference type.
I need to make a full deep copy of this matrix into a new one. Since I want a deep copy, each CustomClass object in the matrix has to be copied into the new matrix.
How would you do that in an efficient way?
For a CustomClass that implements ICloneable, this isn't very difficult:
var myList = new List<List<CustomClass>>();
//populate myList
var clonedList = new List<List<CustomClass>>();
//here's the beef
foreach(var sublist in myList)
{
var newSubList = new List<CustomClass>();
clonedList.Add(newSubList);
foreach(var item in sublist)
newSublist.Add((CustomClass)(item.Clone()));
}
You can make this work in a similar way with any "DeepCopy"-type method, if you feel you don't want to implement ICloneable (I would recommend using the built-in interface though).
One easier way to serialize the whole object and then deserialize it again, try this extension method:
public static T DeepClone<T>(this T source)
{
if (!typeof(T).IsSerializable)
{
throw new ArgumentException("The type must be serializable.", "source");
}
// Don't serialize a null object, simply return the default for that object
if (Object.ReferenceEquals(source, null))
{
return default(T);
}
IFormatter formatter = new BinaryFormatter();
Stream stream = new MemoryStream();
using (stream)
{
formatter.Serialize(stream, source);
stream.Seek(0, SeekOrigin.Begin);
return (T)formatter.Deserialize(stream);
}
}
USAGE
List<List<CustomClass>> foobar = GetListOfListOfCustomClass();
List<List<CustomClass>> deepClonedObject = foobar.DeepClone();
There are two possibilities:
Implement the ICloneable interface on your CustomClass, then you can clone your objects.
If the class can be serialized, serialize it to a memory stream and deserialize it from there. That will create a copy of it.
I would prefer to take the first alternative, because I think serializing / deserializing is slower than cloning via ICloneable.
Assuming you have a Copy method which can duplicate CustomClass objects:
var newMatrix = oldMatrix
.Select(subList => subList.Select(custom => Copy(custom)).ToList())
.ToList();