Assume you have two classes, one inherits the other and the child needs to be serialized / deserialized with XmlSerializer. However, the parent contains a member that is not serializeable, say a dictionary.
public class Parent {
public Dictionary<string, int> dictionary;
}
The parent class is a library used for many other scripts. It cannot be modified. Now the child class contains only serializable members:
public class Child : Parent {
[XmlElement]
public int foo;
}
When trying to call the serializer, I receive an error saying that the dictionary is not serializable. When trying to serialize in JSON, I managed to get away by the price of a warning. I just created another member with the same name and type, and used ScriptIgnore:
public class Child : Parent {
public int foo;
[ScriptIgnore]
public Dictionary<string, int> dictionary;
}
I tried the same trick again here (by using XmlIgnore) but that didn't work on very well, the error was the same. The only way I managed to go through this is create separate classes that will only serve xml de/serializing and then copy the values back into the appropriate place.
Does anyone know a better way around this? Can I make XmlSerializer forget about the parent dictionary in any way?
The very first thing I would say, and always say: if serializing an existing model gets tricky - even remotely awkward, then stop doing that. Take 2 minutes to create a separate DTO model, i.e. a model created solely for the purposes of serialization (and indeed, perhaps even tailored to a specific serializer). Now you put the exact right types, right members, right attributes, and right layout. All you need to do is add some conversion methods - static conversion operators work great here. So what I would say is: create a ParentDto and ChildDto (your names may vary); it'll take 3 minutes, and it'll work great.
Now, back to the question...
XmlSerializer looks at the declaring class for input; for both attributes and conditional serialization, no: we can't add those into the type model at this point. But there is another option - you can use XmlAttributeOverrides to pretend that there was an [XmlIgnore] on the dictionary member. However, some important caveats:
the XmlAttributeOverrides API is a bit of a faff to use (see MSDN for an example)
it is critical that you only do this once, and then store and re-use the XmlSerializer that you create this way; basically, if you don't do this, it will create a new dynamic assembly every time you new a serializer, and assemblies never unload, so you will haemorrhage memory; note that the simple usage (new XmlSerializer(someType) etc) has an inbuilt cache for this; but the XmlAttributeOverrides usage does not
But again, all this messing with XmlAttributeOverrides is a lot more work than just creating a basic DTO
Example of using XmlAttributeOverrides:
using System;
using System.Collections.Generic;
using System.Xml.Serialization;
public class Parent {
public Dictionary<string, int> WantToIgnoreThis { get; set; }
}
public class Child : Parent {
public int Foo { get; set; }
}
static class Program
{
static readonly XmlSerializer customSerializer;
static Program()
{
var xao = new XmlAttributeOverrides();
xao.Add(typeof(Parent), "WantToIgnoreThis", new XmlAttributes {
XmlIgnore = true
});
customSerializer = new XmlSerializer(typeof(Child), xao);
}
static void Main()
{
//var ser = new XmlSerializer(typeof(Child));
// ^^ this would fail
customSerializer.Serialize(Console.Out, new Child {
Foo = 123
});
}
}
Note in particular how the static field is used to cache the serializer.
You could implement IXmlSerializable yourself and handle the specifics in ReadXml(XmlReader reader) and WriteXml(XmlWriter writer). The XmlSerializer will call those methods if your class implements them instead of generating its own serializer.
public class Child : Parent, IXmlSerializable
{
public int Foo { get; set; }
public Dictionary<string, int> Dictionary { get; set; }
public void WriteXml(XmlWriter writer)
{
writer.WriteStartElement("Foo");
writer.WriteValue(this.Foo);
writer.WriteEndElement();
}
void ReadXml(XmlReader reader)
{
var wasEmpty = reader.IsEmptyElement;
reader.Read();
if (wasEmpty)
{
return;
}
reader.ReadStartElement("Foo");
this.Foo = reader.ReadContentAsInt();
reader.ReadEndElement();
}
}
Related
I'm using protobuf-net in an application that does a lot of binary serialization of objects from (for all intents and purposes) 3rd party dlls. As a result, I can't use the [Proto-] attributes on the contracts themselves, and I'm instead using the RuntimeTypeModel to prepare the serializer at runtime as it encounters new types. Example serializer wrapper:
public static class ProtobufSerializer
{
public static byte[] Serialize<T>(T obj)
{
PrepareSerializer(typeof(T));
ProtoBuf.Serialize(memoryStream, obj);
}
public static T Deserialize<T>(byte[] bytes)
{
PrepareSerializer(typeof(T));
ProtoBuf.Serialize(memoryStream, obj);
}
}
Where we can safely assume that PrepareSerializer is capable of preparing RuntimeTypeModel to serialize any given type. I'm having some issues with the deserialization of objects where I have to leverage DynamicType=true though. For instance, given the following interface:
public interface IFoo
{
string Name {get;}
}
And the implementation:
public class Foo : IFoo
{
public string Name {get;set;}
public Bar Bar {get;set;}
[OnDeserializing]
public void OnDeserializing(Type t)
{
PrepareSerializer(typeof(Foo));
}
}
public class Bar
{
public int Baz {get;set;}
}
The PrepareSerializer method essentially would use a surrogate and generate a model roughly equivalent to:
// registered surrogate for IFoo
[ProtoContract]
public class IFooSurrogate
{
[ProtoMember(1, DynamicType=true)]
public object Value
[OnSerializing]
public void OnSerializing(Type t)
{
PrepareSerializer(this.Value.GetType());
}
}
Where Value is set by the implicit converters to equal the instance of IFoo. This works fine during serialize (where the event is fired and gives me a chance to prepare the serializer for the specific interface implementation type). It would also work fine in a non-distributed system where I would have to run through a serialize method before ever trying to deserialize that type. During deserialization in a distributed system though, where the current node has never seen Foo before, the ProtoBuf.Serializer throws an InvalidOperationException complaining about a lack of serializer for type Bar before it runs the Foo.OnDeserializing event (giving me a chance to tell it how to deserialize Bar).
Is there any way to attach a hook to ensure my code is given a chance to know about 'Foo' before protobuf-net complains about a lack of serializers?
I haven't tried exactly this situation, but: in order to allow some flexibility, all Type storage and rehydration goes via the TypeModel.DynamicTypeFormatting event; so, you could in theory hook this event on RuntimeTypeModel.Default, with something like:
RuntimeTypeModel.DynamicTypeFormatting += (sender, args) => {
if (args.FormattedName != null) { // meaning: rehydrating
lock(SomeSyncLock) {
if(NotYetKnown(args.FormattedName))
Prepare(args.FormattedName);
}
}
};
The intent of this API is to allow you to control how types are resolved, but... I guess it would work for this too?
I can, however, get behind the idea of an event that is more deliberately targeted at the first time a new Type is seen, essentially replacing / supplementing the "apply default behaviour" code. I don't think it exists today, though.
I am facing a problem of inconsistency after deserialization using protobuf-net.
The class I would like to serialize/deserialize looks like:
[ProtoContract]
public class TSS
{
[ProtoMember(1, AsReference = true)]
public EventManager eventManager { get; private set; }
[ProtoMember(2)]
public DateTime referenceDateTime { get; private set; }
[ProtoMember(3)]
public Mode mode;
}
And inside EventManager class, it looks like:
[ProtoContract]
public class EventManager
{
[ProtoMember(1)]
public InputQueue inputQueue = new InputQueue();
[ProtoMember(2)]
public InputQueue InputQueue
{
get { return this.inputQueue; }
set { this.inputQueue = value; }
}
[ProtoMember(7, AsReference = true)]
public TSS tss;
}
The tss in class EventManager is a reference of TSS object, and eventManager in class TSS is a reference of EventManager object. This is the reason I put AsReference = true there (is this the right way?)
I do serialization like:
public void StateSaving(int time, TSS tss)
{
Stream memoryStream = new MemoryStream();
Serializer.Serialize(memoryStream, tss);
states.Add(time, memoryStream);
}
and do deserialization like:
public void Deserialize(int time, ref TSS tss)
{
Stream memoryStream = states[time];
memoryStream.Position = 0;
tss = Serializer.Deserialize<TSS>(memoryStream);
}
The problem is that whenever I do deserialization, the data structures like inputQueue in the EventManager is populated with NULL values instead of actual values at that point. I am a newbie to protobuf-net, so please point out any mistakes (I believe there are a lot).
Thanks in advance!!
(from comments)
I have located the problem, basically there's a list that needs to be deserialized. And this list is a list of events, in which the constructors of the events have parameters, and when it tries to deserialize, the program will run the parameterless constructors (I manually added these constructors in order to eliminate the exceptions) instead of the right ones (with parameters). I know this is how serialization/deserialization work, but is there a way I can serialize and deserialize this list correctly?
Ah, indeed. There are various approaches when it comes to object construction:
use the parameterless constructor
look for a constructor which matches all the defined members
skip the constructor completely
use a custom object factory
use a surrogate object and custom conversion
Things like XmlSerializer use the first; things like DataContractSerializer and BinaryFormatter use the 3rd; the good news is that protobuf-net supports all 5. I suggest that in your case the best option is to use the 3rd option for this type, which you can do by:
[ProtoContract(SkipConstructor=true)]
I have developed an application that is meant to send data from client to server and back etc. using serialized objects.
For this application, I decided that protobuf-net would be a good option (especially as it handles variable-length objects so well).
However, when sending an object from client to server or vica-versa, all I know is that the object will be some child class of 'ReplicableObject'. Hence, I am using:
Serializer.SerializeWithLengthPrefix(stream, ro, PrefixStyle.Base128);
Where 'ro' is an object of a type that subclasses from ReplicableObject.
However, I get this exception:
An unhandled exception of type 'ProtoBuf.ProtoException' occurred in
protobuf-net.dll
Additional information: Unexpected type found during serialization;
types must be included with ProtoIncludeAttribute; found MessageObject
passed as ReplicableObject
In this particular instance, I'm trying to send a MessageObject.
As there is precious little documentation for protobuf-net, I am stuck on what to do. I've tried a few attributes here and there to no avail.
Any help appreciated.
Edit: I should make it clear that the subclasses might not even be ones that I've written.
Protobuf is a contract-based serialization format, designed to be platform independent. As such, no type metadata is included on the wire as it would not apply between platforms. Even inheritance is not part of the core protobuf spec.
protobuf-net as a specific implementation introduces support for inheritance (via some smoke and mirrors), but ideally it should still be possible to define the expected types in advance - exactly the same as other serializers such as XmlSerializer or DataContractSerializer. This can be done by using [ProtoInclude(...)] to specify the anticipated concrete types.
If you genuinely can't tell the actual types in advance, there is also a DynamicType option, which writes the AssemblyQualifiedName into the stream. If you are interested in this route, then note that the "cross-platform" features of the format start to break down, but it is very useful for .NET-to-.NET purposes.
At the simplest, a wrapper such as:
[ProtoContract]
public class SomeWrapper {
[ProtoMember(1, DynamicType = true)]
public object Value {get;set;}
}
Wrap your object in that and it should behave (in v2 at least; DynamicType did not exist in v1). Full example:
[TestFixture]
public class SO7218127
{
[Test]
public void Test()
{
var orig = new SomeWrapper {Value = new SubType { Foo = 123, Bar = "abc"}};
var clone = Serializer.DeepClone(orig);
Assert.AreEqual(123, orig.Value.Foo);
Assert.AreEqual("abc", ((SubType) clone.Value).Bar);
}
[ProtoContract]
public class SomeWrapper
{
[ProtoMember(1, DynamicType = true)]
public BaseType Value { get; set; }
}
[ProtoContract]
public class BaseType
{
[ProtoMember(1)]
public int Foo { get; set; }
}
[ProtoContract]
public class SubType : BaseType
{
[ProtoMember(2)]
public string Bar { get; set; }
}
}
I'm new at the C# thing.... (.net 3.5)
I want a Dictionary to hold two different types of object, one of the type is generic. while iterating through the list, i will call methods like add and clone.
I have tried it with a base class and subclasses....
namespace ConsoleApplication1 {
class Element{
}
class Child1 : Element {
public Child1 Clone() { return clone; }
}
class Child2<T> : Element {
public Child2<T> Clone() { return clone; }
}
class Program {
static void Main(string[] args) {
Dictionary<string, Element> d = new Dictionary<string, Element>();
d.Add("c1", new Child1());
d.Add("c2s", new Child2<string>());
d.Add("c2i", new Child2<int>());
foreach (KeyValuePair<string, Element> kvp in d) {
Element e = kvp.Value.Clone();
}
}
}
}
Is there a way or solution for my needs?
Thanks!
Anna
You could make Clone either abstract or virtual on the base-type (Element), and override it in the derived types, but you can't change the return type when overriding, so it would have to be Element (nothing more specific). You can redeclare methods (new...), but that gets messy, and you can't override and new a method by the same name/signature in the same type.
But if you're happy for the return type to be Element...
abstract class Element{
public abstract Element Clone();
}
class Child1 : Element {
public override Element Clone() { return /* your code */; }
}
class Child2<T> : Element {
public override Element Clone() { return /* your code */; }
}
Since the type of .Value you get out of your dictionary is Element, you need to make sure Element defines all operations it should have, like your Clone method.
I would:
Make Clone virtual, and add it to Element (or make Element abstract, and Clone abstract instead of virtual)
Override Clone in both Child1 and Child2
This way, the code kvp.Value.Clone() would call the right Clone method depending on the object returned from the dictionary.
Don't create a class hierarchy just for the sake of being able to add different objects to one dictionary though.
If the classes don't have a decent enough hierarchical relationship, you would be better off using an interface like ICloneable, which is already available in the .NET framework.
Then, simply instantiate your Dictionary like:
Dictionary<string, ICloneable> d = new Dictionary<string, ICloneable>();
It's more flexible. Creating a hierarchy for the sake of the commonality of being able to perform Clone(), is not the right solution IMO.
Though I agree with Wim, that implementing ICloneable is probably the better solution, rather than trying to enforce a non-existing class hierachy, please be aware that ICloneable is considered a "bad API" as it does not specify whether it uses shallow- or deepcopy semantics (see for instance http://pro-thoughts.blogspot.com/2009/02/write-deep-clone-forget-about.html or do a google search for "ICloneable C# bad API"
Is there a way to have XmlSerializer ignore all members by default, unless I say otherwise?
I have a base class and several derived classes with lots of members, but most I do not want to be serialized. Only a select few are acceptable for serialization.
No, you cannot do this.
The XmlSerializer is using a "opt-out" process - it will serialize everything (all public properties) unless you explicitly opt-out by using the [XmlIgnore] attribute. There's no way of changing this behavior.
The .NET 3.5 DataContractSerializer on the other hand is taking the other approach - opt-in. It will not serialize anything, unless you specifically tell it to, by decorating your members with [DataMember].
So maybe the DataContract serializer would work for you? It was a few more advantages (doesn't require a parameter-less constructor, can serialize internal and private properties, too, and it can also serialize fields instead of properties, if needed), and it's tuned for speed. There's some downsides, too - it doesn't support attributes in XML nodes - so you'll have to pick based on your requirements.
There's a good comparison of the two by Dan Rigsby - check it out!
Marc
You could implement IXMLSerializable and determine what you want to be serialized. Here is an example of Object serialization. Check out this SO post about the proper way to implement IXMLSerializable. Here is an example of IXMLSerializable using for some collections.
It would look something like this:
using System;
using System.Xml;
using System.Xml.Schema;
using System.Xml.Serialization;
namespace ConsoleApplicationCSharp
{
public class ObjectToSerialize : IXmlSerializable
{
public string Value1;
public string Value2;
public string Value3;
public string ValueToSerialize;
public string Value4;
public string Value5;
public ObjectToSerialize() { }
public void WriteXml(System.Xml.XmlWriter writer)
{
writer.WriteElementString("Val", ValueToSerialize);
}
public void ReadXml(System.Xml.XmlReader reader)
{
if (reader.MoveToContent() == XmlNodeType.Element && reader.LocalName == "Event")
{
ValueToSerialize = reader["Val"];
reader.Read();
}
}
public XmlSchema GetSchema() { return (null); }
public static void Main(string[] args)
{
ObjectToSerialize t = new ObjectToSerialize();
t. ValueToSerialize= "Hello";
System.Xml.Serialization.XmlSerializer x = new XmlSerializer(typeof(ObjectToSerialize));
x.Serialize(Console.Out, t);
return;
}
}
}