I have hit a small snag when I user the binaryformatter to serialize objects. The whole point for the serialization is so that the values can be passed into a hash function which requires a byte array.
The process I have is, I read the file, using newtonsoft Jsonconvert the json file into a POCO object, do some checks, update values as required and save back to same file in the Json format.
The checks include verifying that the hash value matches from the file to what is regenerated at the beginning of the process. The steps I take are, read file, convert to POCO, serialize using binary formatter, generate hash value, compare both values, if correct, update data and save both the new hash value and object as Json into the file.
However, I have hit a snag when i serialize the object using the binary formatter. If the object has properties where the values are same, the byte output from the serializer is different from when the data is read in from the file to when it is written out. As the values of the byte arrays are different, so are the hash values. Moreover, if the values for the properties are different, then the same hash values are generated and therefore no issues.
My question is why does having the same value causes the byte value to be different when the object is read and written to file.
[Serializable]
public class UserAuthorisationData
{
public string surname { get; set; }
public string forename { get; set; }
public string initials { get; set; }
public UserAuthorisationData()
{
surname = "";
forename = "";
initials = "";
}
}
E.g
var objectA = new UserAuthorisationData()
objectA.surname = "Fred";
objectA.forename = "John";
objectA.initials = "FJ";
var objectB = new UserAuthorisationData()
objectB.surname = "John";
objectB.forename = "John";
objectB.initials = "JJ";
In the example above, the value of the byte array for objectA is the same, when the hash values are generated both during when data is written out and when the file is read back in.
However, for objectB, the values differ.
Method below to convert object to byte:
protected virtual byte[] ObjectToByteArray(object objectToSerialize)
{
BinaryFormatter formatter = new BinaryFormatter();
MemoryStream fs = new MemoryStream();
try
{
lock (locker)
{
formatter.Serialize(fs, objectToSerialize);
Logger.Debug($"Successfully converted object to bytes for serialization.");
}
File.WriteAllBytes(#"C:\ali.kamal\User1.txt", fs.ToArray());
return fs.ToArray();
}
}
Call the method on the object
ObjectToByteArray(objectA);
ObjectToByteArray(objectB);
Update 1
Thanks Hadi. The hash code is generated using Microsoft's HMACSHA256.computeHash method.
protected override string ComputeHash(byte[] objectAsBytes, byte[] key)
{
byte[] hashValue = null;
try
{
using (HMACSHA256 hmac = new HMACSHA256(key))
{
hashValue = hmac.ComputeHash(objectAsBytes);
}
}catch(Exception ex)
{
EventLogger.LogEvent($"Could not generate SHA256 hash value, exception:{ex.Message}", EventEntryType.Error);
}
return Convert.ToBase64String(hashValue);
}
e.g.
string hashvalue = ComputeHash(ObjectToByteArray(objectA), ObjectToByteArray("abcd"))
I would suggest you to compare the bytes in different way rather than the Hash code method: Array1.SequenceEqual(Array2), Its not clear how you are comparing the Hash Code of both arrays in your question, but I will assume that you are using GetHashCode method, using Array1.GetHashCode() == Array2.GetHashCode() is not suitable for your case unless you override the HasCode function. Below is the sample using a Console application for using the SequenceEqual method:
using System;
using System.IO;
using System.Linq;
using System.Runtime.Serialization.Formatters.Binary;
namespace ConsoleApp1
{
class Program
{
static void Main(string[] args)
{
var a = new UserData();
var b = new UserData();
var aS = ObjectToByte(a);
var bS = ObjectToByte(b);
Console.WriteLine("A : {0}", a);
Console.WriteLine("B : {0}", b);
// result for empty objects are equal
Console.WriteLine("A == B ? => {0} \n\n", aS.SequenceEqual(bS));
a = new UserData()
{
ForeName = "A",
Initials = "CC",
SurName = "B",
};
b = new UserData()
{
ForeName = "AX",
Initials = "CC",
SurName = "B",
};
aS = ObjectToByte(a);
bS = ObjectToByte(b);
Console.WriteLine("A : {0}", a);
Console.WriteLine("B : {0}", b);
// result for same data type with same object values are equal
Console.WriteLine("A == B ? => {0} \n\n", aS.SequenceEqual(bS));
a = new UserData()
{
ForeName = "AX",
Initials = "CC",
SurName = "B",
};
b = a;
aS = ObjectToByte(a);
bS = ObjectToByte(b);
Console.WriteLine("A : {0}", a);
Console.WriteLine("B : {0}", b);
// result for different objects are not equal
Console.WriteLine("A == B ? => {0} \n\n", aS.SequenceEqual(bS));
}
static byte[] ObjectToByte(object item)
{
var formatter = new BinaryFormatter();
using (var memory = new MemoryStream())
{
formatter.Serialize(memory, item);
return memory.ToArray();
}
}
}
[Serializable]
public class UserData
{
public string SurName { get; set; }
public string ForeName { get; set; }
public string Initials { get; set; }
public override string ToString()
{
return string.Format("{{SurName: {0} , ForeName:{1}, Initials:{2}}}", SurName ?? "Empty", ForeName ?? "Empty", Initials ?? "Empty");
}
}
}
Here a working Demo
Hope this will help you
Related
I am looking for a way to fast and simple implementation of this paradigm:
MyByteArray mb = new MyByteArray();
mb.Add<byte>(bytevalue);
mb.Add<float>(floatvalue);
mb.Add<string>(str);
mb.Add<MyClass>(object);
And then get byte[] from mb to send it as a byte packet via RPC call (to be decoded on the other side using the same technique).
I've found MemoryStream, but it looks like too overheaded for this simple operation.
Can you help me? Thank you.
What are you looking for is BinaryWritter. But it still needs a Stream to write on for pure logic reason. And the only Stream that fits in your need is MemoryStream.
Are you afraid of performance overhead ? You can create your MemoryStream from an existing byte array ;
byte [] buffer = new byte[1024];
using (var memoryStream = new MemoryStream(buffer))
{
using (var binaryWriter = new BinaryWriter(memoryStream))
{
binaryWriter.Write(1.2F); // float
binaryWriter.Write(1.9D); // double
binaryWriter.Write(1); // integer
binaryWriter.Write("str"); // string
}
}
// buffer is filled with your data now.
A tricky way to achieve this is to use a combination of builtin class in .net
class Program
{
static void Main(string[] args)
{
Program program = new Program();
var listBytes = new List<byte>();
listBytes.Add( program.CastToBytes("test"));
listBytes.Add(program.CastToBytes(5));
}
Note
for a custom object you have to define an implicit operator on how the properties or all the object should be converted
public byte[] CastToBytes<T>(T value)
{
//this will cover most of primitive types
if (typeof(T).IsValueType)
{
return BitConverter.GetBytes((dynamic)value);
}
if (typeof(T) == typeof(string))
{
return Encoding.UTF8.GetBytes((dynamic) value);
}
//for a custom object you have to define the rules
else
{
var formatter = new BinaryFormatter();
var memoryStream = new MemoryStream();
formatter.Serialize(memoryStream, value);
return memoryStream.GetBuffer();
}
}
}
This looks like the case for Protocol Buffers, you could look at at protobuf-net.
First, let's decorate the classes.
[ProtoContract]
class User
{
[ProtoMember(1)]
public int Id { get; set; }
[ProtoMember(2)]
public string Name { get; set; }
}
[ProtoContract]
class Message
{
[ProtoMember(1)]
public byte Type { get; set; }
[ProtoMember(2)]
public float Value { get; set; }
[ProtoMember(3)]
public User Sender { get; set; }
}
Then we create our message.
var msg = new Message
{
Type = 1,
Value = 1.1f,
Sender = new User
{
Id = 8,
Name = "user"
}
};
And now, we can use ProtoBuf's serializer to do all our work.
// memory
using (var mem = new MemoryStream())
{
Serializer.Serialize<Message>(mem, msg);
var bytes = mem.GetBuffer();
}
// file
using (var file = File.Create("message.bin")) Serializer.Serialize<Message>(file, msg);
Previously, we serialized a property as a List<byte>
Now we want to change it to be a byte[].
It was out understanding that you should be able to swap out collection types freely between version but we get a ProtoBuf.ProtoException
[TestFixture, Category("Framework")]
class CollectionTypeChange
{
[Test]
public void TestRoundTrip()
{
var bytes = new List<byte>() {1,2,4};
var a = new ArrayHolder(bytes);
var aCopy = Deserialize<ArrayHolder>(Serialize(a));
//Passes
Assert.That(aCopy.CollectionOfBytes, Is.EquivalentTo(a.CollectionOfBytes));
}
[Test]
public void TestChangeArrayToList()
{
var bytes = new List<byte>() { 1, 2, 4 };
var a = new ArrayHolder(bytes);
var aCopy = Deserialize<ListHolder>(Serialize(a));
//Passes
Assert.That(aCopy.CollectionOfBytes, Is.EquivalentTo(a.CollectionOfBytes));
}
[Test]
public void TestChangeListToArray()
{
var bytes = new List<byte>() { 1, 2, 4 };
var a = new ListHolder(bytes);
//Throws: ProtoBuf.ProtoException : Invalid wire-type; this usually means you have over-written a file without truncating or setting the length; see http://stackoverflow.com/q/2152978/23354
var aCopy = Deserialize<ArrayHolder>(Serialize(a));
Assert.That(aCopy.CollectionOfBytes, Is.EquivalentTo(a.CollectionOfBytes));
}
public static byte[] Serialize<T>(T obj)
{
using (var stream = new MemoryStream())
{
Serializer.Serialize(stream, obj);
return stream.ToArray();
}
}
public static T Deserialize<T>(byte[] buffer)
{
using (var stream = new MemoryStream(buffer))
{
return Serializer.Deserialize<T>(stream);
}
}
}
[ProtoContract]
internal class ArrayHolder
{
private ArrayHolder()
{
CollectionOfBytes = new byte[0] {};
}
internal ArrayHolder(IEnumerable<byte> bytesToUse )
{
CollectionOfBytes = bytesToUse.ToArray();
}
[ProtoMember(1)]
public byte[] CollectionOfBytes { get; set; }
}
[ProtoContract]
internal class ListHolder
{
private ListHolder()
{
CollectionOfBytes = new List<byte>();
}
internal ListHolder(IEnumerable<byte> bytesToUse)
{
CollectionOfBytes = bytesToUse.ToList();
}
[ProtoMember(1)]
public List<byte> CollectionOfBytes { get; set; }
}
Is there a special thing about arrays, or bytes that means this doesn't work like we expected?
This looks to be a problem specifically with byte[] properties. If I change the property types to int [] and List<int> the behavior is not reproducible. The problem arises from the fact that there are two ways to encode an array in a Protocol Buffer: as repeated key/value pairs or "packed" as a single key with a length-delimited block of values.
For byte arrays, protobuf-net uses a special serializer, BlobSerializer, which simply writes the byte array length then block-copies the contents into the output buffer as a packed repeated field. It does the reverse operation when reading -- not handling the case when the data is actually in repeated key/value format.
On the other hand, List<byte> is serialized using the general-purpose ListDecorator. Its Read() method tests to see the format currently in the input buffer and reads it appropriately -- either packed or unpacked. Its Write() method, however, writes the byte array unpacked by default. Subsequently, when reading the buffer into a byte [] array, BlobSerializer throws an exception because the format is not as expected. Arguably this is a bug with protobuf-net's BlobSerializer.
There is, however, a straightforward workaround: state that the List<byte> should be serialized in packed format by setting IsPacked = true:
[ProtoContract]
internal class ListHolder
{
private ListHolder()
{
CollectionOfBytes = new List<byte>();
}
internal ListHolder(IEnumerable<byte> bytesToUse)
{
CollectionOfBytes = bytesToUse.ToList();
}
[ProtoMember(1, IsPacked = true)]
public List<byte> CollectionOfBytes { get; set; }
}
This should be a more compact representation for your list of bytes as well.
Unfortunately, the above workaround fails when the byte collection contains bytes with the high bit set. Protobuf-net serializes a packed List<byte> as a length-delimited sequence of Base 128 Varints. Thus when a byte with its high bit set is serialized, it is encoded as two bytes. On the other hand a byte [] member is serialized like a string as a length-delimited sequence of raw bytes. Thus one byte in the byte array is always encoded as byte in the encoding - which is incompatible with the encoding for List<byte>.
As a workaround, one could use a private surrogate List<byte> property in the ArrayHolder type:
[ProtoContract]
internal class ArrayHolder
{
private ArrayHolder()
{
CollectionOfBytes = new byte[0] { };
}
internal ArrayHolder(IEnumerable<byte> bytesToUse)
{
CollectionOfBytes = bytesToUse.ToArray();
}
[ProtoIgnore]
public byte[] CollectionOfBytes { get; set; }
[ProtoMember(1, OverwriteList = true)]
List<byte> ListOfBytes
{
get
{
if (CollectionOfBytes == null)
return null;
return new List<byte>(CollectionOfBytes);
}
set
{
if (value == null)
return;
CollectionOfBytes = value.ToArray();
}
}
}
Sample fiddle.
Alternatively, one could replace the ArrayHolder with a ListHolder during (de)serialization by using MetaType.SetSurrogate() as shown for instance in this answer.
I am using MultiValueDictionary(String,string) in my project (C# - VS2012 - .net 4.5), which is a great help if you want to have multiple values for each key, but I can't serialize this object with protobuf.net.
I have serialized Dictionary(string,string) with Protobuf with ease and speed and MultiValueDictionary inherits from that generic type; so, logically there should be no problem serializing it with the same protocol.
Does any one know a workaround?
This is the error message when I execute my codes:
System.InvalidOperationException: Unable to resolve a suitable Add
method for System.Collections.Generic.IReadOnlyCollection
Do you realy need a dictionary?
If you have less than 10000 items in your dictionary you can also use a modified list of a datatype..
public class YourDataType
{
public string Key;
public string Value1;
public string Value2;
// add some various data here...
}
public class YourDataTypeCollection : List<YourDataType>
{
public YourDataType this[string key]
{
get
{
return this.FirstOrDefault(o => o.Key == key);
}
set
{
YourDataType old = this[key];
if (old != null)
{
int index = this.IndexOf(old);
this.RemoveAt(index);
this.Insert(index, value);
}
else
{
this.Add(old);
}
}
}
}
Use the list like this:
YourDataTypeCollection data = new YourDataTypeCollection();
// add some values like this:
data.Add(new YourDataType() { Key = "key", Value1 = "foo", Value2 = "bar" });
// or like this:
data["key2"] = new YourDataType() { Key = "key2", Value1 = "hello", Value2 = "world" };
// or implement your own method to adding data in the YourDataTypeCollection class
XmlSerializer xser = new XmlSerializer(typeof(YourDataTypeCollection));
// to export data
using (FileStream fs = File.Create("YourFile.xml"))
{
xser.Serialize(fs, data);
}
// to import data
using (FileStream fs = File.Open("YourFile.xml", FileMode.Open))
{
data = (YourDataTypeCollection)xser.Deserialize(fs);
}
string value1 = data["key"].Value1;
I'm using JSON to store certain settings within my application. Some of the settings contain sensitive information (e.g. passwords) while other settings are not sensitive. Ideally I'd like to be able to serialize my objects where the sensitive properties are encrypted automatically while keeping the non-sensitive settings readable. Is there a way to do this using Json.Net? I did not see any setting related to encryption.
Json.Net does not have built-in encryption. If you want to be able to encrypt and decrypt during the serialization process, you will need to write some custom code. One approach is to use a custom IContractResolver in conjunction with an IValueProvider. The value provider gives you a hook where you can transform values within the serialization process, while the contract resolver gives you control over when and where the value provider gets applied. Together, they can give you the solution you are looking for.
Below is an example of the code you would need. First off, you'll notice I've defined a new [JsonEncrypt] attribute; this will be used to indicate which properties you want to be encrypted. The EncryptedStringPropertyResolver class extends the DefaultContractResolver provided by Json.Net. I've overridden the CreateProperties() method so that I can inspect the JsonProperty objects created by the base resolver and attach an instance of my custom EncryptedStringValueProvider to any string properties which have the [JsonEncrypt] attribute applied. The EncryptedStringValueProvider later handles the actual encryption/decryption of the target string properties via the respective GetValue() and SetValue() methods.
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Reflection;
using System.Security.Cryptography;
using System.Text;
using Newtonsoft.Json;
using Newtonsoft.Json.Serialization;
[AttributeUsage(AttributeTargets.Property)]
public class JsonEncryptAttribute : Attribute
{
}
public class EncryptedStringPropertyResolver : DefaultContractResolver
{
private byte[] encryptionKeyBytes;
public EncryptedStringPropertyResolver(string encryptionKey)
{
if (encryptionKey == null)
throw new ArgumentNullException("encryptionKey");
// Hash the key to ensure it is exactly 256 bits long, as required by AES-256
using (SHA256Managed sha = new SHA256Managed())
{
this.encryptionKeyBytes =
sha.ComputeHash(Encoding.UTF8.GetBytes(encryptionKey));
}
}
protected override IList<JsonProperty> CreateProperties(Type type, MemberSerialization memberSerialization)
{
IList<JsonProperty> props = base.CreateProperties(type, memberSerialization);
// Find all string properties that have a [JsonEncrypt] attribute applied
// and attach an EncryptedStringValueProvider instance to them
foreach (JsonProperty prop in props.Where(p => p.PropertyType == typeof(string)))
{
PropertyInfo pi = type.GetProperty(prop.UnderlyingName);
if (pi != null && pi.GetCustomAttribute(typeof(JsonEncryptAttribute), true) != null)
{
prop.ValueProvider =
new EncryptedStringValueProvider(pi, encryptionKeyBytes);
}
}
return props;
}
class EncryptedStringValueProvider : IValueProvider
{
PropertyInfo targetProperty;
private byte[] encryptionKey;
public EncryptedStringValueProvider(PropertyInfo targetProperty, byte[] encryptionKey)
{
this.targetProperty = targetProperty;
this.encryptionKey = encryptionKey;
}
// GetValue is called by Json.Net during serialization.
// The target parameter has the object from which to read the unencrypted string;
// the return value is an encrypted string that gets written to the JSON
public object GetValue(object target)
{
string value = (string)targetProperty.GetValue(target);
byte[] buffer = Encoding.UTF8.GetBytes(value);
using (MemoryStream inputStream = new MemoryStream(buffer, false))
using (MemoryStream outputStream = new MemoryStream())
using (AesManaged aes = new AesManaged { Key = encryptionKey })
{
byte[] iv = aes.IV; // first access generates a new IV
outputStream.Write(iv, 0, iv.Length);
outputStream.Flush();
ICryptoTransform encryptor = aes.CreateEncryptor(encryptionKey, iv);
using (CryptoStream cryptoStream = new CryptoStream(outputStream, encryptor, CryptoStreamMode.Write))
{
inputStream.CopyTo(cryptoStream);
}
return Convert.ToBase64String(outputStream.ToArray());
}
}
// SetValue gets called by Json.Net during deserialization.
// The value parameter has the encrypted value read from the JSON;
// target is the object on which to set the decrypted value.
public void SetValue(object target, object value)
{
byte[] buffer = Convert.FromBase64String((string)value);
using (MemoryStream inputStream = new MemoryStream(buffer, false))
using (MemoryStream outputStream = new MemoryStream())
using (AesManaged aes = new AesManaged { Key = encryptionKey })
{
byte[] iv = new byte[16];
int bytesRead = inputStream.Read(iv, 0, 16);
if (bytesRead < 16)
{
throw new CryptographicException("IV is missing or invalid.");
}
ICryptoTransform decryptor = aes.CreateDecryptor(encryptionKey, iv);
using (CryptoStream cryptoStream = new CryptoStream(inputStream, decryptor, CryptoStreamMode.Read))
{
cryptoStream.CopyTo(outputStream);
}
string decryptedValue = Encoding.UTF8.GetString(outputStream.ToArray());
targetProperty.SetValue(target, decryptedValue);
}
}
}
}
Once you have the resolver in place, the next step is to apply the custom [JsonEncrypt] attribute to the string properties within your classes that you wish to be encrypted during serialization. For example, here is a contrived class that might represent a user:
public class UserInfo
{
public string UserName { get; set; }
[JsonEncrypt]
public string UserPassword { get; set; }
public string FavoriteColor { get; set; }
[JsonEncrypt]
public string CreditCardNumber { get; set; }
}
The last step is to inject the custom resolver into the serialization process. To do that, create a new JsonSerializerSettings instance, then set the ContractResolver property to a new instance of the custom resolver. Pass the settings to the JsonConvert.SerializeObject() or DeserializeObject() methods and everything should just work.
Here is a round-trip demo:
public class Program
{
public static void Main(string[] args)
{
try
{
UserInfo user = new UserInfo
{
UserName = "jschmoe",
UserPassword = "Hunter2",
FavoriteColor = "atomic tangerine",
CreditCardNumber = "1234567898765432",
};
// Note: in production code you should not hardcode the encryption
// key into the application-- instead, consider using the Data Protection
// API (DPAPI) to store the key. .Net provides access to this API via
// the ProtectedData class.
JsonSerializerSettings settings = new JsonSerializerSettings();
settings.Formatting = Formatting.Indented;
settings.ContractResolver = new EncryptedStringPropertyResolver("My-Sup3r-Secr3t-Key");
Console.WriteLine("----- Serialize -----");
string json = JsonConvert.SerializeObject(user, settings);
Console.WriteLine(json);
Console.WriteLine();
Console.WriteLine("----- Deserialize -----");
UserInfo user2 = JsonConvert.DeserializeObject<UserInfo>(json, settings);
Console.WriteLine("UserName: " + user2.UserName);
Console.WriteLine("UserPassword: " + user2.UserPassword);
Console.WriteLine("FavoriteColor: " + user2.FavoriteColor);
Console.WriteLine("CreditCardNumber: " + user2.CreditCardNumber);
}
catch (Exception ex)
{
Console.WriteLine(ex.GetType().Name + ": " + ex.Message);
}
}
}
Output:
----- Serialize -----
{
"UserName": "jschmoe",
"UserPassword": "sK2RvqT6F61Oib1ZittGBlv8xgylMEHoZ+1TuOeYhXQ=",
"FavoriteColor": "atomic tangerine",
"CreditCardNumber": "qz44JVAoJEFsBIGntHuPIgF1sYJ0uyYSCKdYbMzrmfkGorxgZMx3Uiv+VNbIrbPR"
}
----- Deserialize -----
UserName: jschmoe
UserPassword: Hunter2
FavoriteColor: atomic tangerine
CreditCardNumber: 1234567898765432
Fiddle: https://dotnetfiddle.net/trsiQc
Though #Brian's solution is quite clever, I don't like the complexity of a custom ContractResolver. I converted Brian's code to a JsonConverter, so your code would become
public class UserInfo
{
public string UserName { get; set; }
[JsonConverter(typeof(EncryptingJsonConverter), "My-Sup3r-Secr3t-Key")]
public string UserPassword { get; set; }
public string FavoriteColor { get; set; }
[JsonConverter(typeof(EncryptingJsonConverter), "My-Sup3r-Secr3t-Key")]
public string CreditCardNumber { get; set; }
}
I've posted the (quite lengthy) EncryptingJsonConverter as a Gist and also blogged about it.
My solution:
public string PasswordEncrypted { get; set; }
[JsonIgnore]
public string Password
{
get
{
var encrypted = Convert.FromBase64String(PasswordEncrypted);
var data = ProtectedData.Unprotect(encrypted, AdditionalEntropy, DataProtectionScope.LocalMachine);
var res = Encoding.UTF8.GetString(data);
return res;
}
set
{
var data = Encoding.UTF8.GetBytes(value);
var encrypted = ProtectedData.Protect(data, AdditionalEntropy, DataProtectionScope.LocalMachine);
PasswordEncrypted = Convert.ToBase64String(encrypted);
}
(can be made less verbose)
I want to be able to save any C# object in a single column of a SQL database table. I am not clear how to convert the object into a varbinary or get it back from a varbinary. My SystemContextObjects table has an OptionValue column that is Varbinary(max).
var dc1 = new DataContextDataContext();
var optionUpdate = dc1.SystemContextObjects.SingleOrDefault(o => o.OptionId == OptionId && o.OptionKey == OptionKey);
if (optionUpdate != null)
{
optionUpdate.OptionValue = Value; <===== NEED HELP HERE...
optionUpdate.DateCreated = DateTime.Now;
optionUpdate.PageName = PageName;
var ChangeSet = dc1.GetChangeSet();
if (ChangeSet.Updates.Count > 0)
{
dc1.SubmitChanges();
return;
}
}
You can use a binary serializer for this, e.g. using the BinaryFormatter - but your classes must be serializable and be marked as such, here a simple example:
You have a simple Person class and mark it as serializable:
[Serializable]
public class Person
{
public string Name { get; set; }
public string Address { get; set; }
}
You can then serialize it using a memory stream to extract a byte array representing the object:
Person p = new Person() { Name = "Fred Fish", Address = "2 Some Place" };
using (MemoryStream ms = new MemoryStream())
{
BinaryFormatter formatter = new BinaryFormatter();
formatter.Serialize(ms, p);
ms.Position = 0;
byte[] personData = ms.ToArray(); // here's your data!
}
To re-create a Person object from a byte array you use de-serialization which works similar:
byte[] personData = ...
using (MemoryStream ms = new MemoryStream(personData))
{
BinaryFormatter formatter = new BinaryFormatter();
Person p = (Person)formatter.Deserialize(ms);
}
I wound up using JSON to accomplish this. I serialize/deserialize the class to/from a string and store that. Works fine.