What is the best way to copy a BindingList?
Just use ForEach()? Or are there better ways?
BindingList has a constructor which can take an IList. And BindingList implements IList. So you can just do the following:
BindingList newBL = new BindingList(oldBL);
Of course that creates a second list that just points at the same objects. If you actually want to clone the objects in the list then you have to do more work.
Foreach pretty much is the easiest way, and the performance overhead is minimal if any.
From a deleted answer:
Serialize the object then de-serialize
to get a deep cloned non referenced
copy
Which is a valid option if the OP wants a deep copy.
We use the Serialize / De-serialize route to get a deep copy of the list. It works well but it does slow performance down in larger lists, such as for search screens, so I'd avoid using it on lists with 5000+ items.
using System;
using System.IO;
using System.Runtime.Serialization.Formatters.Binary;
namespace ProjectName.LibraryName.Namespace
{
internal static class ObjectCloner
{
///
/// Clones an object by using the .
///
/// The object to clone.
///
/// The object to be cloned must be serializable.
///
public static object Clone(object obj)
{
using (MemoryStream buffer = new MemoryStream())
{
BinaryFormatter formatter = new BinaryFormatter();
formatter.Serialize(buffer, obj);
buffer.Position = 0;
object temp = formatter.Deserialize(buffer);
return temp;
}
}
}
}
Related
I've seen this type of question asked before but not sure what the root cause of the problem was or how to fix it.
I am modifying an existing class to be able to load data into the member variables from flash. Right now, the class loads data from a file through the load function. This function has been overloaded to take in byte array.
The data read back from the flash is put into this byte array.
The error that is thrown is (happens at the line ... = formatter.Deserialize(stream)):
The input stream is not a valid binary format. The starting contents (in bytes) are: 93-E3-E6-3F-C3-F5-E4-41-00-C0-8D-C3-14-EE-4A-C3-00 ...
The interesting thing here is that the contents are exactly the contents of the byte array that is being passed into the stream. In other words, this is the data from the flash and this is exactly what I want serialized. I'm not sure why the error is being thrown.
Or a better question is what is a is a valid binary format for a BinaryFormatter? Does it need a certain size? Is there specific end value needed? Are certain values invalid? The current size of the byte array input is 24 bytes.
Code:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Drawing;
using System.Drawing.Drawing2D;
using System.Drawing.Imaging;
using System.Windows.Media.Imaging;
using System.IO;
using Galileo.Data;
using System.Xml.Serialization;
using System.Reflection;
using System.Runtime.InteropServices;
using UComm;
using System.Runtime.Serialization.Formatters.Binary;
using ULog;
public void Load(byte[] spCalfromPrimary)
{
try
{
Type settingsType = this.GetType();
object tmp = Activator.CreateInstance(settingsType);
Stream stream = new MemoryStream();
stream.Write(spCalfromPrimary, 0, spCalfromPrimary.Length);
stream.Position = 0;
BinaryFormatter formatter = new BinaryFormatter();
//tmp = formatter.Deserialize(stream);
formatter.Deserialize(stream); //**<--- throws error here**
// Use reflection to copy all public properties from the temporary object into this one.
PropertyInfo[] properties = settingsType.GetProperties();
foreach (PropertyInfo property in properties)
{
object value = property.GetValue(tmp, null);
if (value == null)
{
throw new FileFormatException("Null value encountered in settings file");
}
else
{
property.SetValue(this, value, null);
}
}
}
catch (Exception ex)
{
_logger.DebugException("Failed to load spatial cal value from FW", ex);
Console.WriteLine(ex.Message);
}
}
// <summary>
/// Loads the setting from file
/// </summary>
public void Load()
{
Type settingsType = this.GetType();
XmlSerializer xser = new XmlSerializer(settingsType);
object tmp = Activator.CreateInstance(settingsType);
using (StreamReader reader = new StreamReader(_filename)) { tmp = xser.Deserialize(reader); }
// Use reflection to copy all public properties from the temporary object into this one.
PropertyInfo[] properties = settingsType.GetProperties();
foreach (PropertyInfo property in properties)
{
object value = property.GetValue(tmp, null);
if (value == null)
{
throw new FileFormatException("Null value encountered in settings file");
}
else
{
property.SetValue(this, value, null);
}
}
}
Note that I have also tried the a Convert byte array to object function (I found on stackoverflow). When I used this function, an exception was still thrown at .Deserialize(memStream).
// Convert a byte array to an Object
private Object ByteArrayToObject(byte[] arrBytes)
{
MemoryStream memStream = new MemoryStream();
BinaryFormatter binForm = new BinaryFormatter();
memStream.Write(arrBytes, 0, arrBytes.Length);
memStream.Seek(0, SeekOrigin.Begin);
Object obj = (Object) binForm.Deserialize(memStream);
return obj;
}
Apparently I left out some important information.
Serialization happens in a different application from deserialization. Serialization uses a bitconverter to take the data, convert to a byte array and upload it to flash. Let me explain. The data that is being serialized / deserialized & stored in flash is calibration data. Calibration is performed at the factory with Application1 by production. This uses a bitconverter to put every field into a stream and then serialize the stream.
CFlatInterface.FloatToStream(bData, ref i, rtsMappingData.ScaleTrackingDMD);
CFlatInterface.FloatToStream(bData, ref i, rtsMappingData.RotationAngle);
CFlatInterface.FloatToStream(bData, ref i, rtsMappingData.CenterOfRotation.dx);
where the function FloatToStream is defined as:
public static void FloatToStream(byte[] buf, ref int index, float val)
{
Buffer.BlockCopy(BitConverter.GetBytes(val), 0, buf, index, sizeof(float));
index += sizeof(float);
}
So every field that makes up the Calibration is put into the stream this way. The data is put into the stream and a byte array is constructed and sent to flash.
On the other side, once the product is out of the factory and in use, Application2 (user application) has a Calibration object that has all the calibration fields. This reads the flash, and gets the data that was written by Application1. Application2 is trying to deserialize the calibration data using BinaryFormatter and the code above. I am coming to the conclusion this is not possible (Thanks Rotem). The correct course of action is to use the same formatter for both serialization / deserialization - i will implement it this way and indicate if that makes a difference.
Following your update, the obvious issue is that you are serializing and deserializing with different formatters.
BinaryFormatter serializes more than just field data. It also serializes type information and metadata so it knows how to deserialize the objects, so it is expecting more than just raw data bytes as input.
Either use a BinaryFormatter on the serializing end as well, or use a manual deserialization technique on the receiving end.
I'm creating a program which has to send data between a client and server efficiently. To organize packets clearly, I'm using serialization. However, when I serialize these packets the data is unnecessarily large. I'll explain what I'm doing so that you can understand what I need.
My packet classes work like this. I have a Packet object:
using System;
using System.IO;
using System.Runtime.Serialization.Formatters.Binary;
[Serializable]
public class Packet
{
public static byte[] Serialize(Object o)
{
MemoryStream ms = new MemoryStream();
BinaryFormatter bf = new BinaryFormatter();
bf.Serialize(ms, o);
return ms.ToArray();
}
public static Object Deserialize(byte[] bt)
{
MemoryStream ms = new MemoryStream();
BinaryFormatter bf = new BinaryFormatter();
ms.Write(bt, 0, bt.Length);
ms.Position = 0;
object obj = bf.Deserialize(ms);
ms.Close();
return obj;
}
}
I can then create other classes that inherit from the Packet class, here's an example:
using System;
[Serializable]
public class PacketUserInfo : Packet
{
public string Name;
public int Age;
}
Then, it's very simple to put this into a byte array and send it (Of course the above packet is merely an example). However, the size of the resulting array is at least 10 times larger than it would be if I was to use a BinaryWriter and manually write the information.
Why is the serialized data so large? Is there any way to decrease it while still keeping everything organized with packets as their own classes?
Note: I'm only intending to serialize simple properties like this, nothing fancy.
Where you say "Why is the serialized data [...] larger than it would be if I was to use a BinaryWriter and manually write the information", with information you mean property values. The serializer you use however, serializes not only the data, but also some information about the class. You can see this by viewing the serialized data in a text editor.
Is there any way to decrease it while still keeping everything organized with packets as their own classes?
Use more specialized serialization, like protobuf or the library suggested by #Piotr.
Also I think your serialization code should not reside in the Packet base class, but rather in a separate class, like PacketEncoder.
So I'm trying to find a generic extension method that creates a deep copy of an object using reflection, that would work in Silverlight. Deep copy using serialization is not so great in Silverlight, since it runs in partial trust and the BinaryFormatter does not exist. I also know that reflection would be faster then serialization for cloning.
It would be nice to have a method that works to copy public, private and protected fields, and is recursive so that it can copy objects in objects, and that would also be able to handle collections, arrays, etc.
I have searched online, and can only find shallow copy implementations using reflection. I don't understand why, since you can just use MemberwiseClone, so to me, those implementations are useless.
Thank You.
For data contract objects we have used the following helper method for deep cloning within Silverlight:
public static T Clone<T>(T source)
{
DataContractSerializer serializer = new DataContractSerializer(typeof(T));
using (MemoryStream ms = new MemoryStream())
{
serializer.WriteObject(ms, source);
ms.Seek(0, SeekOrigin.Begin);
return (T)serializer.ReadObject(ms);
}
}
Used like this:
var clone = CloneHelper.Clone<MyDTOType>(dtoVar);
Required Namespaces:
using System.Reflection;
using System.Collections.Generic;
Method:
private readonly static object _lock = new object();
public static T cloneObject<T>(T original, List<string> propertyExcludeList)
{
try
{
Monitor.Enter(_lock);
T copy = Activator.CreateInstance<T>();
PropertyInfo[] piList = typeof(T).GetProperties(BindingFlags.Public | BindingFlags.NonPublic | BindingFlags.Instance);
foreach (PropertyInfo pi in piList)
{
if (!propertyExcludeList.Contains(pi.Name))
{
if (pi.GetValue(copy, null) != pi.GetValue(original, null))
{
pi.SetValue(copy, pi.GetValue(original, null), null);
}
}
}
return copy;
}
finally
{
Monitor.Exit(_lock);
}
}
This is not specific to Silverlight in any way - it is just plain Reflection.
As written it will only work with objects that have a parameterless constructor. To use objects that require constructor parameters, you will need to pass in an object[] with the parameters, and use a different overload of the Activator.CreateInstance method e.g.
T copy = (T)Activator.CreateInstance(typeof(T), initializationParameters);
The propertyExcludeList parameter is a list of property names that you wish to exclude from the copy, if you want to copy all properties just pass an empty list e.g.
new List<string>()
Can't you just use regular .NET reflection? Serialize your object to a MemoryStream and then deserialize it back. This will create a deep copy (ultimately using reflection) and will require hardly any code on your part:
T DeepCopy<T>(T instance)
{
BinaryFormatter formatter=new BinaryFormatter();
using(var stream=new MemoryStream())
{
formatter.Serialize(stream, instance);
stream.Position=0;
return (T)formatter.Deserialize(stream);
}
}
I need a semi-shallow copy of an object. Under my original design I used MemberwiseClone to catch all the simple stuff and then I specifically copied the classes to the extent that they needed to be copied. (Some of them are inherently static and most of the rest are containers holding static items.) I didn't like the long list of copies but there's no way around that.
Now, however, I find myself needing to create a descendent object--do I now have to go back and copy all those fields that previously I was copying with MemberwiseClone?
Or am I missing some better workaround for this?
The easiest way to clone, I find, is to use serialization. This obviously only works with classes that are [Serializable] or that implement ISerializable.
Here is a general generic extension you can use to make any serializable class' objects cloneable:
public static T Clone<T>(this T source)
{
if (source == default(T))
{
return default(T);
} else {
IFormatter formatter = new BinaryFormatter();
Stream ms = new MemoryStream();
using (ms)
{
formatter.Serialize(ms, source);
stream.Seek(0, SeekOrigin.Begin);
return (T) formatter.Deserialize(ms);
}
}
}
I've got a third-party component that does PDF file manipulation. Whenever I need to perform operations I retrieve the PDF documents from a document store (database, SharePoint, filesystem, etc.). To make things a little consistent I pass the PDF documents around as a byte[].
This 3rd party component expects a MemoryStream[] (MemoryStream array) as a parameter to one of the main methods I need to use.
I am trying to wrap this functionality in my own component so that I can use this functionality for a number of areas within my application. I have come up with essentially the following:
public class PdfDocumentManipulator : IDisposable
{
List<MemoryStream> pdfDocumentStreams = new List<MemoryStream>();
public void AddFileToManipulate(byte[] pdfDocument)
{
using (MemoryStream stream = new MemoryStream(pdfDocument))
{
pdfDocumentStreams.Add(stream);
}
}
public byte[] ManipulatePdfDocuments()
{
byte[] outputBytes = null;
using (MemoryStream outputStream = new MemoryStream())
{
ThirdPartyComponent component = new ThirdPartyComponent();
component.Manipuate(this.pdfDocumentStreams.ToArray(), outputStream);
//move to begining
outputStream.Seek(0, SeekOrigin.Begin);
//convert the memory stream to a byte array
outputBytes = outputStream.ToArray();
}
return outputBytes;
}
#region IDisposable Members
public void Dispose()
{
for (int i = this.pdfDocumentStreams.Count - 1; i >= 0; i--)
{
MemoryStream stream = this.pdfDocumentStreams[i];
this.pdfDocumentStreams.RemoveAt(i);
stream.Dispose();
}
}
#endregion
}
The calling code to my "wrapper" looks like this:
byte[] manipulatedResult = null;
using (PdfDocumentManipulator manipulator = new PdfDocumentManipulator())
{
manipulator.AddFileToManipulate(file1bytes);
manipulator.AddFileToManipulate(file2bytes);
manipulatedResult = manipulator.Manipulate();
}
A few questions about the above:
Is the using clause in the AddFileToManipulate() method redundant and unnecessary?
Am I cleaning up things OK in my object's Dispose() method?
Is this an "acceptable" usage of MemoryStream? I am not anticipating very many files in memory at once...Likely 1-10 total PDF pages, each page about 200KB. App designed to run on server supporting an ASP.NET site.
Any comments/suggestions?
Thanks for the code review :)
AddFileToManipulate scares me.
public void AddFileToManipulate(byte[] pdfDocument)
{
using (MemoryStream stream = new MemoryStream(pdfDocument))
{
pdfDocumentStreams.Add(stream);
}
}
This code is adding a disposed stream to your pdfDocumentStream list. Instead you should simply add the stream using:
pdfDocumentStreams.Add(new MemoryStream(pdfDocument));
And dispose of it in the Dispose method.
Also you should look at implementing a finalizer to ensure stuff gets disposed in case someone forgets to dispose the top level object.
Is the using clause in the AddFileToManipulate() method redundant and unnecessary?
Worse, it's destructive. You're basically closing your memory stream before it's added in. See the other answers for details, but basically, dispose at the end, but not any other time. Every using with an object causes a Dispose to happen at the end of the block, even if the object is "passed off" to other objects via methods.
Am I cleaning up things OK in my object's Dispose() method?
Yes, but you're making life more difficult than it needs to be. Try this:
foreach (var stream in this.pdfDocumentStreams)
{
stream.Dispose();
}
this.pdfDocumentStreams.Clear();
This works just as well, and is much simpler. Disposing an object does not delete it - it just tells it to free it's internal, unmanaged resources. Calling dispose on an object in this way is fine - the object stays uncollected, in the collection. You can do this and then clear the list in one shot.
Is this an "acceptable" usage of MemoryStream? I am not anticipating very many files in memory at once...Likely 1-10 total PDF pages, each page about 200KB. App designed to run on server supporting an ASP.NET site.
This depends on your situation. Only you can determine whether the overhead of having these files in memory is going to cause you problems. This is going to be a fairly heavy-weight object, though, so I'd use it carefully.
Any comments/suggestions?
Implement a finalizer. It's a good idea whenever you implement IDisposable. Also, you should rework your Dispose implementation to the standard one, or mark your class as sealed. For details on how this should be done, see this article. In particular, you should have a method declared as protected virtual void Dispose(bool disposing) that your Dispose method and your finalizer both call.
It looks to me like you misunderstand what Using does.
It's just syntactic sugar to replace
MemoryStream ms;
try
{
ms = new MemoryStream();
}
finally
{
ms.Dispose();
}
Your usage in AddFileToManipulate is redundant. I'd set up the list of memorystreams in the constructor of PdfDocumentManipulator, then have PdfDocumentManipulator's dispose method call dispose on all the memorystreams.
Side note. This really seems like it calls for an extension method.
public static void DisposeAll<T>(this IEnumerable<T> enumerable)
where T : IDisposable {
foreach ( var cur in enumerable ) {
cur.Dispose();
}
}
Now your Dispose method becomes
public void Dispose() {
pdfDocumentStreams.Reverse().DisposeAll();
pdfDocumentStreams.Clear();
}
EDIT
You don't need the 3.5 framework in order to have extension methods. They will happily work on the 3.0 compiler down targeted to 2.0
http://blogs.msdn.com/jaredpar/archive/2007/11/16/extension-methods-without-3-5-framework.aspx