Have trouble deserializing an object when it is sent as a packet - c#

I am trying to learn udp sockets etc.... I created two programs server and client. The client sends a packet to the server, the server bounces it back.
This is the code I use in both programs for converting the data to and from a byte[]
but I am getting an error when converting from byte[]
public static Packet Open(byte[] b)
{
MemoryStream memStream = new MemoryStream();
BinaryFormatter binForm = new BinaryFormatter();
memStream.Write(b, 0, b.Length);
memStream.Seek(0, SeekOrigin.Begin);
object obj = new object();
try
{
// this line here is where the error is occurring
obj = (object)binForm.Deserialize(memStream);
}
catch (Exception er)
{
MessageBox.Show(er.Message);
}
if (obj is Packet)
return (Packet)obj;
else
return null;
}
public byte[] Bundle()
{
BinaryFormatter bf = new BinaryFormatter();
MemoryStream ms = new MemoryStream();
bf.Serialize(ms, this);
return ms.ToArray();
}
If I do this, all from one program it works
Packet p =new Packet();
p.Message="hello";
byte[] data = p.Bundle();
Packet p2 = Packet.Open(data);
MessageBox.Show(p2.Message);
The error I am receiving is "unable to find assembly in "the name of my client program"
AnyIdeas?

It sounds to me like you are serializing a type that is not shared via a reference between both ends. Note: it is not sufficient to have the same class compiled into both, since BinaryFormatter includes the full type name including the assembly, so: it will still count as an unrelated type. The common fix there (and I use the word "fix" entirely incorrectly) is to write an assembly for the DTO and reference that assembly from both client and server. This approach still has many issues, though.
For info, there are other serializers that are compatible with just having a similar class at each end. I'm biased, but I would suggest having a look at protobuf-net; the output is usually significantly smaller, and it isn't tied to the type, meaning the class just has to be broadly similar at each end (it is very version tolerant). Plus it is faster (CPU-wise) too!

Related

Serializing Protobuf Object and Sending with ØMQ/ZMQ

I have a protobuf object that I am sending from a C# application (using clrZmq) to a C++ service (using the zmq C++ bindings) on a local machine (for testing). I attempt to send my object from C# using the following
Taurus.Odds odds = Util.GetFakeOdds();
using (var context = ZmqContext.Create())
using (var socket = context.CreateSocket(SocketType.REQ))
{
byte[] buffer = null;
socket.Connect(TARGET); // TARGET = "tcp://127.0.0.1:6500"
Taurus.FeedMux mux = new Taurus.FeedMux();
mux.type = Taurus.FeedMux.Type.ODDS;
mux.odds = odds;
SendStatus status = socket.Send(mux.ToByteArray());
if (status == SendStatus.Sent)
{
int i;
byte[] arr = socket.Receive(buffer, SocketFlags.None, out i);
Taurus.Bet bet = buffer.ToObject<Taurus.Bet>();
}
...
}
Where I am serializing to my Taurus.Odds object to byte[] via the extension method
public static byte[] ToByteArray(this object o)
{
if(o == null)
return null;
BinaryFormatter bf = new BinaryFormatter();
using (MemoryStream ms = new MemoryStream())
{
bf.Serialize(ms, o);
return ms.ToArray();
}
}
I see in my C++ application that the code receives the message, but the C++ ZMQ classes fail to de-serialize it correctly. I have some Java code that send to the C++ code in the same way without issue. My question is, am I sending my object via ZMQ correctly in the above and if not what am I doing wrong?
Thanks for your time.
Here's your error:
I am serializing to my Taurus.Odds object to byte[] via the extension method
...
BinaryFormatter bf = new BinaryFormatter();
...
You seem to be unaware of what BinaryFormatter is. It is in no way related to ProtoBuf. The docs say the following:
Serializes and deserializes an object, or an entire graph of connected objects, in binary format.
This binary format is a .NET-specific implementation detail. And it's very rigid at that, with poor versioning support. It was mainly used in the .NET remoting days, and it's generally considered a bad idea to use it today, as there are much better serializers around.
As you can see, there's no way your C++ app could be able to read that, as it's not in protobuf format.
So throw this method away and replace it with some proper protobuf serializing code, as explained in the protobuf-net docs. You'll need to add [ProtoContract] and [ProtoMember] attributes in your objects. Then you could write something like:
public static byte[] ToByteArray<T>(this T o)
{
if (o == null)
return null;
using (MemoryStream ms = new MemoryStream())
{
ProtoBuf.Serializer.Serialize(ms, o);
return ms.ToArray();
}
}

Random serialization exceptions using Socket

I'm trying to send data back and forth between only two computers using a Socket. The data is in the form of serialized Packet objects.
When testing the program on another computer on my local network, I'm getting random SerializationExceptions so that no data goes through.
The program consistently sends different data, so when it makes another pass at sending it again, it will sometimes go through and sometimes hit the same SerializationException again. If I catch the exception and leave it running, all data eventually makes it through, but it takes several tries.
The exception says: "The input stream is not a valid binary format. The starting contents (in bytes) are [byte data]"
Not sure exactly where my problem lies. The larger amounts of data I send (~100kb max) always go through. The smaller ones (50-70 bytes) have trouble. Here's everything to do with my serialization and reading/writing data.
Socket defined as such:
SocketMain = new Socket(AddressFamily.InterNetwork, SocketType.Stream, ProtocolType.Tcp);
Send & Read methods. I'm aware this is probably a horrible way to do so and might end up being my issue. Suggestions?:
public void SendPacket(Packet P)
{
using (MemoryStream MS = new MemoryStream())
{
BinaryFormatter BF = new BinaryFormatter();
BF.Serialize(MS, P);
SocketMain.Send(MS.ToArray());
}
}
public void ReadPacket()
{
byte[] BufferArray = new byte[131072];
int BytesReceived = SocketMain.Receive(BufferArray);
byte[] ActualData = new byte[BytesReceived];
Buffer.BlockCopy(BufferArray, 0, ActualData, 0, BytesReceived);
using (MemoryStream MS = new MemoryStream(ActualData))
{
BinaryFormatter BF = new BinaryFormatter();
HandlePacket((Packet)BF.Deserialize(MS));
}
}
Example Packet object. This is one of my smaller ones. I think this might be the one that is causing the issue, but I don't know how I could tell.
[Serializable()]
public class Packet4BlockVerify : Packet, ISerializable
{
public byte Index;
public string MD5Hash;
public Packet4BlockVerify(int Index, string MD5Hash): base(4)
{
this.Index = (byte)Index;
this.MD5Hash = MD5Hash;
}
protected Packet4BlockVerify(SerializationInfo info, StreamingContext context)
{
this.ID = info.GetByte("ID");
this.Index = info.GetByte("Index");
this.MD5Hash = info.GetString("MD5Hash");
}
public override void GetObjectData(SerializationInfo info, StreamingContext context)
{
info.AddValue("ID", this.ID);
info.AddValue("Index", this.Index);
info.AddValue("MD5Hash", this.MD5Hash);
}
}
Does anyone see anything wrong?
You are not reading all the bytes you sent. Your receive call:
int BytesReceived = SocketMain.Receive(BufferArray);
returns any number of bytes. You will need to pre-pend the bytes you send with the size of the remaining bytes, read that then continue reading till you have all your bytes before trying to deserialize.
TCP sends a continuous byte stream so your receive call reads arbitrary sized chunks. One of the overloads you can specify the number of bytes you want to receive so after reading the number bytes you are expecting you could use that. e.g.
// Warning untested! (but you get the idea)
// when sending
var payload = MS.ToArray();
var payloadSize = payload.Length;
mySocket.Send(BitConverter.GetBytes(payloadSize));
mySocket.Send(payload);
// when recieving
mySocket.Recieve(myBuffer, sizeof(int), SocketFlags.None);
var payloadSize = BitConverter.ToInt32(myBuffer, 0);
mySocket.Recieve(myBuffer, payloadSize, SocketFlags.None);
// now myBuffer from index 0 - payloadSize contains the payload you sent

Why does my protobuf-net stream not work?

I have an object that can be serialized and deserialized but upon deserialization it throws me an error:
Invalid field in source data: 0
I don't know why this is happening
code for de-serialization and receiving:
public void listenUDP()
{
EndPoint ep = (EndPoint)groupEP;
//BinaryFormatter bf = new BinaryFormatter();
recieving_socket.Bind(ep);
while (true)
{
byte[] objData = new byte[65535];
recieving_socket.ReceiveFrom(objData, ref ep);
MemoryStream ms = new MemoryStream();
ms.Write(objData, 0, objData.Length);
ms.Seek(0, SeekOrigin.Begin);
messageHandle(ProtoBuf.Serializer.Deserialize<SimplePacket>(ms));
ms.Dispose();
}
}
Code for serialization:
public void sendDataUDP(Vec2f[] data)
{
SimplePacket packet = new SimplePacket(DateTime.UtcNow, data);
//IFormatter formatter = new BinaryFormatter();
MemoryStream stream = new MemoryStream();
System.Diagnostics.Stopwatch st = System.Diagnostics.Stopwatch.StartNew();
//formatter.Serialize(stream, data);
ProtoBuf.Serializer.Serialize<SimplePacket>(stream, packet);
//Console.WriteLine(st.ElapsedTicks);
stream.Close();
st.Restart();
sending_socket.SendTo(stream.ToArray(), sending_end_point);
//Console.WriteLine(st.ElapsedTicks);
st.Stop();
}
The root object in a protobuf message, as defined by the google specification, does not include any notion of the end of the message. This is intentional, so that concatenation is identical to merging two fragments. Consequently, the consuming code needs to restrict itself to a single message. This is identical between all protobuf implementations, and is not specific for protobuf-net.
What is happening is that your buffer is currently oversized, with garbage at the end. Currently (because you are reading one message) that garbage is most likely all zeros, and a zero is not a valid marker for a field. However, when re-using the buffer the garbage could be... anything.
In your case, probably the easiest way to do this is to use the SerializeWithLengthPrefix / DeserializeWithLengthPrefix methods, which handle all this for you by prepending the payload length at the start of the message, and only processing that much data.
As a final thought: it is not clear to me that your code will guarantee that is has read an entire message; a single receive could (on TCP, at least) return part of a message - or 2 and a bit messages, etc: TCP is stream-based, not message-based.

c# - deserializing large object

I have a chat that has a file sharing system that I built by slightly modifying monotorrent.
When a user shares a file the client serializes the Monotorrent.common.torrent object (represents a .torrent file) and sends it to the server inside of another object and the server deserialize it. This works only when the file that the user shares is small(about less than 1 MB). When its larger the server gives the following exeption:
Binary stream '0' does not contain a valid BinaryHeader. Possible
causes are invalid stream or object version change between
serialization and deserialization.
This is my deserialization code:
public CommendData ByteArrayToCommendData(byte[] arrBytes)
{
using (MemoryStream memStream = new MemoryStream(arrBytes))
{
BinaryFormatter binForm = new BinaryFormatter();
memStream.Seek(0, SeekOrigin.Begin);
CommendData obj = (CommendData)binForm.Deserialize(memStream);
return obj;
}
}
(CommendData contains the Monotorrent.common.torrent object in this instance)

Deserializing from SQL Server Compact throws a "There is an unclosed literal string." error

I am trying to store some objects in an SQL Server Compact (SQL CE) database by serializing them with a SOAP formatter. Serializing seems to work just fine, but when I try to deserialize the object I get an error saying
There is an unclosed literal string. Line 53, position 72.
Furthermore, after restarting the application on attempting to fill the dataset I get the following error:
Failed to enable constraints. One or more rows contain values violating non-null, unique, or foreign-key constraints.
All my columns (except the ID) allow for null values and are non-unique, so I have no idea where this comes from. Here is the code of my serializer:
public static class Serializer
{
static public string Serialize(AssessmentReport theObject)
{
MemoryStream mStream = new MemoryStream();
SoapFormatter formatter = new SoapFormatter();
formatter.Serialize(mStream, theObject);
byte[] buffer = mStream.ToArray();
mStream.Close();
string value = Encoding.UTF8.GetString(buffer);
return value;
}
static public AssessmentReport Deserialize(string value)
{
byte[] buffer = Encoding.UTF8.GetBytes(value);
MemoryStream mStream = new MemoryStream(buffer);
SoapFormatter formatter = new SoapFormatter();
mStream.Position = 0;
AssessmentReport theReport = (AssessmentReport)formatter.Deserialize(mStream);
mStream.Close();
return theReport;
}
}
Here is how I call the serializer (theReport is an instance of the object to be serialized):
examTableAdapter.UpdateAsmFile(Serializer.Serialize(theReport), examID);
And here is how I am calling the deserializing method:
string value = Convert.ToString(examTableAdapter.GetAsmFile(2));
AsmReport theReport = Serializer.Deserialize(value)
The field in the SQL Server Compact database where the string is saved is of type nvarchar with a limit of 3500.
I tried using a binary formatter, but when serializing it seems to always return an empty byte[] buffer. I really need deep serializing, that's why the XML serializer is out of question.
Ok so this one I have been trying to figure out for more than two months now.
While I could not find a logical solution to the problem, it seems that changing the encoding to utf7 fixed the problem. I can't think of a reason why this would be the issue, but it seems to be specific to my machine (I finally had the opportunity to run the code on another computer and it worked perfectly with utf8).

Categories

Resources