I have an object that can be serialized and deserialized but upon deserialization it throws me an error:
Invalid field in source data: 0
I don't know why this is happening
code for de-serialization and receiving:
public void listenUDP()
{
EndPoint ep = (EndPoint)groupEP;
//BinaryFormatter bf = new BinaryFormatter();
recieving_socket.Bind(ep);
while (true)
{
byte[] objData = new byte[65535];
recieving_socket.ReceiveFrom(objData, ref ep);
MemoryStream ms = new MemoryStream();
ms.Write(objData, 0, objData.Length);
ms.Seek(0, SeekOrigin.Begin);
messageHandle(ProtoBuf.Serializer.Deserialize<SimplePacket>(ms));
ms.Dispose();
}
}
Code for serialization:
public void sendDataUDP(Vec2f[] data)
{
SimplePacket packet = new SimplePacket(DateTime.UtcNow, data);
//IFormatter formatter = new BinaryFormatter();
MemoryStream stream = new MemoryStream();
System.Diagnostics.Stopwatch st = System.Diagnostics.Stopwatch.StartNew();
//formatter.Serialize(stream, data);
ProtoBuf.Serializer.Serialize<SimplePacket>(stream, packet);
//Console.WriteLine(st.ElapsedTicks);
stream.Close();
st.Restart();
sending_socket.SendTo(stream.ToArray(), sending_end_point);
//Console.WriteLine(st.ElapsedTicks);
st.Stop();
}
The root object in a protobuf message, as defined by the google specification, does not include any notion of the end of the message. This is intentional, so that concatenation is identical to merging two fragments. Consequently, the consuming code needs to restrict itself to a single message. This is identical between all protobuf implementations, and is not specific for protobuf-net.
What is happening is that your buffer is currently oversized, with garbage at the end. Currently (because you are reading one message) that garbage is most likely all zeros, and a zero is not a valid marker for a field. However, when re-using the buffer the garbage could be... anything.
In your case, probably the easiest way to do this is to use the SerializeWithLengthPrefix / DeserializeWithLengthPrefix methods, which handle all this for you by prepending the payload length at the start of the message, and only processing that much data.
As a final thought: it is not clear to me that your code will guarantee that is has read an entire message; a single receive could (on TCP, at least) return part of a message - or 2 and a bit messages, etc: TCP is stream-based, not message-based.
Related
I have a protobuf object that I am sending from a C# application (using clrZmq) to a C++ service (using the zmq C++ bindings) on a local machine (for testing). I attempt to send my object from C# using the following
Taurus.Odds odds = Util.GetFakeOdds();
using (var context = ZmqContext.Create())
using (var socket = context.CreateSocket(SocketType.REQ))
{
byte[] buffer = null;
socket.Connect(TARGET); // TARGET = "tcp://127.0.0.1:6500"
Taurus.FeedMux mux = new Taurus.FeedMux();
mux.type = Taurus.FeedMux.Type.ODDS;
mux.odds = odds;
SendStatus status = socket.Send(mux.ToByteArray());
if (status == SendStatus.Sent)
{
int i;
byte[] arr = socket.Receive(buffer, SocketFlags.None, out i);
Taurus.Bet bet = buffer.ToObject<Taurus.Bet>();
}
...
}
Where I am serializing to my Taurus.Odds object to byte[] via the extension method
public static byte[] ToByteArray(this object o)
{
if(o == null)
return null;
BinaryFormatter bf = new BinaryFormatter();
using (MemoryStream ms = new MemoryStream())
{
bf.Serialize(ms, o);
return ms.ToArray();
}
}
I see in my C++ application that the code receives the message, but the C++ ZMQ classes fail to de-serialize it correctly. I have some Java code that send to the C++ code in the same way without issue. My question is, am I sending my object via ZMQ correctly in the above and if not what am I doing wrong?
Thanks for your time.
Here's your error:
I am serializing to my Taurus.Odds object to byte[] via the extension method
...
BinaryFormatter bf = new BinaryFormatter();
...
You seem to be unaware of what BinaryFormatter is. It is in no way related to ProtoBuf. The docs say the following:
Serializes and deserializes an object, or an entire graph of connected objects, in binary format.
This binary format is a .NET-specific implementation detail. And it's very rigid at that, with poor versioning support. It was mainly used in the .NET remoting days, and it's generally considered a bad idea to use it today, as there are much better serializers around.
As you can see, there's no way your C++ app could be able to read that, as it's not in protobuf format.
So throw this method away and replace it with some proper protobuf serializing code, as explained in the protobuf-net docs. You'll need to add [ProtoContract] and [ProtoMember] attributes in your objects. Then you could write something like:
public static byte[] ToByteArray<T>(this T o)
{
if (o == null)
return null;
using (MemoryStream ms = new MemoryStream())
{
ProtoBuf.Serializer.Serialize(ms, o);
return ms.ToArray();
}
}
I'm trying to send data back and forth between only two computers using a Socket. The data is in the form of serialized Packet objects.
When testing the program on another computer on my local network, I'm getting random SerializationExceptions so that no data goes through.
The program consistently sends different data, so when it makes another pass at sending it again, it will sometimes go through and sometimes hit the same SerializationException again. If I catch the exception and leave it running, all data eventually makes it through, but it takes several tries.
The exception says: "The input stream is not a valid binary format. The starting contents (in bytes) are [byte data]"
Not sure exactly where my problem lies. The larger amounts of data I send (~100kb max) always go through. The smaller ones (50-70 bytes) have trouble. Here's everything to do with my serialization and reading/writing data.
Socket defined as such:
SocketMain = new Socket(AddressFamily.InterNetwork, SocketType.Stream, ProtocolType.Tcp);
Send & Read methods. I'm aware this is probably a horrible way to do so and might end up being my issue. Suggestions?:
public void SendPacket(Packet P)
{
using (MemoryStream MS = new MemoryStream())
{
BinaryFormatter BF = new BinaryFormatter();
BF.Serialize(MS, P);
SocketMain.Send(MS.ToArray());
}
}
public void ReadPacket()
{
byte[] BufferArray = new byte[131072];
int BytesReceived = SocketMain.Receive(BufferArray);
byte[] ActualData = new byte[BytesReceived];
Buffer.BlockCopy(BufferArray, 0, ActualData, 0, BytesReceived);
using (MemoryStream MS = new MemoryStream(ActualData))
{
BinaryFormatter BF = new BinaryFormatter();
HandlePacket((Packet)BF.Deserialize(MS));
}
}
Example Packet object. This is one of my smaller ones. I think this might be the one that is causing the issue, but I don't know how I could tell.
[Serializable()]
public class Packet4BlockVerify : Packet, ISerializable
{
public byte Index;
public string MD5Hash;
public Packet4BlockVerify(int Index, string MD5Hash): base(4)
{
this.Index = (byte)Index;
this.MD5Hash = MD5Hash;
}
protected Packet4BlockVerify(SerializationInfo info, StreamingContext context)
{
this.ID = info.GetByte("ID");
this.Index = info.GetByte("Index");
this.MD5Hash = info.GetString("MD5Hash");
}
public override void GetObjectData(SerializationInfo info, StreamingContext context)
{
info.AddValue("ID", this.ID);
info.AddValue("Index", this.Index);
info.AddValue("MD5Hash", this.MD5Hash);
}
}
Does anyone see anything wrong?
You are not reading all the bytes you sent. Your receive call:
int BytesReceived = SocketMain.Receive(BufferArray);
returns any number of bytes. You will need to pre-pend the bytes you send with the size of the remaining bytes, read that then continue reading till you have all your bytes before trying to deserialize.
TCP sends a continuous byte stream so your receive call reads arbitrary sized chunks. One of the overloads you can specify the number of bytes you want to receive so after reading the number bytes you are expecting you could use that. e.g.
// Warning untested! (but you get the idea)
// when sending
var payload = MS.ToArray();
var payloadSize = payload.Length;
mySocket.Send(BitConverter.GetBytes(payloadSize));
mySocket.Send(payload);
// when recieving
mySocket.Recieve(myBuffer, sizeof(int), SocketFlags.None);
var payloadSize = BitConverter.ToInt32(myBuffer, 0);
mySocket.Recieve(myBuffer, payloadSize, SocketFlags.None);
// now myBuffer from index 0 - payloadSize contains the payload you sent
I am trying to learn udp sockets etc.... I created two programs server and client. The client sends a packet to the server, the server bounces it back.
This is the code I use in both programs for converting the data to and from a byte[]
but I am getting an error when converting from byte[]
public static Packet Open(byte[] b)
{
MemoryStream memStream = new MemoryStream();
BinaryFormatter binForm = new BinaryFormatter();
memStream.Write(b, 0, b.Length);
memStream.Seek(0, SeekOrigin.Begin);
object obj = new object();
try
{
// this line here is where the error is occurring
obj = (object)binForm.Deserialize(memStream);
}
catch (Exception er)
{
MessageBox.Show(er.Message);
}
if (obj is Packet)
return (Packet)obj;
else
return null;
}
public byte[] Bundle()
{
BinaryFormatter bf = new BinaryFormatter();
MemoryStream ms = new MemoryStream();
bf.Serialize(ms, this);
return ms.ToArray();
}
If I do this, all from one program it works
Packet p =new Packet();
p.Message="hello";
byte[] data = p.Bundle();
Packet p2 = Packet.Open(data);
MessageBox.Show(p2.Message);
The error I am receiving is "unable to find assembly in "the name of my client program"
AnyIdeas?
It sounds to me like you are serializing a type that is not shared via a reference between both ends. Note: it is not sufficient to have the same class compiled into both, since BinaryFormatter includes the full type name including the assembly, so: it will still count as an unrelated type. The common fix there (and I use the word "fix" entirely incorrectly) is to write an assembly for the DTO and reference that assembly from both client and server. This approach still has many issues, though.
For info, there are other serializers that are compatible with just having a similar class at each end. I'm biased, but I would suggest having a look at protobuf-net; the output is usually significantly smaller, and it isn't tied to the type, meaning the class just has to be broadly similar at each end (it is very version tolerant). Plus it is faster (CPU-wise) too!
I am trying to send large objects (>30MB) to a MSMQ queue. Due to the large amount of data we are are tring to send the idea was to GZip the objects prior to sending them, then unzipping them on the receiving end.
However, writing the compressed stream to the message.BodyStream property seems to work, but not reading it out from there.
I don't know what's wrong.
Message l_QueueMessage = new Message();
l_QueueMessage.Priority = priority;
using (MessageQueue l_Queue = CreateQueue())
{
GZipStream stream = new GZipStream(l_QueueMessage.BodyStream, CompressionMode.Compress);
Formatter.Serialize(stream, message);
l_Queue.Send(l_QueueMessage);
}
The Formatter is a global property of type BinaryFormatter. This is used to serialize/deserialize to the type of object we want to send/receive, e.g. "ProductItem".
The receving end looks like this:
GZipStream stream = new GZipStream(l_Message.BodyStream, CompressionMode.Decompress);
object decompressedObject = Formatter.Deserialize(stream);
ProductItem l_Item = decompressedObject as ProductItem;
m_ProductReceived(sender, new MessageReceivedEventArgs<ProductItem>(l_Item));
l_ProductQueue.BeginReceive();
I get an EndOfStreamException "{"Unable to read beyond the end of the stream."} trying to deserialize
at System.IO.BinaryReader.ReadByte()
Using the messageBodyStream property I actually circumvent the message.Formatter, which I don't initialize to anything, becasue I'm using my own ser/deser mechanism with the GZipStream. However, I am not sure if that's the correct way of doing this.
What am I missing?
Thanks!
In your original code, the problem is that you need to close the GZipStream in order for a GZip footer to be written correctly, and only then you can send it. If you dont, you end up sending bytes that can not be deserialized. That's also why your new code where sending is done later works.
OK, I made this work. The key was to convert the decompressed stream on the receiver to a byte[] array. Then the deserialization started working.
The sender code (notice the stream is closed before sending the message):
using (MessageQueue l_Queue = CreateQueue())
{
using (GZipStream stream = new GZipStream(l_QueueMessage.BodyStream, CompressionMode.Compress, true))
{
Formatter.Serialize(stream, message);
}
l_Queue.Send(l_QueueMessage);
}
The receiving end (notice how I convert the stream to a byte[] then deserialize):
using (GZipStream stream = new GZipStream(l_QueueMessage.BodyStream, CompressionMode.Decompress))
{
byte[] bytes = ReadFully(stream);
using (MemoryStream ms = new MemoryStream(bytes))
{
decompressedObject = Formatter.Deserialize(ms);
}
}
Still, don't know why this works using the ReadFully() function and not the Stream.CopyTo().
Does anyone?
Btw, ReadFully() is a function that creates a byte[] out of a Stream. I have to credit Jon Skeet for this at http://www.yoda.arachsys.com/csharp/readbinary.html. Thanks!
Try to separate compressing and sending:
byte[] binaryBuffer = null;
using (MemoryStream compressedBody = new MemoryStream())
{
using(GZipStream stream = new GZipStream(compressedBody, CompressionMode.Compress))
{
Formatter.Serialize(compressedBody, message);
binaryBuffer = compressedBody.GetBuffer();
}
}
using (MessageQueue l_Queue = CreateQueue())
{
l_QueueMessage.BodyStream.Write(binaryBuffer, 0, binaryBuffer.Length);
l_QueueMessage.BodyStream.Seek(0, SeekOrigin.Begin);
l_Queue.Send(l_QueueMessage);
}
I am currently working on code that serializes in one app (C++) and needs to deserialize it in another (C#). I am trying to use google proto + protobuf-net but something is failing.
Both the .cc and the .cs message definition files were generated with their respective compilers, from the same .proto file.
The data is being sent via UDP, and the messages (~40B) easily fit into a single datagram.
On the C++ size, boost::asio is being used to transmit the data, the relevant code being:
ProtocolBufferdata data;
...
boost::asio::streambuf b;
std::ostream os(&b);
data.SerializeToOstream(&os);
m_Socket.send_to(b.data(), m_Endpoint);
I am fairly sure this is working correctly, since using wireshark I can at least see all the strings I expect in the datagram. On the C# side, using Begin/End recieve, we have the following in the callback:
byte[] buffer ....
public void ReceiveData(IAsyncResult iar)
{
try
{
Socket remote = (Socket)iar.AsyncState;
int recv = remote.EndReceive(iar);
using (MemoryStream memStream = new MemoryStream())
{
memStream.Write(buffer, 0, recv);
ProtoData data = ProtoBuf.Serializer.Deserialize<ProtoData >(memStream);
onReceive(data);
}
}
catch (Exception ex)
{
...
}
finally
{
socket.BeginReceive(buffer, 0, buffer.Length, SocketFlags.None, new AsyncCallback(ReceiveData), socket);
}
}
The buffer does have the expected number of bytes in it, and has the tell-tale strings. The protobuf-net container has all default values.
I am a bit puzzled about what is going on here, and it is almost impossible to attach a debugger to the client application since it is deployed as a plug-in to another application that does not play well with a remote debugger. I would appreciate any advice, this has me stumped.
Rewind your stream - it is at the end, so Read will return no data:
memStream.Write(buffer, 0, recv);
memStream.Position = 0; // <===========here
ProtoData data = ProtoBuf.Serializer.Deserialize<ProtoData>(memStream);
alternatively, use the overloaded constructor to prepare the stream:
using (MemoryStream memStream = new MemoryStream(buffer, 0, recv))
{
ProtoData data = ProtoBuf.Serializer.Deserialize<ProtoData>(memStream);