What I am doing is attempting to send an IPEndpoint through protobuf-net and what I observed is that when deserializing the array of 4 bytes into the IP4 address, the set code recieves a value of 8 bytes. Four bytes containing the orignal address, and 4 more bytes containing the address that was serialized. By stepping through the code I have been able to confirm that when Deserialize is called, it first reads the bytes, and then sets they bytes.
After doing some reading I learned about OverwriteList, and as can been seen in the example below, I have set that to true. However the setter is still provided an 8 byte value.
Does anyone have a clue what I am doing wrong?
This sample code should throw an exception when used with protobuf-net r480, Visual Studio 2010 as a .Net 4.0 console application.
using ProtoBuf;
using System.Net;
using System.IO;
namespace ConsoleApplication1
{
[ProtoContract]
class AddressOWner
{
private IPEndPoint endpoint;
public AddressOWner()
{ endpoint = new IPEndPoint(new IPAddress(new byte[] {8,8,8,8}), 0); }
public AddressOWner(IPEndPoint newendpoint)
{ this.endpoint = newendpoint; }
[ProtoMember(1, OverwriteList=true)]
public byte[] AddressBytes
{
get { return endpoint.Address.GetAddressBytes(); }
set { endpoint.Address = new IPAddress(value); }
}
}
class Program
{
static void Main(string[] args)
{
AddressOWner ao = new AddressOWner(new IPEndPoint(new IPAddress(new byte[] { 192, 168, 1, 1 }), 80));
MemoryStream ms = new MemoryStream();
Serializer.Serialize(ms, ao);
byte[] messageData = ms.GetBuffer();
ms = new MemoryStream(messageData);
AddressOWner aoCopy = Serializer.Deserialize<AddressOWner>(ms);
}
}
}
It looks like this is actually a bug, specific to byte[], which is handled as a particular protobuf primitive. Other arrays/lists are mapped to repeated (in protobuf terms), and handle the OverwriteList option correctly. I will tweak the byte[] handling to support this option.
Edit: this is fixed in r484, with supporting integration test
Related
I'm trying to send a packet to a Tattile traffic camera.The Tattile camera uses its own TCP packet protocol called TOP (Tattile Object Protocol)So far from what I can see from reading the documentation I need a IP > TCP > TOP HEADER > VCR PAYLOAD
To create the TOP HeaderHere are the requirements.
24 bytes are required I believe.Here is the command header, from the image above, the Header Dimension part, is this asking for the TOP Header that required 24 bytes?Here is the Header Constructor which I dont understand why it is there is there is already a Command Header with the same information from what I can see.Here is an example on building a message, so for the command code at this stage until I get a better understanding, all I want to do is send data, not receive, so with that being saidHere is the Start Engine command code.
Here is what I have code wise, so far it connects and "sends the message" however the engine doesn't start, as for the enum in the future when I get a better understanding, I should be about to add more of the commands with the command codes.
class Command
{
public enum Codes
{
START_ENGINE
}
private static readonly byte[] HeaderDimension = new byte[24];
private static byte[] CommandCode;
private static readonly byte[] Sender = new byte[4] { 0xFF, 0xFF, 0xFF, 0xFF };
private static readonly byte[] Receiver = Sender;
private static readonly byte[] Error = new byte[] { 0 };
private static readonly byte[] DataDimension = new byte[] {0};
public static void Execute(Codes code)
{
if (code == Codes.START_ENGINE)
{
CommandCode = new byte[4]{ 0x35, 0x0, 0x0, 0x4};
}
using (TcpClient tcpClient = new TcpClient("192.168.1.21", 31000))
{
NetworkStream networkStream = tcpClient.GetStream();
byte[] bytesTosend = HeaderDimension.Concat(CommandCode)
.Concat(Sender)
.Concat(Receiver)
.Concat(Error)
.Concat(DataDimension).ToArray();
networkStream.Write(bytesTosend, 0, bytesTosend.Length);
}
}
}
Here is how I'm calling it
static void Main()
{
Command.Execute(Command.Codes.START_ENGINE);
Console.ReadKey();
}
The header has a total of 24 bytes, containing 6 x 4 byte values. The first 4 bytes contain the length, being 24 (0x18). In C# these are Int32 data types, however keep in mind what the byte order is. Network protocols usually have network byte order (big indian) which is likely different from your C#. Use the System.BitConverter class to test and change if needed.
Your HeaderDimension should be a 4-byte array that contains the value 24, not a 24-byte array.
Also the Error and DataDimension should always be 4 bytes in length.
I'm new to C# and sockets so I apologize if my questions are out of line. I started building a socket interface using the example in this link:
https://code.msdn.microsoft.com/High-Performance-NET-69c2df2f
I want to be able to transfer binary files across the socket so I made an assumption (maybe the wrong one) that I should not use StringBuilder. I changed the OSUserToken from the original to use a MemoryStream and BinaryWriter (commenting out the original code).
Elsewhere in the code (from the link above), SocketAsyncEventArgs is intialized with SetBuffer(new Byte[_bufferSize], 0, _bufferSize);. I'm concerned this will not mesh well with my MemoryStream and BinaryWriter but it seems to work.
sealed class UserToken : IDisposable
{
private Socket _ownerSocket;
public Socket ownerSocket { get { return _ownerSocket; } }
private MemoryStream _memoryStream;
private BinaryWriter _binaryWriter;
//private StringBuilder stringbuilder;
private int totalByteCount;
public String LastError;
public UserToken(Socket readSocket, int bufferSize)
{
_ownerSocket = readSocket;
_memoryStream = new MemoryStream();
_binaryWriter = new BinaryWriter(_memoryStream);
//stringbuilder = new StringBuilder(bufferSize);
}
// Do something with the received data, then reset the token for use by another connection.
// This is called when all of the data have been received for a read socket.
public void ProcessData(SocketAsyncEventArgs args)
{
String received = System.Text.Encoding.ASCII.GetString(_memoryStream.ToArray());
//String received = stringbuilder.ToString();
Debug.Write("Received: \"" + received + "\". The server has read " + received.Length + " bytes.");
_memoryStream.SetLength(0);
//stringbuilder.Length = 0;
totalByteCount = 0;
}
public bool ReadSocketData(SocketAsyncEventArgs readSocket)
{
int byteCount = readSocket.BytesTransferred;
/*
if ((totalByteCount + byteCount) > stringbuilder.Capacity)
{
LastError = "Receive Buffer cannot hold the entire message for this connection.";
return false;
}
else
{
*/
//stringbuilder.Append(Encoding.ASCII.GetString(readSocket.Buffer, readSocket.Offset, byteCount));
_binaryWriter.Write(readSocket.Buffer,readSocket.Offset,byteCount);
totalByteCount += byteCount;
return true;
/*}*/
}
public void Dispose()
{
_memoryStream.Dispose();
_binaryWriter.Dispose();
try
{
_ownerSocket.Shutdown(SocketShutdown.Both);
}
catch
{
//Nothing to do here, connection is closed already
}
finally
{
_ownerSocket.Close();
}
}
}
When I run this, it seems to work without an issue. Even if I set the protected const int DEFAULT_BUFFER_SIZE = 1 it will accept a stream of >1 bytes:
17:11:20:433 - Debug - Initializing the listener on port 5000...
17:11:20:439 - Debug - Starting the listener...
17:11:20:444 - Debug - Server started.
17:11:31:856 - Debug - Received: "listener". The server has read 8 bytes.
17:11:33:264 - Debug - Received: "l". The server has read 1 bytes.
17:11:33:268 - Debug - Received: "istener". The server has read 7 bytes.
17:11:36:744 - Debug - Received: "l". The server has read 1 bytes.
17:11:36:744 - Debug - Received: "i". The server has read 1 bytes.
17:11:36:746 - Debug - Received: "stener". The server has read 6 bytes.
My questions are these:
Am I right that StringBuilder wouldn't work for binary files and I should use MemoryStream and BinaryWriter?
Do I need to be concerned with a buffer overflow if elsewhere in the program, the SocketAsyncEventArgs is initialized with SetBuffer(new Byte[_bufferSize], 0, _bufferSize);?
If I have to obey the buffer size limitation, do I need to put the same buffer restriction on my client sending data?
I found answers to my questions
StringBuilder works fine. Just encode the strings in base64 before sending and decode after receiving. This should be done no matter if sending text or binary data. See the class I wrote below.
Still don't know the answer to this question but seeing as StringBuilder
& base64 works with binary, this question is no longer relevant.
I think the answer to this question is yes. The client should have a max message length. I'm controlling based on the header portion of my socket where I define the length of the message. The header is fixed length and my max message length is 0xFFFFF.
Class for encoding/decoding base64:
public static class Base64
{
public static string EncodeBase64(string text)
{
return System.Convert.ToBase64String(System.Text.Encoding.UTF8.GetBytes(text));
}
public static string EncodeBase64(byte[] array)
{
return System.Convert.ToBase64String(array);
}
public static string DecodeBase64ToString(string base64String)
{
return System.Text.Encoding.UTF8.GetString(System.Convert.FromBase64String(base64String));
}
public static Byte[] DecodeBase64ToBinary(string base64String)
{
Byte[] bytes = System.Convert.FromBase64String(base64String);
return bytes;
}
}
I'm trying to send data back and forth between only two computers using a Socket. The data is in the form of serialized Packet objects.
When testing the program on another computer on my local network, I'm getting random SerializationExceptions so that no data goes through.
The program consistently sends different data, so when it makes another pass at sending it again, it will sometimes go through and sometimes hit the same SerializationException again. If I catch the exception and leave it running, all data eventually makes it through, but it takes several tries.
The exception says: "The input stream is not a valid binary format. The starting contents (in bytes) are [byte data]"
Not sure exactly where my problem lies. The larger amounts of data I send (~100kb max) always go through. The smaller ones (50-70 bytes) have trouble. Here's everything to do with my serialization and reading/writing data.
Socket defined as such:
SocketMain = new Socket(AddressFamily.InterNetwork, SocketType.Stream, ProtocolType.Tcp);
Send & Read methods. I'm aware this is probably a horrible way to do so and might end up being my issue. Suggestions?:
public void SendPacket(Packet P)
{
using (MemoryStream MS = new MemoryStream())
{
BinaryFormatter BF = new BinaryFormatter();
BF.Serialize(MS, P);
SocketMain.Send(MS.ToArray());
}
}
public void ReadPacket()
{
byte[] BufferArray = new byte[131072];
int BytesReceived = SocketMain.Receive(BufferArray);
byte[] ActualData = new byte[BytesReceived];
Buffer.BlockCopy(BufferArray, 0, ActualData, 0, BytesReceived);
using (MemoryStream MS = new MemoryStream(ActualData))
{
BinaryFormatter BF = new BinaryFormatter();
HandlePacket((Packet)BF.Deserialize(MS));
}
}
Example Packet object. This is one of my smaller ones. I think this might be the one that is causing the issue, but I don't know how I could tell.
[Serializable()]
public class Packet4BlockVerify : Packet, ISerializable
{
public byte Index;
public string MD5Hash;
public Packet4BlockVerify(int Index, string MD5Hash): base(4)
{
this.Index = (byte)Index;
this.MD5Hash = MD5Hash;
}
protected Packet4BlockVerify(SerializationInfo info, StreamingContext context)
{
this.ID = info.GetByte("ID");
this.Index = info.GetByte("Index");
this.MD5Hash = info.GetString("MD5Hash");
}
public override void GetObjectData(SerializationInfo info, StreamingContext context)
{
info.AddValue("ID", this.ID);
info.AddValue("Index", this.Index);
info.AddValue("MD5Hash", this.MD5Hash);
}
}
Does anyone see anything wrong?
You are not reading all the bytes you sent. Your receive call:
int BytesReceived = SocketMain.Receive(BufferArray);
returns any number of bytes. You will need to pre-pend the bytes you send with the size of the remaining bytes, read that then continue reading till you have all your bytes before trying to deserialize.
TCP sends a continuous byte stream so your receive call reads arbitrary sized chunks. One of the overloads you can specify the number of bytes you want to receive so after reading the number bytes you are expecting you could use that. e.g.
// Warning untested! (but you get the idea)
// when sending
var payload = MS.ToArray();
var payloadSize = payload.Length;
mySocket.Send(BitConverter.GetBytes(payloadSize));
mySocket.Send(payload);
// when recieving
mySocket.Recieve(myBuffer, sizeof(int), SocketFlags.None);
var payloadSize = BitConverter.ToInt32(myBuffer, 0);
mySocket.Recieve(myBuffer, payloadSize, SocketFlags.None);
// now myBuffer from index 0 - payloadSize contains the payload you sent
I have a project where I'm trying to send a serialized object to the server, then wait for an "OK" or "ERROR" message to come back.
I seem to be having a similar problem to th poster of : TcpClient send/close problem
The issue is that the only way I seem to be able to send the original object is to close the connection, but then (of course) I can't wait to see if the object was processed successfully by the server.
private void button4_Click(object sender, EventArgs e)
{
RequestPacket req = new RequestPacket();
/// ... Fill out request packet ...
/// Connect to the SERVER to send the message...
TcpClient Client = new TcpClient("localhost", 10287);
using (NetworkStream ns = Client.GetStream())
{
XmlSerializer xml = new XmlSerializer(typeof(RequestPacket));
xml.Serialize(ns, req);
/// NOTE: This doesn't seem to do anything....
/// The server doesn't get the object I just serialized.
/// However, if I use ns.Close() it does...
/// but then I can't get the response.
ns.Flush();
// Get the response. It should be "OK".
ResponsePacket resp;
XmlSerializer xml2 = new XmlSerializer(typeof(ResponsePacket));
resp = (ResponsePacket)xml2.Deserialize(ns);
/// ... EVALUATE RESPONSE ...
}
Client.Close()
}
UPDATE: In response to one commenter, I don't think the client can be at fault. It is simply waiting for the object, and the object never comes until I close the socket.... however, if I'm wrong, I'll GLADLY eat crow publicly. =) Here's the client:
static void Main(string[] args)
{
// Read the port from the command line, use 10287 for default
CMD cmd = new CMD(args);
int port = 10287;
if (cmd.ContainsKey("p")) port = Convert.ToInt32(cmd["p"]);
TcpListener l = new TcpListener(port);
l.Start();
while (true)
{
// Wait for a socket connection.
TcpClient c = l.AcceptTcpClient();
Thread T = new Thread(ProcessSocket);
T.Start(c);
}
}
static void ProcessSocket(object c)
{
TcpClient C = (TcpClient)c;
try
{
RequestPacket rp;
//// Handle the request here.
using (NetworkStream ns = C.GetStream())
{
XmlSerializer xml = new XmlSerializer(typeof(RequestPacket));
rp = (RequestPacket)xml.Deserialize(ns);
}
ProcessPacket(rp);
}
catch
{
// not much to do except ignore it and go on.
}
}
Yeah.... it's that simple.
Uh oh, you can blame Nagle's algorithm. It has nothing to do with C# though, it is a default behavior for TCP/IP stack. Enable NoDelay socket option using SetSocketOption method. But be careful, disabling Nagle's algorithm will downgrade the throughput.
I'm also not sure about that stream you are using on top of the socket, as I am not a C# developer at all, but try to drop its instance so it does write for sure :-)
The short version is apparently, when using XmlSerializer (or any other big blob) to shove data down a NetworkStream, it will simply hold the line open indefinitely waiting for more information to be written. It only flushes the connection once you close it. This creates a situation where this method is great for sending, but not receiving. Or vice-versa. It becomes a one-way communication, and useless for continued back-and-forth communication over the same connection.
It's kind of crappy that I had to work around something that seemed so elegant on the surface, but dropping back to my old C days, I've resorted to sending a "number of bytes" packet first, then the actual packet. This enables me to READ at the other end the exact number of bytes so I never get caught in a blocking pattern.
To simplify my life, I created a class that holds some static methods for both sending and receiving. This class can send ANY XML-serializable class across the network, so it does what I need it to do.
If anyone has a more elegant solution, I'd be open to hearing it.
public class PacketTransit
{
public static void SendPacket(TcpClient C, object Packet)
{
MemoryStream ms = new MemoryStream();
XmlSerializer xml = new XmlSerializer(Packet.GetType());
xml.Serialize(ms, Packet);
ms.Position = 0;
byte[] b = ms.GetBuffer();
ms.Dispose();
byte [] sizePacket = BitConverter.GetBytes(b.Length);
// Send the 4-byte size packet first.
C.Client.Send(sizePacket, sizePacket.Length, SocketFlags.None);
C.Client.Send(b, b.Length, SocketFlags.None);
}
/// The string is the XML file that needs to be converted.
public static string ReceivePacket(TcpClient C, Type PacketType)
{
byte [] FirstTen = new byte[1024];
int size = 0;
byte[] sizePacket = BitConverter.GetBytes(size);
// Get the size packet
int sp = C.Client.Receive(sizePacket, sizePacket.Length, SocketFlags.None);
if (sp <= 0) return "";
size = BitConverter.ToInt32(sizePacket, 0);
// read until "size" is met
StringBuilder sb = new StringBuilder();
while (size > 0)
{
byte[] b = new byte[1024];
int x = size;
if (x > 1024) x = 1024;
int r = C.Client.Receive(b, x, SocketFlags.None);
size -= r;
sb.Append(UTF8Encoding.UTF8.GetString(b));
}
return sb.ToString();
}
/// The XML data that needs to be converted back to the appropriate type.
public static object Decode(string PacketData, Type PacketType)
{
MemoryStream ms = new MemoryStream(UTF8Encoding.UTF8.GetBytes(PacketData));
XmlSerializer xml = new XmlSerializer(PacketType);
object obj = xml.Deserialize(ms);
ms.Dispose();
return obj;
}
public static RequestPacket GetRequestPacket(TcpClient C)
{
string str = ReceivePacket(C, typeof(RequestPacket));
if (str == "") return new RequestPacket();
RequestPacket req = (RequestPacket) Decode(str, typeof(RequestPacket));
return req;
}
public static ResponsePacket GetResponsePacket(TcpClient C)
{
string str = ReceivePacket(C, typeof(ResponsePacket));
if (str == "") return new ResponsePacket();
ResponsePacket res = (ResponsePacket)Decode(str, typeof(ResponsePacket));
return res;
}
}
To use this class, I simply need to call PacketTransit.SendPacket(myTcpClient, SomePacket) to send any given XML-Serializable object. I can then use PacketTransit.GetResponsePacket or PacketTransit.GetRequestPacket to receive it at the other end.
For me, this is working very well, but it was alot more of a workout than originally expected.
you should use a StreamWriter/Reader linked to your network stream, .Flush does nothing on a NetworkStream, see here:
http://www.c-sharpcorner.com/UploadFile/dottys/SocketProgDTRP11222005023030AM/SocketProgDTRP.aspx
I believe the real problem here may be that the XmlDeserializer may not return until it has read EOS from the stream. You may need to shutdown the sending stream for output to force this to happen.
I have a client server application in which the server and the client need to send and receive objects of a custom class over the network. I am using TcpClient class for transmitting the data. I am serializing the object at the sender side and sending the resulting stream of bytes to the receiver. But at the receiver, when I try to de-serialize the bytes received, it throws Serialization Exception and the details are :
The input stream is not a valid
binary format. The starting contents
(in bytes) are:
0D-0A-00-01-00-00-00-FF-FF-FF-FF-01-00-00-00-00-00
...
My server code that serializes the object is:
byte[] userDataBytes;
MemoryStream ms = new MemoryStream();
BinaryFormatter bf1 = new BinaryFormatter();
bf1.Serialize(ms, new DataMessage());
userDataBytes = ms.ToArray();
netStream.Write(userDataBytes, 0, userDataBytes.Length);
The client code that de-serializes it is:
readNetStream.Read(readMsgBytes, 0, (int)tcpServer.ReceiveBufferSize);
MemoryStream ms = new MemoryStream(readMsgBytes);
BinaryFormatter bf1 = new BinaryFormatter();
ms.Position = 0;
object rawObj = bf1.Deserialize(ms);
DataMessage msgObj = (DataMessage)rawObj;
Please help me to solve this problem and possibly suggest any other method to transmit objects of custom classes across network using TcpClient in C#.
Thanks,
Rakesh.
Have a look at this code. It takes a slightly different approach.
Example given by the link above: - Note: there was another problem he was facing which he solved here (keep-alive). It's in the link after the initial sample code.
Object class to send (remember the [Serializable]):
[serializable]
public class Person {
private string fn;
private string ln;
private int age;
...
public string FirstName {
get {
return fn;
}
set {
fn=value;
}
}
...
...
public Person (string firstname, string lastname, int age) {
this.fn=firstname;
...
}
}
Class to send object:
using System;
using System.Net;
using System.Net.Sockets;
using System.Runtime.Serialization;
using System.Runtime.Serialization.Formatters.Binary;
class DataSender
{
public static void Main()
{
Person p=new Person("Tyler","Durden",30); // create my serializable object
string serverIp="192.168.0.1";
TcpClient client = new TcpClient(serverIp, 9050); // have my connection established with a Tcp Server
IFormatter formatter = new BinaryFormatter(); // the formatter that will serialize my object on my stream
NetworkStream strm = client.GetStream(); // the stream
formatter.Serialize(strm, p); // the serialization process
strm.Close();
client.Close();
}
}
Class to receive object:
using System;
using System.Net;
using System.Net.Sockets;
using System.Runtime.Serialization;
using System.Runtime.Serialization.Formatters.Binary;
class DataRcvr
{
public static void Main()
{
TcpListener server = new TcpListener(9050);
server.Start();
TcpClient client = server.AcceptTcpClient();
NetworkStream strm = client.GetStream();
IFormatter formatter = new BinaryFormatter();
Person p = (Person)formatter.Deserialize(strm); // you have to cast the deserialized object
Console.WriteLine("Hi, I'm "+p.FirstName+" "+p.LastName+" and I'm "+p.age+" years old!");
strm.Close();
client.Close();
server.Stop();
}
}
When receiving on client side you do not know how much data you want to read.
You are only relying on the ReceiveBufferSize, while your data can be larger
or smaller then that.
I think the best approach here is to send 4 bytes that tells your client about the length of incoming data:
byte[] userDataLen = BitConverter.GetBytes((Int32)userDataBytes.Length);
netStream.Write(userDataLen, 0, 4);
netStream.Write(userDataBytes, 0, userDataBytes.Length);
and on the recieving end you first read the data length and then read
exact amount of data.
byte[] readMsgLen = new byte[4];
readNetStream.Read(readMsgLen, 0, 4);
int dataLen = BitConverter.ToInt32(readMsgLen);
byte[] readMsgData = new byte[dataLen];
readNetStream.Read(readMsgData, 0, dataLen);
Infact, I just realized, that you might has to do a little more to assure you read all data (just an idea because I haven't tried it, but just incase you run into problem again you can try this).
The NetworkStream.Read() method returns a number indicating the amount of data it has read. It might be possible that the incoming data is larger then the RecieveBuffer. In that case you have to loop until you read all of the data. You have to do something like this:
SafeRead(byte[] userData, int len)
{
int dataRead = 0;
do
{
dataRead += readNetStream.Read(readMsgData, dataRead, len - dataRead);
} while(dataRead < len);
}
TCP is stream-based protocol (as opposed to datagram protocol) so it's possible to receive only part of sended data via Read method call.
To solve this problem you may use DataLength field (as cornerback84 suggested) or you may use your own "application-level packet" structure.
For example, you may use something like this
|-------------------------------|
|Begin|DataLength| Data |End|
| 4b | 4b | 1..MaxLen|4b |
|-------------------------------|
where
Begin - start packet identifier (for example 0x0A, 0x0B, 0x0C, 0x0D)
DataLength - Data field length (for example, from 0 to MaxLength)
Data - actual data (serialized Person class or some other data)
End - end packet identifier (for example 0x01, 0x05, 0x07, 0x0F).
That is, on client side you would wait not only for incoming data, after receiving data you would search you Application level packets, and you may deserialized Data part only after receiving valid packet.