I'm trying to send a packet to a Tattile traffic camera.The Tattile camera uses its own TCP packet protocol called TOP (Tattile Object Protocol)So far from what I can see from reading the documentation I need a IP > TCP > TOP HEADER > VCR PAYLOAD
To create the TOP HeaderHere are the requirements.
24 bytes are required I believe.Here is the command header, from the image above, the Header Dimension part, is this asking for the TOP Header that required 24 bytes?Here is the Header Constructor which I dont understand why it is there is there is already a Command Header with the same information from what I can see.Here is an example on building a message, so for the command code at this stage until I get a better understanding, all I want to do is send data, not receive, so with that being saidHere is the Start Engine command code.
Here is what I have code wise, so far it connects and "sends the message" however the engine doesn't start, as for the enum in the future when I get a better understanding, I should be about to add more of the commands with the command codes.
class Command
{
public enum Codes
{
START_ENGINE
}
private static readonly byte[] HeaderDimension = new byte[24];
private static byte[] CommandCode;
private static readonly byte[] Sender = new byte[4] { 0xFF, 0xFF, 0xFF, 0xFF };
private static readonly byte[] Receiver = Sender;
private static readonly byte[] Error = new byte[] { 0 };
private static readonly byte[] DataDimension = new byte[] {0};
public static void Execute(Codes code)
{
if (code == Codes.START_ENGINE)
{
CommandCode = new byte[4]{ 0x35, 0x0, 0x0, 0x4};
}
using (TcpClient tcpClient = new TcpClient("192.168.1.21", 31000))
{
NetworkStream networkStream = tcpClient.GetStream();
byte[] bytesTosend = HeaderDimension.Concat(CommandCode)
.Concat(Sender)
.Concat(Receiver)
.Concat(Error)
.Concat(DataDimension).ToArray();
networkStream.Write(bytesTosend, 0, bytesTosend.Length);
}
}
}
Here is how I'm calling it
static void Main()
{
Command.Execute(Command.Codes.START_ENGINE);
Console.ReadKey();
}
The header has a total of 24 bytes, containing 6 x 4 byte values. The first 4 bytes contain the length, being 24 (0x18). In C# these are Int32 data types, however keep in mind what the byte order is. Network protocols usually have network byte order (big indian) which is likely different from your C#. Use the System.BitConverter class to test and change if needed.
Your HeaderDimension should be a 4-byte array that contains the value 24, not a 24-byte array.
Also the Error and DataDimension should always be 4 bytes in length.
Related
I'm new to C# and sockets so I apologize if my questions are out of line. I started building a socket interface using the example in this link:
https://code.msdn.microsoft.com/High-Performance-NET-69c2df2f
I want to be able to transfer binary files across the socket so I made an assumption (maybe the wrong one) that I should not use StringBuilder. I changed the OSUserToken from the original to use a MemoryStream and BinaryWriter (commenting out the original code).
Elsewhere in the code (from the link above), SocketAsyncEventArgs is intialized with SetBuffer(new Byte[_bufferSize], 0, _bufferSize);. I'm concerned this will not mesh well with my MemoryStream and BinaryWriter but it seems to work.
sealed class UserToken : IDisposable
{
private Socket _ownerSocket;
public Socket ownerSocket { get { return _ownerSocket; } }
private MemoryStream _memoryStream;
private BinaryWriter _binaryWriter;
//private StringBuilder stringbuilder;
private int totalByteCount;
public String LastError;
public UserToken(Socket readSocket, int bufferSize)
{
_ownerSocket = readSocket;
_memoryStream = new MemoryStream();
_binaryWriter = new BinaryWriter(_memoryStream);
//stringbuilder = new StringBuilder(bufferSize);
}
// Do something with the received data, then reset the token for use by another connection.
// This is called when all of the data have been received for a read socket.
public void ProcessData(SocketAsyncEventArgs args)
{
String received = System.Text.Encoding.ASCII.GetString(_memoryStream.ToArray());
//String received = stringbuilder.ToString();
Debug.Write("Received: \"" + received + "\". The server has read " + received.Length + " bytes.");
_memoryStream.SetLength(0);
//stringbuilder.Length = 0;
totalByteCount = 0;
}
public bool ReadSocketData(SocketAsyncEventArgs readSocket)
{
int byteCount = readSocket.BytesTransferred;
/*
if ((totalByteCount + byteCount) > stringbuilder.Capacity)
{
LastError = "Receive Buffer cannot hold the entire message for this connection.";
return false;
}
else
{
*/
//stringbuilder.Append(Encoding.ASCII.GetString(readSocket.Buffer, readSocket.Offset, byteCount));
_binaryWriter.Write(readSocket.Buffer,readSocket.Offset,byteCount);
totalByteCount += byteCount;
return true;
/*}*/
}
public void Dispose()
{
_memoryStream.Dispose();
_binaryWriter.Dispose();
try
{
_ownerSocket.Shutdown(SocketShutdown.Both);
}
catch
{
//Nothing to do here, connection is closed already
}
finally
{
_ownerSocket.Close();
}
}
}
When I run this, it seems to work without an issue. Even if I set the protected const int DEFAULT_BUFFER_SIZE = 1 it will accept a stream of >1 bytes:
17:11:20:433 - Debug - Initializing the listener on port 5000...
17:11:20:439 - Debug - Starting the listener...
17:11:20:444 - Debug - Server started.
17:11:31:856 - Debug - Received: "listener". The server has read 8 bytes.
17:11:33:264 - Debug - Received: "l". The server has read 1 bytes.
17:11:33:268 - Debug - Received: "istener". The server has read 7 bytes.
17:11:36:744 - Debug - Received: "l". The server has read 1 bytes.
17:11:36:744 - Debug - Received: "i". The server has read 1 bytes.
17:11:36:746 - Debug - Received: "stener". The server has read 6 bytes.
My questions are these:
Am I right that StringBuilder wouldn't work for binary files and I should use MemoryStream and BinaryWriter?
Do I need to be concerned with a buffer overflow if elsewhere in the program, the SocketAsyncEventArgs is initialized with SetBuffer(new Byte[_bufferSize], 0, _bufferSize);?
If I have to obey the buffer size limitation, do I need to put the same buffer restriction on my client sending data?
I found answers to my questions
StringBuilder works fine. Just encode the strings in base64 before sending and decode after receiving. This should be done no matter if sending text or binary data. See the class I wrote below.
Still don't know the answer to this question but seeing as StringBuilder
& base64 works with binary, this question is no longer relevant.
I think the answer to this question is yes. The client should have a max message length. I'm controlling based on the header portion of my socket where I define the length of the message. The header is fixed length and my max message length is 0xFFFFF.
Class for encoding/decoding base64:
public static class Base64
{
public static string EncodeBase64(string text)
{
return System.Convert.ToBase64String(System.Text.Encoding.UTF8.GetBytes(text));
}
public static string EncodeBase64(byte[] array)
{
return System.Convert.ToBase64String(array);
}
public static string DecodeBase64ToString(string base64String)
{
return System.Text.Encoding.UTF8.GetString(System.Convert.FromBase64String(base64String));
}
public static Byte[] DecodeBase64ToBinary(string base64String)
{
Byte[] bytes = System.Convert.FromBase64String(base64String);
return bytes;
}
}
I am relativity new to C#. In my TCP client have the following function which sends data to the server and returns the response:
private static TcpClient tcpint = new TcpClient(); //Already initiated and set up
private static NetworkStream stm; //Already initiated and set up
private static String send(String data)
{
//Send data to the server
ASCIIEncoding asen = new ASCIIEncoding();
byte[] ba = asen.GetBytes(data);
stm.Write(ba, 0, ba.Length);
//Read data from the server
byte[] bb = new byte[100];
int k = stm.Read(bb, 0, 100);
//Construct the response from byte array to string
StringBuilder sb = new StringBuilder();
for (int i = 0; i < k; i++)
{
sb.Append(bb[i].ToString());
}
//Return server response
return sb.ToString();
}
As you can see here, when I am reading the response from the server, I am reading it into a fix byte[] array of length 100 bytes.
byte[] bb = new byte[100];
int k = stm.Read(bb, 0, 100);
What do i do if the response from the server is more than 100 bytes? How can I read the data without me knowing what the max length of data form the server will be?
Typically, where there is not some specific intrinsic size of something, tcp protocols explicitly send the length of objects they are sending. One possible method for illustration:
size_t data_len = strlen(some_data_blob);
char lenstr[32];
sprintf(lenstr, "%zd\n", data_len);
send(socket, lenstr, strlen(lenstr));
send(socket, some_data_blob, data_len);
then when the receiver reads the length string, it knows exactly how mush data should follow (good programming practice is to trust but verify though -- if there is more or less data really sent -- say by an 'evil actor' -- you need to be prepared to handle that).
Not with respect to C# but a general answer on writing TCP application:
TCP is steam based protocol. It does not maintain message boundaries. So, the applications using TCP should take care of choosing the right method of data exchange between server and client. Its becomes more paramount if multiple messages gets sent and received on one connection.
One widely used method is to prepend the data message with the length bytes.
Ex:
[2 byte -length field][Actual Data].
The receiver of such data (be it server or client needs to decode length field, wait for until such event where as many bytes are received or raise an alarm on timeout and give up.
Another protocol that can be used is to have applications maintain message boundaries.
Ex:
`[START-of-MSG][Actual Data][END-of-MSG]
The reciever has to parse the data for Start-byte and End-byte (predefined by application protocol) and treat anything in between as data of interest.
hello i solved it with a list, i don't know the size of the complete package but i can read it in parts
List<byte> bigbuffer = new List<byte>();
byte[] tempbuffer = new byte[254];
//can be in another size like 1024 etc..
//depend of the data as you sending from de client
//i recommend small size for the correct read of the package
NetworkStream stream = client.GetStream();
while (stream.Read(tempbuffer, 0, tempbuffer.Length) > 0) {
bigbuffer.AddRange(tempbuffer);
}
// now you can convert to a native byte array
byte[] completedbuffer = new byte[bigbuffer.Count];
bigbuffer.CopyTo(completedbuffer);
//Do something with the data
string decodedmsg = Encoding.ASCII.GetString(completedbuffer);
I do this whith images and looks good, i thik than you dont know the size of the data if the porpouse is read a complete source with a unknow size
I was looking around for an answer to this, and noticed the Available property was added to TcpClient. It returns the amount of bytes available to read.
I'm assuming it was added after most of the replies, so I wanted to share it for others that may stumble onto this question.
https://learn.microsoft.com/en-us/dotnet/api/system.net.sockets.tcpclient.available?view=netframework-4.8
I'm trying to send data back and forth between only two computers using a Socket. The data is in the form of serialized Packet objects.
When testing the program on another computer on my local network, I'm getting random SerializationExceptions so that no data goes through.
The program consistently sends different data, so when it makes another pass at sending it again, it will sometimes go through and sometimes hit the same SerializationException again. If I catch the exception and leave it running, all data eventually makes it through, but it takes several tries.
The exception says: "The input stream is not a valid binary format. The starting contents (in bytes) are [byte data]"
Not sure exactly where my problem lies. The larger amounts of data I send (~100kb max) always go through. The smaller ones (50-70 bytes) have trouble. Here's everything to do with my serialization and reading/writing data.
Socket defined as such:
SocketMain = new Socket(AddressFamily.InterNetwork, SocketType.Stream, ProtocolType.Tcp);
Send & Read methods. I'm aware this is probably a horrible way to do so and might end up being my issue. Suggestions?:
public void SendPacket(Packet P)
{
using (MemoryStream MS = new MemoryStream())
{
BinaryFormatter BF = new BinaryFormatter();
BF.Serialize(MS, P);
SocketMain.Send(MS.ToArray());
}
}
public void ReadPacket()
{
byte[] BufferArray = new byte[131072];
int BytesReceived = SocketMain.Receive(BufferArray);
byte[] ActualData = new byte[BytesReceived];
Buffer.BlockCopy(BufferArray, 0, ActualData, 0, BytesReceived);
using (MemoryStream MS = new MemoryStream(ActualData))
{
BinaryFormatter BF = new BinaryFormatter();
HandlePacket((Packet)BF.Deserialize(MS));
}
}
Example Packet object. This is one of my smaller ones. I think this might be the one that is causing the issue, but I don't know how I could tell.
[Serializable()]
public class Packet4BlockVerify : Packet, ISerializable
{
public byte Index;
public string MD5Hash;
public Packet4BlockVerify(int Index, string MD5Hash): base(4)
{
this.Index = (byte)Index;
this.MD5Hash = MD5Hash;
}
protected Packet4BlockVerify(SerializationInfo info, StreamingContext context)
{
this.ID = info.GetByte("ID");
this.Index = info.GetByte("Index");
this.MD5Hash = info.GetString("MD5Hash");
}
public override void GetObjectData(SerializationInfo info, StreamingContext context)
{
info.AddValue("ID", this.ID);
info.AddValue("Index", this.Index);
info.AddValue("MD5Hash", this.MD5Hash);
}
}
Does anyone see anything wrong?
You are not reading all the bytes you sent. Your receive call:
int BytesReceived = SocketMain.Receive(BufferArray);
returns any number of bytes. You will need to pre-pend the bytes you send with the size of the remaining bytes, read that then continue reading till you have all your bytes before trying to deserialize.
TCP sends a continuous byte stream so your receive call reads arbitrary sized chunks. One of the overloads you can specify the number of bytes you want to receive so after reading the number bytes you are expecting you could use that. e.g.
// Warning untested! (but you get the idea)
// when sending
var payload = MS.ToArray();
var payloadSize = payload.Length;
mySocket.Send(BitConverter.GetBytes(payloadSize));
mySocket.Send(payload);
// when recieving
mySocket.Recieve(myBuffer, sizeof(int), SocketFlags.None);
var payloadSize = BitConverter.ToInt32(myBuffer, 0);
mySocket.Recieve(myBuffer, payloadSize, SocketFlags.None);
// now myBuffer from index 0 - payloadSize contains the payload you sent
What I am doing is attempting to send an IPEndpoint through protobuf-net and what I observed is that when deserializing the array of 4 bytes into the IP4 address, the set code recieves a value of 8 bytes. Four bytes containing the orignal address, and 4 more bytes containing the address that was serialized. By stepping through the code I have been able to confirm that when Deserialize is called, it first reads the bytes, and then sets they bytes.
After doing some reading I learned about OverwriteList, and as can been seen in the example below, I have set that to true. However the setter is still provided an 8 byte value.
Does anyone have a clue what I am doing wrong?
This sample code should throw an exception when used with protobuf-net r480, Visual Studio 2010 as a .Net 4.0 console application.
using ProtoBuf;
using System.Net;
using System.IO;
namespace ConsoleApplication1
{
[ProtoContract]
class AddressOWner
{
private IPEndPoint endpoint;
public AddressOWner()
{ endpoint = new IPEndPoint(new IPAddress(new byte[] {8,8,8,8}), 0); }
public AddressOWner(IPEndPoint newendpoint)
{ this.endpoint = newendpoint; }
[ProtoMember(1, OverwriteList=true)]
public byte[] AddressBytes
{
get { return endpoint.Address.GetAddressBytes(); }
set { endpoint.Address = new IPAddress(value); }
}
}
class Program
{
static void Main(string[] args)
{
AddressOWner ao = new AddressOWner(new IPEndPoint(new IPAddress(new byte[] { 192, 168, 1, 1 }), 80));
MemoryStream ms = new MemoryStream();
Serializer.Serialize(ms, ao);
byte[] messageData = ms.GetBuffer();
ms = new MemoryStream(messageData);
AddressOWner aoCopy = Serializer.Deserialize<AddressOWner>(ms);
}
}
}
It looks like this is actually a bug, specific to byte[], which is handled as a particular protobuf primitive. Other arrays/lists are mapped to repeated (in protobuf terms), and handle the OverwriteList option correctly. I will tweak the byte[] handling to support this option.
Edit: this is fixed in r484, with supporting integration test
Server side code:
byte[] size = new byte[4];
size = BitConverter.GetBytes(fileData.Length);
stream.Write(size, 0, 4);
Client side code:
byte[] size = new byte[4];
ReadWholeArray(s, size);
int fileSize = BitConverter.ToInt32(size, 0);
Definition of ReadWholeArray method:
public static void ReadWholeArray(Stream stream, byte[] data)
{
int offset = 0;
int remaining = data.Length;
while (remaining > 0)
{
int read = stream.Read(data, offset, remaining);
if (read <= 0)
throw new EndOfStreamException
(String.Format("End of stream reached with {0} bytes left to read", remaining));
remaining -= read;
offset += read;
}
}
The Program sends fileData.Length (the value for this instance is 2422) from server. On receiving this data at client side the value of received data is -772097985
Why the sent data is not received without alteration in value? What is the problem?
Okay, simple diagnostics to start with: log the individual contents of the byte array at both ends, so you can see what's going on there.
That way you can see if it's the binary data which is getting corrupted in your communication protocol, or whether BitConverter is causing your problem. For example, you could have a big-endian BitConverter at one end, and a little-endian BitConverter at the other. That seems unlikely, but it's possible - particularly if one of your server or client is running Mono rather than .NET itself.
If that does turn out to be the problem, you might want to use my EndianBitConverter class from MiscUtil, which lets you specify the endianness.
If the problem is in the communications layer, you quite possibly want to install Wireshark to see what's happening at the network level. Are you sure you've read all the data you're meant to have read so far, for example? (If you've only read 15 bytes before this, and the size is written at offset 16, then obviously you'll get the "extra" byte first.)
This works fine:
private void button2_Click(object sender, RoutedEventArgs e)
{
MemoryStream ms = new MemoryStream();
byte [] original = BitConverter.GetBytes((int)2224); // 176, 8, 0, 0
ms.Write(original, 0, original.Length);
ms.Seek(0, SeekOrigin.Begin);
byte [] data = new byte[4];
int count = ms.Read(data, 0, 4); // count is 4, data is 176, 8, 0, 0
int fileSize = BitConverter.ToInt32(data, 0); // is 2224
return;
}
Can you use WireShark or something to intercept the bytes? What kind of connection are you using? Could more data be being sent (i.e. telnet control characters at the start of the stream)? Can you debug each end and verify these values or write the byte array contents to a log file? By the way calling this: "byte[] size = new byte[4];" is wasteful because BitConverter.GetBytes() returns a new array.