I'm utilizing Protobuf3 with c# for my client & c++ for my server, where the .proto's are generated from the corresponding protoc3 compiler. I'm new to utilizing Google Protocol Buffers, where I'm currently trying to figure out how to re-parse the received bytes on my server that were sent from my client to turn it back into it's originating google::protobuf::Message object on my c++ server.
My client is sending a CodedOutputStream in c# :
public PacketHandler()
{
m_client = GetComponent<ClientObject>();
m_packet = new byte[m_client.GetServerConnection().GetMaxPacketSize()];
m_ostream = new BinaryWriter(new MemoryStream(m_packet));
}
public void DataToSend(IMessage message)
{
CodedOutputStream output = new CodedOutputStream(m_ostream.BaseStream, true);
output.WriteMessage(message);
output.Flush();
m_client.GetServerConnection().GetClient().Send(m_packet, message.CalculateSize());
}
This seems to be working, the message that is sent right now is a simple Ping message that looks like this:
// Ping.proto
syntax = "proto3";
package server;
import "BaseMessage.proto";
message Ping {
base.BaseMessage base = 1;
}
message Pong {
base.BaseMessage base = 1;
}
My BaseMessage looks like this :
// BaseMessage.proto
syntax = "proto3";
package base;
message BaseMessage {
uint32 size = 1;
}
My plan is to hopefully extend all of my messages from a BaseMessage to allow for identification of the particular message eventually. Once I get the parsing & re-parsing figured out.
The received message that I am getting on my c++ server side looks like this : \u0004\n\u0002\b
When receiving the message I am attempting to re-parse using the CodedInputStream object by attempting to parse the received bytes.
PacketHandler::PacketHandler(QByteArray& packet, const Manager::ClientPtr client) :
m_packet(packet),
m_client(client)
{
//unsigned char buffer[512] = { 0 };
unsigned char data[packet.size()] = { 0 };
memcpy(data, packet.data(), packet.size());
google::protobuf::uint32 msgSize;
google::protobuf::io::CodedInputStream inputStream(data, packet.size());
//inputStream.ReadVarint32(&msgSize);
//inputStream.ReadRaw(buffer, packet.size());
server::Ping pingMsg;
pingMsg.ParseFromCodedStream(&inputStream);
qDebug() << pingMsg.base().size();
}
This is where I am a bit unsure of the process that is needing to be done to re-parse the message into the particular message. I believe if I utilize a BaseMessage that is extending all messages that will allow me to identify the particular message so I know which one to create. However, in this current test where I know it will be a Ping message, the ParseFromCodedStream doesn't seem to create the original message. My reasoning comes from during my qDebug() the pingMsg.base().size() is not the correct value that was set during the sending phase in my c# client.
I was able to get this solved by incrementing my sent byte count by + 1.
public void DataToSend(IMessage message)
{
CodedOutputStream output = new CodedOutputStream(m_ostream.BaseStream, true);
output.WriteMessage(message);
output.Flush();
(m_ostream.BaseStream as MemoryStream).SetLength(0); // reset stream for next packet(s)
m_client.GetServerConnection().GetClient().Send(m_packet, message.CalculateSize() + 1);
}
I'm still a bit suspect as to why I need to add one. I'm thinking it must have to do with a null terminator. However prior without this addition my message would look like : #\u0012!\n\u001Ftype.googleapis.com/server.Pin
Then with the addition all it seems to do is add on is the correct message : #\u0012!\n\u001Ftype.googleapis.com/server.Ping
Also, during this process I figured out how to classify my messages utilizing the protobuf3 Any type. Where now my BaseMessage is defined as :
syntax = "proto3";
package base;
import "google/protobuf/any.proto";
message BaseMessage {
uint32 size = 1;
google.protobuf.Any msg = 2;
}
Which allows me to place any sort of google::protobuf::Message into the msg field, where on ParsingFromCodedStream I can check if it is that particular message.
PacketHandler::PacketHandler(QByteArray& packet, const Manager::ClientPtr client) :
m_packet(packet),
m_client(client)
{
unsigned char data[packet.size()] = { 0 };
memcpy(data, packet.data(), packet.size());
google::protobuf::io::CodedInputStream inputStream(data, packet.size());
// read the prefixed length of the message & discard
google::protobuf::uint32 msgSize;
inputStream.ReadVarint32(&msgSize);
// -----
// collect the BaseMessage & execute functionality based on type_url
base::BaseMessage msg;
if (!msg.ParseFromCodedStream(&inputStream))
{
qDebug() << msg.DebugString().c_str();
return;
}
if (msg.msg().Is<server::Ping>())
{
server::Ping pingMsg;
msg.msg().UnpackTo(&pingMsg);
}
}
Related
I want to create same message and send it with C# as I do it with C++ where all works. Note that I have C# client where I have troubles, C++ client where all works fine and C++ server that should read messages from both C# and C++ clients.
Here is how I send the message from C++:
void ConnectAuthserverCommand::SendLogin(tcp::socket &s, const flatbuffers::FlatBufferBuilder &builder) const {
ClientOpcode opc = CLIENT_LOGIN_REQUEST;
flatbuffers::FlatBufferBuilder builder2;
auto email = builder2.CreateString("test#abv.bg");
auto password = builder2.CreateString("test");
auto loginRequest = Vibranium::CreateLoginRequest(builder2, email, password);
builder2.FinishSizePrefixed(loginRequest);
size_t size2 = builder2.GetSize();
uint8_t *buf2 = builder2.GetBufferPointer();
uint8_t *actualBuffer2 = new uint8_t[size2 + 2];
actualBuffer2[1] = (opc >> 8);
actualBuffer2[0] = (opc&0xFF);
memcpy(actualBuffer2 + 2, buf2, size2);
boost::asio::write(s, boost::asio::buffer(actualBuffer2,size));
}
ClientOpcode is as follows:
enum ClientOpcode : uint16_t{
CLIENT_AUTH_CONNECTION = 0x001,
CLIENT_LOGIN_REQUEST = 0x002,
CLIENT_NUM_MSG_TYPES = 0x003,
};
What I do is the following: I get a ClientOpcode which I want to put infront of FlatBuffers message. So I create an array of uint8_t which I extend with exactly 2 bytes(Because the size of uint16_t is 2 bytes.) Than on the server I read the first 2 bytes in order to get the header and here is how I do that:
void Vibranium::Client::read_header() {
auto self(shared_from_this());
_packet.header_buffer.resize(_packet.header_size);
boost::asio::async_read(socket,
boost::asio::buffer(_packet.header_buffer.data(), _packet.header_size),
[this, self](boost::system::error_code ec,std::size_t bytes_transferred)
{
if ((boost::asio::error::eof == ec) || (boost::asio::error::connection_reset == ec))
{
Disconnect();
}
else
{
assert(_packet.header_buffer.size() >= sizeof(_packet.headerCode));
std::memcpy(&_packet.headerCode, &_packet.header_buffer[0], sizeof (_packet.headerCode));
if(_packet.headerCode)
read_size();
else
Logger::Log("UNKNOWN HEADER CODE", Logger::FatalError);
}
});
}
So far so good, however I am not able to send correctly formatted same message from the C# client. Note that I send exactly same data, take a look:
Client authClient = GameObject.Find("Client").GetComponent<AuthClient>().client; // This is how I get Client class instance.
ClientOpcode clientOpcode = ClientOpcode.CLIENT_LOGIN_REQUEST;
var builder = new FlatBuffers.FlatBufferBuilder(1);
var email = builder.CreateString("test#abv.bg");
var password = builder.CreateString("test");
var loginRequest = LoginRequest.CreateLoginRequest(builder, email, password);
builder.FinishSizePrefixed(loginRequest.Value);
authClient.Send(builder, clientOpcode);
And here is how I actually prepend the header and send the data in C#:
public static Byte[] PrependClientOpcode(FlatBufferBuilder byteBuffer, ClientOpcode code)
{
var originalArray = byteBuffer.SizedByteArray();
byte[] buffer = new byte[originalArray.Length + 2];
buffer[1] = (byte)((ushort)code / 0x0100);
buffer[0] = (byte)code;
Array.Copy(originalArray, 0, buffer, 2, originalArray.Length);
return buffer;
}
public void Send(FlatBufferBuilder builder, ClientOpcode opcode)
{
byte[] buffer = builder.SizedByteArray();
var bufferToSend = PrependClientOpcode(builder, opcode);
if (bufferToSend.Length > MaxMessageSize)
{
Logger.LogError("Client.Send: message too big: " + bufferToSend.Length + ". Limit: " + MaxMessageSize);
return;
}
if (Connected)
{
// respect max message size to avoid allocation attacks.
if (bufferToSend.Length <= MaxMessageSize)
{
// add to send queue and return immediately.
// calling Send here would be blocking (sometimes for long times
// if other side lags or wire was disconnected)
sendQueue.Enqueue(bufferToSend);
sendPending.Set(); // interrupt SendThread WaitOne()
}
}
else
{
Logger.LogWarning("Client.Send: not connected!");
}
}
ClientOpcode enum on C# is as follows:
public enum ClientOpcode : ushort
{
CLIENT_AUTH_CONNECTION = 0x001,
CLIENT_LOGIN_REQUEST = 0x002,
CLIENT_NUM_MSG_TYPES = 0x003,
}
I think I can use ushort as a replacement of uint16_t in C#. That is why ClientOpcode is ushort.
When I send the message I get error on the client saying UNKNOWN HEADER CODE. If you take a look at the C++ server code to read the header you'll see that this message is displayed when the server is unable to read the header code. So somehow I am unable to place the ClientOpcode header correctly infront of the TCP message send from the C# client.
In order to find out what are the differences I installed WireShark on the host to track both messages. Here are they:
This one is from the correctly working C++ client:
And this one is the dump of the C# client:
As you can see on the second image of the TCP dump the Length of is bigger. C++ message is with length of 58 where C# message's length is 62. Why?
The C++ client is sending data:
0200340000000c00000008000c00040008000800000014000000040000000400000074657374000000000b00000074657374406162762e626700
When the C# client is sending:
0000003a0200340000000c00000008000c00040008000800000014000000040000000400000074657374000000000b00000074657374406162762e626700
The C# client is adding to it's message in front 0000003a. If I remove that messages should be the same and all will work.
Why is my C# client adding those extra data in front and how can I fix it?
I am new to using Modbus Communication, and I found some other related threads here but unfortunately, it was for other languages or using TCP rather than RTU connection for Modbus.
So I have this segment of C# code that I can use to send data:
byte address = Convert.ToByte(txtSlaveID.Text);
ushort start = Convert.ToUInt16(txtWriteRegister.Text);
short[] value = new short[1];
if(Int16.TryParse(txtWriteValue.Text, out short numberValue))
{
value[0] = numberValue; //This part works!
}
else
{
value = new short[3] { 0x52, 0x4E, 0x56 }; //This is where I am trying to send letters/ASCII
}
try
{
mb.SendFc16(address, start, (ushort)value.Length, value);
}
catch (Exception err)
{
WriteLog("Error in write function: " + err.Message);
}
WriteLog(mb.modbusStatus);
So when I want to send down a single value, this code works. It will take the short array and dow the following to build the packet:
//Put write values into message prior to sending:
for (int i = 0; i < registers; i++)
{
message[7 + 2 * i] = (byte)(values[i] >> 8);
message[8 + 2 * i] = (byte)(values[i]);
}
So as you can see I attempted to use the hex values in the array, and send them down to the registers.
How can I modify the first sample of code to be able to send down HEX values, and write out characters into the register space in the first image?
I think you are not able to write the character at the device's register.
You need not to store the hex values into the short array. Simply store the characters into array, before writing them into the register convert them into byte.
Note - Whatever data will be written into the devices register, should be in byte.
I wrote a C# chat software that uses a new (at least for me) system that I called request system. I don't know if that has been created before, but for now I think of it as my creation :P
Anyhow, this system works like this:
soc receives a signal
checks the signal
if the data it just received is the number 2, the client software knows that the server is about to send a chat message. if the number is 3, so the client knows that the server is about to send the member list, and so on.
The problem is this: when I do step-by-step in VS2012 it works fine, the chat is working properly. When I use it on debug mode or just run it on my desktop, there seems to be missing data, and it shouldn't be because the code is working just fine...
Example of code for the sending&receiving message on client:
public void RecieveSystem()
{
while (true)
{
byte[] req = new byte[1];
soc.Receive(req);
int requestID = int.Parse(Encoding.UTF8.GetString(req));
if (requestID == 3)
{
byte[] textSize = new byte[5];
soc.Receive(textSize);
byte[] text = new byte[int.Parse(Encoding.UTF8.GetString(textSize))];
soc.Receive(text);
Dispatcher.Invoke(() => { ChatBox.Text += Encoding.UTF8.GetString(text) + "\r\n"; });
}
}
}
public void OutSystem(string inputText)
{
byte[] req = Encoding.UTF8.GetBytes("3");
soc.Send(req);
byte[] textSize = Encoding.UTF8.GetBytes(Encoding.UTF8.GetByteCount(inputText).ToString());
soc.Send(textSize);
byte[] text = Encoding.UTF8.GetBytes(inputText);
soc.Send(text);
Thread.CurrentThread.Abort();
}
and on the server:
public void UpdateChat(string text)
{
byte[] req = Encoding.UTF8.GetBytes("3");
foreach (User user in onlineUsers)
user.UserSocket.Send(req);
byte[] textSize = Encoding.UTF8.GetBytes(Encoding.UTF8.GetByteCount(text).ToString());
foreach (User user in onlineUsers)
user.UserSocket.Send(textSize);
byte[] data = Encoding.UTF8.GetBytes(text);
foreach (User user in onlineUsers)
user.UserSocket.Send(data);
}
public void RequestSystem(Socket soc)
{
~~~
}
else if (request == 3)
{
byte[] dataSize = new byte[5];
soc.Receive(dataSize);
byte[] data = new byte[int.Parse(Encoding.UTF8.GetString(dataSize))];
soc.Receive(data);
UpdateChat(Encoding.UTF8.GetString(data));
}
}
catch
{
if (!soc.Connected)
{
Dispatcher.Invoke(() => { OnlineMembers.Items.Remove(decodedName + " - " + soc.RemoteEndPoint); Status.Text += soc.RemoteEndPoint + " Has disconnected"; });
onlineUsers.Remove(user);
Thread.CurrentThread.Abort();
}
}
}
}
What could be the problem?
You're assuming that you'll have one packet for each Send call. That's not stream-oriented - that's packet-oriented. You're sending multiple pieces of data which I suspect are coalesced into a single packet, and then you'll get them all in a single Receive call. (Even if there are multiple packets involved, a single Receive call could still receive all the data.)
If you're using TCP/IP, you should be thinking in a more stream-oriented fashion. I'd also encourage you to change the design of your protocol, which is odd to say the least. It's fine to use a length prefix before each message, but why would you want to encode it as text when you've got a perfectly good binary connection between the two computers?
I suggest you look at BinaryReader and BinaryWriter: use TcpClient and TcpListener rather than Socket (or at least use NetworkStream), and use the reader/writer pair to make it easier to read and write pieces of data (either payloads or primitives such as the length of messages). (BinaryWriter.Write(string) even performs the length-prefixing for you, which makes things a lot easier.)
I've been breaking my head over a bug in this system I've been building. Basically, I use sockets to communicate between two C# applications. Or rather a Unity C# script server and a C# client application.
With manual tests, the system works perfectly fine, no anomalies whatsoever.
In order to test performance and multi-user functionality, I wrote up a tester class which launches multiple threads(clients), and have those fire X amount of messages at the server. Here's where my problem occurs...Sometimes.
When a Socket sends or receives, it returns an integer container the amount of bytes that was sent/received. When the problem occurs, I can see that the correct amount of bytes arrived at the server. However, after putting the bytes into a string, suddenly I'm left with an empty string, instead of the message I'd normally see here.
I'm at a loss at to what's causing this problem. I'm using Encoding.Default.GetString() to translate the bytes into a string.
Any help is appreciated!
David
public void ReceiveFromClient (Socket handlerSocket)
{
serverBuffer = new byte[iBufferSize]; //iBufferSize = 8192;
int i = handlerSocket.Receive (serverBuffer);
Debug.Log ("Bytes received: " + i);
string message = Encoding.UTF8.GetString (serverBuffer, 0, i);
Debug.Log ("Message received: " + message);
//Do stuff with the message
}
bool SendMessageToUnity(string input)
{//returns a bool saying whether the message was sent or not
if (clientSocket != null)
{
if (clientSocket.Connected)
{
byte[] bytes = Encoding.UTF8.GetBytes(input+"|");
txtOutput.BeginInvoke(new Action(() => txtOutput.AppendText("Sending message: " + Encoding.UTF8.GetString(bytes) + Environment.NewLine)));
int i = clientSocket.Send(bytes);
txtOutput.BeginInvoke(new Action(() => txtOutput.AppendText("Sending "+i+" bytes. "+ Environment.NewLine)));
return true;
}
}
return false;
}
Look for for a zero value ('\0') in your array of bytes before converting it to a string.
private string GetString(byte[] data)
{
data = data.Where(b => b != 0).ToArray();
return Encoding.UTF8.GetString(data);
}
If you get the byte array correctly than the problem in the Encoding.
Check the sending Encoding usually UTF8 but you have to check it out.
and then var inputStr = Encoding.UTF8.GetString(InputByteArray);
^^
I'm trying to communicate between C# and C++ with varying amounts of success.
I am able to send a message between the two using reply/request, but the doubles that I am receiving are not correct.
For debugging purposes and understanding, I am currently running the following:
Clrzmq 3.0 rc1, Google ProtocolBuffer 2.5, Protobuf-csharp-port-2.4, ZeroMQ-3.2.3
.Proto
package InternalComm;
message Point
{
optional double x = 1;
optional double y = 2;
optional string label = 3;
}
server.cpp (the relevant part)
while (true) {
zmq::message_t request;
// Wait for next request from client
socket.recv (&request);
zmq::message_t reply (request.size());
memcpy ((void*)reply.data(), request.data(), request.size());
socket.send(reply);
}
client.cs (the relevant part)
public static Point ProtobufPoint(Point point)
{
Point rtn = new Point(0,0);
using (var context = ZmqContext.Create())
{
using (ZmqSocket requester = context.CreateSocket(SocketType.REQ))
{
requester.Connect("tcp://localhost:5555");
var p = InternalComm.Point.CreateBuilder().SetX(point.X).SetY(point.Y).Build().ToByteArray();
requester.Send(p);
string reply = requester.Receive(Encoding.ASCII);
Console.WriteLine("Input: {0}", point);
byte[] bytes = System.Text.Encoding.ASCII.GetBytes(reply);
var message = InternalComm.Point.ParseFrom(bytes);
rtn.X = message.X;
rtn.Y = message.Y;
Console.WriteLine("Output: {0}", rtn);
}
}
return rtn;
}
On the C# side, Point is a very simple struct. Just x and y properties.
Here is what I'm getting from my unit tests as a result of running the above code.
Input (1.31616874365468, 4.55516872325469)
Output (0.000473917985115791, 4.55516872323627)
Input (274.120398471829, 274.128936418736)
Output (274.077917334613, 274.128936049925)
Input (150.123798461987, 2.345E-12)
Output (145.976459594794, 1.11014954927532E-13)
Input (150, 0)
Output (145.96875, 0)
I am thinking that the problem is my protobuf code is incorrect (doubtful this is a bug on Skeet's side). I am also running under the assumption that server.cpp is doing nothing to the message but returning it as is.
Thoughts?
The requestor.Receive(Encoding.ASCII) call is designed to receive a string, not a block of bytes. You are asking the ZmqSocket instance to return the message as an ASCII string, which is highly likely to cause modifications to the content. If you're sending a byte array, receive a byte array.
Try this:
int readSize;
byte[] reply = requester.Receive(null, out readSize);
var message = InternalComm.Point.ParseFrom(reply);
The readSize variable will contain the actual number of valid bytes in the received block, which may vary from the size of the reply array, so you may need to slice up the array to make it palatable to ProtoBuf.
Why the ASCII --> bytes --> parsing step? If you're parsing bytes, you should read bytes. If you're parsing text, you should read that.
Unnecessary charset-conversions look very likely to be erroneous.