Php Byte Array Packet - c#

I have a c# library which practically starts listening on a tcpip server an accepts a buffer of a certain size.
I need to send this packet as Byte array from php over the socket in a form of byte array or equivalent.
The packet is constructed for example byte[1] (a flag) is a number from 0 to 255 and byte[6] to byte[11] contains a float number in a string fromat for example:
005.70 which takes 6 bytes representing every character.
I managed to send the flag but when i try to send the float number it does not convert on the other side (C#).
So my question how can i send a byte array to c# using php?
From the C# part the conversion is being handled as follows:
float.Parse(System.Text.Encoding.Default.GetString(Data, 6, 6));

Just after i have posted the question i have dictated my answer. I am not 100% sure if this is the right way but it managed to convert correctly.
Here is the answer:
I created an array of characters and escaped the flag (4) to be the actual byte value being (4) but i didn't escape the money value
$string = array (0=>"\0", 1=>"\4", 2=>"\0", 3=>"\0", 4=>"\0", 5=>"\0", 6=>"5", 7=>".", 8=>"7", 9=>"\0", 10=>"\0");
Imploded all together with nothing as glue:
$arrByte = implode("", $string);
and sent over the opened socket:
$success = #fwrite($this->socket, $arrByte);

Related

How does deserialising byte arrays to utf8 know when each character starts/ends?

I am bit confused how networking does this. I have a string in C# and I serialise it to utf-8. But according to utf-8 each character takes up "possibly" 1 to 4 bytes.
So if my server receives this byte array over the net and deserialises it knowing its a utf8 string of some size. How does it know how many bytes each character is to convert it properly?
Will i have to include the total bytes for each string in the protocol eg:
[message length][char byte length=1][2][char byte length=2][56][123][ ... etc...]
Or is this unnecessary ?
UTF-8 encodes the number of bytes required in the bits that make up the character. Read the description on Wikipedia; only single-byte code points start with a zero bit. Only two-byte code points start with bits 110, only bytes inside a multi-byte code point start with 10.

Sending binary data via (C#)

Good morning,
I'm new to network programming but have been doing research and got the basics of setting up a server/client application. I would like to send binary data via TCP from the server to the client to parse and print out integers based on certain field lengths.
I'm basically creating a dummy server to send network data and would like for my client to parse it.
My idea is to create a byte array: byte[] data = {1,0,0,0,1,0,0,1) to represent 8 bytes being set. For example, the client would read the first 2 bytes and print a 2 followed by the next 6 bytes and print a 9.
This is a simple example. The byte array I would like to send would be 864 bytes. I would parse the first 96,48,48 etc.
Would this be a good way of doing this? If not, how should I send 1s and 0s? I found many example sending strings but I would like to send binary data.
Thanks.
You seem to be confusing bits and bytes.
A byte is composed of 8 bits, which can represent integer values from 0 to 255.
So, Instead of sending {1,0,0,0,1,0,0,1}, splitting the byte array and parsing the bytes as bits to get 2 and 9, you could simply create your array as:
byte[] data={2,9};
To send other primitive data types(int,long,float,double...), you can convert them to a byte array.
int x=96;
byte[] data=BitConverter.GetBytes(x);
The byte array can then be written into stream as
stream.Write(data,0,data.Length);
On the client side, parse the byte arrays as:
int x=BitConverter.ToInt32(data,startIndex);
MSDN has great references on TCP clients and listeners.
https://msdn.microsoft.com/en-us/library/system.net.sockets.tcplistener(v=vs.110).aspx
https://msdn.microsoft.com/en-us/library/system.net.sockets.tcpclient(v=vs.110).aspx

Textbox data to Byte then sent to serialport & vice versa

I am very new to C# and using VS, but need a little help.
I have a textbox where a user can put in a value, for example "658". I want to convert this into bytes first (max 3 bytes) before sending it to the serialport. So the first byte sent is 0x02 and the second byte sent is0x92.
The second thing I am having issues with is the same but in reverse. I receive data in bytes, for example "0x0B, 0xC7, 0x14" and then I need to convert them into a decimal value and display them in a different Textbox.
I have tried a number of conversions that did not seem to work (parse, Tobyte and even using binary converter) so I am in need of help.
Thanks
This should get you started:
Convert From Numeric to Bytes:
var textInput = "658";
// validate...
var numericInput = Convert.ToInt32(textInput);
var convertedToBytes = BitConverter.GetBytes(numericInput);
// if your system is little endian (see below), reverse array output.
Convert From Bytes to Numeric:
// fourth octet is required to convert to an int32, which requires 4 bytes.
var bytesInput = new byte[] { 0x0, 0x0B, 0xC7, 0x14 };
// if your system is little endian (see below), reverse array.
var convertedFromBytes = BitConverter.ToInt32(bytesInput, 0);
Note, you want to pay attention to endian-ness. See this: https://msdn.microsoft.com/en-us/library/bb384066.aspx
You can use Encoding.GetBytes and Encoding.GetString to convert string to byte[] and back.
https://msdn.microsoft.com/ru-ru/library/ds4kkd55(v=vs.110).aspx
https://msdn.microsoft.com/ru-ru/library/744y86tc(v=vs.110).aspx
That should no be a problem as both the sending and the receiving serial port will accept/return a byte array. So the question comes down to how you create a by array from a string.
byte[] bytes = Encoding.ASCII.GetBytes(textBox1.Text);
The way back is:
string s = Encoding.ASCII.GetString(bytes);

Socket message header building

I'm working on a protocol which will transfer block of xml data via tcp socket. Now say I need to read all the bytes from a xml file and build a memory buffer. Then before sending the actual data bytes I need to send one header to other peer end. Say my protocol using below header type.
MessageID=100,Size=232,CRC=190
string strHeader = "100,232,190"
Now I would like to know how can I make this header length fixed (fixed header length is required for other peer to identify it as a header) for any amount of xml data. Currently say I'm having a xml file sized 283637bytes, so the message header will look like.
string strHeader = "100,283637,190"
How can I make it generic for any size of data? The code is being written both in c++ and c#.
There are a number of ways to do it.
Fixed Length
You can pad the numbers numbers with leading zeroes so you know exactly what length of the text you need to work with. 000100,000232,000190
Use Bytes instead of strings
If you are using integers, you can read the bytes as integers instead of manipulating the string. Look into the BinaryReader class. If needing to do this on the C++ side, the concept is still the same. I am sure there many ways to convert 4 bytes into an int.
Specify the length at the beginning
Usually when working with dynamic length strings. There is an indicator of how many bytes need to be read in order to get the entire string. You could specify the first 4 bytes of your message as the length of your string and then read up to that point.
The best approach for you is to implement this as a struct like
struct typedef _msg_hdr {
int messageID;
int size;
int crc;
}msg_hdr;
This will always have 12 bytes length. Now when sending your message, first send header to the receiver. The receiver should receive it in the same structure. This is the best and easiest way

How do i send an ASCII character in a C# program

For my C# program to work as a serial port interface, I need to send "ENQ" and expect back "ACK".
From what I understood in replies to my earlier questions in this forum, I think the system expects me to send '5' and give back a '6' (ASCII equivalents of ENQ and ACK).
So instead of sending a string which is "ENQ", can I send a character which is '5'?
Thanks for all the answers, which pointed out that '5' is not ASCII 5 in C#.
Okay I read the comments now, I will try:
serialPort.Write(new byte[]{5},0,1);
Send an unsigned byte through the serial port, with the value of 5.
You will need to check, to verify, but everything is probably going to be unsigned, so you can get 0-255 as values.
If you're using the SerialPort object and the WriteLine method, try this:
SerialPort serialPort = ...;
...
string message = ((char) 5).ToString();
serialPort.WriteLine(message);
To address the comments, you can change the SerialPort.NewLine property to be your "end transmission" character which, in this case, may be char 5. I've developed against a few machines that used ETX as its "end of message" character. If you're continually using Write and then need to write the char 5 to end your message, consider this route.
You can't send a character that's 5, in C# chars hold unicode 16-bit character.
You can send a n unsigned byte with value 5 though.

Categories

Resources