How do i send an ASCII character in a C# program - c#

For my C# program to work as a serial port interface, I need to send "ENQ" and expect back "ACK".
From what I understood in replies to my earlier questions in this forum, I think the system expects me to send '5' and give back a '6' (ASCII equivalents of ENQ and ACK).
So instead of sending a string which is "ENQ", can I send a character which is '5'?
Thanks for all the answers, which pointed out that '5' is not ASCII 5 in C#.
Okay I read the comments now, I will try:
serialPort.Write(new byte[]{5},0,1);

Send an unsigned byte through the serial port, with the value of 5.
You will need to check, to verify, but everything is probably going to be unsigned, so you can get 0-255 as values.

If you're using the SerialPort object and the WriteLine method, try this:
SerialPort serialPort = ...;
...
string message = ((char) 5).ToString();
serialPort.WriteLine(message);
To address the comments, you can change the SerialPort.NewLine property to be your "end transmission" character which, in this case, may be char 5. I've developed against a few machines that used ETX as its "end of message" character. If you're continually using Write and then need to write the char 5 to end your message, consider this route.

You can't send a character that's 5, in C# chars hold unicode 16-bit character.
You can send a n unsigned byte with value 5 though.

Related

Should I check endianness when using System.Text.Encoding.Unicode.GetBytes(aString)?

I'm writing something that convert data to byte[], transferring through internet, then convert back to what they were for a Unity game project.
I use BitConverter to convert int, float, etc., as the following example shows:
float aFloat = 312321f;
var bytes = BitConverter.GetBytes(aFloat);
if (BitConverter.IsLittleEndian) Array.Reverse(bytes);
// sending through the internet
byte[] bytes = GetByteArrayFromTheInternet();
if (BitConverter.IsLittleEndian) Array.Reverse(bytes);
float aFloat = BitConverter.ToSingle(bytes, 0);
I do the endianness check before and after sending the data to make sure they're the same. Do I need to do this for string?
string aString = "testing";
var bytes = System.Text.Encoding.Unicode.GetBytes(aString);
// if (BitConverter.IsLittleEndian) Array.Reverse(bytes); // Do I need this line?
// sending through the internet
byte[] bytes = GetByteArrayFromTheInternet();
// if (BitConverter.IsLittleEndian) Array.Reverse(bytes); // Do I need this too?
string aString = System.Text.Encoding.Unicode.GetString(bytes);
Thanks in advance!
I do the endianess check before and after sending the data to make sure they're the same. Do I need to do this for string?
That depends on who you're talking to on the network. What endianness are they using?
In your first example, you are assuming that the network protocol always sends float types (32-bit floating point) as big-endian. Which is fine; traditionally, "network host order" has always been big-endian, so it's a good choice for a network protocol.
But there's no requirement that a network protocol comply with that, nor that it be internally self-consistent, and you haven't provided any information about what protocol you're implementing.
Note: by "network protocol", I'm referring to the application-level protocol. This would be something like HTTP, SMTP, FTP, POP, etc. I.e. whatever your application chooses for the format of bytes on the network.
So, you'll have to consult the specification of the protocol you're using to find out what endianness the Unicode-encoded (UTF16) data uses. I would guess that it's big-endian, since your float values are too. But I can't say that for sure.
Note that if the network protocol does encode text as big-endian UTF16, then you don't need to swap the bytes for each character yourself. Just use the BigEndianUnicode encoding object to encode and decode the text. It will handle the endianness for you.
Note also that it's not really optional to use the right encoder. All that checking the BitConverter.IsLittleEndian field tells you is the current CPU architecture. But if the text on the network protocol is encoded as big-endian, then even if you are running on a big-endian CPU, you still need to use the BigEndianUnicode encoding. Just like that one will always reliably decode big-endian text, the Unicode encoding always decodes the text as if it's little-endian, even when running on a big-endian CPU.

Python to C#: how to format data for sockets?

I am translating a python communication library into C#, and am having trouble interpreting how the string gets formatted before being sent over tcp.
The relevant portion of the code is as follows:
struct.pack(
'!HHBH'+str(var_name_len)+'s',
self.msg_id,
req_len,
flag,
var_name_len,
self.varname
)
Then it gets sent with: sendall()
I have looked at the Python documentation (https://docs.python.org/2/library/struct.html) but am still drawing a blank regarding the first line: '!HHBH'+str(var_name_len)+'s', I understand this is where the formatting is set, but what it is being formatted to is beyond me.
The python code that I am translating can be found at the following link:
https://github.com/linuxsand/py_openshowvar/blob/master/py_openshowvar.py
Any python and C# vigilantes out there that can help me build this bridge?
Edit: Based on jas' answer, I have written the following c# struct:
public struct messageFormat
{
ushort messageId;
ushort reqLength;
char functionType;
ushort varLengthHex;
string varname;
...
Once I populate it, I will need to send it over TCP. I have an open socket, but need to convert to to a byte[] I assume so I can use socket.send(byte[])?
Thanks
What's being formatted are the five arguments following the format string. Each argument has a corresponding element in the format string.
For the sake of the explanation, let's assume that var_name_len has the value 12 (presumably because var_name is a string of length 12 in this hypothetical case).
So the format string will be
!HHBH12s
Breaking that down according to the docs:
! Big-ending byte ordering will be used
H self.msg_id will be packed as a two-byte unsigned short
H req_len will be packed as above
B flag will be packed as a one-byte unsigned char
H var_name_len will be packed as a two-byte unsigned short
12s self.varname will be packed as a 12-byte string

Php Byte Array Packet

I have a c# library which practically starts listening on a tcpip server an accepts a buffer of a certain size.
I need to send this packet as Byte array from php over the socket in a form of byte array or equivalent.
The packet is constructed for example byte[1] (a flag) is a number from 0 to 255 and byte[6] to byte[11] contains a float number in a string fromat for example:
005.70 which takes 6 bytes representing every character.
I managed to send the flag but when i try to send the float number it does not convert on the other side (C#).
So my question how can i send a byte array to c# using php?
From the C# part the conversion is being handled as follows:
float.Parse(System.Text.Encoding.Default.GetString(Data, 6, 6));
Just after i have posted the question i have dictated my answer. I am not 100% sure if this is the right way but it managed to convert correctly.
Here is the answer:
I created an array of characters and escaped the flag (4) to be the actual byte value being (4) but i didn't escape the money value
$string = array (0=>"\0", 1=>"\4", 2=>"\0", 3=>"\0", 4=>"\0", 5=>"\0", 6=>"5", 7=>".", 8=>"7", 9=>"\0", 10=>"\0");
Imploded all together with nothing as glue:
$arrByte = implode("", $string);
and sent over the opened socket:
$success = #fwrite($this->socket, $arrByte);

Socket message header building

I'm working on a protocol which will transfer block of xml data via tcp socket. Now say I need to read all the bytes from a xml file and build a memory buffer. Then before sending the actual data bytes I need to send one header to other peer end. Say my protocol using below header type.
MessageID=100,Size=232,CRC=190
string strHeader = "100,232,190"
Now I would like to know how can I make this header length fixed (fixed header length is required for other peer to identify it as a header) for any amount of xml data. Currently say I'm having a xml file sized 283637bytes, so the message header will look like.
string strHeader = "100,283637,190"
How can I make it generic for any size of data? The code is being written both in c++ and c#.
There are a number of ways to do it.
Fixed Length
You can pad the numbers numbers with leading zeroes so you know exactly what length of the text you need to work with. 000100,000232,000190
Use Bytes instead of strings
If you are using integers, you can read the bytes as integers instead of manipulating the string. Look into the BinaryReader class. If needing to do this on the C++ side, the concept is still the same. I am sure there many ways to convert 4 bytes into an int.
Specify the length at the beginning
Usually when working with dynamic length strings. There is an indicator of how many bytes need to be read in order to get the entire string. You could specify the first 4 bytes of your message as the length of your string and then read up to that point.
The best approach for you is to implement this as a struct like
struct typedef _msg_hdr {
int messageID;
int size;
int crc;
}msg_hdr;
This will always have 12 bytes length. Now when sending your message, first send header to the receiver. The receiver should receive it in the same structure. This is the best and easiest way

TextWriter.ReadToEnd vs. Unix wc Command

Another question re. Unicode, terminals and now C# and wc. If I write this simple piece of code
int i=0;
foreach(char c in Console.In.ReadToEnd())
{
if(c!='\n') i++;
}
Console.WriteLine("{0}", i);
and input it only the character "€" (3 bytes in utf-8), wc returns 3 characters (maybe using wint_t, though I haven't checked), but ReadToEnd() returns 1 (one character). What exactly is the behavior of ReadToEnd in this case? How do I know what ReadToEnd is doing behind the scenes?
I'm running xterm initialized with utf-8.en.US, running Ubuntu Linux and Mono.
Thank you.
wc and most unix-like commands deal with characters in terms of the C char data type which is usually an unsigned 8 bit integer. wc simply reads the bytes from the standard input one by one with no conversion and determines that there are 3 characters.
.NET deals with characters in terms of its own Char data type which is a 16 bit unsigned integer and represents a UTF-16 character. The console class has recieved the 3 bytes of input, determined that the console it is attached to is UTF-8 and has properly converted them to a single UTF-16 euro character.
You need to take into consideration the character encoding. Currently you are merely counting the bytes and chars and bytes are not necessarily the same size.
Encoding encoding = Encoding.UTF8;
string s = "€";
int byteCount = encoding.GetByteCount(s);
Console.WriteLine(byteCount); // prints "3" on the console
byte[] bytes = new byte[byteCount];
encoding.GetBytes(s, 0, s.Length, bytes, 0);
int charCount = encoding.GetCharCount(bytes);
Console.WriteLine(charCount); // prints "1" on the console
ReadToEnd returns a string. All strings in .NET are Unicode. They're not just an array of bytes.
Apparently, wc is returning the number of bytes. The number of bytes and the number of characters used to be the same thing.
wc, by default, returns the number of lines, words and bytes in a file. If you want to to return the number of characters according to the active locale's encoding rather than just the number of bytes then you should look at the -m or --chars option which modern wc's have.

Categories

Resources