Send text as binary data from socket - c#

I have a device emulator which accept the data as text from socket.Below code works fine until I send from 0x00 to 0x7F means upto Ascii limit (0 to 127).
Issue arise when I try to send beyond the Ascii limit like 0x80,0x81. It send to emulator 0x3F('?') and it change the whole meaning of command. because it does not able to understand this.
So What may be the possible solution to send the data beyond Ascii limit
Send data code:
//string data = textBox1.Text;
string d1 = ConvertHex("35"); //getting exact byte in socket
byte[] buffer = Encoding.ASCII.GetBytes(d1);
clientStream.Write(buffer, 0, buffer.Length);
clientStream.Flush();
ConverHex function:
public static string ConvertHex(String hexString)
{
try
{
string ascii = string.Empty;
for (int i = 0; i < hexString.Length; i += 2)
{
String hs = string.Empty;
hs = hexString.Substring(i, 2);
uint decval = System.Convert.ToUInt32(hs, 16);
char character = System.Convert.ToChar(decval);
ascii += character;
}
return ascii;
}
catch (Exception ex) { Console.WriteLine(ex.Message); }
return string.Empty;
}

but when I send more than 79 then I get 3F in emulator.
7F is in fact the upper bound. Because that's 127 in decimal, the highest code point supported by the ASCII encoding. Code points higher than that get decoded to a question mark, having the code point of 63 or 3F in hexadecimal.
That's because you're using text to transmit binary data. Don't do that. See How can I convert a hex string to a byte array? for a proper implementation of "hex string to byte array".

Related

Can't convert my string to hex representation

I'm receiving data using serial port, and I'm use the following code to convert the string to hex representation and show it in the richtextbox5:
string hex = "";
foreach (char c in RXstring)
{
uint tmp = c;
hex += String.Format("{0:X2}", (uint)System.Convert.ToInt16(tmp.ToString())) ;
}
richTextBox5.AppendText(hex + " <= Hex");
where RXstring is where I store data from serial port.
The problem is :
when I send data like 127(decimal)=> 01111111(binary)=> 7F(hex) it converted correctly, while when I send data like 191 or 167 which all share that the most significant bit is 1 and they are all 8 bits the output is 3F despite the other bits, (the representation of any 8 bits start with 1 is 3F), whats wrong with my code?
can you help, thx.
This is an example of using bytes - and it seems to work as you want:
string hex = "";
byte[] RXstring = { 0xFF, 0xCF, 0xB8, 167,191 };
foreach (byte c in RXstring)
{
uint tmp = c;
hex += String.Format("{0:X2}", (uint)System.Convert.ToInt16(tmp.ToString()));
}
System.Console.WriteLine("{0} <= Hex", hex);
I just add serialPort1.Encoding = Encoding.Default; to my serial port and worked perfectly

c# 2 byte calculation checksum

I need help with a checksum calculation.
This is (not my code!) but from specification
http://www.leupamed.at/?wpdmact=process&did=NC5ob3RsaW5r
private void CalcCheckSum(string msg, out byte checksum1, out byte checksum2)
{
byte cs1 = 0;
byte cs2 = 0;
// Always use "\n" as line break when calculating the checksum.
msg = msg.Replace("\r\n", "\n"); // Find and replace CR LF with LF
msg = msg.Replace("\r", "\n"); // Find and replace CR with LF.
for (int i = 0; i < msg.Length; i++)
{
cs1 += (byte) msg[i];
cs2 += cs1;
}
checksum1 = cs1;
checksum2 = cs2;
}
I must create a packet like this:
<!--:Begin:Chksum:1:--><!--:Ack:Msg:3:0:--><!--:End:Chksum:1:184:62:-->
The string <!--:Ack:Msg:3:0:--> is the actual data, I must calculate two checksum bytes (184 and 62) and insert these into the final packet (as seen above).
But my result from the calculation is 10 and 62
var msg = "<!--:Ack:Msg:3:0:-->";
byte checksum1 = 0;
byte checksum2 = 0;
CalcCheckSum(msg, out checksum1, out checksum2);
I don't now how to calculate correct checksum values.
This is checksum for response. Not for validating request.
I can't upload image due to low reputation, so look at last line in specification: https://drive.google.com/file/d/0B_Gs9q9SJteadVRwSVc1a2FmUTg/edit?usp=sharing
This acknowledge message is independent on request. Only it must be response to request message ID 3.
Solution?
After calculating checksum:
checksum1 = 256 - (10 + 62) = 184
checksum2 = 62
Device communicating without problem, now.
After calculating checksum:
checksum1 = 256 - (10 + 62) = 184
checksum2 = 62
Probably this question is too specific and no one has experience with this type of checksum calculation.
Device communicating without problem, now.

Are there any rules for the XOR cipher?

I have the following method which takes the plain text and the key text. It is supposed to return a string encrypted with the XOR method as ascii.
public static string encryptXOREng(string plainText,string keyText)
{
StringBuilder chiffreText = new StringBuilder();
byte[] binaryPlainText = System.Text.Encoding.ASCII.GetBytes(plainText);
byte[] binaryKeyText = System.Text.Encoding.ASCII.GetBytes(keyText);
for(int i = 0;i<plainText.Length;i++)
{
int result = binaryPlainText[i] ^ binaryKeyText[i];
chiffreText.Append(Convert.ToChar(result));
}
return chiffreText.ToString();
}
For some characters it runs just fine. But for example if it performs XOR on 'G' & 'M', which is 71 XOR 77 it returns 10. And 10 stands for Line feed. This is then actually not represented by a character in my output. This leads to a plain text of a length being encrypted to a cipher string which is only 2 characters long, in some cases. I suppose this would make a decryption impossible, even with a key? Or are the ascii characters 0 - 31 there but simply not visible?
To avoid non printable chars use Convert.ToBase64String
public static string encryptXOREng(string plainText, string keyText)
{
List<byte> chiffreText = new List<byte>();
byte[] binaryPlainText = System.Text.Encoding.ASCII.GetBytes(plainText);
byte[] binaryKeyText = System.Text.Encoding.ASCII.GetBytes(keyText);
for (int i = 0; i < plainText.Length; i++)
{
int result = binaryPlainText[i] ^ binaryKeyText[i % binaryKeyText.Length];
chiffreText.Add((byte)result);
}
return Convert.ToBase64String(chiffreText.ToArray());
}
PS: In your code you assume keyText is not shorter than plainText, I fixed it also.
As far as i know there are no rules specific to xor-ciphers. Cryptographic functions often output values that are not printable, which makes sense - the result is not supposed to be readable. In stead you may want to use the output bytes directly or a base64 encoded result.
I would do something like:
public static byte[] XORCipher(string plainText, string keyText)
{
byte[] binaryPlainText = System.Text.Encoding.ASCII.GetBytes(plainText);
byte[] binaryKeyText = System.Text.Encoding.ASCII.GetBytes(keyText);
for(int i = 0;i<plainText.Length;i++)
{
binaryPlainText[i] ^= binaryKeyText[i];
}
return binaryPlainText;
}

How to convert a string to byte array

I have a string and want to convert it to a byte array of hex value using C#.
for eg, "Hello World!" to byte[] val=new byte[] {0x48, 0x65, 0x6C, 0x6C, 0x6F, 0x20, 0x57, 0x6F, 0x72, 0x6C, 0x64, 0x21};,
I see the following code in Converting string value to hex decimal
string input = "Hello World!";
char[] values = input.ToCharArray();
foreach (char letter in values)
{
// Get the integral value of the character.
int value = Convert.ToInt32(letter);
// Convert the decimal value to a hexadecimal value in string form.
string hexOutput = String.Format("0x{0:X}", value);
Console.WriteLine("Hexadecimal value of {0} is {1}", letter, hexOutput);
}
I want this value into byte array but can't write like this
byte[] yy = new byte[values.Length];
yy[i] = Convert.ToByte(Convert.ToInt32(hexOutput));
I try this code referenced from How to convert a String to a Hex Byte Array? where I passed the hex value 48656C6C6F20576F726C6421 but I got the decimal value not hex.
public byte[] ToByteArray(String HexString)
{
int NumberChars = HexString.Length;
byte[] bytes = new byte[NumberChars / 2];
for (int i = 0; i < NumberChars; i += 2)
{
bytes[i / 2] = Convert.ToByte(HexString.Substring(i, 2), 16);
}
return bytes;
}
and I also try code from How can I convert a hex string to a byte array?
But once I used Convert.ToByte or byte.Parse , the value change to decimal value.
How should I do?
Thanks in advance
I want to send 0x80 (i.e, 128) to serial port but when I copy and paste the character equivalent to 128 to the variable 'input' and convert to byte, I got 63 (0x3F). So I think I need to send hex array. I think I got the wrong idea. Pls see screen shot.
For now, I solve this to combine byte arrays.
string input = "Hello World!";
byte[] header = new byte[] { 2, 48, 128 };
byte[] body = Encoding.ASCII.GetBytes(input);
Hexadecimal has nothing to do with this, your desired result is nothing more nor less than an array of bytes containing the ASCII codes.
Try Encoding.ASCII.GetBytes(s)
There's something strange with your requirement:
I have a string and want to convert it to a byte array of hex value
using C#.
An byte is just an 8-bit value. You can present it as decimal (e.g. 16) or hexidecimal (e.g. 0x10).
So, what do you realy want?
In case you are really wanting to get a string which contains the hex representation of an array of bytes, here's how you can do that:
public static string BytesAsString(byte[] bytes)
{
string hex = BitConverter.ToString(bytes); // This puts "-" between each value.
return hex.Replace("-",""); // So we remove "-" here.
}
It seems like you’re mixing converting to array and displaying array data.
When you have array of bytes it’s just array of bytes and you can represent it in any possible way binary, decimal, hexadecimal, octal, whatever… but that is only valid if you want to visually represent these.
Here is a code that manually converts string to byte array and then to array of strings in hex format.
string s1 = "Stack Overflow :)";
byte[] bytes = new byte[s1.Length];
for (int i = 0; i < s1.Length; i++)
{
bytes[i] = Convert.ToByte(s1[i]);
}
List<string> hexStrings = new List<string>();
foreach (byte b in bytes)
{
hexStrings.Add(Convert.ToInt32(b).ToString("X"));
}

Specifying amount of bytes to move over to string in C#

This is what I am trying to do. I am taking byte data from an incoming socket program. By default, the bytes passed in to be Encoded and appended to my string will be 1500 bytes since this is the size I defined for the bytes array. My question is, I would like to know how to pass in part of the byte array instead of the whole 1500 bytes.
IPAddress localAddr = IPAddress.Parse(args[0]);
System.Console.WriteLine("The local IP is {0}", localAddr);
Int32 port = int.Parse(args[1]);
System.Console.WriteLine("The port is {0}", port);
TcpListener myListener = new TcpListener(localAddr, port);
byte[] bytes = new byte[1500];
string sem = "";
do
{
int flag = 0;
int rec = 1;
Console.Write("Waiting");
myListener.Start();
Socket mySocket = myListener.AcceptSocket();
// receiving the hl7 message
StringBuilder sbb = new StringBuilder();
do{
// bytes = null;
rec = mySocket.Receive(bytes,SocketFlags.None);
// rec = mySocket.Receive(bytes);
Console.WriteLine("rec = {0} ",rec);
for (int i=0; i<bytes.Length; i++)
{
if (bytes[i]==0x1C)
{
flag = 1;
}
}
sbb.Append(Encoding.ASCII.GetString(bytes));
}while (flag == 0);
Firstly, let's be clear about what the code shown does:
it creates an empty byte array
it decodes from this array using an encoding, creating a new string
it passes this string to append to a StringBuilder
What it doesn't do is "copy bytes to a string" - not least a string is essentially "char" data (16 bits each), not byte data. If you wanted to treat byte data as char data, it would just-about work for UTF-16 (depending on the endianness), but not ASCII.
Re choosing how much to append:
Encoding.GetString has an overload to specify the offset and count of byte data to consider
StringBuilder.Append has an overload to specify the offset and count of char data to consider
Either or both may be useful here; however, I don't think the code does what you think it does; there are easier ways to initialise a StringBuilder

Categories

Resources