I'm receiving data using serial port, and I'm use the following code to convert the string to hex representation and show it in the richtextbox5:
string hex = "";
foreach (char c in RXstring)
{
uint tmp = c;
hex += String.Format("{0:X2}", (uint)System.Convert.ToInt16(tmp.ToString())) ;
}
richTextBox5.AppendText(hex + " <= Hex");
where RXstring is where I store data from serial port.
The problem is :
when I send data like 127(decimal)=> 01111111(binary)=> 7F(hex) it converted correctly, while when I send data like 191 or 167 which all share that the most significant bit is 1 and they are all 8 bits the output is 3F despite the other bits, (the representation of any 8 bits start with 1 is 3F), whats wrong with my code?
can you help, thx.
This is an example of using bytes - and it seems to work as you want:
string hex = "";
byte[] RXstring = { 0xFF, 0xCF, 0xB8, 167,191 };
foreach (byte c in RXstring)
{
uint tmp = c;
hex += String.Format("{0:X2}", (uint)System.Convert.ToInt16(tmp.ToString()));
}
System.Console.WriteLine("{0} <= Hex", hex);
I just add serialPort1.Encoding = Encoding.Default; to my serial port and worked perfectly
Related
I have a device emulator which accept the data as text from socket.Below code works fine until I send from 0x00 to 0x7F means upto Ascii limit (0 to 127).
Issue arise when I try to send beyond the Ascii limit like 0x80,0x81. It send to emulator 0x3F('?') and it change the whole meaning of command. because it does not able to understand this.
So What may be the possible solution to send the data beyond Ascii limit
Send data code:
//string data = textBox1.Text;
string d1 = ConvertHex("35"); //getting exact byte in socket
byte[] buffer = Encoding.ASCII.GetBytes(d1);
clientStream.Write(buffer, 0, buffer.Length);
clientStream.Flush();
ConverHex function:
public static string ConvertHex(String hexString)
{
try
{
string ascii = string.Empty;
for (int i = 0; i < hexString.Length; i += 2)
{
String hs = string.Empty;
hs = hexString.Substring(i, 2);
uint decval = System.Convert.ToUInt32(hs, 16);
char character = System.Convert.ToChar(decval);
ascii += character;
}
return ascii;
}
catch (Exception ex) { Console.WriteLine(ex.Message); }
return string.Empty;
}
but when I send more than 79 then I get 3F in emulator.
7F is in fact the upper bound. Because that's 127 in decimal, the highest code point supported by the ASCII encoding. Code points higher than that get decoded to a question mark, having the code point of 63 or 3F in hexadecimal.
That's because you're using text to transmit binary data. Don't do that. See How can I convert a hex string to a byte array? for a proper implementation of "hex string to byte array".
I'm writing a C# application that will receive serial data from 3 different COM ports configure with 8-bit UART with no parity. The other devices will be sending and receiving binary encoded HEX ex. AF01h = 10101010 00000001 two characters for each byte. I have set up virtual COM ports and a simple application for testing purposes and am sending data back and forth before I hook the devices up. I found that data is ASCII encoded by default on transmission and reception but I need both to be binary encoded HEX. I do not see that option in the encoding class and would rather not have the application using a complete different encoding than the 3 other devices. Right now I am using this code to convert the string when it is sent
string binarystring = String.Join(String.Empty, hexstring.Select(c => Convert.ToString(Convert.ToInt32(c.ToString(), 16), 2).PadLeft(4, '0')));
sport.Write(binarystring);
txtReceive.AppendText("[" + dtn + "] " + "Sent: " + binarystring + "\n");
This works for testing transmission for now but i will eventually change the code to place the two digit hex number directly into a byte array.
This code will allow me to enter AF01h = 1010101000000001, but on the receiving end of the application I get 16 bytes of ASCII encoded characters. Is there a way I can get the app on the same page as the other devices?
Figured out a way to do it. Just needed to convert the long string of hex to two hex character byte integers
string hex = txtDatatoSend.Text; //"F1AAAF1234BA01"
int numOfBytes = HEX.Length;
byte[] outbuffer = new byte[numOfBytes / 2];
for (int i = 0; i < numOfBytes; i += 2)
{
outbuffer[i / 2] = Convert.ToByte(hex.Substring(i, 2), 16);
}
sport.Write(outbuffer, 0, outbuffer.Length);
sport.DiscardOutBuffer()
The only caveat is you have to enter in the an even number of characters
On the other end the data gets placed right back in the Byte[] and i can decode it like this.
byte[] inbuffer = new byte[sport.BytesToRead];
sport.Read(inbuffer, 0, inbuffer.Length);
txtReceive.AppendText("[" + dtn + "] " + "Received: " + inbuffer.Length + " bytes ");
for (int i = 0; i < inbuffer.Length; i++)
{
string hexValue = inbuffer[i].ToString("X2");
txtReceive.AppendText(inbuffer[i] + " is " + hexValue + "HEX ");
}
txtReceive.AppendText("\n");
sport.DiscardInBuffer();
I need help with a checksum calculation.
This is (not my code!) but from specification
http://www.leupamed.at/?wpdmact=process&did=NC5ob3RsaW5r
private void CalcCheckSum(string msg, out byte checksum1, out byte checksum2)
{
byte cs1 = 0;
byte cs2 = 0;
// Always use "\n" as line break when calculating the checksum.
msg = msg.Replace("\r\n", "\n"); // Find and replace CR LF with LF
msg = msg.Replace("\r", "\n"); // Find and replace CR with LF.
for (int i = 0; i < msg.Length; i++)
{
cs1 += (byte) msg[i];
cs2 += cs1;
}
checksum1 = cs1;
checksum2 = cs2;
}
I must create a packet like this:
<!--:Begin:Chksum:1:--><!--:Ack:Msg:3:0:--><!--:End:Chksum:1:184:62:-->
The string <!--:Ack:Msg:3:0:--> is the actual data, I must calculate two checksum bytes (184 and 62) and insert these into the final packet (as seen above).
But my result from the calculation is 10 and 62
var msg = "<!--:Ack:Msg:3:0:-->";
byte checksum1 = 0;
byte checksum2 = 0;
CalcCheckSum(msg, out checksum1, out checksum2);
I don't now how to calculate correct checksum values.
This is checksum for response. Not for validating request.
I can't upload image due to low reputation, so look at last line in specification: https://drive.google.com/file/d/0B_Gs9q9SJteadVRwSVc1a2FmUTg/edit?usp=sharing
This acknowledge message is independent on request. Only it must be response to request message ID 3.
Solution?
After calculating checksum:
checksum1 = 256 - (10 + 62) = 184
checksum2 = 62
Device communicating without problem, now.
After calculating checksum:
checksum1 = 256 - (10 + 62) = 184
checksum2 = 62
Probably this question is too specific and no one has experience with this type of checksum calculation.
Device communicating without problem, now.
I am running into problems with Hex values and python. I am trying to write a function which performs a bytewise XOR and returns a Hex value.
Basically I am trying to convert this C# code to Python:
private byte[] AddParity(string _in)
{
byte parity = 0x7f;
List<byte> _out = new List<byte>();
ASCIIEncoding asc = new ASCIIEncoding();
byte[] bytes = asc.GetBytes(_in + '\r');
foreach (byte bt in bytes)
{
parity ^= bt;
_out.Add(bt);
}
_out.Add(parity);
return _out.ToArray();
}
Can someone point me in the right direction?
parity = 0x7f
parities = [int(item,16) ^ parity for item in "4e 7f 2b".split()]
#or maybe
parities = [ord(item) ^ parity for item in "somestring"]
I guess you are using this as some sort of checksum
parity = 0x7f
bits = []
for bit in "somestring":
parity ^= ord(bit)
parity &= 0xFF #ensure width
bits.append(bit)
bits.append(parity)
to do the checksum more pythonically you could do
this is the answer you want
bytestring = "vTest\r"
bits = chr(0x7f) + bytestring
checksum = reduce(lambda x,y:chr((ord(x)^ord(y))&0xff),bits)
message = bytestring+checksum
print map(lambda x:hex(ord(x)),message)
#Result:['0x76', '0x54', '0x65', '0x73', '0x74', '0xd', '0x32']
# ser.write(message)
if you want to see the hex values
print map(hex,parities)
or to see the binary
print map(bin,parities)
I have a string and want to convert it to a byte array of hex value using C#.
for eg, "Hello World!" to byte[] val=new byte[] {0x48, 0x65, 0x6C, 0x6C, 0x6F, 0x20, 0x57, 0x6F, 0x72, 0x6C, 0x64, 0x21};,
I see the following code in Converting string value to hex decimal
string input = "Hello World!";
char[] values = input.ToCharArray();
foreach (char letter in values)
{
// Get the integral value of the character.
int value = Convert.ToInt32(letter);
// Convert the decimal value to a hexadecimal value in string form.
string hexOutput = String.Format("0x{0:X}", value);
Console.WriteLine("Hexadecimal value of {0} is {1}", letter, hexOutput);
}
I want this value into byte array but can't write like this
byte[] yy = new byte[values.Length];
yy[i] = Convert.ToByte(Convert.ToInt32(hexOutput));
I try this code referenced from How to convert a String to a Hex Byte Array? where I passed the hex value 48656C6C6F20576F726C6421 but I got the decimal value not hex.
public byte[] ToByteArray(String HexString)
{
int NumberChars = HexString.Length;
byte[] bytes = new byte[NumberChars / 2];
for (int i = 0; i < NumberChars; i += 2)
{
bytes[i / 2] = Convert.ToByte(HexString.Substring(i, 2), 16);
}
return bytes;
}
and I also try code from How can I convert a hex string to a byte array?
But once I used Convert.ToByte or byte.Parse , the value change to decimal value.
How should I do?
Thanks in advance
I want to send 0x80 (i.e, 128) to serial port but when I copy and paste the character equivalent to 128 to the variable 'input' and convert to byte, I got 63 (0x3F). So I think I need to send hex array. I think I got the wrong idea. Pls see screen shot.
For now, I solve this to combine byte arrays.
string input = "Hello World!";
byte[] header = new byte[] { 2, 48, 128 };
byte[] body = Encoding.ASCII.GetBytes(input);
Hexadecimal has nothing to do with this, your desired result is nothing more nor less than an array of bytes containing the ASCII codes.
Try Encoding.ASCII.GetBytes(s)
There's something strange with your requirement:
I have a string and want to convert it to a byte array of hex value
using C#.
An byte is just an 8-bit value. You can present it as decimal (e.g. 16) or hexidecimal (e.g. 0x10).
So, what do you realy want?
In case you are really wanting to get a string which contains the hex representation of an array of bytes, here's how you can do that:
public static string BytesAsString(byte[] bytes)
{
string hex = BitConverter.ToString(bytes); // This puts "-" between each value.
return hex.Replace("-",""); // So we remove "-" here.
}
It seems like you’re mixing converting to array and displaying array data.
When you have array of bytes it’s just array of bytes and you can represent it in any possible way binary, decimal, hexadecimal, octal, whatever… but that is only valid if you want to visually represent these.
Here is a code that manually converts string to byte array and then to array of strings in hex format.
string s1 = "Stack Overflow :)";
byte[] bytes = new byte[s1.Length];
for (int i = 0; i < s1.Length; i++)
{
bytes[i] = Convert.ToByte(s1[i]);
}
List<string> hexStrings = new List<string>();
foreach (byte b in bytes)
{
hexStrings.Add(Convert.ToInt32(b).ToString("X"));
}