I have a string and want to convert it to a byte array of hex value using C#.
for eg, "Hello World!" to byte[] val=new byte[] {0x48, 0x65, 0x6C, 0x6C, 0x6F, 0x20, 0x57, 0x6F, 0x72, 0x6C, 0x64, 0x21};,
I see the following code in Converting string value to hex decimal
string input = "Hello World!";
char[] values = input.ToCharArray();
foreach (char letter in values)
{
// Get the integral value of the character.
int value = Convert.ToInt32(letter);
// Convert the decimal value to a hexadecimal value in string form.
string hexOutput = String.Format("0x{0:X}", value);
Console.WriteLine("Hexadecimal value of {0} is {1}", letter, hexOutput);
}
I want this value into byte array but can't write like this
byte[] yy = new byte[values.Length];
yy[i] = Convert.ToByte(Convert.ToInt32(hexOutput));
I try this code referenced from How to convert a String to a Hex Byte Array? where I passed the hex value 48656C6C6F20576F726C6421 but I got the decimal value not hex.
public byte[] ToByteArray(String HexString)
{
int NumberChars = HexString.Length;
byte[] bytes = new byte[NumberChars / 2];
for (int i = 0; i < NumberChars; i += 2)
{
bytes[i / 2] = Convert.ToByte(HexString.Substring(i, 2), 16);
}
return bytes;
}
and I also try code from How can I convert a hex string to a byte array?
But once I used Convert.ToByte or byte.Parse , the value change to decimal value.
How should I do?
Thanks in advance
I want to send 0x80 (i.e, 128) to serial port but when I copy and paste the character equivalent to 128 to the variable 'input' and convert to byte, I got 63 (0x3F). So I think I need to send hex array. I think I got the wrong idea. Pls see screen shot.
For now, I solve this to combine byte arrays.
string input = "Hello World!";
byte[] header = new byte[] { 2, 48, 128 };
byte[] body = Encoding.ASCII.GetBytes(input);
Hexadecimal has nothing to do with this, your desired result is nothing more nor less than an array of bytes containing the ASCII codes.
Try Encoding.ASCII.GetBytes(s)
There's something strange with your requirement:
I have a string and want to convert it to a byte array of hex value
using C#.
An byte is just an 8-bit value. You can present it as decimal (e.g. 16) or hexidecimal (e.g. 0x10).
So, what do you realy want?
In case you are really wanting to get a string which contains the hex representation of an array of bytes, here's how you can do that:
public static string BytesAsString(byte[] bytes)
{
string hex = BitConverter.ToString(bytes); // This puts "-" between each value.
return hex.Replace("-",""); // So we remove "-" here.
}
It seems like you’re mixing converting to array and displaying array data.
When you have array of bytes it’s just array of bytes and you can represent it in any possible way binary, decimal, hexadecimal, octal, whatever… but that is only valid if you want to visually represent these.
Here is a code that manually converts string to byte array and then to array of strings in hex format.
string s1 = "Stack Overflow :)";
byte[] bytes = new byte[s1.Length];
for (int i = 0; i < s1.Length; i++)
{
bytes[i] = Convert.ToByte(s1[i]);
}
List<string> hexStrings = new List<string>();
foreach (byte b in bytes)
{
hexStrings.Add(Convert.ToInt32(b).ToString("X"));
}
Related
I'm using the .net port of libsodium. The hash generation function has two forms, one that accepts byte arrays and one that accepts strings:
public static byte[] ArgonHashBinary(string password, string salt, long opsLimit, int memLimit, long outputLength = ARGON_SALTBYTES)
public static byte[] ArgonHashBinary(byte[] password, byte[] salt, long opsLimit, int memLimit, long outputLength = ARGON_SALTBYTES)
What i'm having an issue with is both forms producing the same hash when the input values are identical.
var saltAsBytes = PasswordHash.ArgonGenerateSalt();
var saltAsString = Encoding.UTF8.GetString(saltAsBytes);
var tmp = Encoding.UTF8.GetBytes(saltAsString);
var hash1 = PasswordHash.ArgonHashBinary(password, saltAsString, 6, 134217728, 16);
var hash2 = PasswordHash.ArgonHashBinary( Encoding.UTF8.GetBytes(password), saltAsBytes, 6, 134217728, 16);
Anything with "PasswordHash." is libsodium and not my code.
From the code above when i convert it from a string and then back to a byte array the byte array. The byte array array is always a different length. ArgonGenerateSalt() produces a byte array with a length of 16. When i convert it back from a string above its generally ~30 (different every time because of different salts produced).
Why am i converting to UTF8? Because thats what they are doing internally:
https://github.com/adamcaudill/libsodium-net/blob/master/libsodium-net/PasswordHash.cs
public static byte[] ArgonHashBinary(string password, string salt, StrengthArgon limit = StrengthArgon.Interactive, long outputLength = ARGON_SALTBYTES)
{
return ArgonHashBinary(Encoding.UTF8.GetBytes(password), Encoding.UTF8.GetBytes(salt), limit, outputLength);
}
When i convert the salt to a UTF8 string the hashing function will fail because they are checking the length of the byte array to make sure its 16 bytes. If i convert it to a ASCII string it works but produces a different hash (which is expected).
To clarify the hashing piece in this code is not the issue. Figuring out why tmp is different then saltAsBytes is the key.
I think the problem here is that the ArgonGenerateSalt method doesn't return a UTF8 encoded string, it returns completely random bytes.
You can't decode random bytes as a UTF8 string and expect it to round trip. A trivial example to see where this blows up is to do the following:
var data = new byte[] { 128 };
var dataAsString = Encoding.UTF8.GetString( data );
var dataAsBytes = Encoding.UTF8.GetBytes( dataAsString );
After this, dataAsBytes will be 3 bytes (specifically 239, 191, 189).
Converting a byte array to string and then back again produced different results
A binary data may not be converted to string and then back to byte array
using Encoding.[AnyEncoding].GetBytes and Encoding.[AnyEncoding].GetString
Instead use Convert.ToBase64String and Convert.FromBase64String
You can easily test...
var bytes = new byte[] { 255, 255, 255 };
var buf = Encoding.UTF8.GetString(bytes);
var newbytes = Encoding.UTF8.GetBytes(buf);
newbytes's length will be 9.....
Edit: This is the test case for #Theo
var bytes = new byte[] { 0, 216 }; //any new byte[] { X, 216 };
var buf = Encoding.Unicode.GetString(bytes);
var newbytes = Encoding.Unicode.GetBytes(buf); //253,255
I want to create byte array that contains 64 bits, How can i get particular bits values say 17th bit, and also how can i get hex value of that index of byte? I did like this, Is this correct?
byte[] _byte = new byte[8];
var bit17=((((_byte[2]>>1)& 0x01);
string hex=BitConverter.ToString(_byte,2,4).Replace("-", string.Empty)
You could use a BitArray:
var bits = new BitArray(64);
bool bit17 = bits[17];
I'm not sure what you mean by the "hex value of that bit" - it will be 0 or 1, because it's a bit.
If you have the index of a bit in a byte (between 0 and 7 inclusive) then you can convert that to a hex string as follows:
int bitNumber = 7; // For example.
byte value = (byte)(1 << bitNumber);
string hex = value.ToString("x");
Console.WriteLine(hex);
You can just use ToString() method.
byte[] arr= new byte[8];
int index = 0;
string hexValue = arr[index].ToString("X");
I have a string text = 0a00...4c617374736e6e41. This string actually contains hex values as chars. What I am trying to do is to the following conversion, without changing e. g. the char a to 0x41;
text = 0a...4c617374736e6e41;
--> byte[] bytes = {0x0a, ..., 0x4c, 0x61, 0x73, 0x74, 0x73, 0x6e, 0x6e, 0x41};
This is what I tried to implement so far:
...
string text = "0a00...4c617374736e6e41";
var storage = StringToByteArray(text)
...
Console.ReadKey();
public static byte[] StringToByteArray(string text)
{
char[] buffer = new char[text.Length/2];
byte[] bytes = new byte[text.length/2];
using(StringReader sr = new StringReader(text))
{
int c = 0;
while(c <= text.Length)
{
sr.Read(buffer, 0, 2);
Console.WriteLine(buffer);
//How do I store the blocks in the byte array in the needed format?
c +=2;
}
}
}
The Console.WriteLine(buffer) gives me the two chars I need. But I have NO idea how to put them in the desired format.
Here are some links I already found in the topic, however I were not able to transfer that to my problem:
How would I read an ascii string of hex values in to a byte array?
How do you convert Byte Array to Hexadecimal String, and vice versa?
Try this
string text = "0a004c617374736e6e41";
List<byte> output = new List<byte>();
for (int i = 0; i < text.Length; i += 2)
{
output.Add(byte.Parse(text.Substring(i,2), System.Globalization.NumberStyles.HexNumber));
}
So,
I have a string that I want to convert each character to hex values and then put it in a byte array to be sent through a com port.
I can convert the individual characters to the hex that I need to send, but I can't get that array of strings into a byte array correctly.
example:
string beforeConverting = "HELLO";
String[] afterConverting = {"0x48", "0x45", "0x4C", "0x4C", "0x4F"};
should become
byte[] byteData = new byte[]{0x48, 0x45, 0x4C, 0x4C, 0x4F};
I've tried several different things from several different posts but I can't get the right combination of things together. If anyone could point me in the right direction or give me a snippet of example code that would be awesome!
If your final aim is to send byte[], then you can actually skip the middle step and immediately do the conversion from string to byte[] using Encoding.ASCII.GetBytes (provided that you send ASCII char):
string beforeConverting = "HELLO";
byte[] byteData = Encoding.ASCII.GetBytes(beforeConverting);
//will give you {0x48, 0x45, 0x4C, 0x4C, 0x4F};
If you don't send ASCII, you could find the appropriate Encoding type (like Unicode or UTF32), depends on your need.
That being said, if you still want to convert the hex string to byte array, you could do something something like this:
/// <summary>
/// To convert Hex data string to bytes (i.e. 0x01455687) given the data type
/// </summary>
/// <param name="hexString"></param>
/// <param name="dataType"></param>
/// <returns></returns>
public static byte[] HexStringToBytes(string hexString) {
try {
if (hexString.Length >= 3) //must have minimum of length of 3
if (hexString[0] == '0' && (hexString[1] == 'x' || hexString[1] == 'X'))
hexString = hexString.Substring(2);
int dataSize = (hexString.Length - 1) / 2;
int expectedStringLength = 2 * dataSize;
while (hexString.Length < expectedStringLength)
hexString = "0" + hexString; //zero padding in the front
int NumberChars = hexString.Length / 2;
byte[] bytes = new byte[NumberChars];
using (var sr = new StringReader(hexString)) {
for (int i = 0; i < NumberChars; i++)
bytes[i] = Convert.ToByte(new string(new char[2] { (char)sr.Read(), (char)sr.Read() }), 16);
}
return bytes;
} catch {
return null;
}
}
And then use it like this:
byte[] byteData = afterConverting.Select(x => HexStringToBytes(x)[0]).ToArray();
The method I put above is more general which can handle input string like 0x05163782 to give byte[4]. For your use, you only need to take the first byte (as the byte[] will always be byte[1]) and thus you have [0] index in the LINQ Select.
The core method used in the custom method above is Convert.ToByte():
bytes[i] = Convert.ToByte(new string(new char[2] { (char)sr.Read(), (char)sr.Read() }), 16);
To convert just the hexadecimal string to a number, you could use the System.Convert class like so
string hex = "0x3B";
byte b = Convert.ToByte(hex.Substring(2), 16)
// b is now 0x3B
Substring is used to skip the characters 0x
I have the following method which takes the plain text and the key text. It is supposed to return a string encrypted with the XOR method as ascii.
public static string encryptXOREng(string plainText,string keyText)
{
StringBuilder chiffreText = new StringBuilder();
byte[] binaryPlainText = System.Text.Encoding.ASCII.GetBytes(plainText);
byte[] binaryKeyText = System.Text.Encoding.ASCII.GetBytes(keyText);
for(int i = 0;i<plainText.Length;i++)
{
int result = binaryPlainText[i] ^ binaryKeyText[i];
chiffreText.Append(Convert.ToChar(result));
}
return chiffreText.ToString();
}
For some characters it runs just fine. But for example if it performs XOR on 'G' & 'M', which is 71 XOR 77 it returns 10. And 10 stands for Line feed. This is then actually not represented by a character in my output. This leads to a plain text of a length being encrypted to a cipher string which is only 2 characters long, in some cases. I suppose this would make a decryption impossible, even with a key? Or are the ascii characters 0 - 31 there but simply not visible?
To avoid non printable chars use Convert.ToBase64String
public static string encryptXOREng(string plainText, string keyText)
{
List<byte> chiffreText = new List<byte>();
byte[] binaryPlainText = System.Text.Encoding.ASCII.GetBytes(plainText);
byte[] binaryKeyText = System.Text.Encoding.ASCII.GetBytes(keyText);
for (int i = 0; i < plainText.Length; i++)
{
int result = binaryPlainText[i] ^ binaryKeyText[i % binaryKeyText.Length];
chiffreText.Add((byte)result);
}
return Convert.ToBase64String(chiffreText.ToArray());
}
PS: In your code you assume keyText is not shorter than plainText, I fixed it also.
As far as i know there are no rules specific to xor-ciphers. Cryptographic functions often output values that are not printable, which makes sense - the result is not supposed to be readable. In stead you may want to use the output bytes directly or a base64 encoded result.
I would do something like:
public static byte[] XORCipher(string plainText, string keyText)
{
byte[] binaryPlainText = System.Text.Encoding.ASCII.GetBytes(plainText);
byte[] binaryKeyText = System.Text.Encoding.ASCII.GetBytes(keyText);
for(int i = 0;i<plainText.Length;i++)
{
binaryPlainText[i] ^= binaryKeyText[i];
}
return binaryPlainText;
}