Converting String to Hex then into Byte Array - c#

So,
I have a string that I want to convert each character to hex values and then put it in a byte array to be sent through a com port.
I can convert the individual characters to the hex that I need to send, but I can't get that array of strings into a byte array correctly.
example:
string beforeConverting = "HELLO";
String[] afterConverting = {"0x48", "0x45", "0x4C", "0x4C", "0x4F"};
should become
byte[] byteData = new byte[]{0x48, 0x45, 0x4C, 0x4C, 0x4F};
I've tried several different things from several different posts but I can't get the right combination of things together. If anyone could point me in the right direction or give me a snippet of example code that would be awesome!

If your final aim is to send byte[], then you can actually skip the middle step and immediately do the conversion from string to byte[] using Encoding.ASCII.GetBytes (provided that you send ASCII char):
string beforeConverting = "HELLO";
byte[] byteData = Encoding.ASCII.GetBytes(beforeConverting);
//will give you {0x48, 0x45, 0x4C, 0x4C, 0x4F};
If you don't send ASCII, you could find the appropriate Encoding type (like Unicode or UTF32), depends on your need.
That being said, if you still want to convert the hex string to byte array, you could do something something like this:
/// <summary>
/// To convert Hex data string to bytes (i.e. 0x01455687) given the data type
/// </summary>
/// <param name="hexString"></param>
/// <param name="dataType"></param>
/// <returns></returns>
public static byte[] HexStringToBytes(string hexString) {
try {
if (hexString.Length >= 3) //must have minimum of length of 3
if (hexString[0] == '0' && (hexString[1] == 'x' || hexString[1] == 'X'))
hexString = hexString.Substring(2);
int dataSize = (hexString.Length - 1) / 2;
int expectedStringLength = 2 * dataSize;
while (hexString.Length < expectedStringLength)
hexString = "0" + hexString; //zero padding in the front
int NumberChars = hexString.Length / 2;
byte[] bytes = new byte[NumberChars];
using (var sr = new StringReader(hexString)) {
for (int i = 0; i < NumberChars; i++)
bytes[i] = Convert.ToByte(new string(new char[2] { (char)sr.Read(), (char)sr.Read() }), 16);
}
return bytes;
} catch {
return null;
}
}
And then use it like this:
byte[] byteData = afterConverting.Select(x => HexStringToBytes(x)[0]).ToArray();
The method I put above is more general which can handle input string like 0x05163782 to give byte[4]. For your use, you only need to take the first byte (as the byte[] will always be byte[1]) and thus you have [0] index in the LINQ Select.
The core method used in the custom method above is Convert.ToByte():
bytes[i] = Convert.ToByte(new string(new char[2] { (char)sr.Read(), (char)sr.Read() }), 16);

To convert just the hexadecimal string to a number, you could use the System.Convert class like so
string hex = "0x3B";
byte b = Convert.ToByte(hex.Substring(2), 16)
// b is now 0x3B
Substring is used to skip the characters 0x

Related

How to put chars that actually represent hex values in a byte array

I have a string text = 0a00...4c617374736e6e41. This string actually contains hex values as chars. What I am trying to do is to the following conversion, without changing e. g. the char a to 0x41;
text = 0a...4c617374736e6e41;
--> byte[] bytes = {0x0a, ..., 0x4c, 0x61, 0x73, 0x74, 0x73, 0x6e, 0x6e, 0x41};
This is what I tried to implement so far:
...
string text = "0a00...4c617374736e6e41";
var storage = StringToByteArray(text)
...
Console.ReadKey();
public static byte[] StringToByteArray(string text)
{
char[] buffer = new char[text.Length/2];
byte[] bytes = new byte[text.length/2];
using(StringReader sr = new StringReader(text))
{
int c = 0;
while(c <= text.Length)
{
sr.Read(buffer, 0, 2);
Console.WriteLine(buffer);
//How do I store the blocks in the byte array in the needed format?
c +=2;
}
}
}
The Console.WriteLine(buffer) gives me the two chars I need. But I have NO idea how to put them in the desired format.
Here are some links I already found in the topic, however I were not able to transfer that to my problem:
How would I read an ascii string of hex values in to a byte array?
How do you convert Byte Array to Hexadecimal String, and vice versa?
Try this
string text = "0a004c617374736e6e41";
List<byte> output = new List<byte>();
for (int i = 0; i < text.Length; i += 2)
{
output.Add(byte.Parse(text.Substring(i,2), System.Globalization.NumberStyles.HexNumber));
}

How to convert a large hexadecimal string to a byte array?

Hi I am in need of using file handling,for that i used a method for converting a hexadecimal string into a byte array.
public static byte[] StringToByteArray(string hex)
{
return Enumerable.Range(0, hex.Length)
.Where(x => x % 2 == 0)
.Select(x => Convert.ToByte(hex.Substring(x, 2), 16))
.ToArray();
}
My problem is ,when i give a small hexadecimal string as a parameter to this function it will produce the right output,but when i used a large hexadecimal string as a parameter output is not that expected.
for your clear understanding -
I used a hexadecimal string which is being converted from a byte array of value [26246026],
when i convert that hex string into a byte array it should return a byte value as [26246026] - but its returning only the partial bytes ie.[262144].
i cant get the exact byte value from the hex string,how can i get that?
Please someone help me to get the expected output.
My input string for that method contains this hexadecimel string which is a 25mb size file-it should return a byte value of [26246026]---but its returning only the byte value of [262144].
when am using small hex string (min size file) its working fine,but when i work on big files i cant get the original file byte. please suggest me what to do.
my input parameter string content is as follow as asked in comment.
Its totally 524288 characters in length..
looks like this.
3026b2758e66cf11a6d900aa0062ce6c301600000000000008000000010240a4d0d207e3d21197f000a0c95ea850cc0000000000000004001c00530066004f0072006900670069006e0061006c00460050005300000003000400b49204001c0057004d004600530044004b00560065007200730069006f006e00000000001e00310031002e0030002e0036003000300031002e00370030003000300000001a0057004d004600530044004b004e006500650064006500640000000000160030002e0030002e0030002e00300030003000300000000c0049007300560042005200000002000400000000003326b2758e66cf11a6..........................................................................................................................................
d900aa0062ce6c54010000000000001e0000003a00da000000570dcb8b495848cea4609eca906bc24db442394f0ddac5eb0604fb99820bcc30ff0f1736eefd74cd4317a21a369e208c580dbb02f90e888f0a35901e08439ec6087c61d241bc3c476c24d311291a678596a98792a9000b68adf213906e0f00097c8d989e517ee532fcd6cb70e520ec9dd4fad8a1a37668bbd678bea11c1fcf2d187c4c4c6c09c3c2c53d3e64016cfebc34eace85d45a4c08cd78d05d3934e05b72ec194304848165a8c1a585c78423
/// <summary>
/// Parses a continuous hex stream from a string.
/// </summary>
public static byte[] ParseHexBytes(this string s)
{
if (s == null)
throw new ArgumentNullException("s");
if (s.Length == 0)
return new byte[0];
if (s.Length % 2 != 0)
throw new ArgumentException("Source length error", "s");
int length = s.Length >> 1;
byte[] result = new byte[length];
for (int i = 0; i < length; i++)
{
result[i] = Byte.Parse(s.Substring(i * 2, 2), NumberStyles.HexNumber);
}
return result;
}

How to convert a string to byte array

I have a string and want to convert it to a byte array of hex value using C#.
for eg, "Hello World!" to byte[] val=new byte[] {0x48, 0x65, 0x6C, 0x6C, 0x6F, 0x20, 0x57, 0x6F, 0x72, 0x6C, 0x64, 0x21};,
I see the following code in Converting string value to hex decimal
string input = "Hello World!";
char[] values = input.ToCharArray();
foreach (char letter in values)
{
// Get the integral value of the character.
int value = Convert.ToInt32(letter);
// Convert the decimal value to a hexadecimal value in string form.
string hexOutput = String.Format("0x{0:X}", value);
Console.WriteLine("Hexadecimal value of {0} is {1}", letter, hexOutput);
}
I want this value into byte array but can't write like this
byte[] yy = new byte[values.Length];
yy[i] = Convert.ToByte(Convert.ToInt32(hexOutput));
I try this code referenced from How to convert a String to a Hex Byte Array? where I passed the hex value 48656C6C6F20576F726C6421 but I got the decimal value not hex.
public byte[] ToByteArray(String HexString)
{
int NumberChars = HexString.Length;
byte[] bytes = new byte[NumberChars / 2];
for (int i = 0; i < NumberChars; i += 2)
{
bytes[i / 2] = Convert.ToByte(HexString.Substring(i, 2), 16);
}
return bytes;
}
and I also try code from How can I convert a hex string to a byte array?
But once I used Convert.ToByte or byte.Parse , the value change to decimal value.
How should I do?
Thanks in advance
I want to send 0x80 (i.e, 128) to serial port but when I copy and paste the character equivalent to 128 to the variable 'input' and convert to byte, I got 63 (0x3F). So I think I need to send hex array. I think I got the wrong idea. Pls see screen shot.
For now, I solve this to combine byte arrays.
string input = "Hello World!";
byte[] header = new byte[] { 2, 48, 128 };
byte[] body = Encoding.ASCII.GetBytes(input);
Hexadecimal has nothing to do with this, your desired result is nothing more nor less than an array of bytes containing the ASCII codes.
Try Encoding.ASCII.GetBytes(s)
There's something strange with your requirement:
I have a string and want to convert it to a byte array of hex value
using C#.
An byte is just an 8-bit value. You can present it as decimal (e.g. 16) or hexidecimal (e.g. 0x10).
So, what do you realy want?
In case you are really wanting to get a string which contains the hex representation of an array of bytes, here's how you can do that:
public static string BytesAsString(byte[] bytes)
{
string hex = BitConverter.ToString(bytes); // This puts "-" between each value.
return hex.Replace("-",""); // So we remove "-" here.
}
It seems like you’re mixing converting to array and displaying array data.
When you have array of bytes it’s just array of bytes and you can represent it in any possible way binary, decimal, hexadecimal, octal, whatever… but that is only valid if you want to visually represent these.
Here is a code that manually converts string to byte array and then to array of strings in hex format.
string s1 = "Stack Overflow :)";
byte[] bytes = new byte[s1.Length];
for (int i = 0; i < s1.Length; i++)
{
bytes[i] = Convert.ToByte(s1[i]);
}
List<string> hexStrings = new List<string>();
foreach (byte b in bytes)
{
hexStrings.Add(Convert.ToInt32(b).ToString("X"));
}

Base-N encoding of a byte array

A couple of days ago I came across this CodeReview for Base-36 encoding a byte array. However, the answers that followed didn't touch on decoding back into a byte array, or possibly reusing the answer to perform encodings of different bases (radix).
The answer for the linked question uses BigInteger. So as far as implementation goes, the base and its digits could be parametrized.
The problem with BigInteger though, is that we're treating our input as an assumed integer. However, our input, a byte array, is just an opaque series of values.
If the byte array ends in a series of zero bytes, eg {0xFF,0x7F,0x00,0x00}, those bytes will be lost when using the algorithm in the answer (would only encode {0xFF,0x7F}.
If the last non-zero byte has the sign bit set then the proceeding zero byte is consumed as it's treated as the BigInt's sign delimiter. So {0xFF,0xFF,0x00,0x00} would encode only as {0xFF,0xFF,0x00}.
How could a .NET programmer use BigInteger to create a reasonably efficient and radix-agnostic encoder, with decoding support, plus the ability to handle endian-ness, and with the ability to 'work around' the ending zero bytes being lost?
edit [2020/01/26]: FWIW, the code below along with its unit test live along side my open source libraries on Github.
edit [2016/04/19]: If you're fond of exceptions, you may wish to change some of the Decode implementation code to throw InvalidDataException instead of just returning null.
edit [2014/09/14]: I've added a 'HACK' to Encode() to handle cases where the last byte in the input is signed (if you were to convert to sbyte). Only sane solution I could think of right now is to just Resize() the array by one. Additional unit tests for this case passed, but I didn't rerun perf code to account for such cases. If you can help it, always have your input to Encode() include a dummy 0 byte at the end to avoid additional allocations.
Usage
I've created a RadixEncoding class (found in the "Code" section) which initializes with three parameters:
The radix digits as a string (length determines the actual radix of course),
The assumed byte ordering (endian) of input byte arrays,
And whether or not the user wants the encode/decode logic to acknowledge ending zero bytes.
To create a Base-36 encoding, with little-endian input, and with respect given to ending zero bytes:
const string k_base36_digits = "0123456789abcdefghijklmnopqrstuvwxyz";
var base36_no_zeros = new RadixEncoding(k_base36_digits, EndianFormat.Little, false);
And then to actually perform encoding/decoding:
const string k_input = "A test 1234";
byte[] input_bytes = System.Text.Encoding.UTF8.GetBytes(k_input);
string encoded_string = base36_no_zeros.Encode(input_bytes);
byte[] decoded_bytes = base36_no_zeros.Decode(encoded_string);
Performance
Timed with Diagnostics.Stopwatch, ran on an i7 860 #2.80GHz. Timing EXE ran by itself, not under a debugger.
Encoding was initialized with the same k_base36_digits string from above, EndianFormat.Little, and with ending zero bytes acknowledged (even though the UTF8 bytes don't have any extra ending zero bytes)
To encode the UTF8 bytes of "A test 1234" 1,000,000 times takes 2.6567905secs
To decode the same string the same amount of times takes 3.3916248secs
To encode the UTF8 bytes of "A test 1234. Made slightly larger!" 100,000 times takes 1.1577325secs
To decode the same string the same amount of times takes 1.244326secs
Code
If you don't have a CodeContracts generator, you will have to reimplement the contracts with if/throw code.
using System;
using System.Collections.Generic;
using System.Numerics;
using Contract = System.Diagnostics.Contracts.Contract;
public enum EndianFormat
{
/// <summary>Least Significant Bit order (lsb)</summary>
/// <remarks>Right-to-Left</remarks>
/// <see cref="BitConverter.IsLittleEndian"/>
Little,
/// <summary>Most Significant Bit order (msb)</summary>
/// <remarks>Left-to-Right</remarks>
Big,
};
/// <summary>Encodes/decodes bytes to/from a string</summary>
/// <remarks>
/// Encoded string is always in big-endian ordering
///
/// <p>Encode and Decode take a <b>includeProceedingZeros</b> parameter which acts as a work-around
/// for an edge case with our BigInteger implementation.
/// MSDN says BigInteger byte arrays are in LSB->MSB ordering. So a byte buffer with zeros at the
/// end will have those zeros ignored in the resulting encoded radix string.
/// If such a loss in precision absolutely cannot occur pass true to <b>includeProceedingZeros</b>
/// and for a tiny bit of extra processing it will handle the padding of zero digits (encoding)
/// or bytes (decoding).</p>
/// <p>Note: doing this for decoding <b>may</b> add an extra byte more than what was originally
/// given to Encode.</p>
/// </remarks>
// Based on the answers from http://codereview.stackexchange.com/questions/14084/base-36-encoding-of-a-byte-array/
public class RadixEncoding
{
const int kByteBitCount = 8;
readonly string kDigits;
readonly double kBitsPerDigit;
readonly BigInteger kRadixBig;
readonly EndianFormat kEndian;
readonly bool kIncludeProceedingZeros;
/// <summary>Numerial base of this encoding</summary>
public int Radix { get { return kDigits.Length; } }
/// <summary>Endian ordering of bytes input to Encode and output by Decode</summary>
public EndianFormat Endian { get { return kEndian; } }
/// <summary>True if we want ending zero bytes to be encoded</summary>
public bool IncludeProceedingZeros { get { return kIncludeProceedingZeros; } }
public override string ToString()
{
return string.Format("Base-{0} {1}", Radix.ToString(), kDigits);
}
/// <summary>Create a radix encoder using the given characters as the digits in the radix</summary>
/// <param name="digits">Digits to use for the radix-encoded string</param>
/// <param name="bytesEndian">Endian ordering of bytes input to Encode and output by Decode</param>
/// <param name="includeProceedingZeros">True if we want ending zero bytes to be encoded</param>
public RadixEncoding(string digits,
EndianFormat bytesEndian = EndianFormat.Little, bool includeProceedingZeros = false)
{
Contract.Requires<ArgumentNullException>(digits != null);
int radix = digits.Length;
kDigits = digits;
kBitsPerDigit = System.Math.Log(radix, 2);
kRadixBig = new BigInteger(radix);
kEndian = bytesEndian;
kIncludeProceedingZeros = includeProceedingZeros;
}
// Number of characters needed for encoding the specified number of bytes
int EncodingCharsCount(int bytesLength)
{
return (int)Math.Ceiling((bytesLength * kByteBitCount) / kBitsPerDigit);
}
// Number of bytes needed to decoding the specified number of characters
int DecodingBytesCount(int charsCount)
{
return (int)Math.Ceiling((charsCount * kBitsPerDigit) / kByteBitCount);
}
/// <summary>Encode a byte array into a radix-encoded string</summary>
/// <param name="bytes">byte array to encode</param>
/// <returns>The bytes in encoded into a radix-encoded string</returns>
/// <remarks>If <paramref name="bytes"/> is zero length, returns an empty string</remarks>
public string Encode(byte[] bytes)
{
Contract.Requires<ArgumentNullException>(bytes != null);
Contract.Ensures(Contract.Result<string>() != null);
// Don't really have to do this, our code will build this result (empty string),
// but why not catch the condition before doing work?
if (bytes.Length == 0) return string.Empty;
// if the array ends with zeros, having the capacity set to this will help us know how much
// 'padding' we will need to add
int result_length = EncodingCharsCount(bytes.Length);
// List<> has a(n in-place) Reverse method. StringBuilder doesn't. That's why.
var result = new List<char>(result_length);
// HACK: BigInteger uses the last byte as the 'sign' byte. If that byte's MSB is set,
// we need to pad the input with an extra 0 (ie, make it positive)
if ( (bytes[bytes.Length-1] & 0x80) == 0x80 )
Array.Resize(ref bytes, bytes.Length+1);
var dividend = new BigInteger(bytes);
// IsZero's computation is less complex than evaluating "dividend > 0"
// which invokes BigInteger.CompareTo(BigInteger)
while (!dividend.IsZero)
{
BigInteger remainder;
dividend = BigInteger.DivRem(dividend, kRadixBig, out remainder);
int digit_index = System.Math.Abs((int)remainder);
result.Add(kDigits[digit_index]);
}
if (kIncludeProceedingZeros)
for (int x = result.Count; x < result.Capacity; x++)
result.Add(kDigits[0]); // pad with the character that represents 'zero'
// orientate the characters in big-endian ordering
if (kEndian == EndianFormat.Little)
result.Reverse();
// If we didn't end up adding padding, ToArray will end up returning a TrimExcess'd array,
// so nothing wasted
return new string(result.ToArray());
}
void DecodeImplPadResult(ref byte[] result, int padCount)
{
if (padCount > 0)
{
int new_length = result.Length + DecodingBytesCount(padCount);
Array.Resize(ref result, new_length); // new bytes will be zero, just the way we want it
}
}
#region Decode (Little Endian)
byte[] DecodeImpl(string chars, int startIndex = 0)
{
var bi = new BigInteger();
for (int x = startIndex; x < chars.Length; x++)
{
int i = kDigits.IndexOf(chars[x]);
if (i < 0) return null; // invalid character
bi *= kRadixBig;
bi += i;
}
return bi.ToByteArray();
}
byte[] DecodeImplWithPadding(string chars)
{
int pad_count = 0;
for (int x = 0; x < chars.Length; x++, pad_count++)
if (chars[x] != kDigits[0]) break;
var result = DecodeImpl(chars, pad_count);
DecodeImplPadResult(ref result, pad_count);
return result;
}
#endregion
#region Decode (Big Endian)
byte[] DecodeImplReversed(string chars, int startIndex = 0)
{
var bi = new BigInteger();
for (int x = (chars.Length-1)-startIndex; x >= 0; x--)
{
int i = kDigits.IndexOf(chars[x]);
if (i < 0) return null; // invalid character
bi *= kRadixBig;
bi += i;
}
return bi.ToByteArray();
}
byte[] DecodeImplReversedWithPadding(string chars)
{
int pad_count = 0;
for (int x = chars.Length - 1; x >= 0; x--, pad_count++)
if (chars[x] != kDigits[0]) break;
var result = DecodeImplReversed(chars, pad_count);
DecodeImplPadResult(ref result, pad_count);
return result;
}
#endregion
/// <summary>Decode a radix-encoded string into a byte array</summary>
/// <param name="radixChars">radix string</param>
/// <returns>The decoded bytes, or null if an invalid character is encountered</returns>
/// <remarks>
/// If <paramref name="radixChars"/> is an empty string, returns a zero length array
///
/// Using <paramref name="IncludeProceedingZeros"/> has the potential to return a buffer with an
/// additional zero byte that wasn't in the input. So a 4 byte buffer was encoded, this could end up
/// returning a 5 byte buffer, with the extra byte being null.
/// </remarks>
public byte[] Decode(string radixChars)
{
Contract.Requires<ArgumentNullException>(radixChars != null);
if (kEndian == EndianFormat.Big)
return kIncludeProceedingZeros ? DecodeImplReversedWithPadding(radixChars) : DecodeImplReversed(radixChars);
else
return kIncludeProceedingZeros ? DecodeImplWithPadding(radixChars) : DecodeImpl(radixChars);
}
};
Basic Unit Tests
using System;
using Microsoft.VisualStudio.TestTools.UnitTesting;
static bool ArraysCompareN<T>(T[] input, T[] output)
where T : IEquatable<T>
{
if (output.Length < input.Length) return false;
for (int x = 0; x < input.Length; x++)
if(!output[x].Equals(input[x])) return false;
return true;
}
static bool RadixEncodingTest(RadixEncoding encoding, byte[] bytes)
{
string encoded = encoding.Encode(bytes);
byte[] decoded = encoding.Decode(encoded);
return ArraysCompareN(bytes, decoded);
}
[TestMethod]
public void TestRadixEncoding()
{
const string k_base36_digits = "0123456789abcdefghijklmnopqrstuvwxyz";
var base36 = new RadixEncoding(k_base36_digits, EndianFormat.Little, true);
var base36_no_zeros = new RadixEncoding(k_base36_digits, EndianFormat.Little, true);
byte[] ends_with_zero_neg = { 0xFF, 0xFF, 0x00, 0x00 };
byte[] ends_with_zero_pos = { 0xFF, 0x7F, 0x00, 0x00 };
byte[] text = System.Text.Encoding.ASCII.GetBytes("A test 1234");
Assert.IsTrue(RadixEncodingTest(base36, ends_with_zero_neg));
Assert.IsTrue(RadixEncodingTest(base36, ends_with_zero_pos));
Assert.IsTrue(RadixEncodingTest(base36_no_zeros, text));
}
Interestingly, I was able to port Kornman's techniques across to Java and got expected output up to and including base36. Whereas when running his? code from c# using C:\Windows\Microsoft.NET\Framework\v4.0.30319 csc, the output was not as expected.
For example, trying to base16 encode the obtained MD5 hashBytes for the String "hello world" below using Kornman's RadixEncoding encode, I could see the groups of two bytes per characters had the bytes in wrong order.
Rather than 5eb63bbbe01eeed093cb22bb8f5acdc3
I saw something like e56bb3bb0ee1....
This was on Windows 7.
const string input = "hello world";
public static void Main(string[] args)
{
using (System.Security.Cryptography.MD5 md5 = System.Security.Cryptography.MD5.Create())
{
byte[] inputBytes = System.Text.Encoding.ASCII.GetBytes(input);
byte[] hashBytes = md5.ComputeHash(inputBytes);
// Convert the byte array to hexadecimal string
StringBuilder sb = new StringBuilder();
for (int i = 0; i < hashBytes.Length; i++)
{
sb.Append(hashBytes[i].ToString("X2"));
}
Console.WriteLine(sb.ToString());
}
}
Java code is below for anyone interested. As mentioned above, it only works to base 36.
private static final char[] BASE16_CHARS = "0123456789abcdef".toCharArray();
private static final BigInteger BIGINT_16 = BigInteger.valueOf(16);
private static final char[] BASE36_CHARS = "0123456789abcdefghijklmnopqrstuvwxyz".toCharArray();
private static final BigInteger BIGINT_36 = BigInteger.valueOf(36);
public static String toBaseX(byte[] bytes, BigInteger base, char[] chars)
{
if (bytes == null) {
return null;
}
final int bitsPerByte = 8;
double bitsPerDigit = Math.log(chars.length) / Math.log(2);
// Number of chars to encode specified bytes
int size = (int) Math.ceil((bytes.length * bitsPerByte) / bitsPerDigit);
StringBuilder sb = new StringBuilder(size);
for (BigInteger value = new BigInteger(bytes); !value.equals(BigInteger.ZERO);) {
BigInteger[] quotientAndRemainder = value.divideAndRemainder(base);
sb.insert(0, chars[Math.abs(quotientAndRemainder[1].intValue())]);
value = quotientAndRemainder[0];
}
return sb.toString();
}

How to convert UTF-8 byte[] to string

I have a byte[] array that is loaded from a file that I happen to known contains UTF-8.
In some debugging code, I need to convert it to a string. Is there a one-liner that will do this?
Under the covers it should be just an allocation and a memcopy, so even if it is not implemented, it should be possible.
string result = System.Text.Encoding.UTF8.GetString(byteArray);
There're at least four different ways doing this conversion.
Encoding's GetString, but you won't be able to get the original bytes back if those bytes have non-ASCII characters.
BitConverter.ToString The output is a "-" delimited string, but there's no .NET built-in method to convert the string back to byte array.
Convert.ToBase64String You can easily convert the output string back to byte array by using Convert.FromBase64String. Note: The output string could contain '+', '/' and '='. If you want to use the string in a URL, you need to explicitly encode it.
HttpServerUtility.UrlTokenEncodeYou can easily convert the output string back to byte array by using HttpServerUtility.UrlTokenDecode. The output string is already URL friendly! The downside is it needs System.Web assembly if your project is not a web project.
A full example:
byte[] bytes = { 130, 200, 234, 23 }; // A byte array contains non-ASCII (or non-readable) characters
string s1 = Encoding.UTF8.GetString(bytes); // ���
byte[] decBytes1 = Encoding.UTF8.GetBytes(s1); // decBytes1.Length == 10 !!
// decBytes1 not same as bytes
// Using UTF-8 or other Encoding object will get similar results
string s2 = BitConverter.ToString(bytes); // 82-C8-EA-17
String[] tempAry = s2.Split('-');
byte[] decBytes2 = new byte[tempAry.Length];
for (int i = 0; i < tempAry.Length; i++)
decBytes2[i] = Convert.ToByte(tempAry[i], 16);
// decBytes2 same as bytes
string s3 = Convert.ToBase64String(bytes); // gsjqFw==
byte[] decByte3 = Convert.FromBase64String(s3);
// decByte3 same as bytes
string s4 = HttpServerUtility.UrlTokenEncode(bytes); // gsjqFw2
byte[] decBytes4 = HttpServerUtility.UrlTokenDecode(s4);
// decBytes4 same as bytes
A general solution to convert from byte array to string when you don't know the encoding:
static string BytesToStringConverted(byte[] bytes)
{
using (var stream = new MemoryStream(bytes))
{
using (var streamReader = new StreamReader(stream))
{
return streamReader.ReadToEnd();
}
}
}
Definition:
public static string ConvertByteToString(this byte[] source)
{
return source != null ? System.Text.Encoding.UTF8.GetString(source) : null;
}
Using:
string result = input.ConvertByteToString();
Converting a byte[] to a string seems simple, but any kind of encoding is likely to mess up the output string. This little function just works without any unexpected results:
private string ToString(byte[] bytes)
{
string response = string.Empty;
foreach (byte b in bytes)
response += (Char)b;
return response;
}
I saw some answers at this post and it's possible to be considered completed base knowledge, because I have a several approaches in C# Programming to resolve the same problem. The only thing that is necessary to be considered is about a difference between pure UTF-8 and UTF-8 with a BOM.
Last week, at my job, I needed to develop one functionality that outputs CSV files with a BOM and other CSV files with pure UTF-8 (without a BOM). Each CSV file encoding type will be consumed by different non-standardized APIs. One API reads UTF-8 with a BOM and the other API reads without a BOM. I needed to research the references about this concept, reading the "What's the difference between UTF-8 and UTF-8 without BOM?" Stack Overflow question, and the Wikipedia article "Byte order mark" to build my approach.
Finally, my C# Programming for both UTF-8 encoding types (with BOM and pure) needed to be similar to this example below:
// For UTF-8 with BOM, equals shared by Zanoni (at top)
string result = System.Text.Encoding.UTF8.GetString(byteArray);
//for Pure UTF-8 (without B.O.M.)
string result = (new UTF8Encoding(false)).GetString(byteArray);
Using (byte)b.ToString("x2"), Outputs b4b5dfe475e58b67
public static class Ext {
public static string ToHexString(this byte[] hex)
{
if (hex == null) return null;
if (hex.Length == 0) return string.Empty;
var s = new StringBuilder();
foreach (byte b in hex) {
s.Append(b.ToString("x2"));
}
return s.ToString();
}
public static byte[] ToHexBytes(this string hex)
{
if (hex == null) return null;
if (hex.Length == 0) return new byte[0];
int l = hex.Length / 2;
var b = new byte[l];
for (int i = 0; i < l; ++i) {
b[i] = Convert.ToByte(hex.Substring(i * 2, 2), 16);
}
return b;
}
public static bool EqualsTo(this byte[] bytes, byte[] bytesToCompare)
{
if (bytes == null && bytesToCompare == null) return true; // ?
if (bytes == null || bytesToCompare == null) return false;
if (object.ReferenceEquals(bytes, bytesToCompare)) return true;
if (bytes.Length != bytesToCompare.Length) return false;
for (int i = 0; i < bytes.Length; ++i) {
if (bytes[i] != bytesToCompare[i]) return false;
}
return true;
}
}
There is also class UnicodeEncoding, quite simple in usage:
ByteConverter = new UnicodeEncoding();
string stringDataForEncoding = "My Secret Data!";
byte[] dataEncoded = ByteConverter.GetBytes(stringDataForEncoding);
Console.WriteLine("Data after decoding: {0}", ByteConverter.GetString(dataEncoded));
In addition to the selected answer, if you're using .NET 3.5 or .NET 3.5 CE, you have to specify the index of the first byte to decode, and the number of bytes to decode:
string result = System.Text.Encoding.UTF8.GetString(byteArray, 0, byteArray.Length);
Alternatively:
var byteStr = Convert.ToBase64String(bytes);
The BitConverter class can be used to convert a byte[] to string.
var convertedString = BitConverter.ToString(byteAttay);
Documentation of BitConverter class can be fount on MSDN.
To my knowledge none of the given answers guarantee correct behavior with null termination. Until someone shows me differently I wrote my own static class for handling this with the following methods:
// Mimics the functionality of strlen() in c/c++
// Needed because niether StringBuilder or Encoding.*.GetString() handle \0 well
static int StringLength(byte[] buffer, int startIndex = 0)
{
int strlen = 0;
while
(
(startIndex + strlen + 1) < buffer.Length // Make sure incrementing won't break any bounds
&& buffer[startIndex + strlen] != 0 // The typical null terimation check
)
{
++strlen;
}
return strlen;
}
// This is messy, but I haven't found a built-in way in c# that guarentees null termination
public static string ParseBytes(byte[] buffer, out int strlen, int startIndex = 0)
{
strlen = StringLength(buffer, startIndex);
byte[] c_str = new byte[strlen];
Array.Copy(buffer, startIndex, c_str, 0, strlen);
return Encoding.UTF8.GetString(c_str);
}
The reason for the startIndex was in the example I was working on specifically I needed to parse a byte[] as an array of null terminated strings. It can be safely ignored in the simple case
A LINQ one-liner for converting a byte array byteArrFilename read from a file to a pure ASCII C-style zero-terminated string would be this: Handy for reading things like file index tables in old archive formats.
String filename = new String(byteArrFilename.TakeWhile(x => x != 0)
.Select(x => x < 128 ? (Char)x : '?').ToArray());
I use '?' as the default character for anything not pure ASCII here, but that can be changed, of course. If you want to be sure you can detect it, just use '\0' instead, since the TakeWhile at the start ensures that a string built this way cannot possibly contain '\0' values from the input source.
Try this console application:
static void Main(string[] args)
{
//Encoding _UTF8 = Encoding.UTF8;
string[] _mainString = { "Hello, World!" };
Console.WriteLine("Main String: " + _mainString);
// Convert a string to UTF-8 bytes.
byte[] _utf8Bytes = Encoding.UTF8.GetBytes(_mainString[0]);
// Convert UTF-8 bytes to a string.
string _stringuUnicode = Encoding.UTF8.GetString(_utf8Bytes);
Console.WriteLine("String Unicode: " + _stringuUnicode);
}
Here is a result where you didn’t have to bother with encoding. I used it in my network class and send binary objects as string with it.
public static byte[] String2ByteArray(string str)
{
char[] chars = str.ToArray();
byte[] bytes = new byte[chars.Length * 2];
for (int i = 0; i < chars.Length; i++)
Array.Copy(BitConverter.GetBytes(chars[i]), 0, bytes, i * 2, 2);
return bytes;
}
public static string ByteArray2String(byte[] bytes)
{
char[] chars = new char[bytes.Length / 2];
for (int i = 0; i < chars.Length; i++)
chars[i] = BitConverter.ToChar(bytes, i * 2);
return new string(chars);
}
string result = ASCIIEncoding.UTF8.GetString(byteArray);

Categories

Resources