Float point to Hex in C# - c#

Googling I found there is not much information "to the point" on how to convert numbers to hexadecimal floating point single precision. There are three clear steps: 1 Convert entire binary part. 2 Add a comma and convert the fractional part to binary. 3 Put the result in scientific reporting. 4 Pass the result to the IEEE-754 standard 32 bits. This would result in binary. Then is turn it into hexadecimal. And all this is a bummer, I put the code hoping that it will solve that for me;-) Greetings.
private String Float2Hex(String value) {
String[] aux;
String number = "", mantissa = "", exponent = "";
Double div = 0;
int exp = 0;
aux = value.Split('.');
number = Convert.ToString(int.Parse(aux[0]), 2);
exp = number.Length - 1;
mantissa = number.Substring(1, number.Length - 1);
while ((aux.Length > 1) && (mantissa.Length < 23)) {
div = Double.Parse("0," + aux[1]) * 2;
aux = div.ToString().Split(',');
mantissa += aux[0];
}
while (mantissa.Length < 23) // Simple precision = 23 bits
mantissa += "0";
exponent = Convert.ToString(exp + 127, 2);
if (value.Substring(0, 1).Equals("-"))
number = "1" + exponent + mantissa;
else
number = "0" + exponent + mantissa;
return Bin2Hex(number);
}
I use the following Bin2Hex function of another partner: Binary to Hexadecimal

Another example:
String value = "", tmp = "";
val = float.Parse(StringValue);
byte[] b = BitConverter.GetBytes(val);
StringBuilder sb = new StringBuilder();
foreach (byte by in b)
sb.Append(by.ToString("X2"));
return sb.ToString();

Could you not use a stream to write the float value?
string a = Float2Hex(4.5f);
The function
public string Float2Hex(float fNum)
{
MemoryStream ms = new MemoryStream(sizeof(float));
StreamWriter sw = new StreamWriter(ms);
// Write the float to the stream
sw.Write(fNum);
sw.Flush();
// Re-read the stream
ms.Seek(0, SeekOrigin.Begin);
byte[] buffer = new byte[4];
ms.Read(buffer, 0, 4);
// Convert the buffer to Hex
StringBuilder sb = new StringBuilder();
foreach (byte b in buffer)
sb.AppendFormat("{0:X2}", b);
sw.Close();
return sb.ToString();
}

Related

Conversion of Hexadecimal to text [duplicate]

I need to check for a string located inside a packet that I receive as byte array. If I use BitConverter.ToString(), I get the bytes as string with dashes (f.e.: 00-50-25-40-A5-FF).
I tried most functions I found after a quick googling, but most of them have input parameter type string and if I call them with the string with dashes, It throws an exception.
I need a function that turns hex(as string or as byte) into the string that represents the hexadecimal value(f.e.: 0x31 = 1). If the input parameter is string, the function should recognize dashes(example "47-61-74-65-77-61-79-53-65-72-76-65-72"), because BitConverter doesn't convert correctly.
Like so?
static void Main()
{
byte[] data = FromHex("47-61-74-65-77-61-79-53-65-72-76-65-72");
string s = Encoding.ASCII.GetString(data); // GatewayServer
}
public static byte[] FromHex(string hex)
{
hex = hex.Replace("-", "");
byte[] raw = new byte[hex.Length / 2];
for (int i = 0; i < raw.Length; i++)
{
raw[i] = Convert.ToByte(hex.Substring(i * 2, 2), 16);
}
return raw;
}
For Unicode support:
public class HexadecimalEncoding
{
public static string ToHexString(string str)
{
var sb = new StringBuilder();
var bytes = Encoding.Unicode.GetBytes(str);
foreach (var t in bytes)
{
sb.Append(t.ToString("X2"));
}
return sb.ToString(); // returns: "48656C6C6F20776F726C64" for "Hello world"
}
public static string FromHexString(string hexString)
{
var bytes = new byte[hexString.Length / 2];
for (var i = 0; i < bytes.Length; i++)
{
bytes[i] = Convert.ToByte(hexString.Substring(i * 2, 2), 16);
}
return Encoding.Unicode.GetString(bytes); // returns: "Hello world" for "48656C6C6F20776F726C64"
}
}
string str = "47-61-74-65-77-61-79-53-65-72-76-65-72";
string[] parts = str.Split('-');
foreach (string val in parts)
{
int x;
if (int.TryParse(val, out x))
{
Console.Write(string.Format("{0:x2} ", x);
}
}
Console.WriteLine();
You can split the string at the -
Convert the text to ints (int.TryParse)
Output the int as a hex string {0:x2}
string hexString = "8E2";
int num = Int32.Parse(hexString, System.Globalization.NumberStyles.HexNumber);
Console.WriteLine(num);
//Output: 2274
From https://msdn.microsoft.com/en-us/library/bb311038.aspx
Your reference to "0x31 = 1" makes me think you're actually trying to convert ASCII values to strings - in which case you should be using something like Encoding.ASCII.GetString(Byte[])
If you need the result as byte array, you should pass it directly without changing it to a string, then change it back to bytes.
In your example the (f.e.: 0x31 = 1) is the ASCII codes. In that case to convert a string (of hex values) to ASCII values use:
Encoding.ASCII.GetString(byte[])
byte[] data = new byte[] { 0x31, 0x32, 0x33, 0x34, 0x35, 0x36, 0x37, 0x38, 0x39, 0x30 };
string ascii=Encoding.ASCII.GetString(data);
Console.WriteLine(ascii);
The console will display: 1234567890
My Net 5 solution that also handles null characters at the end:
hex = ConvertFromHex( hex.AsSpan(), Encoding.Default );
static string ConvertFromHex( ReadOnlySpan<char> hexString, Encoding encoding )
{
int realLength = 0;
for ( int i = hexString.Length - 2; i >= 0; i -= 2 )
{
byte b = byte.Parse( hexString.Slice( i, 2 ), NumberStyles.HexNumber, CultureInfo.InvariantCulture );
if ( b != 0 ) //not NULL character
{
realLength = i + 2;
break;
}
}
var bytes = new byte[realLength / 2];
for ( var i = 0; i < bytes.Length; i++ )
{
bytes[i] = byte.Parse( hexString.Slice( i * 2, 2 ), NumberStyles.HexNumber, CultureInfo.InvariantCulture );
}
return encoding.GetString( bytes );
}
One-liners:
var input = "Hallo Hélène and Mr. Hörst";
var ConvertStringToHexString = (string input) => String.Join("", Encoding.UTF8.GetBytes(input).Select(b => $"{b:X2}"));
var ConvertHexToString = (string hexInput) => Encoding.UTF8.GetString(Enumerable.Range(0, hexInput.Length / 2).Select(_ => Convert.ToByte(hexInput.Substring(_ * 2, 2), 16)).ToArray());
Assert.AreEqual(input, ConvertHexToString(ConvertStringToHexString(input)));

How to return a byte from a byte array with the Full Hex Value?

So I am creating a program that reads a byte array and returns the value
byte[] buffer1 = File.ReadAllBytes(path).Skip(startPos).Take(lengthToExtract).ToArray();
byte[] reversed = buffer1.Reverse().ToArray();
string buffer2 = "";
foreach (var i in reversed)
{
buffer2 = buffer2 + i.ToString("X") + " ";
}
MessageBox.Show(buffer2);
int size = int.Parse(buffer2.Replace(" ", string.Empty), System.Globalization.NumberStyles.HexNumber);
return size;
But when I get the message box for the value of the hex string buffer of the values it takes out the "unimportant" hex number out which messes up my value. For example if the byte read is 0x00 it will return the value as just 0, and I am reading the resulted bytes backwards so: 0x04080 (0 4 0 80) is different then 0x0040080 (00 04 00 80). I need help please this messes up my entire program.
Here are 2 functions you can use. They will solve both of the problems - double digits and bytes order
public static string HexStringFromArrayChangeEndian(byte[] data)
{
StringBuilder sdata = new StringBuilder();
for (int s = data.Length - 1; s >= 0; s--)
sdata.Append(string.Format("{0:X}", data[s]).PadLeft(2, '0'));
return sdata.ToString();
}
public static string HexStringFromArraySameEndian(byte[] data)
{
StringBuilder sdata = new StringBuilder();
for (int s = 0; s < data.Length; s++)
sdata.Append(string.Format("{0:X}", data[s]).PadLeft(2, '0'));
return sdata.ToString();
}
Use i.ToString("X2"). Will force string output of "00" instead of simplified "0"

Simple Byte Encryption Not Working

When the message is decrypted, the characters are one less than the original. Example: H Will be G
I have tried to debug the code by printing out values and all goes well until trying to divide by 100000 and multiplying by the date
Here is the code I used:
I didn't include the Main Method Here
public static string encrypt(string input)
{
string final;
string date = DateTime.Now.Date.ToShortDateString().ToString();
var datetime = int.Parse(date.Replace("/", ""));
List<int> semi = new List<int>();
var bytes = Encoding.UTF8.GetBytes(input.ToCharArray());
for (int i = 0; i < bytes.Length; i++)
{
int y = bytes[i] * datetime / 100000;
semi.Add(y);
Console.WriteLine(y);
}
Console.WriteLine(string.Join("", bytes));
final = string.Join(":", semi.ToArray()) + ":" + date;
return final;
}
public static string decrypt(string input)
{
string final;
string[] raw = input.Split(':');
int date = int.Parse(raw[raw.Length - 1].Replace("/",""));
var dump = new List<string>(raw);
dump.RemoveAt(raw.Length - 1);
string[] stringbytes = dump.ToArray();
List<byte> bytes = new List<byte>();
for (int i = 0; i < stringbytes.Length; i++)
{
int x = int.Parse(stringbytes[i]);
Console.WriteLine(x);
x = x * 100000 / date;
byte finalbytes = Convert.ToByte(x);
bytes.Add(finalbytes);
}
Console.WriteLine(string.Join("", bytes.ToArray()));
Console.WriteLine(date);
var bytearray = bytes.ToArray();
final = Encoding.UTF8.GetString(bytearray);
return final;
}
It's likely a rounding error from integer division. when doing integer math it is very possible that ((x * date / 100000) * 100000 / date) != x, in fact the only time it will be == x is when date % 100000 == 0.
Fix the rounding errors introduced by your int division and it should fix your problem.
P.S. I would also be very hesitant to call this "encryption", there is no secret key, all the information required to decrpt the message is in the message itself. You are only relying on the fact that the algorithm is secret which is practically impossible to do with C#. I would rather call what you are doing "Encoding", because to decode something that is encoding all you need to know is the algorithm.
You are using the low precision datatype int to store the result of the division. I have changed the type to double and it works
public static string encrypt(string input)
{
string final;
string date = DateTime.Now.Date.ToString("MMddyyyy");
var datetime = int.Parse(date);
List<double> semi = new List<double>();
var bytes = Encoding.UTF8.GetBytes(input);
for (int i = 0; i < bytes.Length; i++)
{
double y = bytes[i] * datetime / 100000;
semi.Add(y);
Console.WriteLine(y);
}
Console.WriteLine(string.Join("", bytes));
final = string.Join(":", semi.ToArray()) + ":" + date;
return final;
}
public static string decrypt(string input)
{
string final;
string[] raw = input.Split(':');
int date = int.Parse(raw[raw.Length - 1].Replace("/", ""));
var dump = new List<string>(raw);
dump.RemoveAt(raw.Length - 1);
string[] stringbytes = dump.ToArray();
List<byte> bytes = new List<byte>();
for (int i = 0; i < stringbytes.Length; i++)
{
var x = double.Parse(stringbytes[i]);
Console.WriteLine(x);
x = x * 100000 / date;
byte finalbytes = Convert.ToByte(x);
bytes.Add(finalbytes);
}
Console.WriteLine(string.Join("", bytes.ToArray()));
Console.WriteLine(date);
var bytearray = bytes.ToArray();
final = Encoding.UTF8.GetString(bytearray);
return final;
}
Here is a fully working console app http://ideone.com/Rjc13A
I believe this is a number truncation issue. In your decrypt method, the division will actually create a double instead of an int - If you do the math it turns out to have decimal places. Since x is an integer it will be cutt-off.
The following should work:
for (int i = 0; i < stringbytes.Length; i++)
{
var x = double.Parse(stringbytes[i]);
Console.WriteLine(x);
x = Math.Round((x * 100000) / date,0);
byte finalbytes = Convert.ToByte(x);
bytes.Add(finalbytes);
}
Also, as a side note why are you creating your own encryption algorithm? Could you not use one that already exists?

String of bits to Unicode

I have a string of bits, like this string str = "0111001101101000" It's the letters"sh".
I need to make Unicode letters out of it. I'm doing following:
BitArray bn = new BitArray(str.Length); //creating new bitarray
for (int kat = 0; kat < str.Length; kat++)
{
if (str[kat].ToString() == "0")//adding boolean values into array
{
bn[kat] = false;
}
else
bn[kat] = true;
}
byte[] bytes = new byte[bn.Length];//converting to bytes
bn.CopyTo(bytes, 0);
string output = Encoding.Unicode.GetString(bytes); //encoding
textBox2.Text = output; // result in textbox
But the output text is just complete mess. How to do it right?
There's a couple of problems with your code.
First BitArray will reverse the bit order - it's easier to use
Convert.ToByte
Your input string contains two bytes (one
per character), but you're using Encoding.Unicode to decode it, which
is UTF16 encoding (two bytes per character), you need to use Encoding.UTF8
Working Code
string str = "0111001101101000";
int numOfBytes = str.Length / 8;
byte[] bytes = new byte[numOfBytes];
for (int i = 0; i < numOfBytes; ++i)
{
bytes[i] = Convert.ToByte(str.Substring(8 * i, 8), 2);
}
string output = Encoding.UTF8.GetString(bytes);
A) Your string is ASCII, not UNICODE: 8 bits per character
B) The most significant bit of every byte is on the left, so the strange math used in bn[...]
C) The commented part is useless because "false" is the default state of a BitArray
D) The length of the byte array was wrong. 8 bits == 1 byte! :-)
string str = "0111001101101000";
BitArray bn = new BitArray(str.Length); //creating new bitarray
for (int kat = 0; kat < str.Length; kat++) {
if (str[kat] == '0')//adding boolean values into array
{
//bn[(kat / 8 * 8) + 7 - (kat % 8)] = false;
} else {
bn[(kat / 8 * 8) + 7 - (kat % 8)] = true;
}
}
// 8 bits in a byte
byte[] bytes = new byte[bn.Length / 8];//converting to bytes
bn.CopyTo(bytes, 0);
string output = Encoding.ASCII.GetString(bytes); //encoding
Probably better:
string str = "0111001101101000";
byte[] bytes = new byte[str.Length / 8];
for (int ix = 0, weight = 128, ix2 = 0; ix < str.Length; ix++) {
if (str[ix] == '1') {
bytes[ix2] += (byte)weight;
}
weight /= 2;
// Every 8 bits we "reset" the weight
// and increment the ix2
if (weight == 0) {
ix2++;
weight = 128;
}
}
string output = Encoding.ASCII.GetString(bytes); //encoding

hex number of length 128 which is to be converted to binary in c#

private static string GetSHA512(string strPlain)
{
UnicodeEncoding UE = new UnicodeEncoding();
byte[] HashValue, MessageBytes = UE.GetBytes(strPlain);
SHA512Managed SHhash = new SHA512Managed();
string strHex = "";
HashValue = SHhash.ComputeHash(MessageBytes);
foreach (byte b in HashValue)
{
strHex += String.Format("{0:x2}", b);
//strHex += b.ToString();
}
int len = strHex.Length;
//********This strHex of length 128 characters is to be converted to binary
// ( actually 512 bit output in binary is required.)**********/
}
please see if anyone can help.
If you really want to convert the hex string representation of the hash to a binary string representation:
int len = strHex.Length;
StringBuilder sb = new StringBuilder("");
for (int i = 0; i < len; i++)
{
sb.Append(Convert.ToString(Convert.ToByte(strHex.Substring(i, 1), 16), 2).PadLeft(4, '0'));
}
Assuming you're asking how to convert a byte array to a base-2 representation string:
byte b = 123;
string s = Convert.ToString(b, 2); // second argument is base
Console.WriteLine(s); // prints '1111011'
Now just walk through your byte array to create your string byte by byte.

Categories

Resources