C# Get CheckSum XOR Error in hexadecimal conversion [duplicate] - c#

This question already has answers here:
Convert string "0x32" into a single byte
(3 answers)
Closed 5 months ago.
Sorry for the rudimentary question.
I am writing code to calculate a checksum as follows
public void GetCheckSum()
{
//test data1
//The answer is 61
//var hexInput = "43-33-30-30-20-20-20-20-20-20-20-32-03";
//test data2
//The answer is 42
var hexInput = "54-33-30-30-20-20-20-20-38-2E-30-03";
var splitByte=hexInput.Split("-").Select(hex => Convert.ToByte(hex));
byte[] byteData = splitByte.ToArray();
byte checksumTest = new byte();
for(int i = 0; i < byteData.Length; i++)
{
checkSumTest = (byte)(checkSumTest ^ byteData[i]);
}
Console.WriteLine(checkSumTest);
For test data1, I get the desired value of 61.
However, test data2 gives an error with the byte conversion of "2E".
ERROR : Input string was not in a correct format
How should I handle hexadecimal numbers like "2E"?

You should be using Convert.ToByte(hex, 16) to indicate that the input represents a number in base 16.

Related

How to convert From Hex To Dump in C#

I convert my Hex to dump to get special character like symbol but when I try to convert my "0x18" i "\u0018" this value. Can anyone give me solution regarding this matter.
Here is my code:
public static string FromHexDump(string sText)
{
Int32 lIdx;
string prValue ="" ;
for (lIdx = 1; lIdx < sText.Length; lIdx += 2)
{
string prString = "0x" + Mid(sText, lIdx, 2);
string prUniCode = Convert.ToChar(Convert.ToInt64(prString,16)).ToString();
prValue = prValue + prUniCode;
}
return prValue;
}
I used VB language. I have a database that already encrypted text to my password and the value is BAA37D40186D like this so I loop it by step 2 and it will like this 0xBA,0xA3,0x7D,0x40,0x18,0x6D and the VB result getting like this º£}#m
You can use this code:
var myHex = '\x0633';
var formattedString += string.Format(#"\x{0:x4}", (int)myHex);
Or you can use this code from MSDN (https://learn.microsoft.com/en-us/dotnet/csharp/programming-guide/types/how-to-convert-between-hexadecimal-strings-and-numeric-types):
string hexValues = "48 65 6C 6C 6F 20 57 6F 72 6C 64 21";
string[] hexValuesSplit = hexValues.Split(' ');
foreach (string hex in hexValuesSplit)
{
// Convert the number expressed in base-16 to an integer.
int value = Convert.ToInt32(hex, 16);
// Get the character corresponding to the integral value.
string stringValue = Char.ConvertFromUtf32(value);
char charValue = (char)value;
Console.WriteLine("hexadecimal value = {0}, int value = {1}, char value = {2} or {3}",
hex, value, stringValue, charValue);
}
The question is unclear - what is the database column's type? Does it contain 6 bytes, or 12 characters with the hex encoding of the bytes? In any case, this has nothing to do with special characters or encodings.
First, 0x18 is the byte value of the Cancel Character in the Latin 1 codepage, not the pound sign. That's 0xA3. It seems that the byte values in the question are just the Latin 1 bytes for the string in hex.
.NET strings are Unicode (UTF16LE specifically). There's no UTF8 string or Latin1 string. Encodings and codepages apply when converting bytes to strings or vice versa. This is done using the Encoding class and eg Encoding.GetBytes
In this case, this code will convert the byte to the expected string form, including the unprintable character :
new byte[] {0xBA,0xA3,0x7D,0x40,0x18,0x6D};
var latinEncoding=Encoding.GetEncoding(1252);
var result=latinEncoding.GetString(dbBytes);
The result is :
º£}#m
With the Cancel character between # and m.
If the database column contains the byte values as strings :
it takes double the required space and
the hex values have to be converted back to bytes before converting to strings
The x format is used to convert numbers or bytes to their hex form and vice versa. For each byte value, ToString("x") returns the hex string.
The hex string can be produced from the original buffer with :
var dbBytes=new byte[] {0xBA,0xA3,0x7D,0x40,0x18,0x6D};
var hexString=String.Join("",dbBytes.Select(c=>c.ToString("x")));
There are many questions that show how to parse a byte string into a byte array. I'll just steal Jared Parson's LINQ answer :
public static byte[] StringToByteArray(string hex) {
return Enumerable.Range(0, hex.Length)
.Where(x => x % 2 == 0)
.Select(x => Convert.ToByte(hex.Substring(x, 2), 16))
.ToArray();
}
With that, we can parse the hex string into a byte array and convert it to the original string :
var bytes=StringToByteArray(hexString);
var latinEncoding=Encoding.GetEncoding(1252);
var result=latinEncoding.GetString(bytes);
First of all u don't need dump but Unicode, I would recomend to read about unicode/encoding etc and why this is a problem with strings.
PS: solution : StackOverflow

Does C# have a way of casting a double array to a string similar to the C++ cast to a char*?

I have inherited C++ code that casts a double array to a char* as shown below. In C#, I have not been able to generate a string from an array of doubles that matches the string generated by the C++ cast. In C#, is there someway to generate a string from a double array that would match the string created in C++ where a simple cast to char* is done? The result of the C++ cast appears to be some kind of binary data.
I want to replace the C++ code that creates the string with C# code that will generate the same string and store it in a database memo field. I want to keep the C++ code that retrieves the string from the database memo field and converts it to a double array for use in calculations.
C++ code that casts the double array Darray to a char *:
char*s = (char*)Darray
I have tried several things in C# that didn't create the desired string including (obvious compile error):
string s = (string) Darray;
C# code that didn't create identical string to C++ code:
int length = Darray.Length * sizeof(double);
IntPtr pnt = Marshal.AllocHGlobal(length );
Marshal.Copy(Darray, 0, pnt, Darray.Length);
byte[] Barray = new byte[length];
Marshal.Copy(pnt, Barray, 0, length);
string theString = BitConverter.ToString(Barray);
C# code that also didn't create identical string to C++ code:
BinaryFormatter formatter = new BinaryFormatter();
using (MemoryStream m = new MemoryStream())
{
formatter.Serialize(m, Darray);
m.Position = 0;
StreamReader sr = new StreamReader(m);
string theString = sr.ReadToEnd();
}
C# code that also didn't create identical string to C++ code:
byte[] theBytesData = new byte[numBytesReqd];
Buffer.BlockCopy(Darray, 0, theBytesData, 0, numBytesReqd);
string theString = Encoding.ASCII.GetString(theBytesData, 0, theBytesData.Length);
Maybe there is no solution to this problem other than a mixed language program.
For the following C++ code:
double Darray[] = { 1.0,2.0,3.0 };
char* DarrayCp = (char*)Darray;
for (int i = 0; i < blockSize; i++)
{
cout << i << "\tDarrayCp: " << DarrayCp[i] << endl;
}
I get the following output which I'd like to reproduce with C# code:
0 DarrayCp:
1 DarrayCp:
2 DarrayCp:
3 DarrayCp:
4 DarrayCp:
5 DarrayCp:
6 DarrayCp: ð
7 DarrayCp: ?
8 DarrayCp:
9 DarrayCp:
10 DarrayCp:
11 DarrayCp:
12 DarrayCp:
13 DarrayCp:
14 DarrayCp:
15 DarrayCp: #
16 DarrayCp:
17 DarrayCp:
18 DarrayCp:
19 DarrayCp:
20 DarrayCp:
21 DarrayCp:
22 DarrayCp:
23 DarrayCp: #
Since you're just reinterpreting the raw bytes in the C++ code, the output characters you're seeing depend on the encoding used by the console or whatever other output method you're using to test the C++ code.
It looks like ISO-8859-1 gives you the same output as the sample you posted:
var Darray = new double[] { 1.0, 2.0, 3.0 };
var bytes = new byte[Darray.Length * sizeof(double)];
Buffer.BlockCopy(Darray, 0, bytes, 0, bytes.Length);
var str = Encoding.GetEncoding("ISO-8859-1").GetString(bytes);
but it's unclear to me if this conversion is actually useful for whatever you're trying to accomplish, since string and char in C# use UTF-16 characters so the resulting str has a completely different byte representation. bytes already represents the same data as DarrayCp in your C++ source, without any conversion.

Convert Python to C# [duplicate]

This question already has answers here:
Understanding slicing
(38 answers)
Closed 4 years ago.
Hi I'm trying to convert som Python code to C# I'm struck understanding this line of code
n = int(e[2:10], 16)
e is a string looking like this:
0100000180a6fa85de8dd3381cc277b046d7e3856307519d03da4e3ff5dca52de833c56951ab3e539a161df98454be311fd242407b25bf7b8e84c322f06f913d712393922bd1477d2cf3a9d2ba14bb00f8b2d7a203376afed0e1782e49ea55d43cee8e3bb8331f3f8aa81955bae8fcd118f640b4cd49d787bd8a12d57f424b371d07f08de67ab8f40bf5894288920adfe9480cfbec7deef073c3f137d71dff9d4ab967d9178648961cd2def00d376cf01dca6a4c6428243cef23eeab9791f5cd7d66f5293879b7ed83abf600f78426491c57c8a61e
n = int(e[2:10], 16) takes characters 2..10 from e and interprets them as hexadecimal characters to interpret as an integer.
That is, for your input,
>>> e = '0100000180a6fa85de8dd3...'
>>> f = e[2:10]
>>> f
'00000180'
>>> int(f, 16)
384
so you should be able to do the same with something like Convert.ToInt32(e.Substring(2, 8), 16) in C#.
At first, you're using string slicing (from 2nd to 9th character) using [2:10]. Then you're converting them to (decimal) int from hexadecimal.
Which will result n = 384.

Convert hexadecimal string to its numerical values in C# [duplicate]

This question already has answers here:
Convert integer to hexadecimal and back again
(11 answers)
Closed 9 years ago.
i have a text box on my form. I want to write "0x31" as a string to my textbox and then when i clicked a button, i want to convert this string to 0x31 as a hexadecimal value.
How can i convert this string to hexadecimal value?
int i = Convert.ToInt32("0x31", 16);
Console.WriteLine("0x" + i.ToString("X2"))
string hexValues = "48 65 6C 6C 6F 20 57 6F 72 6C 64 21";
string[] hexValuesSplit = hexValues.Split(' ');
foreach (String hex in hexValuesSplit)
{
// Convert the number expressed in base-16 to an integer.
int value = Convert.ToInt32(hex, 16);
// Get the character corresponding to the integral value.
string stringValue = Char.ConvertFromUtf32(value);
char charValue = (char)value;
Console.WriteLine("hexadecimal value = {0}, int value = {1}, char value = {2} or {3}",
hex, value, stringValue, charValue);
}
Example From: http://msdn.microsoft.com/en-us/library/bb311038.aspx
Hexadecimal is just a representation of an value, it is not a value itself.
This page will tell you everything you need to know about parsing and displaying hex in C#
http://msdn.microsoft.com/en-us/library/bb311038.aspx
First to clear up: The string is in hexadecimal format, when you convert it to a value it's just a numeric value, it's not hexadecimal.
Use the Int32.Parse method with the NumberStyle.HexNumber specifier:
string input = "0x31";
int n;
if (input.StartsWith("0x")) {
n = Int32.Parse(input.Substring(2), NumberStyles.HexNumber);
} else {
n = Int32.Parse(input);
}
The string hex value is a representation of a value. The actual string value can be converted to whatever you like(float, int etc.)
There are several ways to do the conversion. Simple example:
// convert to int from base 16
int value = Convert.ToInt32(hex, 16);
Note that hex is just a representation of a value - so what you are really asking is how you can parse a value from the string - do it like so:
int val = int.Parse("0x31", NumberStyles.HexNumber);
val now contains an int with the hex value 0x31.

How can we convert binary number into its octal number using c#?

**Hey i was working on an application which converts any basenumber like (2,8,10,16,etc) to user's desire base system. I am having a problem in converting a binary number to its octal number can anyone help me out?
I tried everthing like
// i am taking a binary number in value and then converting it to base 8
Int32 value = int.Parse(convertnumber);
Console.WriteLine(Convert.ToString(value, 8));
For example:
value =10011
Answer should be this "23" but using the above code i am getting "23433"
"23433" is is the correct answer, when converting "10011" in base 10 to base 8.
You may have meant to interpret "10011" as a binary number. In which case, you want:
int value = Convert.ToInt32(convertnumber, 2);
Edit: in response to comments, here's almost-complete code:
string val = "10011";
int convertnumber = Convert.ToInt32(val, 2);
Console.WriteLine(Convert.ToString(convertnumber, 8)); // prints "23"
string binary = "10011";
int integer = Convert.ToInt32(binary, 2);
Console.WriteLine(Convert.ToString(integer, 8));
Output: 23
In this example we convert the binary string representation to an integer and from an integer to the octal string representation.
int value = Convert.ToInt32(convertnumber, 2);
Console.WriteLine(Convert.ToString(value, 8));
You are taking a base 10 number 10011 and converting it to base 8. Which is 23433.
If you want to do this manually (so you understand what is going on) here is a suggestion:
First pad the binary string to be divisable by 3 ( 3 bits = 1 octal digit )
string binary = "10011";
int pad = binary.Length % 3;
binary = new string('0', 3-pad) + binary;
Then process each three bits into one octal digit
int n = binary.Length / 3;
char[] bin_digits = binary.ToCharArray();
char[] oct_digits = new char[n];
for (int i = 0; i < n; i++)
{
int digit = bin_digits.Skip(3 * i).Take(3).Aggregate(0,
(x, v) => (int)v - (int)'0' + 2 * x);
// x is the value accumulation
// v is a char '0' or '1' representing a bit and is converted to int 0, 1
oct_digits[i] = (char)(digit + (int)'0');
// convert int to char digit
}
Convert the digits array into a string
string oct_value = new string(oct_digits);
Example results:
"10011" -> "23"
"11000" -> "30"
"1011011" -> "133"
Naturally, int.Parse parses a decimal number. If your input is binary, then you'll need to first do a conversion from binary to integer.
Int32 value = Convert.ToInt32( "10011", 2 );
Console.WriteLine(Convert.ToString(value, 8));
That's because int.Parse is converting 10011 to, well, 10011 in decimal. It is not converting it from 10011 binary to 23 octal (19 decimal) as you want it to.

Categories

Resources