What does {1:X} mean in C#? - c#

I don't quite get what {1:X} is in this piece of code:
ushort secretKey = 0x0088; // The cyphering key.
char character = 'A'; // The initial key to be cyphered.
Console.WriteLine("Initial symbol: {0}, its code in the symbols' table: {1:X}", character, (byte)character);
I mean, I realize that {0:X} means a lowercase hexadecimal, but does that mean that {1:X} is a decimal? Thanks for explaining.

It is known as composite formatting. 1 means second argument and X means hexadecimal. Here is the list.

Related

C# byte array calculation with BigInteger not working properly

So, i need to calculate byte arrays in my program and i noticed weird thing:
string aaa = "F8F9FAFBFCFD";
string aaaah = "10101010101";
BigInteger dsa = BigInteger.Parse(aaa, NumberStyles.HexNumber) + BigInteger.Parse(aaaah, NumberStyles.HexNumber);
MessageBox.Show(dsa.ToString("X"));
When i add aaa + aaah, it displays me 9FAFBFCFDFE, but it should display F9FAFBFCFDFE, but when i subtract it does it right, aaa - aaah, displays F7F8F9FAFBFC, everything should be right in my code.
BigInteger.Parse interprets "F8F9FAFBFCFD" as the negative number -7,722,435,347,203 (using two's complement) and not 273,752,541,363,453 as you were probably expecting.
From the documentation for BigInteger.Parse:
If value is a hexadecimal string, the Parse(String, NumberStyles)
method interprets value as a negative number stored by using two's
complement representation if its first two hexadecimal digits are
greater than or equal to 0x80. In other words, the method interprets
the highest-order bit of the first byte in value as the sign bit.
To get the result you are expecting, prefix aaa with a 0 to force it to be interpreted as a positive value:
string aaa = "0F8F9FAFBFCFD";
string aaaah = "10101010101";
BigInteger dsa = BigInteger.Parse(aaa, NumberStyles.HexNumber)
+ BigInteger.Parse(aaaah, NumberStyles.HexNumber);
MessageBox.Show(dsa.ToString("X")); // outputs 0F9FAFBFCFDFE

Char array is returning incorrect values

I have a char array, chars[] with values {'#', '$', '1'} contained within it. I want to remove the 1 and place it into another variable, val, but when I do it gives me a 49 (idk why). I tried debugging it and the info shows that the elements of chars are as follows:
char[0] = 35 '#'
char[1] = 36 '$'
char[2] = 49 '1'
Which in turn makes
int val = char[2];
become
val = 49
I'm not sure why this is, but it's throwing my plans off. Does anyone know what the problem is and what I can do to fix it?
You should use
char val = char[2];
With int, you are getting the ASCII representation of the char as an integer.
see also http://hu.wikipedia.org/wiki/ASCII
49 is the ASCII representation for the char '1'
link to ASCII table
Just go for: charArray[x].ToString();
This will convert the ASCII representation to an actual Character.

Problems parsing through string in C#

I am trying to parse through the first three characters of a string.
public List<string> sortModes(List<string> allModesNonSorted)
{
foreach (string s in allModesNonSorted)
{
char firstNumber = s[0];
char secondNumber = s[1];
char thirdNumber = s[2];
char.IsDigit(firstNumber);
char.IsDigit(secondNumber);
char.IsDigit(thirdNumber);
combinedNumbers = Convert.ToInt16(firstNumber) + Convert.ToInt16(secondNumber) + Convert.ToInt16(thirdNumber);
}
return allModesNonSorted;
}
It recognizes each character correctly, but adds on an extra value 53 or 55. Below when I add the numbers, the 53 and 55 are included. Why is it doing this??
53 is the Unicode value of '5', and 55 is the Unicode value of '7'. It's showing you both the numeric and character versions of the data.
You'll notice with secondNumber you see the binary value 0 and the character value '\0' as well.
If you want to interpret a string as an integer, you can use
int myInteger = int.Parse(myString);
Specifically if you know you always have the format
input = "999 Hz Bla bla"
you can do something like:
int firstSeparator = input.IndexOf(' ');
string frequency = input.Substring(firstSeparator);
int numericFrequency = int.Parse(frequency);
That will work no matter how many digits are in the frequency as long as the digits are followed by a space character.
53 is the ASCII value for the character '5'
57 is the ASCII value for the character '7'
this is just Visual Studio showing you extra details about the actual values.
You can proceed with your code.
Because you're treating them as Characters.
the character '5' is sequentially the 53rd character in ASCII.
the simplest solution is to just subtract the character '0' from all of them, that will give you the numeric value of a single character.
53 and 55 are the ASCII values of the '5' and '7' characters (the way the characters are stored in memory).
If you need to convert them to Integers, take a look at this SO post.

Conversion of a Char variable to an integer

why is the integer equivalent of '8' is 56 in C sharp? I want to convert it to an integer 8 and not any other number.
You'll need to subtract the offset from '0'.
int zero = (int)'0'; // 48
int eight = (int)'8'; // 56
int value = eight - zero; // 8
56 is the (EDIT) Unicode value for the character 8 use:
Int32.Parse(myChar.ToString());
EDIT:
OR this:
char myChar = '8';
Convert.ToInt32(myChar);
The right way of converting unicode characters in C#/.Net is to use corresponding Char methods IsDigit and GetNumericValue (http://msdn.microsoft.com/en-us/library/e7k33ktz.aspx).
If you are absolutely sure that there will be no non-ASCII numbers in your input than ChaosPandion's suggestion is fine too.

Guid to Base34 encoder/decoder

Does anyone have a nice code snippet for a Guid to Base34 encoder/decoder, I've googled around for it previously and never really found any good sources.
This Number base conversion class in C# could be fairly easily extended to do base34 (or others if you think people will confuse S and 5 or b and 6 or i and j or B and 8 or 9 and g or whatever)
Here's a simplified version... It basically takes a string, calculates the MD5 hash, extracts the first four bytes as an unsigned long (effective mapping the string to a 4-byte number), converts that to base36 and then swaps out the "oh" and "zero" chars for "X" and "Y". Then, it ensures the final string is only six chars, padding with "Z" chars if needed.
require 'digest/md5'
# create an easy-to-read 6-digit unique idno
idno = original # starting string
idno = Digest::MD5.digest(idno).unpack("N").first # digest as unsigned long
idno = idno.to_s(36).upcase.tr("0O","XY") # convert to base34 (no "oh" or "zero")
idno = idno[0,6].ljust(6,"Z") # final 6-digit unique idno (pad with "Z" chars)
The key methods here are ToByteArray and this particular constructor.
Encode:
string encodedGuid = Convert.ToBase64String(guid.ToByteArray());
Decode:
Guid guid = new Guid(Convert.FromBase64String(encodedGuid));

Categories

Resources