Conversion of a Char variable to an integer - c#

why is the integer equivalent of '8' is 56 in C sharp? I want to convert it to an integer 8 and not any other number.

You'll need to subtract the offset from '0'.
int zero = (int)'0'; // 48
int eight = (int)'8'; // 56
int value = eight - zero; // 8

56 is the (EDIT) Unicode value for the character 8 use:
Int32.Parse(myChar.ToString());
EDIT:
OR this:
char myChar = '8';
Convert.ToInt32(myChar);

The right way of converting unicode characters in C#/.Net is to use corresponding Char methods IsDigit and GetNumericValue (http://msdn.microsoft.com/en-us/library/e7k33ktz.aspx).
If you are absolutely sure that there will be no non-ASCII numbers in your input than ChaosPandion's suggestion is fine too.

Related

Strange FormatException when converting to octal

So I have an integer number 208 I don't expect many to understand why I am doing this, but the end result of what I am trying to do is get the base-10 representation of octal number 208 (two-zero-eight). I expect that the confusing thing (for people that will try and answer this question) is that while 208 is an integer, I am using it more like a string containing the characters two, zero, and eight. Please let me know if there are any more questions on this, as I think it will cause some confusion.
Anyway, to get the base-10 representation of "208" here is what I do:
Convert int 208 into string "208".
Take the string "208", and parse from octal to decimal.
Then, here is the corresponding source code:
public byte OctalToDecimal(int octalDigits)
{
byte decimalValue = 0;
string octalString = string.Empty;
// first, get a string representation of the integer number
octalString = octalDigits.ToString();
// now, get the decimal value of the octal string
decimalValue = Convert.ToByte(octalString, 8);
// set the decimal-value as the label
return decimalValue;
}
I get a format exception when octalDigits = 208. I get a message about there being additional characters in the octalString's value. Why would that be? All I do is convert from int to string it's very short/simple, and not like I append anything on there. What is going on?
You should know that the digits for octal numbers are in the range 0 to 7
Here, some helpful links
the octal representations of bytes range from 000 to 377?
http://www.asciitable.com/
Octal numbers can not contain digit 8, like base-10 representation can't contain "digit" 10 and binary can't contain digit 2.

Why are ASCII values of a byte different when cast as Int32?

I'm in the process of creating a program that will scrub extended ASCII characters from text documents. I'm trying to understand how C# is interpreting the different character sets and codes, and am noticing some oddities.
Consider:
namespace ASCIITest
{
class Program
{
static void Main(string[] args)
{
string value = "Slide™1½”C4®";
byte[] asciiValue = Encoding.ASCII.GetBytes(value); // byte array
char[] array = value.ToCharArray(); // char array
Console.WriteLine("CHAR\tBYTE\tINT32");
for (int i = 0; i < array.Length; i++)
{
char letter = array[i];
byte byteValue = asciiValue[i];
Int32 int32Value = array[i];
//
Console.WriteLine("{0}\t{1}\t{2}", letter, byteValue, int32Value);
}
Console.ReadLine();
}
}
}
Output from program
CHAR BYTE INT32
S 83 83
l 108 108
i 105 105
d 100 100
e 101 101
T 63 8482 <- trademark symbol
1 49 49
½ 63 189 <- fraction
" 63 8221 <- smartquotes
C 67 67
4 52 52
r 63 174 <- registered trademark symbol
In particular, I'm trying to understand why the extended ASCII characters (the ones with my notes added to the right of the third column) show up with the correct value when cast as int32, but all show up as 63 when cast as the byte value. What's going on here?
ASCII.GetBytes conversion replaces all characters outside of ASCII range (0-127) with question mark (code 63).
So since your string contains characters outside of that range your asciiValue have ? instead of all interesting symbols like ™ - its Char (Unicode) repesentation is 8482 which is indeed outside of 0-127 range.
Converting string to char array does not modify values of characters and you still have original Unicode codes (char is essentially Int16) - casting it to longer integer type Int32 does not change the value.
Below are possible conversion of that character into byte/integers:
var value = "™";
var ascii = Encoding.ASCII.GetBytes(value)[0]; // 63(`?`) - outside 0-127 range
var castToByte = (byte)(value[0]); // 34 = 8482 % 256
var Int16 = (Int16)value[0]; // 8482
var Int32 = (Int16)value[0]; // 8482
Details available at ASCIIEncoding Class
ASCIIEncoding corresponds to the Windows code page 20127. Because ASCII is a 7-bit encoding, ASCII characters are limited to the lowest 128 Unicode characters, from U+0000 to U+007F. If you use the default encoder returned by the Encoding.ASCII property or the ASCIIEncoding constructor, characters outside that range are replaced with a question mark (?) before the encoding operation is performed.

Char array is returning incorrect values

I have a char array, chars[] with values {'#', '$', '1'} contained within it. I want to remove the 1 and place it into another variable, val, but when I do it gives me a 49 (idk why). I tried debugging it and the info shows that the elements of chars are as follows:
char[0] = 35 '#'
char[1] = 36 '$'
char[2] = 49 '1'
Which in turn makes
int val = char[2];
become
val = 49
I'm not sure why this is, but it's throwing my plans off. Does anyone know what the problem is and what I can do to fix it?
You should use
char val = char[2];
With int, you are getting the ASCII representation of the char as an integer.
see also http://hu.wikipedia.org/wiki/ASCII
49 is the ASCII representation for the char '1'
link to ASCII table
Just go for: charArray[x].ToString();
This will convert the ASCII representation to an actual Character.

Problems parsing through string in C#

I am trying to parse through the first three characters of a string.
public List<string> sortModes(List<string> allModesNonSorted)
{
foreach (string s in allModesNonSorted)
{
char firstNumber = s[0];
char secondNumber = s[1];
char thirdNumber = s[2];
char.IsDigit(firstNumber);
char.IsDigit(secondNumber);
char.IsDigit(thirdNumber);
combinedNumbers = Convert.ToInt16(firstNumber) + Convert.ToInt16(secondNumber) + Convert.ToInt16(thirdNumber);
}
return allModesNonSorted;
}
It recognizes each character correctly, but adds on an extra value 53 or 55. Below when I add the numbers, the 53 and 55 are included. Why is it doing this??
53 is the Unicode value of '5', and 55 is the Unicode value of '7'. It's showing you both the numeric and character versions of the data.
You'll notice with secondNumber you see the binary value 0 and the character value '\0' as well.
If you want to interpret a string as an integer, you can use
int myInteger = int.Parse(myString);
Specifically if you know you always have the format
input = "999 Hz Bla bla"
you can do something like:
int firstSeparator = input.IndexOf(' ');
string frequency = input.Substring(firstSeparator);
int numericFrequency = int.Parse(frequency);
That will work no matter how many digits are in the frequency as long as the digits are followed by a space character.
53 is the ASCII value for the character '5'
57 is the ASCII value for the character '7'
this is just Visual Studio showing you extra details about the actual values.
You can proceed with your code.
Because you're treating them as Characters.
the character '5' is sequentially the 53rd character in ASCII.
the simplest solution is to just subtract the character '0' from all of them, that will give you the numeric value of a single character.
53 and 55 are the ASCII values of the '5' and '7' characters (the way the characters are stored in memory).
If you need to convert them to Integers, take a look at this SO post.

adding 48 to string number

I have a string in C# like this:
string only_number;
I assigned it a value = 40
When I check only_number[0], I get 52
When I check only_number[1], I get 48
why it is adding 48 to a character at current position? Please suggest
String is basically char[]. So what you are seeing is ASCII value of char 4 and 0.
Proof: Diff between 4 and 0 = Diff between 52 and 48.
Since it is a string so you didn't assigned it 40. Instead you assigned it "40".
What you see is the ASCII code of '4' and '0'.
It's not adding 48 to the character. What you see is the character code, and the characters for digits start at 48 in Unicode:
'0' = 48
'1' = 49
'2' = 50
'3' = 51
'4' = 52
'5' = 53
'6' = 54
'7' = 55
'8' = 56
'9' = 57
A string is a range of char values, and each char value is a 16 bit integer basically representing a code point in the Unicode character set.
When you read from only_number[0] you get a char value that is '4', and the character code for that is 52. So, what you have done is reading a character from the string, and then converted that to an integer before you display it.
So:
char c = only_number[0];
Console.WriteLine(c); // displays 4
int n = (int)only_number[0]; // cast to integer
Console.WriteLine(n); // displays 52
int m = only_number[0]; // the cast is not needed, but the value is cast anyway
Console.WriteLine(m); // displays 52
You are accessing this string and it is outputting the ASCII character codes for each of your two characters, '4' and '0' - please see here:
http://www.theasciicode.com.ar/ascii-control-characters/null-character-ascii-code-0.html
string is the array of chars, so, that;s why you recieved these results, it basicallly display the ASCII of '4' and '0'.

Categories

Resources