How to get ASCII value of characters in C# - c#

Below is my string in C# which I am converting it to Character array & in need to get the ASCII value of each character in the string.
static void Main(string[] args)
{
string s = "Test";
var arr = s.ToCharArray();
foreach(var a in arr)
{
var n = Encoding.ASCII.GetByteCount(a.ToString());
Console.WriteLine(a);
Console.WriteLine(n);
}
}
This outputs as
T
1
e
1
s
1
t
1
On googling I got number of links but none of them suffice my need.
How to get ASCII value of string in C#
https://www.codeproject.com/Questions/516802/ConvertingpluscharsplustoplusASCIIplusinplusC
I am in need to get the ASCII value of each character in string.???
Any help/suggestion highly appreciated.

A string can be directly enumerated to a IEnumerable<char>. And each char can be casted to a integer to see its UNICODE "value" (code point). UTF-16 maps the 128 characters of ASCII (0-127) to the UNICODE code points 0-127 (see for example https://en.wikipedia.org/wiki/Code_point), so you can directly print this number.
string s = "Test";
foreach (char a in s)
{
if (a > 127)
{
throw new Exception(string.Format(#"{0} (code \u{1:X04}) is not ASCII!", a, (int)a));
}
Console.WriteLine("{0}: {1}", a, (int)a);
}

GetByteCount will return the count of bytes used, so for each character it will be 1 byte.
Try GetBytes
static void Main(string[] args)
{
string s = "Test";
var n = ASCIIEncoding.ASCII.GetBytes(s);
for (int i = 0; i < s.Length; i++)
{
Console.WriteLine($"Char {s[i]} - byte {n[i]}");
}
}

Every character is represented in the ASCII table with a value between 0 and 127. Converting the chars to an Integer you will be able to get the ASCII value.
static void Main(string[] args)
{
string s = "Test";
for (int i = 0; i < s.Length; i++)
{
//Convert one by one every leter from the string in ASCII value.
int value = s[i];
Console.WriteLine(value);
}
}

You're asking for the byte count when you should be asking for the bytes themselves. Use Encoding.ASCII.GetBytes instead of Encoding.ASCII.GetByteCount. Like in this answer: https://stackoverflow.com/a/400777/3129333

Console.WriteLine(a);
Console.WriteLine(((int)a).ToString("X"));
You need to convert in int and then in hex.
GetByteCount will return the count of bytes used, so for each character it will be 1.
You can read also: Need to convert string/char to ascii values

Related

turning a string with binary into binary in C#

lets say I have a string that contains binary, like this:
string test = "01001010";
so I want to do something like this:
someFunc(test);
and this function would return exactly what the test variable says, but in byte form instead of string.
example:
using System;
class Program
{
static void Main()
{
Console.WriteLine(Convert.ToChar(someFunc(Console.ReadLine())));
}
}
this program prompts you to enter a byte using Console.ReadLine (which returns a string), turns it into a byte, then turns it into a char.
How could I do this?
You could write it in this way:
using System;
class Program
{
static byte someFunc(string text)
{
byte t = 0;
for (int i = 0; i < 8; i++)
t = (byte)(t * 2 + (text[i] - '0'));
return t;
}
static void Main()
{
Console.WriteLine(Convert.ToChar(someFunc(Console.ReadLine())));
}
}
But it would be useful before using someFunc() to check if string is not okey (for example, that there would be shown an error message if input is "10102010")
Use Convert.ToInt32(string, int) where string is the string you want to convert, and int is the base of the number system you want to convert from, in your case base 2 or binary. Or if you really desperately need it to be a byte, then you can use Convert.ToByte(string, int). Like so:
using System;
class Program
{
public static void Main()
{
var input = Console.ReadLine(); // 01001010
var number = Convert.ToInt32(input, 2);
Console.WriteLine(number); // prints '74'
}
}
Be warned that Convert.ToXyz() will throw an exception of type FormatException if the given input string contains character that are illegal for the given base. For base 2 that would be any character that's not a 0 or 1, you might want to catch such exceptions, or check that all characters in the input string are either a '0' or '1' beforehand
Edited:
Take char after char, add it to result byte and multiply by 2 to convert from binary to decimal (except for the last char, it should be just added).
Then return byte as char.
public static char someFunc(string bs) {
byte result = 0;
for (int i = 0; i < bs.Length - 1; i++)
{
if (bs[i].Equals('1'))
{
result += 1;
}
result *= 2;
}
if (bs[bs.Length - 1].Equals('1'))
{
result++;
}
return (char) result;
}
returns J for "01001010"
Hi this is one implementation which i use in java, but this will ork for c# as well. My be you need some syntax changes.
static int someFunc(String s){
int binary = 0x00;
for(int i=0;i<8;i++){
if(s.charAt(i) == '1')
binary =(binary<<1) | 0x01;
else if(s.charAt(i) == '0')
binary =(binary<<1) | 0x00;
}
return binary;
}

Getting the sum of all numbers in a character array

I have converted string to char[], but now when I try to get a total of all the numbers in the array, I get a wrong output. The goal is that if the user enters a number as a string e.g - 12, the output should be 3 i.e 1 + 2, another example - 123 should be 1+2+3 = 6.
I am new to coding. My apologies for any inconvienence.
static void Main(string[] args)
{
int sum = 0;
String num = Console.ReadLine();
char[] sep = num.ToCharArray();
for (int i = 0; i < sep.Length; i++)
{
sum += (sep[i]);
}
Console.WriteLine(sum);
Console.ReadLine();
}
You are currently adding ascii values. The ascii value of 1 is 49 and that of 2 Is 50... You need to use int.TryParse to convert from char to int.
int value;
for (int i = 0; i < sep.Length; i++)
{
if (int.TryParse (sep[i].ToString(),out value))
sum += value;
}
If you want to calculate sum of digits, you need to convert each char to int first. Char needs to be converted to string and then parsed into int. Your original code contains implicit conversion, which converts 1 and 2 into 49 and 50 (ASCII), thus the sum ends up being 99.
Try this code instead:
static void Main(string[] args)
{
int sum = 0;
String num = Console.ReadLine();
char[] sep = num.ToCharArray();
for (int i = 0; i < sep.Length; i++)
{
sum += int.Parse(sep[i].ToString());
}
Console.WriteLine(sum);
Console.ReadLine();
}
Just for fun here is a LINQ solution.
var sum = num.Select( c => int.Parse((string)c) ).Sum();
This solution takes advantage of the fact that a string is also an IEnumerable<char> and therefore can be treated as a list of characters.
The Select statement iterates over the characters and converts each one to an integer by supplying a lambda expression (that's the => stuff) that maps each character onto its integer equivalent. The symbol is typically prounced "goes to". You might pronounce the whole expression "C goes to whatever integer can be parsed from it."
Then we call Sum() to convert the resulting list of integers into a numeric sum.

How to parse a binary string to a binary literal in C#

Here is my specific problem. I need to represent an integer (like 1,2,3,..) as a binary literal with exactly 128 bits.
This is my string representing 1 in binary:
string = "000...0001"; // 128 characters. all zeros until the last 1
Intended result:
bx000...0001;
This issue is that 128 bits is larger than normal types like int, double, decimal, etc. Thus, I believe you must use the BigInteger class to hold this binary value??
Another way to frame this: How can I make sure my BigInteger value is 16 bytes big?
BigInteger val = new BigInteger(1); // but must be 16 bytes exactly.
You would have to specify the number of bytes and pad whatever is missing with 0's, then you can use the BitArray to get the bit values. Something like this.
public static string GetBitString(BigInteger val, int bytes)
{
byte[] arrayBytes = new byte[bytes];
var valBytes = val.ToByteArray();
for (var i = 0; i < valBytes.Length; i++)
{
arrayBytes[i] = valBytes[i];
}
var arr = new BitArray(arrayBytes);
return $"bx{string.Join("", arr.Cast<bool>().Reverse().Select(c => c ? "1" : "0"))}";
}
Another options is to just resize the array created to be 16 bytes. Something like this
public static string GetBitString(BigInteger val, int bytes)
{
var valBytes = val.ToByteArray();
Array.Resize(ref valBytes, bytes);
return $"bx{string.Join("", new BitArray(valBytes).Cast<bool>().Reverse().Select(c => c ? "1" : "0"))}";
}
Using the ToBinaryString extension method from this answer modified to skip leading zeros and not force a sign zero, you can just use PadLeft to ensure you have leading zeroes:
public static string ToBinaryString(this BigInteger bigint) {
var bytes = bigint.ToByteArray();
// Create a StringBuilder having appropriate capacity.
var base2 = new StringBuilder(bytes.Length * 8);
// Convert remaining bytes adding leading zeros.
var idx = bytes.Length - 1;
for (; idx > 0 && bytes[idx] == 0; --idx)
;
for (; idx >= 0; --idx)
base2.Append(Convert.ToString(bytes[idx], 2).PadLeft(8, '0'));
return base2.ToString();
}
Then with the prefix and the left padding:
var ans = "bx"+val.ToBinaryString().PadLeft(128, '0');

Error converting string to int [duplicate]

I work on a project in C# which requires to use arabic numbers, but then it must store as integer in database, I need a solution to convert arabic numbers into int in C#.
Any solution or help please?
thanks in advance
From comments:
I have arabic numbers like ١،٢،٣،٤... and must convert to 1,2,3, or ٢٣٤ convert to 234
Updated: You can use StringBuilder for memory optimization.
private static string ToEnglishNumbers(string input)
{
StringBuilder sbEnglishNumbers = new StringBuilder(string.Empty);
for (int i = 0; i < input.Length; i++)
{
if (char.IsDigit(input[i]))
{
sbEnglishNumbers.Append(char.GetNumericValue(input, i));
}
else
{
sbEnglishNumbers.Append(input[i].ToString());
}
}
return sbEnglishNumbers.ToString();
}
Original Answer: use this Method
private string toEnglishNumber(string input)
{
string EnglishNumbers = "";
for (int i = 0; i < input.Length; i++)
{
if (Char.IsDigit(input[i]))
{
EnglishNumbers += char.GetNumericValue(input, i);
}
else
{
EnglishNumbers += input[i].ToString();
}
}
return EnglishNumbers;
}
Unfortunately it is not yet possible to parse the complete string representation by passing in an appropriate IFormatProvider(maybe in the upcoming versions). However, the char type has a GetNumericValue method which converts any numeric Unicode character to a double. For example:
double two = char.GetNumericValue('٢');
Console.WriteLine(two); // prints 2
You could use it to convert one digit at a time.
Arabic digits like ١،٢،٣،٤ in unicode are encoded as characters in the range 1632 to 1641. Subtract the unicode for arabic zero (1632) from the unicode value of each arabic digit character to get their digital values. Multiply each digital value with its place value and sum the results to get the integer.
Alternatively use Regex.Replace to convert the string with Arabic digits into a string with decimal digits, then use Int.Parse to convert the result into an integer.
A simple way to convert Arabic numbers into integer
string EnglishNumbers="";
for (int i = 0; i < arabicnumbers.Length; i++)
{
EnglishNumbers += char.GetNumericValue(arabicnumbers, i);
}
int convertednumber=Convert.ToInt32(EnglishNumbers);
This is my solution :
public static string arabicNumToEnglish(string input)
{
String[] map={"٠","١","٢","٣","٤","٥","٦","٧","٨","٩"};
for (int i = 0; i <= 9; i++)
{
input=input.Replace(map[i],i.ToString());
}
return input;
}
to get the value of a digit, substract the zero character from it, e.g in normal numeric, '1'-'0' = 1, '2'-'0' = 2. etc.
For multidigit number you can use something like this
result =0;
foreach(char digit in number)
{
result *= 10; //shift the digit, multiply by ten for each shift
result += (digit - '0)'; //add the int value of the current digit.
}
just replace the '0' with the arabic zero if your number uses Arabic character. This works for any numeric symbols, as long as 0-9 in that symbol system are encoded consecutively.
I know this question is a bit old, however I faced similar case in one of my projects and passed by this question and decided to share my solution which did work perfectly for me, and hope it will serve others the same.
private string ConvertToWesternArbicNumerals(string input)
{
var result = new StringBuilder(input.Length);
foreach (char c in input.ToCharArray())
{
//Check if the characters is recognized as UNICODE numeric value if yes
if (char.IsNumber(c))
{
// using char.GetNumericValue() convert numeric Unicode to a double-precision
// floating point number (returns the numeric value of the passed char)
// apend to final string holder
result.Append(char.GetNumericValue(c));
}
else
{
// apend non numeric chars to recreate the orignal string with the converted numbers
result.Append(c);
}
}
return result.ToString();
}
now you can simply call the function to return the western Arabic numerals.
try this extension:
public static class Extension
{
public static string ToEnglishNumbers(this string s)
{
return s.Replace("۰", "0").Replace("۱", "1").Replace("۲", "2").Replace("۳", "3").Replace("۴", "4")
.Replace("۵", "5").Replace("۶", "6").Replace("۷", "7").Replace("۸", "8").Replace("۹", "9");
}
public static int ToNumber(this string s)
{
if (int.TryParse(s.ToEnglishNumbers(), out var result))
{
return result;
}
return -1;
}
public static string ToArabicNumbers(this string s)
{
return s.Replace("0", "۰").Replace("1", "۱").Replace("2", "۲").Replace("3", "۳").Replace("4", "۴")
.Replace("5", "۵").Replace("6", "۶").Replace("7", "۷").Replace("8", "۸").Replace("9", "۹");
}
}

How could I encode a long number using uppercase letters and numbers to make it shorter to type?

Is there a way I could encode a long number (e.g. 12349874529768521) as lower-case letters AND numbers for the purposes of reducing its length? The idea is that a user might have a long number on a piece of paper.
It seems to me that if there are more symbols available, that the resulting number could be made shorter. So I'm looking for something like hexadecimal but using the larger symbol space of A-Z instead of just A-F.
This would be in C# (if it matters)
Base32 encoding is designed to produce an unambiguous, compact, human-readable (and non-obscene!) representation. From Wikipedia:
Base32 has a number of advantages over Base64:
The resulting character set is all one case, which can often be beneficial when using a case-insensitive filesystem, spoken language, or human memory.
The
result can be used as a file name because it can not possibly contain the '/' symbol, which is the Unix path separator.
The alphabet can be selected to avoid similar-looking pairs of different symbols, so the strings can be accurately transcribed by hand. (For example, the RFC 4648 symbol set omits the digits for one, eight and zero, since they could be confused with the letters 'I', 'B', and 'O'.)
A result excluding padding can be included in a URL without encoding any characters.
Base32 also has advantages over hexadecimal/Base16: Base32
representation takes roughly 20% less space. (1000 bits takes 200
characters, compared to 250 for Base16)
Douglas Crockford's original article on Base32 encoding is also well worth a read.
EDIT: here's a bit of C# that'll do base-N encoding of integers:
class Program {
private const string BINARY = "01";
private const string DECIMAL = "0123456789";
private const string HEX = "0123456789abcdef";
private const string BASE32 = "0123456789abcdefghjkmnpqrstvwxyz";
static string EncodeInt32(string alphabet, int value) {
var sb = new StringBuilder();
while (value > 0) {
sb.Insert(0, alphabet[value % alphabet.Length]);
value = value / alphabet.Length;
}
return sb.ToString();
}
static int DecodeInt32(string alphabet, string value) {
int result = 0;
int b = alphabet.Length;
int pow = 0;
for (var i = value.Length-1; i >= 0; i--) {
result += (int)(Math.Pow(b, pow++)) * alphabet.IndexOf(value[i]);
}
return (result);
}
static void Main(string[] args) {
for (var i = 0; i < 1234567890; i += 1234567) {
Console.WriteLine("{0} {1} {2}", i, EncodeInt32(BASE32, i), DecodeInt32(BASE32, EncodeInt32(BASE32, i)));
}
Console.ReadKey(false);
}
}
Example output showing typical reduction in string length:
1227159598 14j9y1e 1227159598
1228394165 14kfknn 1228394165
1229628732 14mn99w 1229628732
1230863299 14ntyy3 1230863299
1232097866 14q0mja 1232097866
1233332433 14r6a6h 1233332433
1234567000 14sbztr 1234567000
How about a BaseN Method to encode/decode your long into a string with characters you defined on your own
public static class BaseN
{
private const string CharList = "0123456789abcdefghijklmnopqrstuvwxyz";
public static String Encode(long input)
{
if (input < 0) throw new ArgumentOutOfRangeException("input", input, "input cannot be negative");
var result = new System.Collections.Generic.Stack<char>();
while (input != 0)
{
result.Push(CharList[(int)(input % CharList.Length)]);
input /= CharList.Length;
}
return new string(result.ToArray());
}
public static long Decode(string input)
{
long result = 0, pos = 0;
foreach (char c in input.Reverse())
{
result += CharList.IndexOf(c) * (long)Math.Pow(CharList.Length, pos);
pos++;
}
return result;
}
}
Usage:
long number = 12349874529768521;
string result = BaseN.Encode(number);
Sample:
https://dotnetfiddle.net/odwFlk
Here's a similar approach to the others, using a Base-N conversion:
using System;
using System.Text;
namespace ConsoleApp3
{
class Program
{
static void Main()
{
long n = 12349874529768521;
string baseChars = "0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz##";
var encoded = AsBaseN(n, baseChars.ToCharArray());
Console.WriteLine(encoded); // Prints "9HXNyK2uh"
long decoded = AsLong(encoded, baseChars.ToCharArray());
Console.WriteLine(decoded); // Prints "12349874529768521"
}
public static string AsBaseN(long value, char[] baseChars)
{
var result = new StringBuilder();
int targetBase = baseChars.Length;
do
{
result.Append(baseChars[value % targetBase]);
value /= targetBase;
}
while (value > 0);
return result.ToString();
}
public static long AsLong(string number, char[] baseChars)
{
long result = 0;
int numberBase = baseChars.Length;
long multiplier = 1;
foreach (char c in number)
{
result += multiplier * Array.IndexOf(baseChars, c);
multiplier *= numberBase;
}
return result;
}
}
}
If you want a different set of allowable characters, just change baseChars as appropriate. For example, if you just want 0-9 and A-Z:
string baseChars = "0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ";
This gives a result of T3OPA1YNLD3 (base 36) instead of 9HXNyK2uh (base 64).
I presume you mean you want to represent the number with fewer characters.
Base 36 will do this (0-9, a-z).
You can use a base 36 encoder.
Base36 is a binary-to-text encoding scheme that represents binary data in an ASCII string format by translating it into a radix-36 representation. The choice of 36 is convenient in that the digits can be represented using the Arabic numerals 0–9 and the Latin letters A–Z1 (the ISO basic Latin alphabet).
Here's an example of one, but any should work: https://github.com/thewindev/csharpbase36
Example Usage
// Encoding
Base36.Encode(10); // returns "A"
Base36.Encode(10000); // returns "7PS"
// Decoding
Base36.Decode("Z"); // returns 35L
Base36.Decode("10"); // returns 36L
Base36.Decode("7PS"); // returns 10000L
By default uppercase letters are used. If you really wanted to lowercase then a simple string.ToLowerInvarient() can change that.
However, uppercase is usually easier to read, which is why it's used by default, so you might want to consider using uppercase rather than lowercase.
You could look to Base64 encoding. It uses 0-9, A-Z, a-z, + and / characters. Or Base36, if you're interested only in 0-9 and A-Z.

Categories

Resources