What is idiomatic way to convert standard hexadecimal string (like "0x0123") to BigInteger in C#?
What I tried requires removing the hex prefix manually:
using System;
using System.Numerics;
using System.Globalization;
namespace TestHex
{
class Program
{
static void Main(string[] args)
{
BigInteger A;
// it does not work
// A = BigInteger.Parse("0x0123");
// it works, but without hex prefix
A = BigInteger.Parse("123", NumberStyles.AllowHexSpecifier);
Console.WriteLine(A);
Console.ReadLine();
}
}
}
According to the MSDN documentation, the idiom is to only accept hexadecimal strings without 0x as input, but then to lie to the user by outputting them prefixed with 0x:
public class Example
{
public static void Main()
{
string[] hexStrings = { "80", "E293", "F9A2FF", "FFFFFFFF",
"080", "0E293", "0F9A2FF", "0FFFFFFFF",
"0080", "00E293", "00F9A2FF", "00FFFFFFFF" };
foreach (string hexString in hexStrings)
{
BigInteger number = BigInteger.Parse(
hexString,
NumberStyles.AllowHexSpecifier);
Console.WriteLine("Converted 0x{0} to {1}.", hexString, number);
}
}
}
// The example displays the following output:
// Converted 0x80 to -128.
// Converted 0xE293 to -7533.
// Converted 0xF9A2FF to -417025.
// Converted 0xFFFFFFFF to -1.
// Converted 0x080 to 128.
// Converted 0x0E293 to 58003.
// Converted 0x0F9A2FF to 16360191.
// Converted 0x0FFFFFFFF to 4294967295.
// Converted 0x0080 to 128.
// Converted 0x00E293 to 58003.
// Converted 0x00F9A2FF to 16360191.
// Converted 0x00FFFFFFFF to 4294967295.
That's a really rubbish idiom. I'd invent your own idiom that fits your use case.
Related
I want to get input from user and print the type of the input given by user.
I have tried this.
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
class Solution
{
static void Main(String[] args)
{
var userObj = Console.ReadLine();
// if input is 5 it should print it is of type int.
//if input is 5.4 it should print it is of type double.
Console.WriteLine(userObj.GetType());// printing only string
}
}
also tried this but always going false
using System;
class Solution
{
static void Main(String[] args)
{
var userObj = Console.ReadLine();
if (string.Format(userObj) == string .Format("0"))
{
Console.WriteLine("it is of type interger");
}
}
}
You're misunderstanding how var works in C#. C# is a strongly-typed language and as such it's not the same as other languages that use var like JavaScript. Therefore, the variable declared as var already knows what type it is at compile time.
Console.ReadLine() returns a string, therefore the variable userObj in this sense WILL be a string. You will never get anything but a string type.
You can, however, try several things to see if you can convert it to another type. for example:
var userInput = Console.ReadLine();
int x;
if(int.TryParse(userInput, out x))
{
Console.WriteLine("That's an int!");
}
Try parse some with some different number datatypes from biggest to smallest. I assume you want to store the number in the smallest one possible.
float var1;
double var2;
int var3;
if (float.TryParse(urMom, out var1))
Console.WriteLine("Float");
else if (double.TryParse(urMom, out var2))
Console.WriteLine("Double");
else if (int.TryParse(urMom, out var3))
Console.WriteLine("Int");
This question already has answers here:
Convert char to int in C#
(20 answers)
Closed 9 months ago.
I was trying to learn explicit conversion in c#.
First I explicitly converted string to int
namespace MyfirstCSharpCode
{
class Program
{
static void Main(string[] args)
{
string str = "1234";
int i = Convert.ToInt16(str);
Console.WriteLine(i);
}
}
}
it worked and the result is : 1234
Now I tried to convert char to int
namespace MyfirstCSharpCode
{
class Program
{
static void Main(string[] args)
{
char str = '1';
int i = Convert.ToInt16(str);
Console.WriteLine(i);
}
}
}
And now the result is 49, the ASCII value of 1. How to get the char 1 instead of ASCII value. Please don't be too harsh its my first day in c# ;)..
You hit the overload methods of Conver.ToInt16(char) , and char is a UTF-16 code unit and representing 1 is 49. You can use Int16.TryParse to safe parse the number (need in 16 bit sign number) in order to get the valid integer.
string str = "1"; // use string if your number e.g 122
short number; // convert to 16 bit signed integer equivalent.
if (Int16.TryParse(str, out number))
{
Console.WriteLine(number);
}
I have a reference program which I got online. In that variable data64 is defined as ulong type. Which they're converting to unicode and displaying in a textbox like this:
TextBox1.AppendText = Rxmsg.data64.ToString("X");
The actual value of data64 is "12288092688749907974".
While it displays in textbox the value is "AA88133200035006". I thought it's a simple decimal to hex conversion. So I converted the data64 value to hex but I was wrong. Can anyone please clarify me how the above conversion was made? It's related to one of my projects. It would be very useful for me to proceed further.
reason is the Endianness of the display
IsLittleEndian Yes: 06-50-03-00-32-13-88-AA
IsLittleEndian No: AA-88-13-32-00-03-50-06
fiddle demo and wikipedia link
using System;
namespace ConsoleApplication1
{
public class Program
{
public static void Main(string[] args)
{
var value = 12288092688749907974u;
var bytes = BitConverter.GetBytes(value);
Console.Write(BitConverter.IsLittleEndian ? "IsLittleEndian Yes" : "IsLittleEndian No");
Console.WriteLine(" Value " + BitConverter.ToString(bytes));
Array.Reverse(bytes);
Console.Write(BitConverter.IsLittleEndian ? "IsLittleEndian No" : "IsLittleEndian Yes");
Console.WriteLine(" Value " + BitConverter.ToString(bytes));
Console.Read();
}
}
}
I am trying to take output from an array and convert to a double for use in a calculation.
This is what I am trying to do:
Console.WriteLine(product[1]);
double units = Convert.ToDouble(Console.ReadLine());
Have been trying few other thing but getting no where; any easy solution?
There's no need to write it to console and read it back.. simply:
var units = Convert.ToDouble(product[1]);
You might also consider using Double.TryParse() to check whether the value can be converted into a double and isn't a string of letters.
your line could be throw exception if the user type some invalid double
double units = Convert.ToDouble(Console.ReadLine());
you should do this
double units ;
if (!double.TryParse(Console.ReadLine(), out units )) {
//units is not a double
}
else{
//units is a double
}
If you need to convert the whole array to doubles, you could do this:
using System.Linq;
var doubleProduct = product.Select(p => double.Parse(p)).ToArray();
Edit
You can also use Array.ConvertAll() which is apparently more efficient (Thanks #PetSerAl for the tip). It also means you don't need Linq:
var doubleProduct = Array.ConvertAll(product, p => double.Parse(p));
using System;
public class Example
{
public static void Main()
{
string[] values= { "-1,035.77219", "1AFF", "1e-35",
"1,635,592,999,999,999,999,999,999", "-17.455",
"190.34001", "1.29e325"};
double result;
foreach (string value in values)
{
try {
result = Convert.ToDouble(value);
Console.WriteLine("Converted '{0}' to {1}.", value, result);
}
catch (FormatException) {
Console.WriteLine("Unable to convert '{0}' to a Double.", value);
}
catch (OverflowException) {
Console.WriteLine("'{0}' is outside the range of a Double.", value);
}
}
}
}
// The example displays the following output:
// Converted '-1,035.77219' to -1035.77219.
// Unable to convert '1AFF' to a Double.
// Converted '1e-35' to 1E-35.
// Converted '1,635,592,999,999,999,999,999,999' to 1.635593E+24.
// Converted '-17.455' to -17.455.
// Converted '190.34001' to 190.34001.
// '1.29e325' is outside the range of a Double.
Read MSDN
Console.WriteLine Method (String, Object)
Console.ReadLine Method
Please try folllowing:
using System;
public class Program
{
public static void Main()
{
string[] products= { "10.5","20.5","50.5"};
foreach (var product in products)
{
Console.WriteLine(Convert.ToDouble(product));
}
}
}
Live Demo
I am using NLua for script interface with my application.
I want to send keyboard input from LUA language to my C# code.
I do with this C# code.
using (Lua lua = new Lua())
{
lua.LoadCLRPackage();
lua.RegisterFunction("keypressC", null, typeof(TestNLUA).GetMethod("keypressC"));
lua.RegisterFunction("keypressS", null, typeof(TestNLUA).GetMethod("keypressS"));
lua["Key"] = new SpecialKey();
}
public class SpecialKey
{
public static readonly char EnterC = '\uE007';
public static readonly string EnterS = Convert.ToString(EnterC);
}
public class TestNLUA
{
public static void keypressC(char key)
{
// key = 57351 => OK
}
public static void keypressS(string key)
{
char[] akey = key.ToCharArray();
// akey[0] = 63 = ? (question mark) => KO
}
}
And in LUA Script I do
keypressC(Key.EnterC)
keypressS(Key.EnterS)
In keypressC, Nlua passe value 57351 to key parameter. It's OK.
In keypressS, Nlua passe value "?" to key parameter. It's KO.
I have no idea why there is the character "?".
Looks like a marshaling error in NLua (i.e. LuaInterface)?
Can you help me?
This is a marshaling problem in nLua/LuaInterface.
It uses Marshal.StringToHGlobalAnsi to marshal a string from C# to Lua.
It uses Marshal.PtrToStringAnsi to marshal a string from Lua to C#.
If you round-trip your example string through these functions, you can see it reproduces your problem:
string test = "\uE007";
Console.WriteLine(test);
Console.WriteLine("{0}: {1}", test[0], (int) test[0]);
IntPtr ptr = Marshal.StringToHGlobalAnsi(test);
string roundTripped = Marshal.PtrToStringAnsi(ptr, test.Length);
Console.WriteLine(roundTripped);
Console.WriteLine("{0}: {1}", roundTripped[0], (int) roundTripped[0]);
Output:
?
?: 57351
?
?: 63
Your problem goes away if you change the the marshaling functions to use Uni instead of Ansi, but you'd need to build nLua/LuaInterface from source.