I have to convert a string to byte (16 bit) in JavaScript. I can do this in .net in following code but I have to change this for old classic asp App which uses JavaScript.
string strShared_Key = "6fc2e550abc4ea333395346123456789";
int nLength = strShared_Key.Length;
byte[] keyMAC = new byte[nLength / 2];
for (int i = 0; i < nLength; i += 2)
keyMAC[i / 2] = Convert.ToByte(strShared_Key.Substring(i, 2), 16);
This is the JavaScript function but doesn't return same out put as above .net code.
function String2Bin16bit(inputString) {
var str = ""; // string
var arr = []; // byte array
for (var i = 0; i < inputString.length; i += 2) {
// get chunk of two characters and parse to number
arr.push(parseInt(inputString.substr(i, 2), 16));
}
return arr;
}
You want parseInt(x, 16) which will read x as a number and parse it as such bearing in mind that it's in base 16.
var str = "aabbcc"; // string
var arr = []; // byte array
for(var i = 0; i < str.length; i += 2) {
arr.push(parseInt(str.substr(i, 2), 16)); // get chunk of two characters and parse to number
}
Related
i use Modbus/TCP to get data, data type is LReal.
but i want LReal to int.
this is my data.
staic_13 data is 「4.232」
but i get [80] 16400 [81]-2098 [82] -9962 [83] -30933.
i don't know how turn to double
Based on this experimental Python code,
>>> x = [16400, -2098, -9962, -30933]
>>> struct.unpack(">d", struct.pack(">4h", *x))
(4.242,)
it looks like you'd need to concatenate those 4 16-byte integers in big-endian format, then interpret those 8 bytes as a single big-endian double.
In .NET 6 (see this fiddle):
using System.Buffers.Binary;
using System;
short[] values = {16400, -2098, -9962, -30933};
byte[] buf = new byte[values.Length * sizeof(short)];
for (int i = 0; i < values.Length; i++)
{
BinaryPrimitives.WriteInt16BigEndian(buf.AsSpan(i * sizeof(short)), values[i]);
}
double result = BinaryPrimitives.ReadDoubleBigEndian(buf);
Console.WriteLine(result);
In .NET 4 (see this fiddle):
using System;
public class Program
{
public static void Main()
{
short[] values = {16400, -2098, -9962, -30933};
byte[] buf = new byte[8];
for (int i = 0; i < 4; i++)
{
byte[] sh_buf = BitConverter.GetBytes(values[i]);
if(BitConverter.IsLittleEndian) {
// Flip the bytes around if we're little-endian
buf[(3 - i) * 2] = sh_buf[0];
buf[(3 - i) * 2 + 1] = sh_buf[1];
} else {
buf[i * 2] = sh_buf[0];
buf[i * 2 + 1] = sh_buf[1];
}
}
double result = BitConverter.ToDouble(buf, 0);
Console.WriteLine(result);
}
}
I'm trying to convert an unHex value to a string but it's not working.
I have the following value 0x01BB92E7F716F55B144768FCB2EA40187AE6CF6B2E52A64F7331D0539507441F7D770112510D679F0B310116B0D709E049A19467672FFA532A7C30DFB72
Result I hope would be this
but executing the function below displays this result
»’ Ç ÷ õ [Ghü²ê # zæÏk.R¦Os1ÐS • D} w Q gŸ 1 ° × àI¡ ”gg / úS * | 0ß ·) = ¤
Any idea how I can extract the information as expected
public static string Hex2String (string input)
{
var builder = new StringBuilder ();
for (int i = 0; i < socketLength; i + = 2)
{
// throws an exception if not properly formatted
string hexdec = input.Substring (i, 2);
int number = Int32.Parse (hexdec, NumberStyles.HexNumber);
char charToAdd = (char) number;
builder.Append (charToAdd);
}
return builder.ToString ();
}
Your result is base64-encoded. Base64 is a way of taking a byte array and turning it into human-readable characters.
Your code tries to take these raw bytes and cast them to chars, but not all byte values are valid printable characters: some are control characters, some can't be printed, etc.
Instead, let's turn the hex string into a byte array, and then turn that byte array into a base64 string.
string input = "01BB92E7F716F55B144768FCB2EA40187AE6CF6B2E52A64F7331D0539507441F7D770112510D679F0B310116B0D709E049A19467672FFA532A7C30DFB72";
byte[] bytes = new byte[input.Length / 2];
for (int i = 0; i < bytes.Length; i++)
{
bytes[i] = byte.Parse(input.Substring(i * 2, 2), NumberStyles.HexNumber);
}
string result = Convert.ToBase64String(bytes);
This results in:
AbuS5/cW9VsUR2j8supAGHrmz2suUqZPczHQU5UHRB99dwESUQ1nnwsxARaw1wngSaGUZ2cv+lMqfDDftw==
See it running here.
I have a string of bits, like this string str = "0111001101101000" It's the letters"sh".
I need to make Unicode letters out of it. I'm doing following:
BitArray bn = new BitArray(str.Length); //creating new bitarray
for (int kat = 0; kat < str.Length; kat++)
{
if (str[kat].ToString() == "0")//adding boolean values into array
{
bn[kat] = false;
}
else
bn[kat] = true;
}
byte[] bytes = new byte[bn.Length];//converting to bytes
bn.CopyTo(bytes, 0);
string output = Encoding.Unicode.GetString(bytes); //encoding
textBox2.Text = output; // result in textbox
But the output text is just complete mess. How to do it right?
There's a couple of problems with your code.
First BitArray will reverse the bit order - it's easier to use
Convert.ToByte
Your input string contains two bytes (one
per character), but you're using Encoding.Unicode to decode it, which
is UTF16 encoding (two bytes per character), you need to use Encoding.UTF8
Working Code
string str = "0111001101101000";
int numOfBytes = str.Length / 8;
byte[] bytes = new byte[numOfBytes];
for (int i = 0; i < numOfBytes; ++i)
{
bytes[i] = Convert.ToByte(str.Substring(8 * i, 8), 2);
}
string output = Encoding.UTF8.GetString(bytes);
A) Your string is ASCII, not UNICODE: 8 bits per character
B) The most significant bit of every byte is on the left, so the strange math used in bn[...]
C) The commented part is useless because "false" is the default state of a BitArray
D) The length of the byte array was wrong. 8 bits == 1 byte! :-)
string str = "0111001101101000";
BitArray bn = new BitArray(str.Length); //creating new bitarray
for (int kat = 0; kat < str.Length; kat++) {
if (str[kat] == '0')//adding boolean values into array
{
//bn[(kat / 8 * 8) + 7 - (kat % 8)] = false;
} else {
bn[(kat / 8 * 8) + 7 - (kat % 8)] = true;
}
}
// 8 bits in a byte
byte[] bytes = new byte[bn.Length / 8];//converting to bytes
bn.CopyTo(bytes, 0);
string output = Encoding.ASCII.GetString(bytes); //encoding
Probably better:
string str = "0111001101101000";
byte[] bytes = new byte[str.Length / 8];
for (int ix = 0, weight = 128, ix2 = 0; ix < str.Length; ix++) {
if (str[ix] == '1') {
bytes[ix2] += (byte)weight;
}
weight /= 2;
// Every 8 bits we "reset" the weight
// and increment the ix2
if (weight == 0) {
ix2++;
weight = 128;
}
}
string output = Encoding.ASCII.GetString(bytes); //encoding
For testing purpose, I need to generate a random string, which is then encoded into byte array for transferring over the Web and decoded back to a result string. The test uses NUnit framework to compare the original string with the result string. Since the encoded byte array has to be friendly for Web, it is encoded with UTF-8.
The string is encoded into a byte array by Encoder.GetBytes from UTF8Encoding. The byte array is decoded to string by Decoder.GetChars from UTF8Encoding.
The original string needs to be generated randomly and contain any sequence of characters, which can be encoded/decoded using UTF-8 encoding.
My first attempt to generate the string was:
public static String RandomString(Random rnd, Int32 length) {
StringBuilder str = new StringBuilder(length);
for (int i = 0; i < length; i++)
str.Append((char)rnd.Next(char.MinValue, char.MaxValue));
return str.ToString();
}
The above code produces strings with invalid sequences to encode.
I found some suggestions on the web and improved the code:
public static String RandomString(Random rnd, Int32 length) {
StringBuilder str = new StringBuilder(length);
for (int i = 0; i < length; i++) {
char c = (char)rnd.Next(char.MinValue, char.MaxValue);
while (c >= 0xD800 && c <= 0xDFFF)
c = (char)rnd.Next(char.MinValue, char.MaxValue);
str.Append(c);
return str.ToString();
}
The above code has no problem with encoding, but decoding the byte array fails. Furthermore, I am not sure that the code can cover all possible cases.
Any suggestions, how to generate a random string with the given requirements in C#.
UPD: using a random string in encoding/decoding:
public static Encoder Utf8Encode = new UTF8Encoding(false, true).GetEncoder();
public static Decoder Utf8Decode = new UTF8Encoding(false, true).GetDecoder();
public unsafe void TestString(Random rnd, int length, byte* byteArray,
int arrayLenght) {
int encodedLen;
String str = RandomString(rnd, length);
fixed (char* pStr = str) {
encodedLen = Utf8Encode.GetBytes(pStr, str.Length, byteArray,
arrayLenght, true);
}
char* buffer = stackalloc char[8192];
int decodedLen = Utf8Decode.GetChars(byteArray, encodedLen, buffer,
8192, true);
String res = new String(buffer, 0, decodedLen);
Assert.AreEqual(str, res);
}
I have used the code below for generating random UTF-8 character byte sequences. I can't guarantee it captures every aspect of the UTF-8 spec, but it was valuable for my testing purposes, so I'm posting it here.
private static readonly (int, int)[] HeadByteDefinitions =
{
(1 << 7, 0b0000_0000),
(1 << 5, 0b1100_0000),
(1 << 4, 0b1110_0000),
(1 << 3, 0b1111_0000)
};
static byte[] RandomUtf8Char(Random gen)
{
const int totalNumberOfUtf8Chars = (1 << 7) + (1 << 11) + (1 << 16) + (1 << 21);
int tailByteCnt;
var rnd = gen.Next(totalNumberOfUtf8Chars);
if (rnd < (1 << 7))
tailByteCnt = 0;
else if (rnd < (1 << 7) + (1 << 11))
tailByteCnt = 1;
else if (rnd < (1 << 7) + (1 << 11) + (1 << 16))
tailByteCnt = 2;
else
tailByteCnt = 3;
var (range, offset) = HeadByteDefinitions[tailByteCnt];
var headByte = Convert.ToByte(gen.Next(range) + offset);
var tailBytes = Enumerable.Range(0, tailByteCnt)
.Select(_ => Convert.ToByte(gen.Next(1 << 6) + 0b1000_0000));
return new[] {headByte}.Concat(tailBytes).ToArray();
}
The coding like
decimal prodprice = Convert.ToDecimal(NavigationContext.QueryString["Price"]);
then what should i put for
Binary image = Convert.ToByte(NavigationContext.QueryString["Image"]);
I had error at Convert.ToByte.
I am just going to assume you get your "Image" querystring as a Hex string (0-F digits). If so, you can convert it into a byte array first. An example (not the most efficient one but it should work):
string data = "0A0B0C0F1102"; // example data
if (data.Length % 2 != 0) { data = "0" + data; }
byte[] result = new byte[data.Length / 2];
for (int i = 0; i < data.Length; i += 2) {
result[i/2] = Convert.ToByte(data.Substring(i, 2), 16);
}
Binary image = result;