I have a string of bits, like this string str = "0111001101101000" It's the letters"sh".
I need to make Unicode letters out of it. I'm doing following:
BitArray bn = new BitArray(str.Length); //creating new bitarray
for (int kat = 0; kat < str.Length; kat++)
{
if (str[kat].ToString() == "0")//adding boolean values into array
{
bn[kat] = false;
}
else
bn[kat] = true;
}
byte[] bytes = new byte[bn.Length];//converting to bytes
bn.CopyTo(bytes, 0);
string output = Encoding.Unicode.GetString(bytes); //encoding
textBox2.Text = output; // result in textbox
But the output text is just complete mess. How to do it right?
There's a couple of problems with your code.
First BitArray will reverse the bit order - it's easier to use
Convert.ToByte
Your input string contains two bytes (one
per character), but you're using Encoding.Unicode to decode it, which
is UTF16 encoding (two bytes per character), you need to use Encoding.UTF8
Working Code
string str = "0111001101101000";
int numOfBytes = str.Length / 8;
byte[] bytes = new byte[numOfBytes];
for (int i = 0; i < numOfBytes; ++i)
{
bytes[i] = Convert.ToByte(str.Substring(8 * i, 8), 2);
}
string output = Encoding.UTF8.GetString(bytes);
A) Your string is ASCII, not UNICODE: 8 bits per character
B) The most significant bit of every byte is on the left, so the strange math used in bn[...]
C) The commented part is useless because "false" is the default state of a BitArray
D) The length of the byte array was wrong. 8 bits == 1 byte! :-)
string str = "0111001101101000";
BitArray bn = new BitArray(str.Length); //creating new bitarray
for (int kat = 0; kat < str.Length; kat++) {
if (str[kat] == '0')//adding boolean values into array
{
//bn[(kat / 8 * 8) + 7 - (kat % 8)] = false;
} else {
bn[(kat / 8 * 8) + 7 - (kat % 8)] = true;
}
}
// 8 bits in a byte
byte[] bytes = new byte[bn.Length / 8];//converting to bytes
bn.CopyTo(bytes, 0);
string output = Encoding.ASCII.GetString(bytes); //encoding
Probably better:
string str = "0111001101101000";
byte[] bytes = new byte[str.Length / 8];
for (int ix = 0, weight = 128, ix2 = 0; ix < str.Length; ix++) {
if (str[ix] == '1') {
bytes[ix2] += (byte)weight;
}
weight /= 2;
// Every 8 bits we "reset" the weight
// and increment the ix2
if (weight == 0) {
ix2++;
weight = 128;
}
}
string output = Encoding.ASCII.GetString(bytes); //encoding
Related
Not sure if I am in the right direction.
I can't find info about tilde.
int n = 5;
int m = ~n;
string numAsString = Convert.ToString(~n, 2);
char[] NumAsChar = numAsString.ToCharArray();
long l = Convert.ToInt64(numAsString, 2);
Console.WriteLine(numAsString);
Console.WriteLine(l);
You're probably looking for a simple answer.
int n = 5;
byte[] nbytes = BitConverter.GetBytes(n);
for(int i = 0 ; i < nbytes.Length; i++)
nbytes[i] = ~nbytes[i];
n = BitConverter.ToInt32(nbytes, 0);
edit: you actually can't do ~ on a byte[]. You can either do
for(int i = 0 ; i < nbytes.Length; i++)
nbytes[i] = ~nbytes[i];
or just not use a byte array at all.
For clarity's sake, do note that you can just do
n = ~n;
and skip doing any of the separation. But you specifically asked for the byte conversion.
Use these 2 methods
static byte[] GetBytes(string str)
{
byte[] bytes = new byte[str.Length * sizeof(char)];
System.Buffer.BlockCopy(str.ToCharArray(), 0, bytes, 0, bytes.Length);
return bytes;
}
static string GetString(byte[] bytes)
{
char[] chars = new char[bytes.Length / sizeof(char)];
System.Buffer.BlockCopy(bytes, 0, chars, 0, bytes.Length);
return new string(chars);
}
And then use them like this
byte[] bytes = GetTheBytes(str);
byte[] reversed = bytes.Reverse().ToArray();
var revStr = GetString(reversed)
I did it like this.Any suggestions on making it simpler.
int n = 100;
//Convert decimal to binary
string numAsString = Convert.ToString(n, 2);
char[] NumAsChar = numAsString.ToCharArray();
Console.WriteLine(numAsString);
//Invert bits
for (int i = 0; i < numAsString.Length; i++)
{
if (NumAsChar[i] == '0')
{
NumAsChar[i] = '1';
}
else
{
NumAsChar[i] = '0';
}
}
string NewNumAsString = new string(NumAsChar);
//Convert inverted binary num to decimal
long l = Convert.ToInt64(NewNumAsString, 2);
Console.WriteLine(NewNumAsString);
Console.WriteLine(l);
Have been searching the solution for two days.
I want to convert my wave 32 or 24 bits to a 16bit.
This my code after reading few stackoverflow topics):
byte[] data = Convert.FromBase64String("-- Wav String encoded --") (32 or 24 bits)
int conv = Convert.ToInt16(data);
byte[] intBytes = BitConverter.GetBytes(conv);
if (BitConverter.IsLittleEndian)
Array.Reverse(intBytes);
byte[] result = intBytes;
but when i writeAllbyte my result, nothing to hear...
Here is a method that cuts the least significant bits:
byte[] data = ...
var skipBytes = 0;
byte[] data16bit;
int samples;
if( /* data was 32 bit */ ) {
skipBytes = 2;
samples = data.Length / 4;
} else if( /* data was 24 bit */ ) {
skipBytes = 1;
samples = data.Length / 3;
}
data16bit = new byte[samples * 2];
int writeIndex = 0;
int readIndex = 0;
for(var i = 0; i < samples; ++i) {
readIndex += skipBytes; //skip the least significant bytes
//read the two most significant bytes
data16bit[writeIndex++] = data[readIndex++];
data16bit[writeIndex++] = data[readIndex++];
}
This assumes a little endian byte order (least significant byte is the first byte, usual for WAV RIFF). If you have big endian, you have to put the readIndex += ... after the two read lines.
You could implement your own conversion iterator for this task like so:
IEnumerable<byte> ConvertTo16Bit(byte[] data, int skipBytes)
{
int bytesToRead = 0;
int bytesToSkip = skipBytes;
int readIndex = 0;
while (readIndex < data.Length)
{
if (bytesToSkip > 0)
{
readIndex += bytesToSkip;
bytesToSkip = 0;
bytesToRead = 2;
continue;
}
if (bytesToRead == 0)
{
bytesToSkip = skipBytes;
continue;
}
yield return data[readIndex++];
bytesToRead--;
}
}
This way you don't have to create a new array if there is no need for it. And you could simply convert the data array to a new 16 bit array with the IEnumerable<T> extension methods:
var data16bit = ConvertTo16Bit(data, 1).ToArray();
Or if you don't need the array, you can iterate the data skipping the least significant bytes:
foreach (var b in ConvertTo16Bit(data, 1))
{
Console.WriteLine(b);
}
For testing purpose, I need to generate a random string, which is then encoded into byte array for transferring over the Web and decoded back to a result string. The test uses NUnit framework to compare the original string with the result string. Since the encoded byte array has to be friendly for Web, it is encoded with UTF-8.
The string is encoded into a byte array by Encoder.GetBytes from UTF8Encoding. The byte array is decoded to string by Decoder.GetChars from UTF8Encoding.
The original string needs to be generated randomly and contain any sequence of characters, which can be encoded/decoded using UTF-8 encoding.
My first attempt to generate the string was:
public static String RandomString(Random rnd, Int32 length) {
StringBuilder str = new StringBuilder(length);
for (int i = 0; i < length; i++)
str.Append((char)rnd.Next(char.MinValue, char.MaxValue));
return str.ToString();
}
The above code produces strings with invalid sequences to encode.
I found some suggestions on the web and improved the code:
public static String RandomString(Random rnd, Int32 length) {
StringBuilder str = new StringBuilder(length);
for (int i = 0; i < length; i++) {
char c = (char)rnd.Next(char.MinValue, char.MaxValue);
while (c >= 0xD800 && c <= 0xDFFF)
c = (char)rnd.Next(char.MinValue, char.MaxValue);
str.Append(c);
return str.ToString();
}
The above code has no problem with encoding, but decoding the byte array fails. Furthermore, I am not sure that the code can cover all possible cases.
Any suggestions, how to generate a random string with the given requirements in C#.
UPD: using a random string in encoding/decoding:
public static Encoder Utf8Encode = new UTF8Encoding(false, true).GetEncoder();
public static Decoder Utf8Decode = new UTF8Encoding(false, true).GetDecoder();
public unsafe void TestString(Random rnd, int length, byte* byteArray,
int arrayLenght) {
int encodedLen;
String str = RandomString(rnd, length);
fixed (char* pStr = str) {
encodedLen = Utf8Encode.GetBytes(pStr, str.Length, byteArray,
arrayLenght, true);
}
char* buffer = stackalloc char[8192];
int decodedLen = Utf8Decode.GetChars(byteArray, encodedLen, buffer,
8192, true);
String res = new String(buffer, 0, decodedLen);
Assert.AreEqual(str, res);
}
I have used the code below for generating random UTF-8 character byte sequences. I can't guarantee it captures every aspect of the UTF-8 spec, but it was valuable for my testing purposes, so I'm posting it here.
private static readonly (int, int)[] HeadByteDefinitions =
{
(1 << 7, 0b0000_0000),
(1 << 5, 0b1100_0000),
(1 << 4, 0b1110_0000),
(1 << 3, 0b1111_0000)
};
static byte[] RandomUtf8Char(Random gen)
{
const int totalNumberOfUtf8Chars = (1 << 7) + (1 << 11) + (1 << 16) + (1 << 21);
int tailByteCnt;
var rnd = gen.Next(totalNumberOfUtf8Chars);
if (rnd < (1 << 7))
tailByteCnt = 0;
else if (rnd < (1 << 7) + (1 << 11))
tailByteCnt = 1;
else if (rnd < (1 << 7) + (1 << 11) + (1 << 16))
tailByteCnt = 2;
else
tailByteCnt = 3;
var (range, offset) = HeadByteDefinitions[tailByteCnt];
var headByte = Convert.ToByte(gen.Next(range) + offset);
var tailBytes = Enumerable.Range(0, tailByteCnt)
.Select(_ => Convert.ToByte(gen.Next(1 << 6) + 0b1000_0000));
return new[] {headByte}.Concat(tailBytes).ToArray();
}
I have to convert a string to byte (16 bit) in JavaScript. I can do this in .net in following code but I have to change this for old classic asp App which uses JavaScript.
string strShared_Key = "6fc2e550abc4ea333395346123456789";
int nLength = strShared_Key.Length;
byte[] keyMAC = new byte[nLength / 2];
for (int i = 0; i < nLength; i += 2)
keyMAC[i / 2] = Convert.ToByte(strShared_Key.Substring(i, 2), 16);
This is the JavaScript function but doesn't return same out put as above .net code.
function String2Bin16bit(inputString) {
var str = ""; // string
var arr = []; // byte array
for (var i = 0; i < inputString.length; i += 2) {
// get chunk of two characters and parse to number
arr.push(parseInt(inputString.substr(i, 2), 16));
}
return arr;
}
You want parseInt(x, 16) which will read x as a number and parse it as such bearing in mind that it's in base 16.
var str = "aabbcc"; // string
var arr = []; // byte array
for(var i = 0; i < str.length; i += 2) {
arr.push(parseInt(str.substr(i, 2), 16)); // get chunk of two characters and parse to number
}
How would I go about converting a bytearray to a bit array?
The obvious way; using the constructor that takes a byte array:
BitArray bits = new BitArray(arrayOfBytes);
It depends on what you mean by "bit array"... If you mean an instance of the BitArray class, Guffa's answer should work fine.
If you actually want an array of bits, in the form of a bool[] for instance, you could do something like that :
byte[] bytes = ...
bool[] bits = bytes.SelectMany(GetBits).ToArray();
...
IEnumerable<bool> GetBits(byte b)
{
for(int i = 0; i < 8; i++)
{
yield return (b & 0x80) != 0;
b *= 2;
}
}
public static byte[] ToByteArray(this BitArray bits)
{
int numBytes = bits.Count / 8;
if (bits.Count % 8 != 0) numBytes++;
byte[] bytes = new byte[numBytes];
int byteIndex = 0, bitIndex = 0;
for (int i = 0; i < bits.Count; i++) {
if (bits[i])
bytes[byteIndex] |= (byte)(1 << (7 - bitIndex));
bitIndex++;
if (bitIndex == 8) {
bitIndex = 0;
byteIndex++;
}
}
return bytes;
}
You can use BitArray to create a stream of bits from a byte array. Here an example:
string testMessage = "This is a test message";
byte[] messageBytes = Encoding.ASCII.GetBytes(testMessage);
BitArray messageBits = new BitArray(messageBytes);
byte number = 128;
Convert.ToString(number, 2);
=> out: 10000000
public static byte[] ToByteArray(bool[] byteArray)
{
return = byteArray
.Select(
(val1, idx1) => new { Index = idx1 / 8, Val = (byte)(val1 ? Math.Pow(2, idx1 % 8) : 0) }
)
.GroupBy(gb => gb.Index)
.Select(val2 => (byte)val2.Sum(s => (byte)s.Val))
.ToArray();
}