Converting between string bit and string char out to ASCII - c#

The problem is to convert non-ASCII characters into binary and vice versa
string of bits to string of char
string result = "";
while (value.Length > 0)
{ var first8 = value.Substring(0, 8);
value = value.Substring(8);
var number = Convert.ToInt64(first8, 2);
result += (char)number;
and
string of char to string of bits
string S = "";
byte[] asciiBytes = Encoding.ASCII.GetBytes(value);
for (int i = 0; i < asciiBytes.Length; i++)
for (int j = 0; j < 8; j++)
{
S += (asciiBytes[i] & 0x80) > 0 ? "1" : "0";
asciiBytes[i] <<= 1;
}
return S;
Can you correct my code if it is the best?

You could split the string into 8 character string, convert it to a byte using Convert.ToByte and then use Encoding.ASCII.GetString to convert the byte array to string.
var str = "0011111110110101001111110110111100111111110110110011111101101111";
var byteArray = Enumerable.Range(0, str.Length / 8)
.Select(x => Convert.ToByte(str.Substring(x * 8, 8), 2)).ToArray();
var convertedString = Encoding.ASCII.GetString(byteArray);
For 8-Bit
Encoding enc = Encoding.GetEncoding(1252);
var convertedString = enc.GetString(byteArray);
Output
?µ?o?Û?o

Related

C# - Fast Method to Convert Byte Array to Hex String

I want to convert a Byte array as fast as possible to a Hex String.
So through my previous question, I found the following code:
private static readonly uint[] _lookup32 = CreateLookup32();
private static uint[] CreateLookup32()
{
var result = new uint[256];
for (int i = 0; i < 256; i++)
{
string s = i.ToString("X2");
result[i] = ((uint)s[0]) + ((uint)s[1] << 16);
}
return result;
}
private static string ByteArrayToHexViaLookup32(byte[] bytes)
{
var lookup32 = _lookup32;
var result = new char[bytes.Length * 2];
for (int i = 0; i < bytes.Length; i++)
{
var val = lookup32[bytes[i]];
result[2 * i] = (char)val;
result[2 * i + 1] = (char)(val >> 16);
}
return new string(result);
}
This works great but the Issue with it is that the output string looks like this:
output: 0F42000AAD24120024
but i need it like this: 0F 42 00 0A AD 24 12 00 24
As my coding knowledge is kinda meh on "cryptic" looking algorithms I don't know where and how to add code so it would add a blank space between each 2 Bytes - (Hexoutputstring + " ") to it.
I could loop trough the string and add every 2 charackters a blank space but that would hugely increase the amount of time it needs to give me a useful results as appending strings is slow.
Could someone help me with the code above? Thanks you :)
private static string ByteArrayToHexViaLookup32(byte[] bytes)
{
var lookup32 = _lookup32;
var byteCount = bytes.Length;
var result = new char[3* byteCount - 1];
for (int i = 0; i < byteCount; i++)
{
var val = lookup32[bytes[i]];
int index = 3 * i;
result[index] = (char)val;
result[index + 1] = (char)(val >> 16);
if (i < byteCount - 1) result[index + 2] = ' ';
}
return new string(result);
}
If performance is one of your main concerns, I would approach it something like this:
private static readonly char[] digits = new char[] { '0', '1', '2', '3', '4', '5', '6', '7', '8', '9', 'A', 'B', 'C', 'D', 'E', 'F' };
private static string ByteArrayToHexViaLookup32(byte[] bytes)
{
char[] buffer = new char[bytes.Length * 3];
int index = 0;
for (int i = 0; i < bytes.Length; i++)
{
if (index > 0)
buffer[index++] = ' ';
buffer[index++] = digits[(bytes[i] >> 4) & 0xf];
buffer[index++] = digits[bytes[i] & 0xf];
}
return new string(buffer, 0, index);
}
The following version doesn't require any lookup array, but I'm not sure if it's as fast.
private static string ByteArrayToHexViaLookup32(byte[] bytes)
{
char[] buffer = new char[bytes.Length * 3];
int index = 0;
for (int i = 0; i < bytes.Length; i++)
{
if (index > 0)
buffer[index++] = ' ';
buffer[index++] = GetDigit((bytes[i] >> 4) & 0xf);
buffer[index++] = GetDigit(bytes[i] & 0xf);
}
return new string(buffer, 0, index);
}
private char GetDigit(int value)
{
if (value < 10)
return (char)('0' + value);
return (char)('7' + value);
}
Both versions insert a space between bytes.
private static string ByteArrayToStringHex(byte[] bytes)
{
string hexValue = BitConverter.ToString(bytes);
hexValue = hexValue.Replace("-", " ");
return hexValue;
}
I think it results the same values as which you want

convert string to int32 with LSB c#

I have binary string and Int32 array.
How to convert binary string (each 11 bit of string) to Int32 value(11 LSB) on int array?
I tried this:
for (int i = 0; i <(string.Length); i++) {
if (count1 >= string.Length - 21)
break;
else
string = string.Insert(count1, "000000000000000000000");
count1 += 31;
}
int numOfBytes = string.Length / 32;
int[] ints = new int[numOfBytes];
for (int i = 0; i < numOfBytes; ++i) {
ints[i] = Convert.ToInt32(string.Substring(32 * i, 32), 2);
}
but it returns false values
Int32[] BinaryStringToInt32Array(const string binaryString, const int bitCount)
{
Int32[] results = new Int32[binaryString.Length/bitCount];
for (int i = 0; i < results.Length; i++)
{
string str = binaryString.Substring(i * bitCount, bitCount);
results[i] = Convert.ToInt32(str, 2);
}
return results;
}
Take note this function will ignore all leftover bits if binaryString's length is not multiple of bitCount. In your case, bitCount is 11.

C# Function to translate Binary-Code

Right now I try to write a C# Program to translate 8 Base Binary into Text.
But I guess I am not experienced enough with C# to truly make it Work.
I think the code I come up with, should, from a logical Point-of-View somewhat do what I want, but the Syntax isn't properly doing it, since don´t know it better.
This is what I have so far:
using System;
using System.Linq;
using System.Text;
class binaryTranslate
{
public int convertBin(string CodeInput)
{
int [] code = CodeInput.ToArray();
int CodeCount = code.ToString().Length;
int EightBaseSegAmount = CodeCount / 8;
int ByteCapacity = 8;
StringBuilder translated = new StringBuilder();
for (var i = 1; i < EightBaseSegAmount + 1; i++)
{
StringBuilder Byte = new StringBuilder(ByteCapacity);
int ByteStart = (i * 8) - 8;
int ByteEnd = (i * 8) - 1;
int ByteIncrement = 1;
for (var j = ByteStart ; j < ByteEnd + 1; j++)
{
Byte.Append(code[j]);
}
for (var k = 0; k > 7; k++)
{
int BitValue = 128;
if (Byte[k] == 1)
{
if (k > 0)
{
int Squared = Math.Pow(2, k);
ByteIncrement += BitValue / Squared;
}
else
{
ByteIncrement += BitValue;
}
}
}
char toSymbol = Convert.ToChar(ByteIncrement);
translated.Append(toSymbol);
}
return translated;
}
public static int Main()
{
convertBin("010010000110000101101100011011000110111100100001");
}
}
First of all, your code won't compile. Here are the errors/mistakes.
The first one is, at the first line of your function, you are converting the input string to an array using String.ToArray(), which returns a char[] but your try to assign it to a variable (code) typed int[]. You can solve this by replacing the int[] with either char[] or var.
The second one is, inside the second for loop (k = 0; k > 7), you use Math.Pow() and assign it's return value to an int variable (Squared). But Math.Pow returns double. You can solve this by casting the return value of Math.Pow to int. Like; int Squared = (int)Math.Pow(2, k);
The last thing is not easily solvable like the first two because, your code is not exactly correct. You are trying to return something called translated, which is a variable of type StringBuilder. But your function is defined to return an int.
Now these were compile errors. There are a bunch of logical and decision errors/mistakes. Your algorithm also isn't very correct.
Here is a sample code you can use/examine. I'd like to help you further, why your code was incorrect, what was your design mistakes etc. if you want to.
class binaryTranslate
{
public enum IncompleteSegmentBehavior
{
Skip = 0,
ZerosToStart = 1,
ZerosToEnd = 2
}
private byte ConvertBinstrToByte(string sequence)
{
if (string.IsNullOrEmpty(sequence))
return 0; // Throw?
if (sequence.Length != sizeof(byte) * 8)
return 0; // Throw?
const char zero = '0';
const char one = '1';
byte value = 0;
for (int i = 0; i < sequence.Length; i++)
{
if (sequence[i] != zero && sequence[i] != one)
return 0; // Throw
value |= (byte)((sequence[i] - zero) << (7 - i));
}
return value;
}
private string HandleIncompleteSegment(string segment, int segmentSize, IncompleteSegmentBehavior behavior)
{
string result = null;
var zeroAppender = new StringBuilder();
for (int i = 0; i < segmentSize - segment.Length; i++)
zeroAppender.Append('0');
var zeros = zeroAppender.ToString();
switch (behavior)
{
case IncompleteSegmentBehavior.Skip:
break;
case IncompleteSegmentBehavior.ZerosToStart:
result = zeros + result;
break;
case IncompleteSegmentBehavior.ZerosToEnd:
result = result + zeros;
break;
default:
break;
}
return result;
}
public byte[] ConvertBinstrToBytes(string binarySequence, IncompleteSegmentBehavior behavior = IncompleteSegmentBehavior.Skip)
{
var segmentSize = sizeof(byte) * 8;
var sequenceLength = binarySequence.Length;
var numberOfBytes = (int)Math.Ceiling((double)sequenceLength / segmentSize);
var bytes = new byte[numberOfBytes];
for (int i = 0; i < numberOfBytes; i++)
{
var charactersLeft = sequenceLength - i * segmentSize;
var segmentLength = (charactersLeft < segmentSize ? charactersLeft : segmentSize);
var segment = binarySequence.Substring(i * segmentSize, segmentLength);
if (charactersLeft < segmentSize)
{
segment = HandleIncompleteSegment(segment, segmentSize, behavior);
if (segment == null)
continue;
}
bytes[i] = ConvertBinstrToByte(segment);
}
return bytes;
}
}
This code passes these assertions.
var bytes = new binaryTranslate()
.ConvertBinstrToBytes("00000000");
Assert.Equal(bytes.Length, 1);
Assert.Equal(bytes[0], 0b00000000);
bytes = new binaryTranslate()
.ConvertBinstrToBytes("10000000");
Assert.Equal(bytes.Length, 1);
Assert.Equal(bytes[0], 0b10000000);
bytes = new binaryTranslate()
.ConvertBinstrToBytes("11111111");
Assert.Equal(bytes.Length, 1);
Assert.Equal(bytes[0], 0b11111111);
bytes = new binaryTranslate()
.ConvertBinstrToBytes("00000001");
Assert.Equal(bytes.Length, 1);
Assert.Equal(bytes[0], 0b00000001);
bytes = new binaryTranslate()
.ConvertBinstrToBytes("1100110000110011");
Assert.Equal(bytes.Length, 2);
Assert.Equal(bytes[0], 0b11001100);
Assert.Equal(bytes[1], 0b00110011);
If you are really converting to a string the code should look like this
namespace binaryTranslate
{
class Program
{
static void Main(string[] args)
{
//convertBin("01001000 01100001 01101100 01101100 01101111 00100001");
string results = BinaryTranslate.convertBin(new byte[] { 0x44, 0x61, 0x6c, 0x6c, 0x6f, 0x21 });
}
}
public class BinaryTranslate
{
public static string convertBin(byte[] CodeInput)
{
return string.Join("", CodeInput.Select(x => x.ToString("X2")));
}
}
}
this should do the trick.
public static string FromBinary(string binary)
{
int WordLength = 8;
binary = binary.Replace(' ', '');
while(binary.Length % WordLength != 0)
binary += "0";
string output = String.Empty;
string word = String.Empty;
int offset = 0;
while(offset < binary.Length)
{
int tmp = 0;
word = binary.Substring(offset, 8);
for(int i=0; i<(WordLength - 1); i++)
if(word[i] == '1')
tmp += (int) Math.Pow(2, i);
output += Convert.ToChar(tmp);
offset += WordLength;
}
return output;
}

convert UTF-8 HEX into emoji representation

How do we convert UTF-8 HEX: EE 94 93 into equivalent Emoji representation: 1f1e8, 1f1f3
taken from here: http://www.iemoji.com/view/emoji/175/places/regional-indicator-symbol-letters-cn
public static string ConvertEmoji2UnicodeHex(string emoji)
{
if (string.IsNullOrWhiteSpace(emoji))
return emoji;
byte[] bytes = Encoding.UTF8.GetBytes(emoji);
string firstItem = Convert.ToString(bytes[0], 2);
int iv;
if (bytes.Length == 1)
{
iv = Convert.ToInt32(firstItem, 2);
}
else
{
StringBuilder sbBinary = new StringBuilder();
sbBinary.Append(firstItem.Substring(bytes.Length + 1).TrimStart('0'));
for (int i = 1; i < bytes.Length; i++)
{
string item = Convert.ToString(bytes[i], 2);
item = item.Substring(2);
sbBinary.Append(item);
}
iv = Convert.ToInt32(sbBinary.ToString(), 2);
}
return Convert.ToString(iv, 16).PadLeft(4, '0');
}

encryption , String to Byte conversion

I have a string and i changed it to hex values , so i want to store them in the Byte array , but it gives me error of "Input String is not correct format". here is my code :
byte[] PlainText = new byte[16];
byte[] MasterKey = new byte[16];
string input = "Hello";
char[] values = input.ToCharArray();
int i =0;
foreach (char letter in values)
{
int value = Convert.ToInt32(letter);
string hexout = String.Format("{0:X}", value);
PlainText[i++] = Convert.ToByte(hexout);
}
Change your intial code
byte[] PlainText = new byte[16];
byte[] MasterKey = new byte[16];
string input = "Hello";
char[] values = input.ToCharArray();
int i =0;
string hexout=string.empty;
foreach (char letter in values)
{
int value = Convert.ToInt32(letter);
hexout+= String.Format("{0:X}", value);
}
plaintext=StringToByteArray(hexout);
for converting hex to byte array
public static byte[] StringToByteArray(String hex)
{
int NumberChars = hex.Length;
byte[] bytes = new byte[NumberChars / 2];
for (int i = 0; i < NumberChars; i += 2)
bytes[i / 2] = Convert.ToByte(hex.Substring(i, 2), 16);
return bytes;
}
or
For parsing long string
public static byte[] StringToByteArray(String hex)
{
int NumberChars = hex.Length/2;
byte[] bytes = new byte[NumberChars];
using (var sr = new StringReader(hex))
{
for (int i = 0; i < NumberChars; i++)
bytes[i] =
Convert.ToByte(new string(new char[2]{(char)sr.Read(), (char)sr.Read()}), 16);
}
return bytes;
}
var bytes=System.Text.Encoding.UTF8.GetBytes(yourString);

Categories

Resources