Append byte array to StringBuilder - c#

I am trying append byte array to string builder, but not getting the result.
I have to return it to the web service in the form of byte array.
See my code below:
WriteableBitmap wbitmp = new WriteableBitmap((BitmapImage)image1.Source);
wbitmp.SaveJpeg(ms, 400, 400, 0, 100);
bytearray = ms.ToArray();
Now I am appending byte array to string builder.
StringBuilder stb1 = new StringBuilder();
stb1.Append("");
XML String looks like this:
StringBuilder stb1 = new StringBuilder();
stb1.Append("<Title>");
stb1.Append("</Title>");
stb1.Append("");
Thanks in advance!!

Having so little informations it's hard to give some useful response. It looks like you're trying to push some image in HTML/XML-like format. In that case probably you should just convert your binary into Base64 string:
var byteString = System.Convert.ToBase64String(byteArray);
But if your web service requires it to be the raw byte array, then you shouldn't use string builder in the first place.

Related

Converting EBCDIC to ASCII in C#

I've been trying to convert a string containing EBCDIC characters to ASCII, this is my code so far:
string data = "F2F1F0F2F2F5F4";
Encoding ascii = Encoding.ASCII;
Encoding ebcdic = Encoding.GetEncoding("IBM037");
byte[] ebcdicData = ebcdic.GetBytes(data);
// Convert to ASCII
byte[] ebcdicDataConverted = Encoding.Convert(ebcdic, ascii, ebcdicData);
string sample = ascii.GetString(ebcdicDataConverted);
But I was expecting that the variable sample contained this value: 2102254
Instead, it shows the same value as data F2F1F0F2F2F5F4
Maybe I'm not understanding how this works, or I'm just burnt out, this page contains the conversion table that:
translates 8-bit EBCDIC characters to 7-bit ASCII
Is the Enconding that I'm using the right one? Am I doing something wrong?
Thanks
I converted de string to byte using a for like this:
byte[] ebcdicData = new byte[data.Length / 2];
for (int i = 0; i < ebcdicData.Length; i++)
{
string chunk = data.Substring(i * 2, 2);
ebcdicData[i] = Convert.ToByte(chunk, 16);
}
This way it worked. Idk why GetBytes() does something different, I will take a look at the documentation

Binary data being mangled when transfered in string between functions

hope this is a trivial problem that can be solved easily.
Im trying to move the contents of a binary file from one location to another but with a twist: I need to transfer it as a string and this is where the file ends up a bit different than the source.
The reason for transfer it via a string is that the code that loads a file and the code that saves a file is only communicating through a host (this is a C# MEF application) and the interface forces me to send data via a string, nothing else.
So what I'm doing is this (pseudo'ish, only core functionality remains):
// This part loads a binary file
string output = string.Empty; // The data to be transfered
byte[] fileContent; // The binary file content
fileContent = File.ReadAllBytes(fileName);
output = Encoding.Default.GetString(fileContent);
//output = Convert.ToBase64String(fileContent);
//output = Encoding.UTF7.GetString(fileContent);
//output = Encoding.UTF8.GetString(fileContent);
//output = Encoding.UTF32.GetString(fileContent);
//output = Encoding.ASCII.GetString(fileContent);
//output = Encoding.BigEndianUnicode.GetString(fileContent);
//output = Encoding.Unicode.GetString(fileContent);
Then the string is transfered to its destination part:
// This part saves a binary file
string input; // This is the data recieved
byte[] content = Encoding.Unicode.GetBytes(input);
File.WriteAllBytes("c:\test.png", content);
The destination file now differs a bit from the source file, a byte here and there if I look at the files with a propriate tool. The encoding I'm using in the sending part that works the best is Unicode.
What am I missing here?
Like was said in the comments, the safest option is using Base-64. But if you want a little more efficiency, any simple 8-bit encoding without gaps should work, as long as you use the same encoding to decode it. By simple I mean not any of the Unicode multi-byte encodings. And I believe ASCII also won't work since it's 7-bit.
Note on efficiency: Each byte is actually being stored in 2 bytes since strings in C# are stored internally in unicode. But with Base-64 you are using 8 bytes for every 3 bytes of the binary.
I Tried using Encoding.GetEncoding(437) and it worked on a local system:
var b = new byte[256];
for (int i = 0; i < 256; i++)
b[i] = (byte)i;
var encoding = System.Text.Encoding.GetEncoding(437);
var s = encoding.GetString(b);
var b2 = encoding.GetBytes(s);
for (int i = 0; i < 256; i++)
if (b2[i] != i)
Console.WriteLine("Error at " + i);

Problems with writing bytes format of string data in Text File in C#

I have a text file stored locally. I want to store string data in binary format there and then retrieve the data again. In the following code snippet, I have done the conversion.
using System;
using System.Collections.Generic;
using System.IO;
using System.Text;
class ConsoleApplication
{
const string fileName = "AppSettings.dat";
static void Main()
{
string someText = "settings";
byte[] byteArray = Encoding.UTF8.GetBytes(someText);
int byteArrayLenght = byteArray.Length;
using (BinaryWriter writer = new BinaryWriter(File.Open(fileName, FileMode.Create)))
{
writer.Write(someText);
}
byte[] x = new byte[byteArrayLenght];
if (File.Exists(fileName))
{
using (BinaryReader reader = new BinaryReader(File.Open(fileName, FileMode.Open)))
{
x = reader.ReadBytes(byteArrayLenght);
}
string str = Encoding.UTF8.GetString(x);
Console.Write(str);
Console.ReadKey();
}
}
}
In the AppSettings.dat file the bytes are written in the following way
But when I have assigned some random value in a byte array and save it in a file using BinaryWriter as I have done in the following code snippet
const string fileName = "AppSettings.dat";
static void Main()
{
byte[] array = new byte[8];
Random random = new Random();
random.NextBytes(array);
using (BinaryWriter writer = new BinaryWriter(File.Open(fileName, FileMode.Create)))
{
writer.Write(array);
}
}
It's actually saved the data in binary format in the text file, shown in the picture.
I don't understand why (in my first case) the byte data converted from string showing human readable format where I want to save the data in non-readable byte format(later case). What's the explanation regarding this?
Is there any way where I can store string data in binary format without approaching brute force?
FYI - I don't want to keep the data in Base64String format, I want it to be in binary format.
If security isn't a concern, and you just don't want the average usage to find your data while meddling into the settings files, a simple XOR will do:
const string fileName = "AppSettings.dat";
static void Main()
{
string someText = "settings";
byte[] byteArray = Encoding.UTF8.GetBytes(someText);
for (int i = 0; i < byteArray.Length; i++)
{
byteArray[i] ^= 255;
}
File.WriteAllBytes(fileName, byteArray);
if (File.Exists(fileName))
{
var x = File.ReadAllBytes(fileName);
for (int i = 0; i < byteArray.Length; i++)
{
x[i] ^= 255;
}
string str = Encoding.UTF8.GetString(x);
Console.Write(str);
Console.ReadKey();
}
}
It takes advantage of an interesting property of character encoding:
In ASCII, the 0-127 range contains the most used characters (a to z, 0 to 9) and the 128-256 range contains only special symbols and accents
For compatibility reasons, in UTF-8 the 0-127 range contains the same characters as ASCII, and the 128-256 range have a special meaning (it tells the decoder that the characters are encoded into multiple bytes)
All I do is flipping the strong-bit of each byte. Therefore, everything in the 0-127 range ends up in the 128-256 range, and vice-versa. Thanks to the property I described, no matter if the text-reader tries to parse in ASCII or in UTF-8, it will only get gibberish.
Please note that, while it doesn't produce human-readable content, it isn't secure at all. Don't use it to store sensitive data.
The notepad just reads your binary data and converts it to UTF8 text.
This code snippet would give you the same result.
byte[] randomBytes = new byte[20];
Random rand = new Random();
rand.NextBytes(randomBytes);
Console.WriteLine(Encoding.UTF8.GetString(randomBytes));
If you want to stop people from converting your data back to a string. then you need to encrypt your data. Here is a project that can help you with that.
But they are still able to read the data in a text editor because it converts your encrypted data to UFT8. They can't Convert it back to usable data unless they have to key to decrypt your data.

Correct Encoding style for convert bytes[] to string

I am trying to convert a byte array to a string, all i need is to simply convert the number in byteArray to string Foreg. "12459865..."
I am trying to do this with this:
fileInString = Encoding.UTF8.GetString(fileInBytes, 0, fileInBytes.Length);
fileInBytes looks like this: 1212549878563212450045....
But resultant fileInString looks like this ID3\0\0\0\0JTENC\0\0\0#\0\0WXXX\0... and alot of weird characters.
I tried different Encoding styles including default but all insert some weird characters into it.
The only option I have is to loop and cast each member into the string
while (currbyte != -1)
{
currbyte = fileStream.ReadByte();
//fileInBytes[i++] = (byte)currbyte;
fileInString += currbyte.ToString();
progressBar1.Value = i++;
}
But this is TOO slow. Please tell me how can I convert by byte array into string using Encoding.....GetString
If you want to encode a byte[] as a string, base64 encoding with Convert's ToBase64String and FromBase64String methods is a good way to do this:
string fileInString = Convert.ToBase64String(fileInBytes);
byte[] asBytesAgain = Convert.FromBase64String(fileInString);
Your encoding using fileInString += currbyte.ToString(); appears to be an ambiguous base 10 encoding. E.g. from what I can tell, the arrays with values { 1, 10, 255 } and { 110, 255 } would be encoded the same: "110255", and so cannot be unambiguously changed back into a byte[].
There is no encoding that can achieve what you want.
You need to loop through the bytes as you do.
To make it faster, use a StringBuilder and don't update the ProgressBar too often:
var builder = new StringBuilder();
for (var i = 0; i < fileStream.Length; i++)
{
builder.Append(fileStream.ReadByte().ToString());
if (i % 1000 == 0) //update the ProgressBar every 1000 bytes for example
{
progressBar.Value = i;
}
}
var result = builder.ToString();
Well, since you are referring to a byte ARRAY, fileInBytes really can't look like '1212549878563212450045', but rather like this:
{byte[22]}
[0]: 1
[1]: 2
etc.
If that is indeed the case you should be able fileInString using StringBuilder as follows:
StringBuilder sb = new StringBuilder();
foreach (var b in fileInBytes)
{
sb.Append(b.ToString());
}
fileInString = sb.ToString();

Encrypt Base64+XOR+UTF8 in JAVA, decrypt in C#

I have simple encrypt function which takes string, convert it to bytes, xor it and apply base64.
JAVA:
String key = "1234";
public String encrypt(String plainText) throws Exception {
byte [] input = plainText.getBytes("UTF8");
byte [] output = new byte[input.length];
for(int i=0; i<input.length; i++)
output[i] = ((byte)(input[i] ^ key.charAt(i % key.length())));
String utf8 = new String(output);
return Utils.encode(utf8);
}
Then I save it to a file and open it in another application in C# using this decrypting method:
C#:
string key="1234";
public string Decrypt(string CipherText)
{
var decoded = System.Convert.FromBase64String(CipherText);
var dexored = xor(decoded, key);
return Encoding.UTF8.GetString(dexored);
}
byte[] xor(byte[] text, string key)
{
byte[] res = new byte[text.Length];
for (int c = 0; c < text.Length; c++)
{
res[c] = (byte)((uint)text[c] ^ (uint)key[c % key.Length]);
}
return res;
}
Problem is that accented characters like ěščřžýáí fail to decode.
Do you have any idea how to determine from which part the problem comes from or how to find it out? Looks to me that it has something to do with UTF-8.
I don't need suggestions for a better encryption. I have already working AES but I want to switch to xored base64 due to performance issues.
Thank you.
This is the problem:
String utf8 = new String(output);
return Utils.encode(utf8);
You should be using base64 here on output, rather than constructing a string out of the now-not-really-text data. It's possible that Utils.encode performs base64 encoding, having converted the input string back to byte for some reason - but fundamentally you shouldn't be constructing a string with your encrypted bytes using the String constructor.
If your Utils class has an encode(byte[]) method - and if that really does base64-encode the data (it's very frustrating to only have half of the code you're using) then you can just use:
return Utils.encode(output);

Categories

Resources