Converting content in a textfile from base64string - c#

i am practicing encryption and decryption.
After i have decrypted some data, i convert the bytes to base64string and store it in a textfile.
After some time i want to decrypt it again, but for that to work i have to convert the content from base64string to bytes again.
I tried with this:
string path = #"C:\encrypt.txt";
string myfile = File.ReadAllText(path);
byte[] convertion = Convert.FromBase64String(myfile);
That will give me an error because the text is actually not a base64string.
Is there anyway to do an convertion?

you can use the following functions for saveing and read base64 strings
public static void WriteAllBase64Text(string path, string text)
{
File.WriteAllText(path, Convert.ToBase64String(Encoding.UTF8.GetBytes(text)));
}
public static string ReadAllBase64Text(string path)
{
var bytes=File.ReadAllText(path);
var encoded = System.Convert.FromBase64String(bytes);
return System.Text.Encoding.UTF8.GetString(encoded);
}

Related

How to convert files to base 64

I am new to C# and I'm really having a trouble on finding a way to convert files(doc, xlsx, jpg, png, etc.) to base64.
So far, what I got is a way to retrieve the path using Directory.GetFiles()....but this is not the result I was expecting.
What I expected it to do is get the data of the files(not the path) and convert it to base64 so that I can display it on my front-end.
Any idea?
Try this:
foreach (string filePath in Directory.GetFiles(#"C:\directory"))
{
Byte[] bytes = File.ReadAllBytes(filePath);
String file = Convert.ToBase64String(bytes);
}

Saving file base64 string in SQL Server database

I working on a file upload, on upload the file is converted to base64 string and this string is sent to an API that saves the string in a SQL Server database table.
The type of the base64 string column in the table is NVARCHAR(MAX) but I noticed when the file is saved, SQL truncates the string such that when you convert online using base64 to image decoder, it produces just part of the whole picture and I noticed that the base64 string before saving to SQL is about 72,000 in character count but after saving it SQL it is reduced to about 4,000 which is responsible for the incomplete image decoded.
Why does SQL truncate the base64 string and what can I do to help my situation?
Here is my base64 conversion code:
public async Task<IActionResult> UploadFile()
{
string base64string;
var myFile = Request.Form.Files["claimFile"];
var filetype = myFile.ContentType;
var filepath = Path.GetTempFileName();
using (var stream = System.IO.File.Create(filepath))
{
await myFile.CopyToAsync(stream);
}
byte[] imageByte = System.IO.File.ReadAllBytes(filepath);
base64string = Convert.ToBase64String(imageByte);
HttpContext.Session.SetString("Base64Str", base64string);
return new EmptyResult();
}
I suspect the problem might be that, by specifying NVARCHAR (16-byte characters), you're inadvertently "corrupting" the string.
TWO SUGGESTIONS:
Redefine the column as VARCHAR(MAX)
Save a uuencoded string, then read the text back and see if the saved/retrieved string values match.
Look here:
https://dba.stackexchange.com/questions/212160/why-do-i-get-incorrect-characters-when-decoding-a-base64-string-to-nvarchar-in-s
Please post back what you find!
Also - out of curiosity - how are you doing the Base64 encoding in the first place?
I read and write to db but to nText column type with the following just fine: you may convert from vb.net to c# if needed.
#Region "write/read files to db"
Public Function SaveToDB(ByVal fullfilename As String) As String
Dim filedata = Convert.ToBase64String(File.ReadAllBytes(fullfilename))
Return filedata
End Function
Public Function LoadFromDB(ByVal filedata As String, ByVal fullfilename As String) As String
Dim filepath As String = (fullfilename)
File.WriteAllBytes(filepath, Convert.FromBase64String(filedata))
Return filepath
End Function
#End Region
C# version:
class SurroundingClass
{
public string SaveToDB(string fullfilename)
{
var filedata = Convert.ToBase64String(File.ReadAllBytes(fullfilename));
return filedata;
}
public string LoadFromDB(string filedata, string fullfilename)
{
string filepath = (fullfilename);
File.WriteAllBytes(filepath, Convert.FromBase64String(filedata));
return filepath;
}
}

What can cause Base64 decoding throw FormatException

I am using C# and .NET to encode and decode base64 string. The following are snippets of my code:
Base64 encoding:
using (var stream = new MemoryStream())
…...
return Convert.ToBase64String(stream.ToArray());
}
Base64 decoding
byte[] bytes = Convert.FromBase64String(messageBody);
My code fails 99% of the time with 1% chance to succeed though. The stack trace is as follows:
5xx Error Returned:System.FormatException: The input is not a valid Base-64 string as it contains a non-base 64 character, more than two padding characters, or an illegal character among the padding characters. at System.Convert.FromBase64_ComputeResultLength(Char inputPtr, Int32 inputLength) at System.Convert.FromBase64CharPtr(Char* inputPtr, Int32 inputLength) at System.Convert.FromBase64String(String s)*
Does anyone know what can cause base64 decoding to fail? My encoding and decoding methods are symmetric and I am really confused about what can be the root cause for this issue?
Thanks for all your replies.
It turned out there were still some old messages in Json format that previously failed in getting delivered and kept retrying in our system; however the new code change of our receiving side got deployed and our receiving side starts to expect messages in protobuf format which results in Deserialization failure when receiving old Json format messages.
In order to debug an issue like this I usually write some tests or create a console app to watch the variables as they change from function to function.
One of the possible scenario's for base64 decoding to fail is if the decoder input is HTMLEncoded. This is common when you pass an encrypted string into a URL for example. It will automatically be HTML encoded and then it sometimes can and sometimes can't be decoded depending on the characters that the encoded output has.
Here's a simple console app to demonstrate this.
class Program
{
static void Main(string[] args)
{
string input = "testaa";
TestEncodeDecode("test");
TestEncodeDecode("testa");
TestEncodeDecode("testaa");
Console.ReadLine();
}
private static void TestEncodeDecode(string input)
{
string encoded = Encode(input);
Console.WriteLine($"Encoded: {encoded}");
string htmlEncoded = WebUtility.UrlEncode(encoded);
Console.WriteLine($"htmlEncoded: {htmlEncoded}");
string decodedString = Decode(htmlEncoded);
Console.WriteLine($"Decoded: {decodedString}");
Console.WriteLine();
}
private static string Decode(string htmlEncoded)
{
try
{
byte[] decoded = Convert.FromBase64String(htmlEncoded);
return Encoding.ASCII.GetString(decoded);
}
catch(Exception)
{
return "Decoding failed";
}
}
private static string Encode(string input)
{
byte[] bytes = Encoding.ASCII.GetBytes(input);
using (var stream = new MemoryStream())
{
stream.Write(bytes);
return Convert.ToBase64String(stream.ToArray());
}
}
}
You'll see that the first two arguments ("test" and "testa") fail to decode, but the third ("testaa") will succeed.
In order to "fix" this, change the Decode method as follows:
private static string Decode(string htmlEncoded)
{
try
{
string regularEncodedString = WebUtility.UrlDecode(htmlEncoded);
byte[] decoded = Convert.FromBase64String(regularEncodedString);
return Encoding.ASCII.GetString(decoded);
}
catch(Exception)
{
return "Decoding failed";
}
}

decode base64 in c# when encoding in python

I encode data of images in base64 using python before sending it to the server which is written in C#. The data that is received is identical to the data that is being sent. However, when I decode the encoded string I get a different result.
Here is the code that takes a screenshot and encodes it in base64:
screen_shot_string_io = StringIO.StringIO()
ImageGrab.grab().save(screen_shot_string_io, "PNG")
screen_shot_string_io.seek(0)
return base64.b64encode(screen_shot_string_io.getvalue())
it is sent as is to the server and the server receives the encoded string correcetly with no data corruption.
Here is the c# code that decodes the string:
byte[] decodedImg = new byte[bytesReceived];
FromBase64Transform transfer = new FromBase64Transform();
transfer.TransformBlock(encodedImg, 0, bytesReceived, decodedImg, 0);
So does anyone know why when the data is decoded the result is incorrect?
If it was me I would simply use the Convert.FromBase64String() and not mess with the FromBase64Transform. You don't have all the details here, so I had to improvise.
In Python I took a screen shot, encoded it, and wrote to file:
# this example converts a png file to base64 and saves to file
from PIL import ImageGrab
from io import BytesIO
import base64
screen_shot_string_io = BytesIO()
ImageGrab.grab().save(screen_shot_string_io, "PNG")
screen_shot_string_io.seek(0)
encoded_string = base64.b64encode(screen_shot_string_io.read())
with open("example.b64", "wb") as text_file:
text_file.write(encoded_string)
And in C# I decoded the file contents are wrote the binary:
using System;
using System.IO;
namespace Base64Decode
{
class Program
{
static void Main(string[] args)
{
byte[] imagedata = Convert.FromBase64String(File.ReadAllText("example.b64"));
File.WriteAllBytes("output.png",imagedata);
}
}
}
If you have a properly encoded byte array, then convert the array to string and then decode the string.
public static void ConvertByteExample()
{
byte[] imageData = File.ReadAllBytes("example.b64");
string encodedString = System.Text.Encoding.UTF8.GetString(imageData); //<-- do this
byte[] convertedData = Convert.FromBase64String(encodedString);
File.WriteAllBytes("output2.png", convertedData);
}

Encrypt and decrypt with MachineKey in C#

I'm trying to encrypt and decrypt Id with MachineKey.
Here is my code that calls the encrypt and decrypt functions:
var encryptedId = Encryption.Protect(profileId.ToString(), UserId);
var decryptedId = Encryption.UnProtect(encryptedId, UserId);
Here is the functions:
public static string Protect(string text, string purpose)
{
if(string.IsNullOrEmpty(text))
{
return string.Empty;
}
byte[] stream = Encoding.Unicode.GetBytes(text);
byte[] encodedValues = MachineKey.Protect(stream, purpose);
return HttpServerUtility.UrlTokenEncode(encodedValues);
}
public static string UnProtect(string text, string purpose)
{
if(string.IsNullOrEmpty(text))
{
return string.Empty;
}
byte[] stream = HttpServerUtility.UrlTokenDecode(text);
byte[] decodedValues = MachineKey.Unprotect(stream, purpose);
return Encoding.UTF8.GetString(decodedValues);
}
The input to the Protect method is 15. This results that the encryptedId variable holds the following string: 6wOttbtJoVBV7PxhVWXGz4AQVYcuyHvTyJhAoPu4Okd2aKhhCGbKlK_T4q3GgirotfOZYZXke0pMdgwSmC5vxg2
To encrypt this, I send this string as a parameter to the UnProtect method.
The result of the decryption should be 15, but is instead: 1\05\0
I can't understand why. I have tried to use only integers in this function, but I still have the same problem. The output of the decrypt differs.
You have an encoding mismatch, you encode a buffer containing the UTF-16 (Encoding.Unicode) representation of the string (which will interleave \0 as you see given that it uses 2 bytes per character for that string) but you decode it as UTF-8 (Encoding.UTF8). You need to be consistent in both methods.

Categories

Resources