I have an application that generates a AES Key (using Security.Cryptography). I take that AES Key, convert it to string and put it in a cookie like this:
string keyToSend = Encoding.UTF8.GetString(CurrentKey);
HttpCookie sessionKeyCookie = new HttpCookie("SessionKey", JsonConvert.SerializeObject(keyToSend));
keyToSend looks like this: "���K��Ui ����&��Ӂ*��()".
Then, I want to take back that key and use it to decrypt something, and I do this:
string keyString = JsonConvert.DeserializeObject<string>(context.Cookies["SessionKey"].Value);
byte[] ascii = Encoding.ASCII.GetBytes(cevaString);
byte[] utf8 = Encoding.UTF8.GetBytes(cevaString);
byte[] utf32 = Encoding.UTF32.GetBytes(cevaString);
Also, my keyToString looks like this: "���K��Ui ����&��Ӂ*��()".
And my browser cookie looks like this: "�\u0010��K��Ui �\u0010�\u000f�\u001f\u0005�\u0012\u0018&��Ӂ*��()\u001e"
The initial key should have 256bits, so 32 entries in that array, but all my variables (ascii, utf8, utf32) have different lengths. Why is that, how can I retrieve the cookie and convert it to a byte[32] array?
It sounds like CurrentKey is arbitrary binary data - not a UTF-8 encoded string. If you've got arbitrary data which you need to encode as a string (e.g. an image, or encrypted or compressed data) you're usually best off using Base64 or hex encoding. Base64 is pretty easy:
string keyToSend = Convert.ToBase64String(CurrentKey);
...
byte[] recoveredKey = Convert.FromBase64String(keyString);
Related
I wanted to do a simple message encrypter to dip my toes into the matter but I can't make it to work. The problem is that whatever input I start with, sometimes it encrypts it but when I try to decrypt it, it just doesn't return the original string. It would be really helpful if you could tell me what I'm doing wrong or guide in the right direction.
Complete code
This are the sections in charge of encrypting and decrypting.
void Decrypt()
{
using var crypt = Aes.Create();
string[] input = ClipboardService.GetText()?.Split(SEPARATOR) ?? Array.Empty<string>();
byte[] key = input[0].ToBytes();
byte[] IV = input[^1].ToBytes();
byte[] value = string.Join(string.Empty, input[1..^1]).ToBytes();
crypt.IV = IV;
crypt.Key = key;
var decryptedValue = crypt.DecryptCbc(value, IV, PaddingMode.Zeros);
string decryptedValueInText = decryptedValue.ToUnicodeString();
ClipboardService.SetText(decryptedValueInText);
LogInfoMessage($"{decryptedValueInText}: {decryptedValue.Length}");
crypt.Clear();
}
void Encrypt()
{
using var crypt = Aes.Create();
crypt.GenerateKey();
string value = ClipboardService.GetText() ?? string.Empty;
var encryptedValue = crypt.EncryptCbc(value.ToBytes(), crypt.IV, PaddingMode.Zeros);
string encryptedValueInText = $"{crypt.Key.ToUnicodeString()}{SEPARATOR}{encryptedValue.ToUnicodeString()}{SEPARATOR}{crypt.IV.ToUnicodeString()}";
ClipboardService.SetText(encryptedValueInText);
LogInfoMessage($"{encryptedValueInText}: {encryptedValue.Length}");
crypt.Clear();
}
There are two extension methods:
public static string ToUnicodeString(this byte[] bytes) => Encoding.Unicode.GetString(bytes);
public static byte[] ToBytes(this string str) => Encoding.Unicode.GetBytes(str);
Example
The input links were:
https://www.youtube.com/
https://www.youtube.com/watch?v=bSA91XTzeuA
I don't think it matters because the key and IV are autogenerated everytime anyways but still.
Per our discussion...
Using the clipboard to store binary data as Unicode text will fail due to invalid UTF-16 codepoints. UTF-16 uses some multi-word encoding for certain Unicode characters, using 32 bits in surrogate pairs to encode Unicode code points from the supplementary planes. There are plenty of primers on the UTF-16 encoding, but basically you have a pair of 16-bit values where the first is in the range 0xD800-0xDBFF and the second must be in the range 0xDC00-0xDFFF. Odds on your encrypted data will break this rule.
As noted, if your encrypted binary data must be sent through a text-only transport you should encode the bytes in the encrypted block using Base64 or similar.
I'd also like to stress that writing methods that can be called with parameters rather than directly accessing the clipboard for I/O makes it much simpler to do testing, including round-trip tests on the various parts of the problem. Proving that the codec is working without reference to the clipboard is a good test and separation of concerns helps to more readily identify the source of problems in the future.
I have a string which I need to convert to base64.
The default convert
byte[] cipherbytes = rsa.Encrypt(plainbytes, false);
return Convert.ToBase64String(cipherbytes);
I get a string which has '+' like "pURT+TFG=" and is converted to a space when sent as a get, so I can't compare to the original.
First, it sounds as a bad idea to send large sets of bytes in the query string. Short byte arrays should be fine. Make sure if this is what you need.
Second, you have to URL encode your base64 encoded string, by calling HttpUtility.UrlEncode or WebUtility.UrlEncode (prefer the latter):
byte[] cipherbytes = rsa.Encrypt(plainbytes, false);
return WebUtility.UrlEncode(Convert.ToBase64String(cipherbytes));
Given the string
KlVkeNK76V27D2MSBOhfNC6eNtA=
This look like base64 encoded. However, I tried using convert to base64 with C# , result is a garbage string.
public static string Base64Encode(string plainText)
{
var plainTextBytes = System.Text.Encoding.UTF8.GetBytes(plainText);
return System.Convert.ToBase64String(plainTextBytes);
}
If I use this:
https://hashkiller.co.uk/sha1-decrypter.aspx
then it give a nicely SHA1 hash:
2a556478d2bbe95dbb0f631204e85f342e9e36d0
Can anyone show me how to decrypt it with C#?
Thanks a lot.
They simply print the hex value of the base64-decoded string:
byte[] bytes = Convert.FromBase64String("KlVkeNK76V27D2MSBOhfNC6eNtA=");
string hexString = new SoapHexBinary(bytes).ToString().ToLowerInvariant();
(where SoapHexBinary is a .NET class that converts byte[] to hex string)
I'm trying to convert a byte array to a string, then at a later time convert those strings back to a byte array, but I'm getting some inconsistent results.
var salt = System.Text.Encoding.UTF8.GetString(encryptedPassword.Salt);
var key = System.Text.Encoding.UTF8.GetString(encryptedPassword.Key);
...
var saltBytes = System.Text.Encoding.UTF8.GetBytes(salt);
var keyBytes = System.Text.Encoding.UTF8.GetBytes(key);
In this case, the original salt and key are both byte[20], but the new ones are not equal (salt being a byte[36], key a byte [41], both with totally different values).
Basically what #DourHighArch said. You can go string->binary->string, but you can't expect to be able to go binary->string->binary using text encoding.
For what you are doing, you probably want to use something like base64 encoding. So you could write it like this:
var salt = Convert.ToBase64String(encryptedPassword.Salt);
var key = Convert.ToBase64String(encryptedPassword.Key);
...
var saltBytes = Convert.FromBase64String(salt);
var keyBytes = Convert.FromBase64String(key);
I'll start by saying, maybe this is overkill.
During my login routine, I encrypt the user's login id prior to using FormsAuthentication.SetAuthCookie.
The problem is that if the encrypted string ends up having escape characters, the string that get saved gets truncated do to the escape characters.
Should I just abondone trying to encrypt the user's login id?
Or, is there a way to get around this issue?
Here is a sample string that gets truncated:
<< *€ƒKõ‹¯Þ\0ý´Gv\"þEaÔs0n×\tD¦™s€7Œ>>
When you encrypt the user id, you should use Base64 encoding so that the encrypted data will only contain valid characters (alphanumeric, +, /, =).
You would probably find this helpful: Convert.ToBase64String(byte[])
Example:
string userId = "Hello";
byte[] encryptedData = GetEncryptedBytes(userId);
string encodedUserId = Convert.ToBase64String(encryptedData);
// encodedUserId is "SGVsbG8="
FormsAuthentication.SetAuthCookie(encryptedUserId);
And decoding is the reverse:
string encodedUserId = "SGVsbG8=";
byte[] encryptedData = Convert.FromBase64String(encodedUserId);
string userId = GetDecryptedString(encryptedData);