I am converting a classic asp application to C#, and would like to be able to decrypt strings in c# that were originally encrypted in classic asp. the classic asp code is here, and the c# code is here. The problem that i am facing is that the signatures of the Encrypt and Decrypt methods in asp vs C# are different. here is my asp code for decrypting, which wraps the decrypt code.
Function AESDecrypt(sCypher)
if sCypher <> "" then
Dim bytIn()
Dim bytOut
Dim bytPassword()
Dim lCount
Dim lLength
Dim sTemp
Dim sPassword
sPassword = "My_Password"
lLength = Len(sCypher)
ReDim bytIn(lLength/2-1)
For lCount = 0 To lLength/2-1
bytIn(lCount) = CByte("&H" & Mid(sCypher,lCount*2+1,2))
Next
lLength = Len(sPassword)
ReDim bytPassword(lLength-1)
For lCount = 1 To lLength
bytPassword(lCount-1) = CByte(AscB(Mid(sPassword,lCount,1)))
Next
bytOut = DecryptData(bytIn, bytPassword) //' this is the problem child
lLength = UBound(bytOut) + 1
sTemp = ""
For lCount = 0 To lLength - 1
sTemp = sTemp & Chr(bytOut(lCount))
Next
AESDecrypt = sTemp
End if
End Function
However, in c# i am struggling to convert this function because the c# equivalent of DecryptData has more params
public static byte[] DecryptData(byte[] message, byte[] password,
byte[] initialisationVector, BlockSize blockSize,
KeySize keySize, EncryptionMode cryptMode)
{...}
what values can i use for initialisationVector, blockSize, keySize, cryptMode so as to be able to decrypt the same way the classic asp code does.
Using Phil Fresle's C# Rijndael implementation, you can use the following code to have successfully decrypt a value that was encrypted with Phil's ASP/VBScript version.
You can read my answer about encrypting here: Password encryption/decryption between classic asp and ASP.NET
public string DecryptData(string encryptedMessage, string password)
{
if (encryptedMessage.Length % 2 == 1)
throw new Exception("The binary key cannot have an odd number of digits");
byte[] byteArr = new byte[encryptedMessage.Length / 2];
for (int index = 0; index < byteArr.Length; index++)
{
string byteValue = encryptedMessage.Substring(index * 2, 2);
byteArr[index] = byte.Parse(byteValue, NumberStyles.HexNumber, CultureInfo.InvariantCulture);
}
byte[] result = Rijndael.DecryptData(
byteArr,
Encoding.ASCII.GetBytes(password),
new byte[] { }, // Initialization vector
Rijndael.BlockSize.Block256, // Typically 128 in most implementations
Rijndael.KeySize.Key256,
Rijndael.EncryptionMode.ModeECB // Rijndael.EncryptionMode.ModeCBC
);
return ASCIIEncoding.ASCII.GetString(result);
}
Most default implementations will use a key size of 128, 192, or 256 bits. A block size at 128 bits is standard. Although some implementations allow block sizes other than 128 bits, changing the block size will just add another item into the mix to cause confusion when trying to get data encrypted in one implementation to properly decrypt in another.
UPDATE
Turns out I was wrong about one piece here; the EncryptionMode should be set as EncryptionMode.ModeECB, not EncryptionMode.ModeCBC. "ECB" is less secure (https://crypto.stackexchange.com/questions/225/should-i-use-ecb-or-cbc-encryption-mode-for-my-block-cipher) because it doesn't cycle like CBC does, but that is how it was implemented in the VB version of the encryption.
Interestingly enough, using CBC on an ECB encrypted value WILL work for the first handful of bytes up until a certain point (i'd imagine this has to do with the block size) at which point the remainder of the value is mangled. You can see this particularly clearly when encrypting a long-ish string in the VB version and decrypting it with the code I posted above with a mode of EncryptionMode.ModeECB
Related
I need to decrypt a signature hash using RSA. I've got a hexadecimal string of 288 characters long which is a general public key from an institution. It represents hexadecimal bytes, so 144 bytes total.
The first 8 bytes are the so called CAR. Which is used for identification. The next 128 bytes are the Modulus N. And the next 8 bytes are the exponent E.
I've never worked with cryptography before so go easy on me. I'm using C# and the Bouncy Castle library for the decryption algorithms. Now, If I understand correctly, a 1024 bit modulus and 64 bits exponent is not strange. I currently have this bit of code:
public byte[] rsa_decrypt(byte[] data)
{
var N = PublicKey.ToCharArray().Slice(16,256);
var E = PublicKey.ToCharArray().Slice(272,16);
RsaEngine rsa = new RsaEngine();
Org.BouncyCastle.Math.BigInteger modulus = new Org.BouncyCastle.Math.BigInteger(new string(N).Insert(0,"00"),16);
Console.WriteLine(modulus);
Org.BouncyCastle.Math.BigInteger exponent = new Org.BouncyCastle.Math.BigInteger(new string(E).Insert(0,"00"),16);
Console.WriteLine(exponent);
RsaKeyParameters x = new RsaKeyParameters(false,modulus,exponent);
var eng = new Pkcs1Encoding(new RsaEngine());
eng.Init(false,x);
return eng.ProcessBlock(data,0,data.Length);
}
The Slice<T>(this T[],offset,length) method is just a small thing I wrote to cut arrays in pieces, nothing special and it works. The insertion of the "00" in the string is because the string could otherwise be interpreted as unsigned I believe.
When I run this code I get the exception
Unhandled exception. Org.BouncyCastle.Crypto.InvalidCipherTextException: block incorrect at Org.BouncyCastle.Crypto.Encodings.Pkcs1Encoding.DecodeBlock(Byte[] input, Int32 inOff, Int32 inLen) at Org.BouncyCastle.Crypto.Encodings.Pkcs1Encoding.ProcessBlock(Byte[] input, Int32 inOff, Int32 length)
Obviously I'm doing something wrong. Can anybody tell me what I'm doing wrong, why I'm doing it wrong, and most preferably, what I should be doing. Again, never worked with crypto algorithms or this library ever before.
I just noticed that .NET Standard 2.1/.NET Core 3.0 finally added a class for AES-GCM encryption.
However, its API seems to be slightly different from the usual .NET crypto classes: Its Encrypt function asks for pre-allocated byte arrays for the cipher text and the tag, instead of providing them itself. Unfortunately there is no example in the docs showing proper usage of that class.
I know how to calculate the expected cipher text size for an AES encryption in theory, but I wonder whether it is really the intended approach to kind of "guess" a buffer size for the cipher text there. Usually crypto libraries provide functions that take care of those calculations.
Does someone have an example on how to properly encrypt a byte array using AesGcm?
I figured it out now.
I forgot that in GCM, the cipher text has the same length as the plain text; contrary to other encryption modes like CBC, no padding is required. The nonce and tag lengths are determined by the NonceByteSizes and TagByteSizes properties of AesGcm, respectively.
Using this, encryption can be done in the following way:
public string Encrypt(string plain)
{
// Get bytes of plaintext string
byte[] plainBytes = Encoding.UTF8.GetBytes(plain);
// Get parameter sizes
int nonceSize = AesGcm.NonceByteSizes.MaxSize;
int tagSize = AesGcm.TagByteSizes.MaxSize;
int cipherSize = plainBytes.Length;
// We write everything into one big array for easier encoding
int encryptedDataLength = 4 + nonceSize + 4 + tagSize + cipherSize;
Span<byte> encryptedData = encryptedDataLength < 1024
? stackalloc byte[encryptedDataLength]
: new byte[encryptedDataLength].AsSpan();
// Copy parameters
BinaryPrimitives.WriteInt32LittleEndian(encryptedData.Slice(0, 4), nonceSize);
BinaryPrimitives.WriteInt32LittleEndian(encryptedData.Slice(4 + nonceSize, 4), tagSize);
var nonce = encryptedData.Slice(4, nonceSize);
var tag = encryptedData.Slice(4 + nonceSize + 4, tagSize);
var cipherBytes = encryptedData.Slice(4 + nonceSize + 4 + tagSize, cipherSize);
// Generate secure nonce
RandomNumberGenerator.Fill(nonce);
// Encrypt
using var aes = new AesGcm(_key);
aes.Encrypt(nonce, plainBytes.AsSpan(), cipherBytes, tag);
// Encode for transmission
return Convert.ToBase64String(encryptedData);
}
Correspondingly, the decryption is done as follows:
public string Decrypt(string cipher)
{
// Decode
Span<byte> encryptedData = Convert.FromBase64String(cipher).AsSpan();
// Extract parameter sizes
int nonceSize = BinaryPrimitives.ReadInt32LittleEndian(encryptedData.Slice(0, 4));
int tagSize = BinaryPrimitives.ReadInt32LittleEndian(encryptedData.Slice(4 + nonceSize, 4));
int cipherSize = encryptedData.Length - 4 - nonceSize - 4 - tagSize;
// Extract parameters
var nonce = encryptedData.Slice(4, nonceSize);
var tag = encryptedData.Slice(4 + nonceSize + 4, tagSize);
var cipherBytes = encryptedData.Slice(4 + nonceSize + 4 + tagSize, cipherSize);
// Decrypt
Span<byte> plainBytes = cipherSize < 1024
? stackalloc byte[cipherSize]
: new byte[cipherSize];
using var aes = new AesGcm(_key);
aes.Decrypt(nonce, cipherBytes, tag, plainBytes);
// Convert plain bytes back into string
return Encoding.UTF8.GetString(plainBytes);
}
See dotnetfiddle for the full implementation and an example.
Note that I wrote this for network transmission, so everything is encoded into one, big base-64 string; alternatively, you can return nonce, tag and cipherBytes separately via out parameters.
The network setting is also the reason why I send the nonce and tag sizes: The class might be used by different applications with different runtime environments, which might have different supported parameter sizes.
I'm tinkering with RSA signing of data.
I'm using a plaintext string, which i convert to byte array. i then generate private certificate, sign the byte array and then generate public key.
next i'm using the same byte array to verify the signature.
but i want to convert signature, in between steps, to the string - idea is to append it later on to the file that's being signed.
static void TestSigning(string privateKey)
{
string data = "TEST_TEST-TEST+test+TEst";
Console.WriteLine("==MESSAGE==");
Console.WriteLine(data);
byte[] dataByte = Encoding.Unicode.GetBytes(data);
using (var rsa = new RSACryptoServiceProvider())
{
rsa.FromXmlString(privateKey);
var publicKey = rsa.ToXmlString(false);
byte[] signature = rsa.SignData(dataByte, CryptoConfig.MapNameToOID("SHA512"));
string signatureString = Encoding.Unicode.GetString(signature);
byte[] roundtripSignature = Encoding.Unicode.GetBytes(signatureString);
Console.WriteLine("==TEST==");
Console.WriteLine(signature.Length.ToString());
Console.WriteLine(roundtripSignature.Length.ToString());
using (var checkRSA = new RSACryptoServiceProvider())
{
checkRSA.FromXmlString(publicKey);
bool verification = checkRSA.VerifyData(
dataByte,
CryptoConfig.MapNameToOID("SHA512"),
roundtripSignature);
Console.WriteLine("==Verification==");
Console.WriteLine(verification.ToString());
Console.ReadKey();
}
}
}
now here's the fun part
if i use UTF8 encoding i get byte arrays of different length
256 is the original size
484 is the roundtrip
UTF7 returns different sizes too
256 vs 679
both ASCII and Unicode return proper sizes 256 vs 256.
i've tried using
var sb = new StringBuilder();
for (int i = 0; i < signature.Length; i++)
{
sb.Append(signature[i].ToString("x2"));
}
to get the string. I'm then using Encoding.UTF8.GetBytes() method
this time i get the sizes of:
256 vs 512
if i remove the format from toString() i get:
256 vs 670
signature verification alwayas failed.
it works fine if i use 'signature' instead of roundtripSignature.
my question: Why, despite using same encoding type i get different byte arrays and strings? shouldn't this conversion be lossless?
Unicode isn't a good choice because, at minimum, \0, CR, LF, <delete>, <backspace> (and the rest of the control codes) can mess things up. (See an answer about this for Encrypt/Decrypt for more).
As #JamesKPolk said, you need to use a suitable binary-to-text encoding. Base64 and hex/Base16 are the most common, but there are plenty of other viable choices.
I know very little about Encryption, but my goal is to essentially decrypt strings. I have been given the AES(128) key.
However, I must retrieve the IV from the Encrypted string, which is the first 16 bits.
Heres the doc for salesforce for more information (if what i explained was incorrect)
Encrypts the blob clearText using the specified algorithm and private
key. Use this method when you want Salesforce to generate the
initialization vector for you. It is stored as the first 128 bits (16
bytes) of the encrypted blob
http://www.salesforce.com/us/developer/docs/apexcode/Content/apex_classes_restful_crypto.htm (encryptWithManagedIV)
For Retrieving the IV I've tried something like this (I don't believe it's right though):
public string retrieveIv()
{
string iv = "";
string input = "bwZ6nKpBEsuAKM8lDTYH1Yl69KkHN1i3XehALbfgUqY=";
byte[] bytesToEncode = Encoding.UTF8.GetBytes(input);
for(int i = 0; i <= 15; i++){
iv += bytesToEncode[i].ToString(); ;
}
return iv;
}
(Just ignore the fact that the input is hardcoded and not parameterized; easier for testing purposes)
Then use the Best answer from this question to decrypt the string
The IV shouldn't be expressed as a string - it should be as a byte array, as per the AesManaged.IV property.
Also, using Encoding.UTF8 is almost certainly wrong. I suspect you want:
public static byte[] RetrieveIv(string encryptedBase64)
{
// We don't need to base64-decode everything... just 16 bytes-worth
encryptedBase64 = encryptedBase64.Substring(0, 24);
// This will be 18 bytes long (4 characters per 3 bytes)
byte[] encryptedBinary = Convert.FromBase64String(encryptedBase64);
byte[] iv = new byte[16];
Array.Copy(encryptedBinary, 0, iv, 0, 16);
return iv;
}
I get the following error when I try to create a IV initialization vector for TripleDES encryptor.
Please see the code example:
TripleDESCryptoServiceProvider tripDES = new TripleDESCryptoServiceProvider();
byte[] key = Encoding.ASCII.GetBytes("SomeKey132123ABC");
byte[] v4 = key;
byte[] connectionString = Encoding.ASCII.GetBytes("SomeConnectionStringValue");
byte[] encryptedConnectionString = Encoding.ASCII.GetBytes("");
// Read the key and convert it to byte stream
tripDES.Key = key;
tripDES.IV = v4;
This is the exception that I get from the VS.
Specified initialization vector (IV) does not match the block size for this algorithm.
Where am I going wrong?
Thank you
MSDN explicitly states that:
...The size of the IV property must be the same as the BlockSize property.
For Triple DES it is 64 bits.
The size of the initialization vector must match the block size - 64 bit in case of TripleDES. Your initialization vector is much longer than eight bytes.
Further you should really use a key derivation function like PBKDF2 to create strong keys and initialization vectors from password phrases.
Key should be 24 bytes and IV should be 8 bytes.
tripDES.Key = Encoding.ASCII.GetBytes("123456789012345678901234");
tripDES.IV = Encoding.ASCII.GetBytes("12345678");
The IV must be the same length (in bits) as tripDES.BlockSize. This will be 8 bytes (64 bits) for TripleDES.
I've upvoted every answer (well the ones that are here before mine!) here as they're all correct.
However there's a bigger mistake you're making (one which I also made v.early on) - DO NOT USE A STRING TO SEED THE IV OR KEY!!!
A compile-time string literal is a unicode string and, despite the fact that you will not be getting either a random or wide-enough spread of byte values (because even a random string contains lots of repeating bytes due to the narrow byte range of printable characters), it's very easy to get a character which actually requires 2 bytes instead of 1 - try using 8 of some of the more exotic characters on the keyboard and you'll see what I mean - when converted to bytes you can end up with more than 8 bytes.
Okay - so you're using ASCII Encoding - but that doesn't solve the non-random problem.
Instead you should use RNGCryptoServiceProvider to initialise your IV and Key and, if you need to capture a constant value for this for future use, then you should still use that class - but capture the result as a hex string or Base-64 encoded value (I prefer hex, though).
To achieve this simply, I've written a macro that I use in VS (bound to the keyboard shortcut CTRL+SHIFT+G, CTRL+SHIFT+H) which uses the .Net PRNG to produce a hex string:
Public Sub GenerateHexKey()
Dim result As String = InputBox("How many bits?", "Key Generator", 128)
Dim len As Int32 = 128
If String.IsNullOrEmpty(result) Then Return
If System.Int32.TryParse(result, len) = False Then
Return
End If
Dim oldCursor As Cursor = Cursor.Current
Cursor.Current = Cursors.WaitCursor
Dim buff((len / 8) - 1) As Byte
Dim rng As New System.Security.Cryptography.RNGCryptoServiceProvider()
rng.GetBytes(buff)
Dim sb As New StringBuilder(CType((len / 8) * 2, Integer))
For Each b In buff
sb.AppendFormat("{0:X2}", b)
Next
Dim selection As EnvDTE.TextSelection = DTE.ActiveDocument.Selection
Dim editPoint As EnvDTE.EditPoint
selection.Insert(sb.ToString())
Cursor.Current = oldCursor
End Sub
Now all you need to do is to turn your hex string literal into a byte array - I do this with a helpful extension method:
public static byte[] FromHexString(this string str)
{
//null check a good idea
int NumberChars = str.Length;
byte[] bytes = new byte[NumberChars / 2];
for (int i = 0; i < NumberChars; i += 2)
bytes[i / 2] = Convert.ToByte(str.Substring(i, 2), 16);
return bytes;
}
There are probably better ways of doing that bit - but it works for me.
I do it like this:
var derivedForIv = new Rfc2898DeriveBytes(passwordBytes, _saltBytes, 3);
_encryptionAlgorithm.IV = derivedForIv.GetBytes(_encryptionAlgorithm.LegalBlockSizes[0].MaxSize / 8);
The IV gets bytes from the derive bytes 'smusher' using the block size as described by the algorithm itself via the LegalBlockSizes property.