I implement the following method:
public static byte[] AESDecrypt(byte[] data, ICryptoTransform transform)
{
using (MemoryStream stream = new MemoryStream(data))
using (CryptoStream cstream = new CryptoStream(stream, transform, CryptoStreamMode.Read))
using (MemoryStream output = new MemoryStream())
{
byte[] buffer = new byte[4000];
int r;
while ((r = cstream.Read(buffer, 0, buffer.Length)) > 0)
{
output.Write(buffer, 0, r);
}
stream.Close();
return output.ToArray();
}
}
I am using this method to decrypt a sequence of 16 bytes blocks, the transform parameter is initialized once at the beginning:
AesCryptoServiceProvider provider = new AesCryptoServiceProvider();
provider.Mode = CipherMode.ECB;
provider.KeySize = 128;
provider.BlockSize = 128;
provider.Key = key;
provider.Padding = PaddingMode.PKCS7;
transform = provider.CreateDecryptor();
My problem is that suddenly the method starts to produce strange output, 16 bytes block is decrypted to 27 bytes !!!!, sometimes 16 bytes are decrypted wrongly to 16 bytes, however when I restart the application the same data produce correct result, does the transform hold any state that makes this happen? what wrong thing I did that makes 16 bytes block decrypted to 27 bytes.
Any help is appreciated
`Edit:
Can someone confirm it is the same bug:
Reuse ICryptoTransform objects
Edit 2:
Something to add to the correct answer:
It seems ICryptoTransform is not thread safe, so calling the above method from two threads simultaneously may cause trouble, I solved it by creating ICrypteTransform object for each thread that is using the method
You are closing stream when you meant to close cstream.
Since you don't close cstream before reading out the data, TransformFinalBlock is never called.
You'd be better off using Stream.CopyTo, and making your output stream have a clearer longer-lifetime than the CryptoStream.
public static byte[] AESDecrypt(byte[] data, ICryptoTransform transform)
{
using (MemoryStream output = new MemoryStream())
{
using (MemoryStream stream = new MemoryStream(data))
using (CryptoStream cstream = new CryptoStream(stream, transform, CryptoStreamMode.Read))
{
cstream.CopyTo(output);
}
return output.ToArray();
}
}
Related
I'm writing an app for sending and receiving AES encrypted data. I'm using a CryptoStream which writes to/reads from a NetworkStream. Initially I tried using the built-in padding like this:
// sending
using (Aes aesAlg = Aes.Create())
{
aesAlg.Padding = PaddingMode.PKCS7; // built-in padding
// set other AES params
using (MemoryStream ms = new MemoryStream())
using (ICryptoTransform encryptor = aesAlg.CreateEncryptor(aesAlg.Key, aesAlg.IV))
using (CryptoStream csEncrypt = new CryptoStream(networkStream, encryptor, CryptoStreamMode.Write, true))
{
BinaryFormatter bf = new BinaryFormatter();
bf.Serialize(ms, data);
byte[] bytes = ms.ToArray();
await csEncrypt.WriteAsync(bytes, 0, bytes.Length);
csEncrypt.FlushFinalBlock(); // this should add padding and send all remaining data
}
}
// receiving
using (Aes aesAlg = Aes.Create())
{
aesAlg.Padding = PaddingMode.PKCS7; // built-in padding
// set other AES params
using (ICryptoTransform decryptor = aesAlg.CreateDecryptor(aesAlg.Key, aesAlg.IV))
using (CryptoStream csDecrypt = new CryptoStream(networkStream, decryptor, CryptoStreamMode.Read, true))
using (MemoryStream ms = new MemoryStream())
{
int totalBytesRead = 0;
while (totalBytesRead < messageLength) // read content until length
{
var toRead = Math.Min(buffer.Length, messageLength - totalBytesRead);
var nowRead = await csDecrypt.ReadAsync(buffer, 0, toRead ); // read from network there
totalBytesRead += nowRead;
await ms.WriteAsync(buffer, 0, nowRead);
}
ms.Position = 0;
received = (Data)formatter.Deserialize(ms); // deserialise the object
}
}
Note that the last argument new CryptoStream() is set to true - the base stream (networkStream) won't be closed when CryptoStream gets disposed. One interesting thing is that if I don't set this to true then receiving start working correctly. But because I need the networkStream to stay open it has to be set to true.
With the above implementation the receiving stream never receives all the data - it blocks itself on the last csDecrypt.ReadAsync(). Based on my understanding the csEncrypt.FlushFinalBlock() should send the last block with the added padding, but for some reason it doesn't happen.
Because this didn't work I added the padding myself like this:
// sending
using (Aes aesAlg = Aes.Create())
{
aesAlg.Padding = PaddingMode.None; // no built-in padding
// set other AES params
using (MemoryStream ms = new MemoryStream())
using (ICryptoTransform encryptor = aesAlg.CreateEncryptor(aesAlg.Key, aesAlg.IV))
using (CryptoStream csEncrypt = new CryptoStream(networkStream, encryptor, CryptoStreamMode.Write, true))
{
BinaryFormatter bf = new BinaryFormatter();
bf.Serialize(ms, data);
byte[] bytes = ms.ToArray();
await csEncrypt.WriteAsync(bytes, 0, bytes.Length);
// manually add padding (so the total number of bytes is divisible by 16 bytes/128 bits)
if (bytes.Length % 16 != 0)
await csEncrypt.WriteAsync(bytes, 0, 16 - (bytes.Length % 16)); // paddingdata
}
}
// receiving
using (Aes aesAlg = Aes.Create())
{
aesAlg.Padding = PaddingMode.None; // no built-in padding
using (ICryptoTransform decryptor = aesAlg.CreateDecryptor(aesAlg.Key, aesAlg.IV))
using (CryptoStream csDecrypt = new CryptoStream(networkStream, decryptor, CryptoStreamMode.Read, true))
using (MemoryStream ms = new MemoryStream())
{
int totalBytesRead = 0;
while (totalBytesRead < messageLength) // read content until length
{
var toRead = Math.Min(buffer.Length, messageLength - totalBytesRead);
// if it's the last buffer to be sent and the number of bytes is not divisible by 16 then add padding
if (messageLength - totalBytesRead <= 1024 && toRead % 16 != 0)
min += 16 - (toRead % 16);
var nowRead = await csDecrypt.ReadAsync(buffer, 0, toRead); // read from network there
totalBytesRead += nowRead;
await ms.WriteAsync(buffer, 0, nowRead);
}
ms.Position = 0;
received = (Data)formatter.Deserialize(ms); // deserialise the object
}
}
If I add the padding myself then everything works correctly and the receiving function doesn't block itself on the last buffer. What should I do to make the built-in padding work?
The problem is on the receiving side. If you don't close the stream then the decryption routine doesn't know that the last block has been received, so it just waits for the next block, which never comes. What you are doing now should work when implemented correctly. But you only have to perform the unpadding at the end - your current implementation doesn't implement PKCS#7 unpadding though.
There is another, possibly cleaner way: read from a stream that does close. Apparently you know the size of the plaintext or ciphertext. So what you can do is to use a stream that wraps the other stream, but that does close after reading all the bytes of the ciphertext - and, of course, doesn't close the underlying stream when that happens. It's a bit of work, but it should not be that hard as you only have to implement the Read method (see the Stream class documentation to see why).
I think the second option is the cleanest, but you could of course just read the necessary bytes and then use the decryptor directly to decrypt the bytes (buffer-by-buffer). Not the nicest one compared to the bounded stream trick, but probably the easiest to get your head around.
So there you are: three options: pick & choose and ... well, implement obviously.
I'm trying to encrypt and decrypt a file using AES. The problem that I have is that when the file gets decrypted, it is broken and you can't open it. The original file has a length of 81.970 bytes and the decrypted file has a length of 81.984 bytes...so there are 14 bytes added for some reason. The problem could be in the way the file gets encrypted but I don't know what I'm doing wrong.
What am I missing here? Could it be the way I'm processing the password, the iv and the padding?
Thanks for your time!
This is the code I use to encrypt:
private AesManaged aesManaged;
private string filePathToEncrypt;
public Encrypt(AesManaged aesManaged, string filePathToEncrypt)
{
this.aesManaged = aesManaged;
this.filePathToEncrypt = filePathToEncrypt;
}
public void DoEncryption()
{
byte[] cipherTextBytes;
byte[] textBytes = File.ReadAllBytes(this.filePathToEncrypt);
using(ICryptoTransform encryptor = aesManaged.CreateEncryptor(aesManaged.Key, aesManaged.IV))
using (MemoryStream ms = new MemoryStream())
using (CryptoStream cs = new CryptoStream(ms, encryptor, CryptoStreamMode.Write))
{
cs.Write(textBytes, 0, textBytes.Length);
cs.FlushFinalBlock();
cipherTextBytes = ms.ToArray();
}
File.WriteAllBytes("EncryptedFile.aes", cipherTextBytes);
}
This is the code I use to decrypt:
private AesManaged aesManaged;
private string filePathToDecrypt;
public Decrypt(AesManaged aesManaged, string filePathToDecrypt)
{
this.aesManaged = aesManaged;
this.filePathToDecrypt = filePathToDecrypt;
}
public void DoDecrypt()
{
byte[] cypherBytes = File.ReadAllBytes(this.filePathToDecrypt);
byte[] clearBytes = new byte[cypherBytes.Length];
ICryptoTransform encryptor = aesManaged.CreateDecryptor(aesManaged.Key, aesManaged.IV);
using (MemoryStream ms = new MemoryStream(cypherBytes))
using (CryptoStream cs = new CryptoStream(ms, encryptor, CryptoStreamMode.Read))
{
cs.Read(clearBytes, 0, clearBytes.Length);
clearBytes = ms.ToArray();
}
File.WriteAllBytes("DecryptedFile.gif", clearBytes);
}
And here is how I call the functions:
string filePathToEncrypt = "dilbert.gif";
string filePathToDecrypt = "EncryptedFile.aes";
string password = "Password";
string passwordSalt = "PasswordSalt";
Rfc2898DeriveBytes deriveBytes = new Rfc2898DeriveBytes(password, Encoding.ASCII.GetBytes(passwordSalt));
var aesManaged = new AesManaged
{
Key = deriveBytes.GetBytes(128 / 8),
IV = deriveBytes.GetBytes(16),
Padding = PaddingMode.PKCS7
};
Console.WriteLine("Encrypting File...");
var encryptor = new Encrypt(aesManaged, filePathToEncrypt);
encryptor.DoEncryption();
Thread.Sleep(300);
Console.WriteLine("Decrypting File...");
var decryptor = new Decrypt(aesManaged, filePathToDecrypt);
decryptor.DoDecrypt();
Thread.Sleep(300);
Try with:
public void DoEncryption()
{
byte[] cipherBytes;
byte[] textBytes = File.ReadAllBytes(this.filePathToEncrypt);
using (ICryptoTransform encryptor = aesManaged.CreateEncryptor(aesManaged.Key, aesManaged.IV))
using (MemoryStream input = new MemoryStream(textBytes))
using (MemoryStream output = new MemoryStream())
using (CryptoStream cs = new CryptoStream(output, encryptor, CryptoStreamMode.Write))
{
input.CopyTo(cs);
cs.FlushFinalBlock();
cipherBytes = output.ToArray();
}
File.WriteAllBytes("EncryptedFile.aes", cipherBytes);
}
and
public void DoDecrypt()
{
byte[] cypherBytes = File.ReadAllBytes(this.filePathToDecrypt);
byte[] textBytes;
using (ICryptoTransform decryptor = aesManaged.CreateDecryptor(aesManaged.Key, aesManaged.IV))
using (MemoryStream input = new MemoryStream(cypherBytes))
using (MemoryStream output = new MemoryStream())
using (CryptoStream cs = new CryptoStream(input, decryptor, CryptoStreamMode.Read))
{
cs.CopyTo(output);
textBytes = output.ToArray();
}
File.WriteAllBytes("DecryptedFile.gif", textBytes);
}
Note that the code could be modified to not use temporary byte[] and read/write directly to input/output streams.
In general you can't desume the length of the plaintext from the length of the cyphertext, so this line:
new byte[cypherBytes.Length]
was totally wrong.
And please, don't use Encoding.ASCII in 2016. It is so like previous century. Use Encoding.UTF8 to support non-english characters.
The answer may be very simple. I don't see where do u try to choose a cipher mode, so by default it probably takes CBC, as IV was inited. Then, 81.970 are padded by 14 bytes, to be divisible by 32. So when it happens, the memory you allocated was just 81.970, so the padding bytes doesn't write correctly, cause of some sort of memory leak, and when decrypt is started, unpadding doesn't work correctly.
I'm trying to read a file, encrypt it, and send it to a server over socket, where it is written. And then the other way around, read it on server, send it to client, decrypt it, and write it again.
My problem using C# Aes class is, that the input size doesn't equal the output size.
For example, when I read 4096 bytes from the file, the output size is 4112 bytes, 16 bytes more. OK, so 4112 bytes are sent and written on the server, but when I get the file again, I can only send a maximum of 4096 bytes over the socket, and then, of course, the decrypt function on client throws an exception, that the padding is invalid and cannot be removed. Sure I could try to read less bytes on the client, but that doesn't work as well.
I'm a very experienced C++ programmer, and I've done this with OpenSsl, and it worked like a charm. The input size has been always the output size, I don't know what is wrong with my functions in C#.
this is the sending part:
byte[] SendData = new byte[4096];
iBytesRead = FileRead.Read (SendData, 0, 4096);
SendData = aes.encrypt (Encoding.Default.GetString (SendData, 0, iBytesRead), iBytesRead);
String a = aes.decrypt (SendData); // no problems here because the size is correct
Socket.sendB (SendData, SendData.Length);
and the part of receiving from server:
byte[] WriteData = new byte[4096],
Temp;
if ((iBytesReceived = Socket.receiveB (ref WriteData)) == 0)
break;
if (Encoding.ASCII.GetString (WriteData, 0, iBytesReceived) == "end")
break;
for (uint i = 0; i < iBytesReceived; i++)
Temp[i] = WriteData[i];
byte[] a = Encoding.Default.GetBytes (aes.decrypt (Temp));
FileWrite.Write (a, 0, Temp.Length);
Aes functions:
public byte[] encrypt(String _InStr, int _InStrLength)
{
if (!bKeySet)
return ErrorReturn;
byte[] encrypted;
using (Aes aes = Aes.Create ())
{
aes.Key = Key;
aes.IV = IV;
//aes.Padding = PaddingMode.PKCS7;
//aes.BlockSize = 128;
//aes.KeySize = 128;
//aes.Mode = CipherMode.CFB;
ICryptoTransform encryptor = aes.CreateEncryptor(aes.Key, aes.IV);
// Create the streams used for encryption.
using (MemoryStream ms = new MemoryStream())
{
using (CryptoStream cs = new CryptoStream(ms, encryptor, CryptoStreamMode.Write))
{
using (StreamWriter sw = new StreamWriter(cs))
{
sw.Write(_InStr);
}
}
ms.Close ();
encrypted = ms.ToArray ();
}
}
return encrypted;
}
public String decrypt(byte[] _InStr)
{
if (!bKeySet)
return "";
String plaintext;
using (Aes aes = Aes.Create ())
{
aes.Key = Key;
aes.IV = IV;
//aes.Padding = PaddingMode.PKCS7;
//aes.BlockSize = 128;
//aes.KeySize = 128;
//aes.Mode = CipherMode.CBC;
ICryptoTransform decryptor = aes.CreateDecryptor(aes.Key, aes.IV);
// Create the streams used for decryption.
using (MemoryStream msDecrypt = new MemoryStream(_InStr))
{
using (CryptoStream csDecrypt = new CryptoStream(msDecrypt, decryptor, CryptoStreamMode.Read))
{
using (StreamReader srDecrypt = new StreamReader(csDecrypt))
{
plaintext = srDecrypt.ReadToEnd ();
}
}
}
}
return plaintext;
}
As it was said, if any padding is used, the output will be aligned to a block size. However .Net doesn't want to work with incomplete blocks when PaddingMode.None is used. You should pad data yourself before encryption(decryption) and remove added bytes after.
One of the way to do this is to wrap ICryptoTransform passed to a CryptoStream
I need to decrypt large amounts of data quickly using the method below. Currently it takes about 0.3 ms to run with the ICryptoTransform provided. Can someone think any way to optimize it further ? The method is called multiple times in succession with different dataToDecrypt-value but with the same decryptor.
public byte[] DecryptUsingDecryptor(byte[] dataToDecrypt, ICryptoTransform decryptor)
{
byte[] decryptedData = null;
MemoryStream msDecrypt = new MemoryStream();
CryptoStream csDecrypt = new CryptoStream(msDecrypt,
decryptor,
CryptoStreamMode.Write);
csDecrypt.Write(dataToDecrypt, 0, dataToDecrypt.Length);
csDecrypt.FlushFinalBlock();
decryptedData = msDecrypt.ToArray();
csDecrypt.Close();
return decryptedData;
}
I don't really know if you would notice any performance improvement but if you are using the same decryptor couldn't you just re-use the same msDecrypt and csDecrypt by setting them as private fields?
public class Decrypter
{
private MemoryStream msDecrypt;
private CryptoStream csDecrypt;
public Decrypter(ICryptoTransform decryptor)
{
msDecrypt = new MemoryStream();
csDecrypt = new CryptoStream(msDecrypt,decryptor,CryptoStreamMode.Write);
}
public byte[] DecryptUsingDecryptor(byte[] dataToDecrypt)
{
byte[] decryptedData = null;
csDecrypt.Write(dataToDecrypt, 0, dataToDecrypt.Length);
csDecrypt.FlushFinalBlock();
decryptedData = msDecrypt.ToArray();
csDecrypt.Close();
return decryptedData;
}
}
As I said, I don't know if it would make any difference but I think that, at least, you won't be recreating your MemoryStream and CryptoStream every time.
I'm trying to get simple encryption/decryption working with AesManaged, but I keep getting an exception when trying to close the decryption stream. The string here gets encrypted and decrypted correctly, and then I get the CryptographicException "Padding was invalid and cannot be removed" after Console.WriteLine prints the correct string.
Any ideas?
MemoryStream ms = new MemoryStream();
byte[] rawPlaintext = Encoding.Unicode.GetBytes("This is annoying!");
using (Aes aes = new AesManaged())
{
aes.Padding = PaddingMode.PKCS7;
aes.Key = new byte[128/8];
aes.IV = new byte[128/8];
using (CryptoStream cs = new CryptoStream(ms, aes.CreateEncryptor(),
CryptoStreamMode.Write))
{
cs.Write(rawPlaintext, 0, rawPlaintext.Length);
cs.FlushFinalBlock();
}
ms = new MemoryStream(ms.GetBuffer());
using (CryptoStream cs = new CryptoStream(ms, aes.CreateDecryptor(),
CryptoStreamMode.Read))
{
byte[] rawData = new byte[rawPlaintext.Length];
int len = cs.Read(rawData, 0, rawPlaintext.Length);
string s = Encoding.Unicode.GetString(rawData);
Console.WriteLine(s);
}
}
The trick is to use MemoryStream.ToArray().
I also changed your code so that it uses the CryptoStream to Write, in both encrypting and decrypting. And you don't need to call CryptoStream.FlushFinalBlock() explicitly, because you have it in a using() statement, and that flush will happen on Dispose(). The following works for me.
byte[] rawPlaintext = System.Text.Encoding.Unicode.GetBytes("This is all clear now!");
using (Aes aes = new AesManaged())
{
aes.Padding = PaddingMode.PKCS7;
aes.KeySize = 128; // in bits
aes.Key = new byte[128/8]; // 16 bytes for 128 bit encryption
aes.IV = new byte[128/8]; // AES needs a 16-byte IV
// Should set Key and IV here. Good approach: derive them from
// a password via Cryptography.Rfc2898DeriveBytes
byte[] cipherText= null;
byte[] plainText= null;
using (MemoryStream ms = new MemoryStream())
{
using (CryptoStream cs = new CryptoStream(ms, aes.CreateEncryptor(), CryptoStreamMode.Write))
{
cs.Write(rawPlaintext, 0, rawPlaintext.Length);
}
cipherText= ms.ToArray();
}
using (MemoryStream ms = new MemoryStream())
{
using (CryptoStream cs = new CryptoStream(ms, aes.CreateDecryptor(), CryptoStreamMode.Write))
{
cs.Write(cipherText, 0, cipherText.Length);
}
plainText = ms.ToArray();
}
string s = System.Text.Encoding.Unicode.GetString(plainText);
Console.WriteLine(s);
}
Also, I guess you know you will want to explicitly set the Mode of the AesManaged instance, and use System.Security.Cryptography.Rfc2898DeriveBytes to derive the Key and IV from a password and salt.
see also:
- AesManaged
This exception can be caused by a mismatch of any one of a number of encryption parameters.
I used the Security.Cryptography.Debug interface to trace all parameters used in the encrypt/decrypt methods.
Finally I found out that my problem was that I set the KeySize property after setting the Key causing the class to regenerate a random key and not using the key that I was initially set up.
For whats its worth, I'll document what I faced. I was trying to read the encryptor memory stream before the CryptoStream was closed. I was naive and I wasted a day debugging it.
public static byte[] Encrypt(byte[] buffer, byte[] sessionKey, out byte[] iv)
{
byte[] encrypted;
iv = null;
using (AesCryptoServiceProvider aesAlg = new AesCryptoServiceProvider { Mode = CipherMode.CBC, Padding = PaddingMode.PKCS7 })
{
aesAlg.Key = sessionKey;
iv = aesAlg.IV;
ICryptoTransform encryptor = aesAlg.CreateEncryptor(sessionKey, iv);
// Create the streams used for encryption.
using (MemoryStream msEncrypt = new MemoryStream())
{
using (CryptoStream csEncrypt = new CryptoStream(msEncrypt, encryptor, CryptoStreamMode.Write))
{
csEncrypt.Write(buffer, 0, buffer.Length);
//This was not closing the cryptostream and only worked if I called FlushFinalBlock()
//encrypted = msEncrypt.ToArray();
}
encrypted = msEncrypt.ToArray();
return encrypted;
}
}
}
Moving the encryptor memory stream read after the cypto stream was closed solved the problem. As Cheeso mentioned. You don't need to call the FlushFinalBlock() if you're using the using block.
byte[] rawData = new
byte[rawPlaintext.Length];
You need to read the length of the buffer, that probably includes the necessary padding (IIRC, been a few years).
Nobody answered, that actually MemoryStream.GetBuffer returns the allocated buffer, not the real data in this buffer. In this case it returns 256-byte buffer, while it contains only 32 bytes of encrypted data.
As others have mentioned, this error can occur if the key/iv is not correctly initialized for decryption. In my case I need to copy key and iv from some larger buffer. Here's what I did wrong:
Does not work: (Padding is invalid and cannot be removed)
aes.Key = new byte[keySize];
Buffer.BlockCopy(someBuffer, keyOffset, aes.Key, 0, keySize);
aes.IV = new byte[ivSize];
Buffer.BlockCopy(someBuffer, ivOffset, aes.IV, 0, ivSize);
Works:
var key = new byte[keySize];
Buffer.BlockCopy(someBuffer, keyOffset, key, 0, keySize);
aes.Key = key;
var iv = new byte[ivSize];
Buffer.BlockCopy(someBuffer, ivOffset, iv, 0, ivSize);
aes.IV = iv;
The OP did not make this mistake, but this might be helpful for others seeing the same error.