I'm trying to encrypt and decrypt a file stream over a socket using RijndaelManaged, but I keep bumping into the exception
CryptographicException: Length of the data to decrypt is invalid.
at System.Security.Cryptography.RijndaelManagedTransform.TransformFinalBlock(Byte[] inputBuffer, Int32 inputOffset, Int32 inputCount)
at System.Security.Cryptography.CryptoStream.FlushFinalBlock()
at System.Security.Cryptography.CryptoStream.Dispose(Boolean disposing)
The exception is thrown at the end of the using statement in receiveFile, when the whole file has been transferred.
I tried searching the web but only found answers to problems that arise when using Encoding when encrypting and decrypting a single string. I use a FileStream, so I don't specify any Encoding to be used, so that should not be the problem. These are my methods:
private void transferFile(FileInfo file, long position, long readBytes)
{
// transfer on socket stream
Stream stream = new FileStream(file.FullName, FileMode.Open);
if (position > 0)
{
stream.Seek(position, SeekOrigin.Begin);
}
// if this should be encrypted, wrap the encryptor stream
if (UseCipher)
{
stream = new CryptoStream(stream, streamEncryptor, CryptoStreamMode.Read);
}
using (stream)
{
int read;
byte[] array = new byte[8096];
while ((read = stream.Read(array, 0, array.Length)) > 0)
{
streamSocket.Send(array, 0, read, SocketFlags.None);
position += read;
}
}
}
private void receiveFile(FileInfo transferFile)
{
byte[] array = new byte[8096];
// receive file
Stream stream = new FileStream(transferFile.FullName, FileMode.Append);
if (UseCipher)
{
stream = new CryptoStream(stream, streamDecryptor, CryptoStreamMode.Write);
}
using (stream)
{
long position = new FileInfo(transferFile.Path).Length;
while (position < transferFile.Length)
{
int maxRead = Math.Min(array.Length, (int)(transferFile.Length - position));
int read = position < array.Length
? streamSocket.Receive(array, maxRead, SocketFlags.None)
: streamSocket.Receive(array, SocketFlags.None);
stream.Write(array, 0, read);
position += read;
}
}
}
This is the method I use to set up the ciphers. byte[] init is a generated byte array.
private void setupStreamCipher(byte[] init)
{
RijndaelManaged cipher = new RijndaelManaged();
cipher.KeySize = cipher.BlockSize = 256; // bit size
cipher.Mode = CipherMode.ECB;
cipher.Padding = PaddingMode.ISO10126;
byte[] keyBytes = new byte[32];
byte[] ivBytes = new byte[32];
Array.Copy(init, keyBytes, 32);
Array.Copy(init, 32, ivBytes, 0, 32);
streamEncryptor = cipher.CreateEncryptor(keyBytes, ivBytes);
streamDecryptor = cipher.CreateDecryptor(keyBytes, ivBytes);
}
Anyone have an idea in what I might be doing wrong?
It looks to me like you're not properly sending the final block. You need to at least FlushFinalBlock() the sending CryptoStream in order to ensure that the final block (which the receiving stream is looking for) is sent.
By the way, CipherMode.ECB is more than likely an epic fail in terms of security for what you're doing. At least use CipherMode.CBC (cipher-block chaining) which actually uses the IV and makes each block dependent on the previous one.
EDIT: Whoops, the enciphering stream is in read mode. In that case you need to make sure you read to EOF so that the CryptoStream can deal with the final block, rather than stopping after readBytes. It's probably easier to control if you run the enciphering stream in write mode.
One more note: You cannot assume that bytes in equals bytes out. Block ciphers have a fixed block size they process, and unless you are using a cipher mode that converts the block cipher to a stream cipher, there will be padding that makes the ciphertext longer than the plaintext.
After the comment made by Jeffrey Hantin, I changed some lines in receiveFile to
using (stream) {
FileInfo finfo = new FileInfo(transferFile.Path);
long position = finfo.Length;
while (position < transferFile.Length) {
int maxRead = Math.Min(array.Length, (int)(transferFile.Length - position));
int read = position < array.Length
? streamSocket.Receive(array, maxRead, SocketFlags.None)
: streamSocket.Receive(array, SocketFlags.None);
stream.Write(array, 0, read);
position += read;
}
}
->
using (stream) {
int read = array.Length;
while ((read = streamSocket.Receive(array, read, SocketFlags.None)) > 0) {
stream.Write(array, 0, read);
if ((read = streamSocket.Available) == 0) {
break;
}
}
}
And voila, she works (because of the ever so kind padding that I didn't care to bother about earlier). I'm not sure what happens if Available returns 0 even though all data hasn't been transferred, but I'll tend to that later in that case. Thanks for your help Jeffrey!
Regards.
cipher.Mode = CipherMode.ECB;
Argh! Rolling your own security code is almost always a bad idea.
Mine i just removed the padding and it works
Commented this out - cipher.Padding = PaddingMode.ISO10126;
Related
Here is the method I like to use. I believe, there is nothing new with this code.
public static byte[] ReadFully(Stream stream, int initialLength)
{
// If we've been passed an unhelpful initial length, just
// use 1K.
if (initialLength < 1)
{
initialLength = 1024;
}
byte[] buffer = new byte[initialLength];
int read = 0;
int chunk;
while ((chunk = stream.Read(buffer, read, buffer.Length - read)) > 0)
{
read += chunk;
// If we've reached the end of our buffer, check to see if there's
// any more information
if (read == buffer.Length)
{
int nextByte = stream.ReadByte();
// End of stream? If so, we're done
if (nextByte == -1)
{
return buffer;
}
// Nope. Resize the buffer, put in the byte we've just
// read, and continue
byte[] newBuffer = new byte[buffer.Length * 2];
Array.Copy(buffer, newBuffer, buffer.Length);
newBuffer[read] = (byte)nextByte;
buffer = newBuffer;
read++;
}
}
// Buffer is now too big. Shrink it.
byte[] ret = new byte[read];
Array.Copy(buffer, ret, read);
return ret;
}
My goal is to read data sent from TCP Clients e.g. box{"id":1,"aid":1}
It is a command to interpret in my application in Jason-like text.
And this text is not necessarily at the same size each time.
Next time there can be run{"id":1,"aid":1,"opt":1}.
The method called by this line;
var serializedMessageBytes = ReadFully(_receiveMemoryStream, 1024);
Please click to see; Received data in receiveMemoryStream
Although we can see the data in the stream,
in the ReadFully method, "chunck" always return 0 and the method returns {byte[0]}.
Any help effort greatly appreciated.
Looking at your stream in the Watch window, the Position of the stream (19) is at the end of the data, hence there is nothing left to read. This is possibly because you have just written data to the stream and have not subsequently reset the position.
Add a stream.Position = 0; or stream.Seek(0, System.IO.SeekOrigin.Begin); statement at the start of the function if you are happy to always read from the start of the stream, or check the code that populates the stream. Note though that some stream implementations do not support seeking.
I'm trying to GetResponseStream from a GET request and convert this into byte[]. The Stream is non-seekable so I can't access stream.Length. response.ContentLength is unreliable.
private byte[] streamToBytes(Stream stream, int bufferSize = 4096)
{
byte[] buffer = new byte[bufferSize];
int read = 0;
int pos = 0;
List<byte> bytes = new List<byte>();
while (true) { // `bufferSize > read` does **not** mean stream end
while (bufferSize == (read = stream.Read(buffer, pos, bufferSize))) {
pos += read;
bytes.AddRange(buffer);
}
if (read > 0) {
byte[] _buffer = new byte[read];
Array.Copy(buffer, _buffer, read);
bytes.AddRange(_buffer);
} else break;
}
return bytes.ToArray();
}
An ArgumentOutOfRangeException gets thrown on the Read in the second iteration of the while loop. MSDN says
ArgumentOutOfRangeException offset or count is negative.
I know this can't be true because offset (pos) is 0 + read >= 0 and count (bufferSize) is 4096 so why am I getting exceptions thrown at me?
I'm trying to keep streamToBytes as generic as possible so I can use it in future async methods, too.
If how the request is made helps, here are the relevant bits
HttpWebRequest request = (HttpWebRequest)WebRequest.Create((new Uri("http://google.com")).ToString());
request.Method = "GET";
request.KeepAlive = true;
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Stream stream = response.GetResponseStream();
byte[] responseBytes = streamToBytes(stream);
As a simpler alternative:
using(var memStream = new MemoryStream())
{
stream.CopyTo(memStream);
return memStream.ToArray();
}
Your code has two mistakes:
The offset on Read is an offset into the buffer, not the position in the stream (as #Ulugbek already noted). This means you don't need the pos variable anymore.
You cannot assume that bufferSize == read even before the stream reached its end. You need to check for > 0 instead.
So your reading loop becomes:
while ((read = stream.Read(buffer, 0, bufferSize)) > 0)
{
bytes.AddRange(buffer.Take(read));
}
You can now drop the special handling for the last block. Simplifying your code to:
private byte[] streamToBytes(Stream stream, int bufferSize = 4096)
{
byte[] buffer = new byte[bufferSize];
int read = 0;
List<byte> bytes = new List<byte>();
while ((read = stream.Read(buffer, 0, bufferSize)) > 0)
{
bytes.AddRange(buffer.Take(read));
}
return bytes.ToArray();
}
Using List<byte> instead of MemoryStream isn't a great idea either. AddRange need to iterate over the bytes individually, instead of using a low level copy operation. Replacing the list with a memory stream, the code becomes:
private byte[] streamToBytes(Stream stream, int bufferSize = 4096)
{
byte[] buffer = new byte[bufferSize];
int read = 0;
using(var bytes = new MemoryStream())
{
while ((read = stream.Read(buffer, 0, bufferSize)) > 0)
{
bytes.Write(buffer, 0, read);
}
return bytes.ToArray();
}
}
You could even split it into two parts, one does the copying into the memory stream, the other handles the creating of the memory stream and turning it into a byte array:
private static void CopyStream(Stream source, Stream destination, int bufferSize = 4096)
{
byte[] buffer = new byte[bufferSize];
int read = 0;
while ((read = source.Read(buffer, 0, bufferSize)) > 0)
{
destination.Write(buffer, 0, read);
}
}
private static byte[] StreamToBytes(Stream stream, int bufferSize = 4096)
{
using(var memStream = new MemoryStream())
{
CopyStream(stream, memStream);
return memStream.ToArray();
}
}
This is probably very similar to what Stream.CopyTo does internally.
Change
stream.Read(buffer, pos, bufferSize)
to
stream.Read(buffer, 0, bufferSize)
I am doing some data chunking and I'm seeing an interesting issue when sending binary data in my response. I can confirm that the length of the byte array is below my data limit of 4 megabytes, but when I receive the message, it's total size is over 4 megabytes.
For the example below, I used the largest chunk size I could so I could illustrate the issue while still receiving a usable chunk.
The size of the binary data is 3,040,870 on the service side and the client (once the message is deserialized). However, I can also confirm that the byte array is actually just under 4 megabytes (this was done by actually copying the binary data from the message and pasting it into a text file).
So, is WCF causing these issues and, if so, is there anything I can do to prevent it? If not, what might be causing this inflation on my side?
Thanks!
The usual way of sending byte[]s in SOAP messages is to base64-encode the data. This encoding takes 33% more space than binary encoding, which accounts for the size difference almost precisely.
You could adjust the max size or chunk size slightly so that the end result is within the right range, or use another encoding, e.g. MTOM, to eliminate this 33% overhead.
If you're stuck with soap, you can offset the buffer overhead Tim S. talked about using the System.IO.Compression library in .Net - You'd use the compress function first, before building and sending the soap message.
You'd compress with this:
public static byte[] Compress(byte[] data)
{
MemoryStream ms = new MemoryStream();
DeflateStream ds = new DeflateStream(ms, CompressionMode.Compress);
ds.Write(data, 0, data.Length);
ds.Flush();
ds.Close();
return ms.ToArray();
}
On the receiving end, you'd use this to decompress:
public static byte[] Decompress(byte[] data)
{
const int BUFFER_SIZE = 256;
byte[] tempArray = new byte[BUFFER_SIZE];
List<byte[]> tempList = new List<byte[]>();
int count = 0;
int length = 0;
MemoryStream ms = new MemoryStream(data);
DeflateStream ds = new DeflateStream(ms, CompressionMode.Decompress);
while ((InlineAssignHelper(count, ds.Read(tempArray, 0, BUFFER_SIZE))) > 0) {
if (count == BUFFER_SIZE) {
tempList.Add(tempArray);
tempArray = new byte[BUFFER_SIZE];
} else {
byte[] temp = new byte[count];
Array.Copy(tempArray, 0, temp, 0, count);
tempList.Add(temp);
}
length += count;
}
byte[] retVal = new byte[length];
count = 0;
foreach (byte[] temp in tempList) {
Array.Copy(temp, 0, retVal, count, temp.Length);
count += temp.Length;
}
return retVal;
}
I'm trying to read a file on the server (in blocks of 5KB), encrypt the block using AES and send it to the client. On the client, i decrypt the received block, and append to a file to get back the original file.
However, my decrypted block size received on the client differs from the plaintext block which is encrypted on the server.
e.g.
I have a 15.5 KB exe file, so i have 15.5*1024/5*1024 = 4 blocks (round figure) to encrypt and send to client (The first 3 blocks are of 5120 bytes and last block is 512 bytes in length). On the client however, the blocks decrypted are of size 5057, 4970, 5016 and 512 bytes which equals a file size of 15.1 KB (less than what was actually sent by the server).
Here is my code snippet:
Server (sends the file to client):
FileStream fs = new FileStream("lcd.exe", FileMode.Open, FileAccess.Read);
//block size = 5KB
int blockSize = 5 * 1024;
//calculate number of blocks in data
long numberOfBlocks = fs.Length / blockSize;
if (fs.Length % blockSize != 0) numberOfBlocks++;
byte[] numberOfBlocksBytes = BitConverter.GetBytes(numberOfBlocks);
//send number of blocks to client
SendMessage(sw, numberOfBlocksBytes);
int count = 0, offset = 0, numberOfBytesToRead=0;
Aes objAes = new Aes();
while (count < numberOfBlocks)
{
byte[] buffer;
numberOfBytesToRead = blockSize;
if (fs.Length < offset + blockSize)
{
numberOfBytesToRead = (int)(fs.Length - offset);
}
buffer = new byte[numberOfBytesToRead];
fs.Read(buffer, 0, numberOfBytesToRead);
//encrypt before sending
byte[] encryptedBuffer = objAes.Encrypt(buffer, Encoding.Default.GetBytes(sessionKey), initVector);
SendMessage(sw, encryptedBuffer);
offset += numberOfBytesToRead;
count++;
}
fs.Close();
Client side code which receives the file:
byte[] numberOfBlocksBytes = ReadMessage(sr);
long numberOfBlocks = BitConverter.ToInt64(numberOfBlocksBytes, 0);
FileStream fs = new FileStream("lcd.exe", FileMode.Append, FileAccess.Write);
//block size = 5KB
int blockSize = 5 * 1024;
Aes objAes = new Aes();
int count = 0, offset = 0;
while (count < numberOfBlocks)
{
byte[] encryptedBuffer = ReadMessage(sr);
byte[] buffer = objAes.Decrypt(encryptedBuffer, sessionKey, initVector);
fs.Write(buffer, 0, buffer.Length);
offset += buffer.Length;
count++;
}
fs.Close();
My AES code for encryption:
private const int StandardKeyLength = 16;
public byte[] Encrypt(byte[] plainText, byte[] key, byte[] initVector)
{
if (key.Length != StandardKeyLength | initVector.Length != StandardKeyLength)
{
throw new ArgumentException("Key Length and Init Vector should be 16 bytes (128 bits) in size");
}
var bPlainBytes = plainText;
var objRm = new RijndaelManaged();
objRm.Key = key;
objRm.IV = initVector;
objRm.Padding = PaddingMode.PKCS7;
objRm.BlockSize = 128;
var ict = objRm.CreateEncryptor(objRm.Key, objRm.IV);
var objMs = new MemoryStream();
var objCs = new CryptoStream(objMs, ict, CryptoStreamMode.Write);
objCs.Write(bPlainBytes, 0, bPlainBytes.Length);
objCs.FlushFinalBlock();
var bEncrypted = objMs.ToArray();
return bEncrypted;
}
My AES code for decryption:
public byte[] Decrypt(byte[] cipherText, byte[] key, byte[] initVector)
{
if (key.Length != StandardKeyLength | initVector.Length != StandardKeyLength)
{
throw new ArgumentException("Key Length and Init Vector should be 16 bytes (128 bits) in size");
}
var bCipherBytes = cipherText;
var objRm = new RijndaelManaged();
objRm.Key = key;
objRm.IV = initVector;
objRm.Padding = PaddingMode.PKCS7;
objRm.BlockSize = 128;
var ict = objRm.CreateDecryptor(objRm.Key, objRm.IV);
var objMs = new MemoryStream(bCipherBytes);
var objCs = new CryptoStream(objMs, ict, CryptoStreamMode.Read);
var streamobj = new StreamReader(objCs);
var strDecrypted = streamobj.ReadToEnd();
return (Encoding.Default.GetBytes(strDecrypted));
}
These are the results i got while debugging the while loop which sends file blocks on the server:
Actual File Size sent: 15.5 KB = 15872 bytes
Buffer size(plaintext) Encrypted Buffer Size(Sent) Offset Count
5120 5136 5120 0
5120 5136 10240 1
5120 5136 15360 2
512 528 15872 3
These are the results i got while debugging the while loop which receives file blocks on the client:
Actual File Size received: 15.1 KB = 15555 bytes
Received Buffersize Decrypted Buffer Size Offset Count
5136 5057 5057 0
5136 4970 10027 1
5136 5016 15043 2
528 512 15555 3
It is evident that the sending and receiving code is working fine (since encrypted buffer size which is sent = received buffer size). However, the decrypted buffer size does not match the buffer size (plaintext) at all except for the last block which is of length 512 bytes.
What can be possibly wrong with decryption because of which i'm not receiving the file completely on the client side?
You're being tripped up because in your Decrypt statement you are treating your ciphertext as if it is a string. Specifically, these lines:
var streamobj = new StreamReader(objCs);
var strDecrypted = streamobj.ReadToEnd();
return (Encoding.Default.GetBytes(strDecrypted));
Instead you want to be calling Read on your CryptoStream to read a raw byte array into a buffer. You can then return that buffer without attempting to coerce it into a string (which is what is happening by using the stream reader).
You should use something more like:
public byte[] Decrypt(byte[] cipherText, byte[] key, byte[] initVector)
{
if (key.Length != StandardKeyLength | initVector.Length != StandardKeyLength)
{
throw new ArgumentException("Key Length and Init Vector should be 16 bytes (128 bits) in size");
}
var bCipherBytes = cipherText;
var objRm = new RijndaelManaged();
objRm.Key = key;
objRm.IV = initVector;
objRm.Padding = PaddingMode.PKCS7;
objRm.BlockSize = 128;
var ict = objRm.CreateDecryptor(objRm.Key, objRm.IV);
var objMs = new MemoryStream(bCipherBytes);
var objCs = new CryptoStream(objMs, ict, CryptoStreamMode.Read);
var buffer = new byte[cipherText.Length];
int readBytes = objCs.Read(buffer, 0, cipherText.Length);
var trimmedData = new byte[readBytes];
Array.Copy(buffer, trimmedData, readBytes);
return trimmedData;
}
I would also suggest you take a look at the encryption utilities I maintain on Snipt. Specifically the Symmetric Encrypt and Decrypt methods. Your code as it stands has a lot of using blocks missing and a number of potential resource leaks.
var streamobj = new StreamReader(objCs);
That's pretty unlikely to work well. The StreamReader will assume that the decrypted data is utf-8 encoded text. There is no hint whatsoever that this is actually the case from the code that encrypts the data, it takes a byte[].
Use a FileStream instead so no conversion is made at all. Also helps you avoid the Encoding.Default.GetBytes() data randomizer.
Quick observation, which may just be my ignorance: Encrypt() method uses default encoding to get the session key bytes. On the receiving end the Decrypt() method uses the sessionKey itself as second parameter, i.e., without getting bytes?
I have a client-server application where the server transmits a 4-byte integer specifying how large the next transmission is going to be. When I read the 4-byte integer on the client side (specifying FILE_SIZE), the next time I read the stream I get FILE_SIZE + 4 bytes read.
Do I need to specify the offset to 4 when reading from this stream, or is there a way to automatically advance the NetworkStream so my offset can always be 0?
SERVER
NetworkStream theStream = theClient.getStream();
//...
//Calculate file size with FileInfo and put into byte[] size
//...
theStream.Write(size, 0, size.Length);
theStream.Flush();
CLIENT
NetworkStream theStream = theClient.getStream();
//read size
byte[] size = new byte[4];
int bytesRead = theStream.Read(size, 0, 4);
...
//read content
byte[] content = new byte[4096];
bytesRead = theStream.Read(content, 0, 4096);
Console.WriteLine(bytesRead); // <-- Prints filesize + 4
Right; found it; FileInfo.Length is a long; your call to:
binWrite.Write(fileInfo.Length);
writes 8 bytes, little-endian. You then read that back via:
filesize = binRead.ReadInt32();
which little-endian will give you the same value (for 32 bits, at least). You have 4 00 bytes left unused in the stream, though (from the high-bytes of the long) - hence the 4 byte mismatch.
Use one of:
binWrite.Write((int)fileInfo.Length);
filesize = binRead.ReadInt64();
NetworkStream certainly advances, but in both cases, your read is unreliable; a classic "read known amount of content" would be:
static void ReadAll(Stream source, byte[] buffer, int bytes) {
if(bytes > buffer.Length) throw new ArgumentOutOfRangeException("bytes");
int bytesRead, offset = 0;
while(bytes > 0 && (bytesRead = source.Reader(buffer, offset, bytes)) > 0) {
offset += bytesRead;
bytes -= bytesRead;
}
if(bytes != 0) throw new EndOfStreamException();
}
with:
ReadAll(theStream, size, 4);
...
ReadAll(theStream, content, contentLength);
note also that you need to be careful with endianness when parsing the length-prefix.
I suspect you simply aren't reading the complete data.