Related
I have a file which has been encrypted by des.exe.
A file can be encrypted and decrypted using the following commands:
des -E -k "foo" sample.txt sample.txt.enc
des -D -k "foo" sample.txt.enc sample.txt.dec
I have attempted to decrypt using the following:
public byte[] Decrypt(FileInfo file, string key)
{
byte[] keyAsBytes = LibDesPasswordConvertor.PasswordToKey(key);
byte[] initializationVector = keyAsBytes;
var cryptoProvider = new DESCryptoServiceProvider();
cryptoProvider.Mode = CipherMode.CBC;
cryptoProvider.Padding = PaddingMode.None;
using (FileStream fs = file.OpenRead())
using (var memStream = new MemoryStream())
using (var decryptor = cryptoProvider.CreateDecryptor(keyAsBytes, initializationVector))
using (var cryptoStream = new CryptoStream(memStream, decryptor, CryptoStreamMode.Write))
{
fs.CopyTo(cryptoStream);
fs.Flush();
cryptoStream.FlushFinalBlock();
return memStream.ToArray();
}
}
public static class LibDesPasswordConvertor
{
public static byte[] PasswordToKey(string password)
{
if (string.IsNullOrWhiteSpace(password))
{
throw new ArgumentException("password");
}
var key = new byte[8];
for (int i = 0; i < password.Length; i++)
{
var c = (int)password[i];
if ((i % 16) < 8)
{
key[i % 8] ^= (byte)(c << 1);
}
else
{
// reverse bits e.g. 11010010 -> 01001011
c = (((c << 4) & 0xf0) | ((c >> 4) & 0x0f));
c = (((c << 2) & 0xcc) | ((c >> 2) & 0x33));
c = (((c << 1) & 0xaa) | ((c >> 1) & 0x55));
key[7 - (i % 8)] ^= (byte)c;
}
}
AddOddParity(key);
var target = new byte[8];
var passwordBuffer = Encoding.ASCII.GetBytes(password).Concat(new byte[8]).Take(password.Length + (8 - (password.Length % 8)) % 8).ToArray();
using(var des = DES.Create())
using(var encryptor = des.CreateEncryptor(key, key))
{
for (int x = 0; x < passwordBuffer.Length / 8; ++x)
{
encryptor.TransformBlock(passwordBuffer, 8 * x, 8, target, 0);
}
}
AddOddParity(target);
return target;
}
private static void AddOddParity(byte[] buffer)
{
for (int i = 0; i < buffer.Length; ++i)
{
buffer[i] = _oddParityTable[buffer[i]];
}
}
private static byte[] _oddParityTable = {
1, 1, 2, 2, 4, 4, 7, 7, 8, 8, 11, 11, 13, 13, 14, 14,
16, 16, 19, 19, 21, 21, 22, 22, 25, 25, 26, 26, 28, 28, 31, 31,
32, 32, 35, 35, 37, 37, 38, 38, 41, 41, 42, 42, 44, 44, 47, 47,
49, 49, 50, 50, 52, 52, 55, 55, 56, 56, 59, 59, 61, 61, 62, 62,
64, 64, 67, 67, 69, 69, 70, 70, 73, 73, 74, 74, 76, 76, 79, 79,
81, 81, 82, 82, 84, 84, 87, 87, 88, 88, 91, 91, 93, 93, 94, 94,
97, 97, 98, 98,100,100,103,103,104,104,107,107,109,109,110,110,
112,112,115,115,117,117,118,118,121,121,122,122,124,124,127,127,
128,128,131,131,133,133,134,134,137,137,138,138,140,140,143,143,
145,145,146,146,148,148,151,151,152,152,155,155,157,157,158,158,
161,161,162,162,164,164,167,167,168,168,171,171,173,173,174,174,
176,176,179,179,181,181,182,182,185,185,186,186,188,188,191,191,
193,193,194,194,196,196,199,199,200,200,203,203,205,205,206,206,
208,208,211,211,213,213,214,214,217,217,218,218,220,220,223,223,
224,224,227,227,229,229,230,230,233,233,234,234,236,236,239,239,
241,241,242,242,244,244,247,247,248,248,251,251,253,253,254,254};
}
But when I execute:
const string KEY = "foo";
var utf8Bytes = Decrypt(new FileInfo(#"PATH-TO\sample.txt.enc"), KEY);
I get:
�1D���z+�a Sample.y���0F�01
Original text:
This is a Sample.
Encrypted:
ñGYjl¦ûg†¼64©‹Bø
é¯Kœ|
To my surprise you've already derived the key correctly. That was the meat of the problem, so Kudos for solving that part already. That the key is correct becomes clear when you see that part of the plaintext is present in the decryption - it wouldn't if the key was wrong.
Looking into the source and some docs from times past, I found a likely IV of all zeros instead of reusing the key bytes (both are very wrong, in cryptographic terms).
Furthermore, as always for SSLeay, the ECB and CBC modes use PKCS#7 compatible padding, rather than no padding.
Finally, FlushFinalBlock will be automatically called if you close the stream, e.g. by exiting the try-with-resources. So if you get the array afterwards then you should get the right values - after you unpad correctly, of course. If you call Flush then FlushFinalBlock will already be called, and calling it twice will make a mess out of things.
Simply removing the flush calls and retrieving the array after the CryptoStream is closed is the way to go.
Both DES and the key derivation (des_string_to_key and des_string_to_2keys) that Young copied from MIT are completely insecure. Using an all zero IV is wrong.
If you use this as transport mode than padding oracles will apply, and decryption is not even necessary for an attacker. The ciphertext is not integrity protected.
If you use the above routines to keep anything confidential or secure you're fooling yourself. This is 80s technology, and I think that real cryptographers wouldn't find it secure back then either.
Basically if your attacker is over 8 years old, you're in trouble.
I'm trying to encrypt jpeg on server and decrypt it on browser like below but failed at step #3.
Encrypt jpeg by C# on server
Get encrypted data, vector, and passphrase on client browser
Decrypt on client browser
I tried to things below, but nothing helped.
use RijndaelManaged instead of AesManaged on server.
use 'CryptoJS.enc.Utf16.parse' instead of 'CryptoJS.enc.Utf8.parse'
on client.
Encryption on Server
public byte[] Encrypt(byte[] bytes, string password, string vector)
{
AesManaged aes = new AesManaged();
aes.KeySize = _keySize;
aes.BlockSize = _blockSize;
aes.Mode = CipherMode.CBC;
aes.IV = Encoding.UTF8.GetBytes(vector);
aes.Key = Encoding.UTF8.GetBytes(password);
aes.Padding = PaddingMode.PKCS7;
byte[] encrypted = aes.CreateEncryptor().TransformFinalBlock(bytes, 0, bytes.Length);
return encrypted;
}
Decryption on Client
// These values are same as above
var encrypted = ... // byte[]
var vector = ... // string
var password = ... // string
var cipherParams = CryptoJS.lib.CipherParams.create({
iv: CryptoJS.enc.Utf8.parse(vector),
mode: CryptoJS.mode.CBC,
padding: CryptoJS.pad.Pkcs7
});
var cipherText = CryptoJS.lib.WordArray.create(encrypted);
var passwordWordArray = CryptoJS.enc.Utf8.parse(password);
var decrypted = CryptoJS.AES.decrypt(cipherText, passwordWordArray, cipherParams);
// decrypted.words is empty here
Please advise.
I'm using .Net Core MVC 2.1, Crypto-JS 3.1.9-1, and Chrome74.0.3729.169 on Windows 10.
THIS PROBLEM RESOLVED
As Topaco mentioned, there was a flaw in javascript. The corrected code is as below.
var encrypted = ... // byte[]
var vector = ... // string
var password = ... // string
var cipherParams = CryptoJS.lib.CipherParams.create({
iv: CryptoJS.enc.Utf8.parse(vector),
mode: CryptoJS.mode.CBC,
padding: CryptoJS.pad.Pkcs7
});
var cipherText = CryptoJS.lib.WordArray.create(encrypted);
var cipherTextParam = CryptoJS.lib.CipherParams.create({
ciphertext: cipherText
});
var passwordWordArray = CryptoJS.enc.Utf8.parse(password);
var decrypted = CryptoJS.AES.decrypt(cipherTextParam, passwordWordArray, cipherParams);
Thank you for your help.
There are two flaws in the JavaScript-code that can be corrected as follows:
In the JavaScript-code the line:
var cipherText = CryptoJS.lib.WordArray.create(encrypted);
must be replaced by:
var cipherText = byteArrayToWordArray(encrypted);
Here, the function byteArrayToWordArray is used:
function byteArrayToWordArray(ba) {
var wa = [], i;
for (i = 0; i < ba.length; i++)
wa[(i / 4) | 0] |= ba[i] << (24 - 8 * i);
return CryptoJS.lib.WordArray.create(wa, ba.length);
}
This function generates a word-array from the byte-array by generating a word (4 bytes) from 4 bytes of the byte-array. In the old code, a word of the same value is generated for each byte, i.e. both arrays have the same number of elements, which is wrong.
Alternatively:
var cipherTextHex = bytesToHex(encrypted);
var cipherText = CryptoJS.enc.Hex.parse(cipherTextHex);
can also be used. Here, the function bytesToHex is used:
function bytesToHex(bytes) {
for (var hex = [], i = 0; i < bytes.length; i++) {
var current = bytes[i] < 0 ? bytes[i] + 256 : bytes[i];
hex.push((current >>> 4).toString(16));
hex.push((current & 0xF).toString(16));
}
return hex.join("");
}
The function generates a hex-string from the byte-array. From this a word-array is derived using the appropriate encoder.
In the JavaScript-code, the line:
var decrypted = CryptoJS.AES.decrypt(cipherText, passwordWordArray, cipherParams);
must be replaced by:
var cipherParamsCipherText = CryptoJS.lib.CipherParams.create({
ciphertext: cipherText
});
var decrypted = CryptoJS.AES.decrypt(cipherParamsCipherText, passwordWordArray, cipherParams);
since the decrypted-function expects a CipherParams-object as the first argument instead of a WordArray.
Alternatively, a Base64-encoded string can also be passed:
var cipherTextB64Enc = CryptoJS.enc.Base64.stringify(cipherText);
var decrypted = CryptoJS.AES.decrypt(cipherTextB64Enc, passwordWordArray, cipherParams);
Test: The C#-code provides for the following input:
byte[] bytes = Encoding.UTF8.GetBytes("The quick brown fox jumps over the lazy dog");
string password = "0123456789012345"; // 16 byte -> AES-128
string vector = "5432109876543210"; // 16 byte
the following byte-array as ciphertext:
170, 27, 161, 209, 42, 247, 234, 191, 38, 167, 22, 74, 34, 139, 115, 0, 75, 207, 119, 161, 97, 142, 179, 93, 41, 12, 177, 128, 52, 151, 75, 231, 76, 157, 14, 197, 59, 111, 63, 206, 136, 218, 189, 244, 116, 43, 25, 20
If the modified JavaScript-code is tested with those data:
var encrypted = [170,27,161,209,42,247,234,191,38,167,22,74,34,139,115,0,75,207,119,161,97,142,179,93,41,12,177,128,52,151,75,231,76,157,14,197,59,111,63,206,136,218,189,244,116,43,25,20]; // byte[]
var vector = "5432109876543210" // string
var password = "0123456789012345"; // string
it is decrypted correctly.
In the test AES-128 has been used. It can be switched to AES-256 by simply using a 32-byte key instead of a 16-byte key.
My C# 3DES encryption is not matching the third party API I'm using. Is there anything wrong with my code?
static void Main(string[] args)
{
String sharedSec = "654A7EA2C9914A0B972937F4EA45FED3";
byte[] byteArraySharedSec = Convert.FromBase64String(sharedSec);
TripleDESCryptoServiceProvider tdes = new TripleDESCryptoServiceProvider();
tdes.KeySize = 192;
tdes.Key = byteArraySharedSec;
tdes.Mode = CipherMode.ECB;
tdes.Padding = PaddingMode.PKCS7;
ICryptoTransform ict = tdes.CreateEncryptor();
ICryptoTransform dct = tdes.CreateDecryptor();
Console.WriteLine("SharedSec: {0}", sharedSec);
Console.WriteLine("byteArraySharedSec: {0}\n", ToReadableByteArray(byteArraySharedSec));
long unixTimestamp = 1299481200;
byte[] unixTimestampByte = BitConverter.GetBytes(unixTimestamp);
Console.WriteLine("Timestamp: {0}, length: {1} ", ToReadableByteArray(unixTimestampByte), unixTimestampByte.Length);
byte[] result = ict.TransformFinalBlock(unixTimestampByte, 0, unixTimestampByte.Length);
Console.WriteLine("After 3DES encrypting: {0}, length {1}\n\n ", ToReadableByteArray(result), result.Length);
}
static public string ToReadableByteArray(byte[] bytes)
{
return string.Join(",", bytes);
}
The output (you can see byteArraySharedSec is correct but encrypting is not):
SharedSec: 654A7EA2C9914A0B972937F4EA45FED3
byteArraySharedSec: 235,158,0,236,64,54,11,223,117,224,13,1,247,189,189,223,177,120,16,14,57,20,64,247
Timestamp: 112,130,116,77,0,0,0,0, length: 8
After 3DES encrypting:
213,60,183,244,171,116,202,205,233,17,226,8,70,9,111,43, length 16
API Doc has given this example:
The 3DES encryption uses:
192 bit key
ECB Mode
PKCS7 Padding
Example SHARED_SECRET: 654A7EA2C9914A0B972937F4EA45FED3
Convert SHARED_SECRET to byte array by Base64 decoding. This is the 3DES key: { 235, 158, 0, 236, 64, 54, 11, 223, 117, 224, 13, 1, 247, 189, 189, 223, 177, 120, 16, 14, 57, 20, 64, 247}
Example timestamp (7AM 7th March 2011 GMT): 1299481200
3DES encrypt (ECB mode, PKCS7 Padding) the timestamp using the 3DES key : 128 bit (16 byte) result for this example { 82, 191, 213, 179, 179, 73, 1, 218, 247, 68, 254, 199, 19, 159, 1, 138}
You're encrypting a different value than the example gave you.
Treating the timestamp as string, and using the UTF8 encoding to get its bytes representation should gave you the same result:
...
byte[] unixTimestampByte = Encoding.UTF8.GetBytes(unixTimestamp.ToString());
...
Why is this program not working? I convert a byte array to long. Then from the long I convert back to a byte array. The resulting byte array is not the same as original.
class Program
{
static void Main(string[] args)
{
byte[] myBytes = { 0, 0, 0, 32, 56, 99, 87, 34, 56, 56, 34, 33, 67
, 56, 66, 72, 1, 0, 0, 56, 0, 22};
long data = BitConverter.ToInt64(myBytes, 0);
byte[] byteData = BitConverter.GetBytes(data);
Console.WriteLine("byte array: " + BitConverter.ToString(myBytes));
Console.WriteLine("byte array: " + BitConverter.ToString(byteData));
}
}
Since l4V already gave the right assumption, I just want to add it as an aswer but I think my answer doesn't deserve any votes since all upvotes should go to l4V. Upvote his comment.
From BitConverter.ToInt64
The ToInt64 method converts the bytes from index startIndex to
startIndex + 7 to a Int64 value.
So basicly, this conversations takes only 8 bytes (0, 0, 0, 32, 56, 99, 87, 34) of your byte array. Other bytes of your array are ignored at this situation.
The length of bytes exceed a long can hold(8 bytes, 64 bits).
For alternative solution, I'd suggest to use BigInteger if your target framework is higher than(including) .Net 4.0.
I should implement a MAC-CBC generation method in C# with some information about the cryptography algorithm. Here's what I have:
I should use DES.
The key is byte[] {11, 11, 11, 11, 11, 11, 11, 11}
The data (16 bytes) should be encrypted in 8-byte parts. First 8 bytes is encrypted using Instance Vector = new byte[8] (8 bytes with 0 value). (CBC?)
that last 8 bytes of the encrypted value should be converted to Hex string. this is the result I should send.
With this information, I have implemented the following method:
public static string Encrypt(byte[] data)
{
var IV = new byte[8];
var key = new byte[] { 11, 11, 11, 11, 11, 11, 11, 11 };
var result = new byte[16];
// Create DES and encrypt.
var des = DES.Create();
des.Key = key;
des.IV = IV;
des.Padding = PaddingMode.None;
des.Mode = CipherMode.CBC;
ICryptoTransform cryptoTransform = des.CreateEncryptor(key, IV);
cryptoTransform.TransformBlock(data, 0, 16, result, 0);
// Get the last eight bytes of the encrypted data.
var lastEightBytes = new byte[8];
Array.Copy(result, 8, lastEightBytes, 0, 8);
// Convert to hex.
var hexResult = string.Empty;
foreach (byte ascii in lastEightBytes)
{
int n = (int)ascii;
hexResult += n.ToString("X").PadLeft(2, '0');
}
return hexResult;
}
The sample raw data they have provided me is: input=byte[] {0, 6, 4, 1, 6, 4, 1, 7, E, E, F, F, F, F, B, B) which should return the output of value: A7CBFB3C730B059C. This means the last eight bytes of encrypted data should be: byte[] {167, 203, 251, 60, 115, 11, 05, 156}.
But unfortunately using the above method, I get: 32D91200D0007632. meaning my encrypted data is not correct. (the last eight byte of my method's generated encrypted value is byte[] {50, 207, 18, 0, 208, 0, 118, 50}).
Is there any way that I can find out what I should do to get to A7CB...? Am I doing something wrong?
CBC-MAC requires a zero Initialisation Vector. Much better to specify the IV explicitly:
var IV = new byte[] { 0, 0, 0, 0, 0, 0, 0, 0 };
You say your key is byte[] { 11, 11, 11, 11, 11, 11, 11, 11 } are those bytes in hex or in base 10? You might want to try:
var key = new byte[] { 0x11, 0x11, 0x11, 0x11, 0x11, 0x11, 0x11, 0x11 };
and see if that works better.
The Mono project has a generic MAC-CBC implementation that should work on any SymmetricAlgorithm - even if it's used, internally, only to implement MACTripleDES.
You can find the MIT.X11 licensed source code here. Use it as-is or compare it to your own code.