I'm trying to convert this function into PHP but somehow it does not give same results.
public static string EncodePassword(string pass, string salt) {
byte[] bytes = Encoding.Unicode.GetBytes(pass);
byte[] src = Convert.FromBase64String(salt);
byte[] dst = new byte[src.Length + bytes.Length];
byte[] inArray = null;
Buffer.BlockCopy(src, 0, dst, 0, src.Length);
Buffer.BlockCopy(bytes, 0, dst, src.Length, bytes.Length);
HashAlgorithm algorithm = HashAlgorithm.Create("SHA1");
inArray = algorithm.ComputeHash(dst);
return Convert.ToBase64String(inArray);
}
This is my take on that in PHP
function CreatePasswordHash($password, $salt) {
$salted_password = base64_decode($salt).$password;
$result = hash('SHA1',$salted_password,true);
return base64_encode($result);
}
And of course it does not work. So what am I doing wrong here?
These are the values for testing:
$salt = 'Xh2pHwDv3VEUQCvz5qOm7w==';
$hashed_value = '0U/kYMz3yCXLsw/r9kocT5zf0cc=';
$password = 'Welcome1!';
if ($hashed_value === CreatePasswordHash($password,$salt)) {
echo "Good job!";
}
Edit: Working solution based on suggestions from Martyx and Slacks
function CreatePasswordHash($password, $salt) {
$upass = mb_convert_encoding($password,'UCS-2LE','auto');
$salted_password = base64_decode($salt).$upass;
$result = hash('SHA1',$salted_password,true);
return base64_encode($result);
}
Once I used SHA1 in PHP and C# and in C# output letters were in uppercase and in PHP it was in lowercase.
I would recommend to use the same encoding in C# and in PHP (UTF8 and UTF16 are good choices).
Choices in C#:
Encoding.ASCII.GetBytes
Encoding.UTF8.GetBytes
...
encoding PHP:
mb_convert_encoding
You need to tell PHP to encode the password string using UTF16 and the salt as raw bytes.
Related
I want to encrypt password using both in CryptoJS and C#. Unfortunately, my C# code fails to generate the proper value. This is my code
internal static byte[] ComputeSha256(this byte[] value)
{
using (SHA256 sha256Hash = SHA256.Create())
return sha256Hash.ComputeHash(value);
}
internal static byte[] ComputeSha256(this string value) => ComputeSha256(Encoding.UTF8.GetBytes(value));
internal static byte[] ComputeMD5(this byte[] value)
{
using (MD5 md5 = MD5.Create())
return md5.ComputeHash(value);
}
internal static byte[] ComputeMD5(this string value) => ComputeMD5(Encoding.UTF8.GetBytes(value));
internal static byte[] CombineByteArray(byte[] first, byte[] second)
{
byte[] bytes = new byte[first.Length + second.Length];
Buffer.BlockCopy(first, 0, bytes, 0, first.Length);
Buffer.BlockCopy(second, 0, bytes, first.Length, second.Length);
return bytes;
}
internal static string EncryptPassword()
{
using (AesManaged aes = new AesManaged())
{
//CLIENT SIDE PASSWORD HASH
/*
var password = '12345';
var passwordMd5 = CryptoJS.MD5(password);
var passwordKey = CryptoJS.SHA256(CryptoJS.SHA256(passwordMd5 + '12345678') + '01234567890123456');
var encryptedPassword = CryptoJS.AES.encrypt(passwordMd5, passwordKey, { mode: CryptoJS.mode.ECB, padding: CryptoJS.pad.NoPadding });
encryptedPassword = CryptoJS.enc.Base64.parse(encryptedPassword.toString()).toString(CryptoJS.enc.Hex);
//encryptedPassword result is c3de82e9e8a28a4caded8c2ef0d49c80
*/
var y1 = Encoding.UTF8.GetBytes("12345678");
var y2 = Encoding.UTF8.GetBytes("01234567890123456");
var password = "12345";
var passwordMd5 = ComputeMD5(password);
var xkey = CombineByteArray(ComputeSha256(CombineByteArray(passwordMd5, y1)), y2);
var passwordKey = ComputeSha256(xkey);
aes.Key = passwordKey;
aes.Mode = CipherMode.ECB;
aes.Padding = PaddingMode.None;
ICryptoTransform crypt = aes.CreateEncryptor();
byte[] cipher = crypt.TransformFinalBlock(passwordMd5, 0, passwordMd5.Length);
var encryptedPassword = BitConverter.ToString(cipher).Replace("-", "").ToLower();
return encryptedPassword; //e969b60e87339625c32f805f17e6f993
}
}
The result of the C# code above is e969b60e87339625c32f805f17e6f993. It should be the same with CryptoJS c3de82e9e8a28a4caded8c2ef0d49c80. What is wrong here?
In the CryptoJS code hashes (in the form of WordArrays) and strings are added in several places. Thereby the WordArray is implicitly encoded with toString() into a hex string with lowercase letters. This is missing in the C# code.
In the C# code the addition is done with CombineByteArray(), where the hash is passed in the parameter first as byte[]. Therefore this parameter must first be converted to a hex encoded string with lowercase letters and then UTF8 encoded, e.g.:
internal static byte[] CombineByteArray(byte[] first, byte[] second)
{
// Hex encode (with lowercase letters) and Utf8 encode
string hex = ByteArrayToString(first).ToLower();
first = Encoding.UTF8.GetBytes(hex);
byte[] bytes = new byte[first.Length + second.Length];
Buffer.BlockCopy(first, 0, bytes, 0, first.Length);
Buffer.BlockCopy(second, 0, bytes, first.Length, second.Length);
return bytes;
}
where ByteArrayToString() is from here.
With this change, the C# code gives the same result as the CryptoJS code.
I am not quite clear about the purpose of the CryptoJS code. Usually plaintext and key are independent, i.e. are not derived from the same password.
Perhaps this is supposed to implement a custom password-based key derivation function. If so, and unless a custom implementation is mandatory for compatibility reasons, it is more secure to use a proven algorithm such as Argon2 or PBKDF2. In particular, the lack of a salt/work factor is insecure.
I'm trying to match encryption schemes on c++ using crypto++ and c# and can't seem to get the same results on both. They both work on them selves, but not from one to the other. Any help would be great!
C++ code using Crypto++:
std::string key = "01286567891233460123456789123456";
std::string iv = "0123456789123456";
std::string encrypt(const std::string& str_in)
{
std::string str_out;
CryptoPP::AES::Encryption aesEncryption((byte*)key.c_str(), CryptoPP::AES::MAX_KEYLENGTH);
CryptoPP::CBC_Mode_ExternalCipher::Encryption cbcEncryption(aesEncryption, (byte*)iv.c_str());
StreamTransformationFilter stfEncryptor(cbcEncryption, new CryptoPP::StringSink(str_out));
stfEncryptor.Put(reinterpret_cast<const unsigned char*>(str_in.c_str()), str_in.length() + 1);
stfEncryptor.MessageEnd();
return str_out;
}
std::string decrypt(const std::string& cipher_text)
{
std::string str_out;
CryptoPP::AES::Decryption aesDecryption((byte*)key.c_str(), CryptoPP::AES::MAX_KEYLENGTH);
CryptoPP::CBC_Mode_ExternalCipher::Decryption cbcDecryption(aesDecryption, (byte*)iv.c_str());
CryptoPP::StreamTransformationFilter stfDecryptor(cbcDecryption, new CryptoPP::StringSink(str_out));
stfDecryptor.Put(reinterpret_cast<const unsigned char*>(cipher_text.c_str()), cipher_text.size());
stfDecryptor.MessageEnd();
return str_out;
}
The Code Ran:
std::string str = encrypt("123456789012345");
str = decrypt(str);
This output is:
Encrypted: Ö&qcƒ“¹yLY»Lïv¹w“¼LLŠÀ¶ó¢,óð9·
Length: 32
Decrypted: 123456789012345
Length: 16
Now in C#, I have the following code:
public string Encrypt(string clearText)
{
byte[] clearBytes = Encoding.Default.GetBytes(clearText);
using (Aes encryptor = Aes.Create("AES"))
{
// encryptor.BlockSize = 128;
encryptor.Padding = PaddingMode.Zeros;
encryptor.KeySize = 128;
encryptor.Mode = CipherMode.CBC;
encryptor.Key = Encoding.Default.GetBytes("01234567891234560123456789123456");
encryptor.IV = Encoding.Default.GetBytes("0123456789123456");
using (MemoryStream ms = new MemoryStream())
{
using (CryptoStream cs = new CryptoStream(ms, encryptor.CreateEncryptor(), CryptoStreamMode.Write))
{
cs.Write(clearBytes, 0, clearBytes.Length);
cs.Close();
}
byte[] bt = ms.ToArray();
clearText = Encoding.Default.GetString(bt);// Convert.ToBase64String(bt);
}
}
return clearText; //Return the encrypted command
}
public string Decrypt(string cipherText)
{
byte[] clearBytes = Encoding.Default.GetBytes(cipherText);
using (Aes decryptor = Aes.Create("AES"))
{
// decryptor.BlockSize = 128;
decryptor.Padding = PaddingMode.Zeros;
decryptor.KeySize = 128;
decryptor.Mode = CipherMode.CBC;
decryptor.Key = Encoding.Default.GetBytes("01286567891233460123456789123456");
decryptor.IV = Encoding.Default.GetBytes("0123456789123456");
using (MemoryStream ms = new MemoryStream())
{
using (CryptoStream cs = new CryptoStream(ms, decryptor.CreateDecryptor(), CryptoStreamMode.Write))
{
cs.Write(clearBytes, 0, clearBytes.Length);
cs.Close();
}
byte[] bt = ms.ToArray();
cipherText = Encoding.Default.GetString(bt);// Convert.ToBase64String(bt);
}
}
return cipherText; //Return the decrypted text
}
}
The code ran:
string orig = "123456789012345";
string cipher = Encrypt(orig);
string dedata = Decrypt(cipher);
The results are:
Orig: 123456789012345
Encrypted: êßyoº0¦ëì›X˜Ü
Length: 16
Decrypted: 123456789012345
Length: 16
As you can see, the encrypted strings end up being different. So when I take an encrypted string in c++, it can't decrypt it in c#, as shown here:
bytes[] encryptedText = ENCRYPTED TEXT FROM C++
text = System.Text.Encoding.Default.GetString(encryptedText);
text = Decrypt(text);
The c++ returns 32 bytes for it's encrypted string which I believe is the padding being added. Not sure how to replicate this in the c# code or vise versa to match things up. Not sure if there is something else I'm missing here... Thanks for any help!
EDIT:
I've matched the keys and now the string matches on both ends except for the padding difference. When I try to decrypt the string in C# it tells me the input data is not the correct block size? Any help with this now?
EDIT AGAIN:
It seems to be generating the same byte string for each c# and c++ encryption function, so that problem is resolved. The problem I now seem to have is on the c# side, I receive the encrypted string and convert the bytes using: text = System.Text.Encoding.Default.GetString(recvBuf); recvBuf being the encrypted string from c++ and it's missing the last character of the string. It matches up with the c++ string minus the last char?? Not sure why this is happening.
For example, my c++ sends over this encrypted string: Ö&qcƒ“¹yLY»Lïv and my c# program will only receive this: Ö&qcƒ“¹yLY»Lï which in turn will make it fail at decryptng. The encrypted string is being sent over a TCP SOCKET if that makes any difference.
EDIT
Still missing bytes after changing encoding and decoding to base64 on both ends.
C++ 1iZxY4OTHrl5TFm7Gkzvdrl3k7xMTIrAtvOiLPPwObc=
C# 1iZxY4OTHrl5TFm7Gkzvdg==
C# RECEIVED 1iZxY4OTHrl5TFm
New Code C#:
public string Encrypt(string clearText)
{
byte[] clearBytes = Encoding.Default.GetBytes(clearText);
using (Aes encryptor = Aes.Create("AES"))
{
// encryptor.BlockSize = 128;
encryptor.Padding = PaddingMode.Zeros;
encryptor.KeySize = 128;
encryptor.Mode = CipherMode.CBC;
encryptor.Key = Encoding.Default.GetBytes("01286567891233460123456789123456");
encryptor.IV = Encoding.Default.GetBytes("0123456789123456");
using (MemoryStream ms = new MemoryStream())
{
using (CryptoStream cs = new CryptoStream(ms, encryptor.CreateEncryptor(), CryptoStreamMode.Write))
{
cs.Write(clearBytes, 0, clearBytes.Length);
cs.Close();
}
byte[] bt = ms.ToArray();
clearText = Convert.ToBase64String(bt);
}
}
return clearText; //Return the encrypted command
}
And C++ Code:
std::string encrypt(const std::string& str_in)
{
std::string str_out;
std::string str_out2;
CryptoPP::AES::Encryption aesEncryption((byte*)key.c_str(), CryptoPP::AES::MAX_KEYLENGTH);
CryptoPP::CBC_Mode_ExternalCipher::Encryption cbcEncryption(aesEncryption, (byte*)iv.c_str());
StreamTransformationFilter stfEncryptor(cbcEncryption, new CryptoPP::StringSink(str_out));
stfEncryptor.Put(reinterpret_cast<const unsigned char*>(str_in.c_str()), str_in.length() + 1);
stfEncryptor.MessageEnd();
str_out2 = base64_encode(reinterpret_cast<const unsigned char*>(str_out.c_str()), strlen(str_out.c_str()));
return str_out2;
}
EDIT
IT WORKS!!! It was simply an overlook on my part, my socket was checking the size of the data before encrypted and sending that size instead of the encrypted string size. Fix that and all it working perfect. Thanks Brandon for the help!
What I am suggesting is your running afoul of text encoding both when converting the clear text to bytes for encrypting and also when converting from bytes to display string for "transmission".
Try this:
C# code
public string Encrypt(string clearText, Encoding encoding)
{
// use supplied encoding to convert clear text to bytes
byte[] clearBytes = encoding.GetBytes(clearText);
byte[] bt = // insert your encrypt code here...
// bt bytes are *not* ascii or utf8 or anything else. If you just
// use an encoding to convert to text, you won't have good results
// lets use base64 encoding to get a nice display string representation
// of the bytes
return Convert.ToBase64String(bt);
}
public string Decrypt(string base64EncryptedString, Encoding encoding)
{
// decode the base64
byte[] bt = Convert.FromBase64String(base64EncryptedString);
byte[] decrypted = // insert your decrypt code here
// now we have the original bytes. Convert back to string using the same
// encoding we used when encrypting
return encoding.GetString(decrypted);
}
// Usage:
var clearText = "Hello World";
var asciiEncrypted = Encrypt(clearText, Encoding.ASCII);
var decrypted = Decrypt(clearText, Encoding.ASCII); // MUST USE SAME ENCODING
var utf8Encrypted = Encrypt(clearText, Encoding.UTF8);
var utf8Decrypted = Decrypt(clearText, Encoding.UTF8);
You need to make the same base64 changes in your C++ code. I'm less familiar with the encoding of your C++ strings. I think that if you are using C++ string literals hard coded in the code, then they will be UTF8. This means that your C++ and C# code should agree once you make the C# changes, make the base64 changes in your C++ code, and pass UTF8 to your C# Encrypt/Decrypt methods.
So I have this piece of php code that I'm not allowed to modify for now, mainly because it's old and works properly.
Warning! Very bad code overal. the IV is not being randomized neither stored with the output. I'm not asking this because I want to,
I'm asking because I need to. I'm also planning on refactoring when I get this working and completing my C# code with actually reliable cyphering code.
function encrypt($string)
{
$output = false;
$encrypt_method = "AES-256-CBC";
$param1 = 'ASasd564D564aAS64ads564dsfg54er8G74s54hjds346gf445gkG7';
$param2 = '654dsfg54er8ASG74sdfg54hjdas346gf34kjdDJF56hfs2345gkFG';
$ky = hash('sha256', $param1); // hash
$iv = substr(hash('sha256', $param2), 0, 16);
$output = openssl_encrypt($string, $encrypt_method, $ky, 0, $iv);
$output = base64_encode($output);
return $output;
}
I want to do the same in C# because I'm getting an entity with all its fields encrypted with that code.
I want to be able to encrypt that data so I can query my entity list whithout having to decrypt all the entities. And I want to decrypt some properties of the filtered entities so they can actually be useful.
Now, for that matter I created a CryptoHelper that will do this, except it doesn't.
I try to calculate the Key and IV in the constructor:
public readonly byte[] Key;
public readonly byte[] IV;
public CryptoHelper()
{
Key = GetByteArraySha256Hash("ASasd564D564aAS64ads564dsfg54er8G74s54hjds346gf445gkG7", false);
IV = GetByteArraySha256Hash("654dsfg54er8ASG74sdfg54hjdas346gf34kjdDJF56hfs2345gkFG", true);
}
private byte[] GetByteArraySha256Hash(string source, bool salt)
{
byte[] result;
try
{
using (SHA256 sha256Hash = SHA256.Create())
{
result = sha256Hash.ComputeHash(Encoding.UTF8.GetBytes(source));
}
}
catch (Exception)
{
throw;
}
if (salt)
{
return result.Take(16).ToArray();
}
return result;
}
And then use a Encrypt and Decrypt methods that are working pretty well when I test them with a test string. The only problem is that the string have some padding at the end, but it's kind of a minor problem considering that any string encrypted with the php method results in gibberish.
private string Encrypt(string source)
{
try
{
string result = "";
using (var aes = new AesManaged { Key = Key, IV = IV, Mode = CipherMode.CBC, Padding = PaddingMode.Zeros })
{
byte[] sourceByteArray = Encoding.UTF8.GetBytes(source);
using (var encryptor = aes.CreateEncryptor(aes.Key, aes.IV))
{
byte[] encriptedSource = encryptor.TransformFinalBlock(sourceByteArray, 0, sourceByteArray.Length);
result = Convert.ToBase64String(encriptedSource);
result = Convert.ToBase64String(Encoding.UTF8.GetBytes(result));
}
}
return result;
}
catch (Exception ex)
{
throw;
}
}
private string Decrypt(string source)
{
try
{
string result = "";
//Double Base64 conversion, as it's done in the php code.
byte[] sourceByte = Convert.FromBase64String(source);
byte[] sourceFreeOfBase64 = Convert.FromBase64String(Encoding.UTF8.GetString(sourceByte));
byte[] resultByte;
int decryptedByteCount = 0;
using (var aes = new AesManaged { Key = Key, IV = IV, Mode = CipherMode.CBC, Padding = PaddingMode.Zeros })
{
using (ICryptoTransform AESDecrypt = aes.CreateDecryptor(aes.Key, aes.IV))
{
using (MemoryStream memoryStream = new MemoryStream(sourceFreeOfBase64))
{
using (CryptoStream cs = new CryptoStream(memoryStream, AESDecrypt, CryptoStreamMode.Read))
{
resultByte = new byte[sourceFreeOfBase64.Length];
decryptedByteCount = cs.Read(resultByte, 0, resultByte.Length);
}
}
}
//This returns the encoded string with a set of "\0" at the end.
result = Encoding.UTF8.GetString(resultByte);
result = result.Replace("\0", "");
}
return result;
}
catch (Exception ex)
{
throw;
}
}
I'm pretty sure that the main problem here lies in the php line $iv = substr(hash('sha256', $param2), 0, 16);. I checked the results of both hash functions in php and C# and are exactly the same.
From what I've been reading php treats strings as byte arrays (correct me if I'm wrong) so a 16 char string should be enough to get a 16 byte array and a 128 block. But in C#, when I get the 16 byte array and convert it to a string I get a 32 char string that is the same as if I did $iv = substr(hash('sha256', $param2), 0, 32);.
So my question is, how do I get the same byte array result in C# that I get in this line $iv = substr(hash('sha256', $param2), 0, 16); of php? Is this even possible?
The hash function will return the same number of bytes whatever the input, so I suspect it is a difference in how you convert the resulting byte[] back to a string in C# compared to the PHP implementation.
The PHP docs say that the hash function output the result in lower case hexits. This is absolutely not the same as the UTF8 encoding that you are returning.
There isn't a built in framework way to do this, but check out this SO question for several different methods.
Also worth noting is that you do not specify the Padding value in your C# code. AES-CBC is a block cipher and will need to use some padding scheme. You may well get a padding exception. I think that it will need Zero padding (docs)
aes.Padding = PaddingMode.Zeros
but I'm not 100%
Well, I managed to solve this in a not so bad manner.
Following #ste-fu advice I tried to get rid of every piece of encoding that I could find.
But I still wasn't anywhere close to getting the Key and IV right. So I did some testing with php. I made a var_dump of the IV and got a neat 16 length array with bytes shown as integers.
var_dump result array starts allways in [1]. Be advised.
$iv = substr(hash('sha256', $param2), 0, 16);
$byte_array = unpack('C*', $iv);
var_dump($byte_array);
That peaked my interest, thinking that if I had the hex string right I should be able to convert each char in the string to it's equivalent byte. Lo and behold, I made this function in C#:
private byte[] StringToByteArray(string hex)
{
IList<byte> resultList = new List<byte>();
foreach (char c in hex)
{
resultList.Add(Convert.ToByte(c));
}
return resultList.ToArray();
}
And this worked very well for the IV. Now I just had to do the same thing for the key. And so I did, just to find that I had a 64 length byte array. That's weird, but ok. More testing in php.
Since it does make sense that the php Key behaves the same as the IV I didn't get how the openssl encryption functions allowed a 64 length Key. So I tryed to encrypt and decrypt the same data with a Key made from the first 32 chars. $ky = substr(hash('sha256', $param1), 0, 32);
And it gave me the same result as with the full Key. So, my educated guess is that openssl just takes the bytes necesary for the encoding to work. In fact it will take anything since I tested with substrings of 1, 16, 20, 32, 33 and 50 length. If the length of the string is bigger than 32 the function itself will cut it.
Anyway, i just had to get the first 32 chars of the Key hex and use my new function to convert them into a byte array and I got my Key.
So, the main C# code right now looks like this:
public CryptoHelper(string keyFilePath, string ivFilePath)
{
//Reading bytes from txt file encoded in UTF8.
byte[] key = File.ReadAllBytes(keyFilePath);
byte[] iv = File.ReadAllBytes(ivFilePath);
IV = StringToByteArray(GetStringHexSha256Hash(iv).Substring(0, 16));
Key = StringToByteArray(GetStringHexSha256Hash(key).Substring(0, 32));
//Tests
var st = Encrypt("abcdefg");
var en = Decrypt(st);
}
//Convert each char into a byte
private byte[] StringToByteArray(string hex)
{
IList<byte> resultList = new List<byte>();
foreach (char c in hex)
{
resultList.Add(Convert.ToByte(c));
}
return resultList.ToArray();
}
private string GetStringHexSha256Hash(byte[] source)
{
string result = "";
try
{
using (SHA256 sha256Hash = SHA256.Create("SHA256"))
{
//Get rid of Encoding!
byte[] hashedBytes = sha256Hash.ComputeHash(source);
for (int i = 0; i < hashedBytes.Length; i++)
{
result = string.Format("{0}{1}",
result,
hashedBytes[i].ToString("x2"));
}
}
}
catch (Exception)
{
throw;
}
return result;
}
private string Encrypt(string source)
{
try
{
string result = "";
using (var aes = new AesManaged { Key = Key, IV = IV, Mode = CipherMode.CBC, Padding = PaddingMode.PKCS7 })
{
byte[] sourceByteArray = Encoding.UTF8.GetBytes(source);
using (var encryptor = aes.CreateEncryptor(aes.Key, aes.IV))
{
byte[] encriptedSource = encryptor.TransformFinalBlock(sourceByteArray, 0, sourceByteArray.Length);
result = Convert.ToBase64String(encriptedSource);
//Nothing to see here, move along.
result = Convert.ToBase64String(Encoding.UTF8.GetBytes(result));
}
}
return result;
}
catch (Exception ex)
{
throw;
}
}
private string Decrypt(string source)
{
try
{
string result = "";
byte[] sourceByte = Convert.FromBase64String(source);
byte[] sourceFreeOfBase64 = Convert.FromBase64String(Encoding.UTF8.GetString(sourceByte));
byte[] resultByte;
int decryptedByteCount = 0;
using (var aes = new AesManaged { Key = Key, IV = IV, Mode = CipherMode.CBC, Padding = PaddingMode.PKCS7 })
{
using (ICryptoTransform AESDecrypt = aes.CreateDecryptor(aes.Key, aes.IV))
{
using (MemoryStream memoryStream = new MemoryStream(sourceFreeOfBase64))
{
using (CryptoStream cs = new CryptoStream(memoryStream, AESDecrypt, CryptoStreamMode.Read))
{
resultByte = new byte[sourceFreeOfBase64.Length];
//Now that everything works as expected I actually get the number of bytes decrypted!
decryptedByteCount = cs.Read(resultByte, 0, resultByte.Length);
}
}
}
//Nothing to see here, move along.
result = Encoding.UTF8.GetString(resultByte);
//Use that byte count to get the actual data and discard the padding.
result = result.Substring(0, decryptedByteCount);
}
return result;
}
catch (Exception ex)
{
throw;
}
}
I still need to clean all the code from my class from all the testing I did, but this is all that's needed to make it work.
I hope this helps anybody with the same problem that I faced.
Cheers.
I have the following code in Java:
byte[] secretKey = secretAccessKey.getBytes("UTF-8");
SecretKeySpec signingKey = new SecretKeySpec(secretKey, "HmacSHA256");
Mac mac = Mac.getInstance("HmacSHA256");
mac.init(signingKey);
byte[] bytes = data.getBytes("UTF-8");
byte[] rawHmac = mac.doFinal(bytes);
String result = javax.xml.bind.DatatypeConverter.printBase64Binary(rawHmac);
and the following code in C#:
UTF8Encoding enc = new UTF8Encoding();
byte[] secretKey = enc.GetBytes(secretAccessKey);
HMACSHA256 hmac = new HMACSHA256(secretKey);
hmac.Initialize();
byte[] bytes = enc.GetBytes(data);
byte[] rawHmac = hmac.ComputeHash(bytes);
string result = Convert.ToBase64String(rawHmac);
The byte arrays "secretKey" and "bytes" are equivalent but the byte array "rawHmac" is different, and the string "result" is different. Can anyone see why?
Don't do this:
byte[] bytes = data.getBytes();
That will use the platform default encoding to convert a string to a byte array. That can vary between platform, whereas you want something repeatable. I would suggest UTF-8:
byte[] bytes = data.getBytes("UTF-8");
(Do the same for the key, of course.)
You should then use the same encoding in your C# - not ASCII, unless you really want to not handle non-ASCII characters.
byte[] bytes = Encoding.UTF8.GetBytes(data);
It's also not clear how you're comparing the results afterwards - don't forget that byte is signed in Java, but unsigned in C#. It's probably simplest to convert the hash to hex or base64 for comparison purposes.
EDIT: I strongly suspect the last part was the problem - comparing the results.
Here are two short but complete programs (using the iharder.net base64 converter in Java) which produce the same base64 output:
Java:
import java.util.*;
import javax.crypto.*;
import javax.crypto.spec.*;
public class Test {
public static void main (String[] args) throws Exception {
String secretAccessKey = "mykey";
String data = "my data";
byte[] secretKey = secretAccessKey.getBytes();
SecretKeySpec signingKey = new SecretKeySpec(secretKey, "HmacSHA256");
Mac mac = Mac.getInstance("HmacSHA256");
mac.init(signingKey);
byte[] bytes = data.getBytes();
byte[] rawHmac = mac.doFinal(bytes);
System.out.println(Base64.encodeBytes(rawHmac));
}
}
C#:
using System;
using System.Security.Cryptography;
using System.Text;
class Test
{
static void Main()
{
String secretAccessKey = "mykey";
String data = "my data";
byte[] secretKey = Encoding.UTF8.GetBytes(secretAccessKey);
HMACSHA256 hmac = new HMACSHA256(secretKey);
hmac.Initialize();
byte[] bytes = Encoding.UTF8.GetBytes(data);
byte[] rawHmac = hmac.ComputeHash(bytes);
Console.WriteLine(Convert.ToBase64String(rawHmac));
}
}
Output from both:
ivEyFpkagEoghGnTw/LmfhDOsiNbcnEON50mFGzW9/w=
This was a non-question, as demonstrated, the hashes are always the same.
The problem in my case was unrelated, the fact that Java uppercases percent encoding on UrlEncoder but .NET doesn't.
Goes to show how important it is to test in isolation!
Hy,
i want to decode a string which contain xml data in .net
but that string was encoded in java
System.Text.UTF8Encoding encoder = new System.Text.UTF8Encoding();
System.Text.Decoder utf8Decode = encoder.GetDecoder();
byte[] todecode_byte = Convert.FromBase64String(data);
int charCount = utf8Decode.GetCharCount(todecode_byte, 0, todecode_byte.Length);
char[] decoded_char = new char[charCount];
utf8Decode.GetChars(todecode_byte, 0, todecode_byte.Length, decoded_char, 0);
result = new String(decoded_char);
return result;
i have written that code but it throw error.
Thanks in Advance.
Assuming it really is UTF-8 which is then base64-encoded, you should just be able to write:
byte[] binary = Convert.FromBase64String(data);
string text = Encoding.UTF8.GetString(binary);
However, it sounds like it wasn't base64-encoded to start with - if you've already got it as text, you should be able to use it without any extra work.