C# encrypt code and c++ encrypt code is not matching - c#

I'm implementing encrypt/decrypt code with c++/c#
I referred to this post and referred to answers.z`
But c++/c# encrypted code was not matched.
Here are my codes.
C++
// base64 encode part
static const std::string base64_chars =
"ABCDEFGHIJKLMNOPQRSTUVWXYZ"
"abcdefghijklmnopqrstuvwxyz"
"0123456789+/";
static inline bool is_base64(BYTE c) {
return (isalnum(c) || (c == '+') || (c == '/'));
}
std::string base64_encode(BYTE const* buf, unsigned int bufLen) {
std::string ret;
int i = 0;
int j = 0;
BYTE char_array_3[3];
BYTE char_array_4[4];
while (bufLen--) {
char_array_3[i++] = *(buf++);
if (i == 3) {
char_array_4[0] = (char_array_3[0] & 0xfc) >> 2;
char_array_4[1] = ((char_array_3[0] & 0x03) << 4) + ((char_array_3[1] & 0xf0) >> 4);
char_array_4[2] = ((char_array_3[1] & 0x0f) << 2) + ((char_array_3[2] & 0xc0) >> 6);
char_array_4[3] = char_array_3[2] & 0x3f;
for (i = 0; (i < 4); i++)
ret += base64_chars[char_array_4[i]];
i = 0;
}
}
if (i)
{
for (j = i; j < 3; j++)
char_array_3[j] = '\0';
char_array_4[0] = (char_array_3[0] & 0xfc) >> 2;
char_array_4[1] = ((char_array_3[0] & 0x03) << 4) + ((char_array_3[1] & 0xf0) >> 4);
char_array_4[2] = ((char_array_3[1] & 0x0f) << 2) + ((char_array_3[2] & 0xc0) >> 6);
char_array_4[3] = char_array_3[2] & 0x3f;
for (j = 0; (j < i + 1); j++)
ret += base64_chars[char_array_4[j]];
while ((i++ < 3))
ret += '=';
}
return ret;
}
//start encrypt
std::string key = "01286567891233460123456789a12345";
std::string iv = "0123456789123456";
std::string encrypt(const std::string& str_in)
{
std::string str_out;
std::string str_out2;
byte* keybyte = (byte*)key.c_str();
CryptoPP::AES::Encryption aesEncryption((byte*)key.c_str(), CryptoPP::AES::MAX_KEYLENGTH);
CryptoPP::CBC_Mode_ExternalCipher::Encryption cbcEncryption(aesEncryption, (byte*)iv.c_str());
StreamTransformationFilter stfEncryptor(cbcEncryption, new CryptoPP::StringSink(str_out));
stfEncryptor.Put(reinterpret_cast<const unsigned char*>(str_in.c_str()), str_in.length() + 1);
stfEncryptor.MessageEnd();
str_out2 = base64_encode(reinterpret_cast<const unsigned char*>(str_out.c_str()), strlen(str_out.c_str()));
return str_out2;
}
std::string decrypt(const std::string& cipher_text)
{
std::string str_out;
//need to insert code of decrypt base64
CryptoPP::AES::Decryption aesDecryption((byte*)key.c_str(), CryptoPP::AES::MAX_KEYLENGTH);
CryptoPP::CBC_Mode_ExternalCipher::Decryption cbcDecryption(aesDecryption, (byte*)iv.c_str());
CryptoPP::StreamTransformationFilter stfDecryptor(cbcDecryption, new CryptoPP::StringSink(str_out));
stfDecryptor.Put(reinterpret_cast<const unsigned char*>(cipher_text.c_str()), cipher_text.size());
stfDecryptor.MessageEnd();
return str_out;
}
c#
public string Encrypt(string testCode)
{
string clearText = testCode;
byte[] clearBytes = Encoding.Default.GetBytes(clearText);
using (Aes encryptor = Aes.Create("AES"))
{
//encryptor.BlockSize = 128;
encryptor.Padding = PaddingMode.Zeros;
encryptor.KeySize = 128;
encryptor.Mode = CipherMode.CBC;
encryptor.Key = Encoding.Default.GetBytes("01286567891233460123456789a12345");
encryptor.IV = Encoding.Default.GetBytes("0123456789123456");
using (MemoryStream ms = new MemoryStream())
{
using (CryptoStream cs = new CryptoStream(ms, encryptor.CreateEncryptor(), CryptoStreamMode.Write))
{
cs.Write(clearBytes, 0, clearBytes.Length);
cs.Close();
}
byte[] bt = ms.ToArray();
clearText = Convert.ToBase64String(bt); //clearText = Encoding.Default.GetString(bt);
}
}
return clearText; //Return the encrypted command
}
public string Decrypt(string cipherText)
{
byte[] clearBytes = Convert.FromBase64String(cipherText);
using (Aes decryptor = Aes.Create("AES"))
{
// decryptor.BlockSize = 128;
decryptor.Padding = PaddingMode.Zeros;
decryptor.KeySize = 128;
decryptor.Mode = CipherMode.CBC;
decryptor.Key = Encoding.Default.GetBytes("01286567891233460123456789a12345");
decryptor.IV = Encoding.Default.GetBytes("0123456789123456");
using (MemoryStream ms = new MemoryStream())
{
using (CryptoStream cs = new CryptoStream(ms, decryptor.CreateDecryptor(), CryptoStreamMode.Write))
{
cs.Write(clearBytes, 0, clearBytes.Length);
cs.Close();
}
byte[] bt = ms.ToArray();
cipherText = Encoding.Default.GetString(bt);
}
}
return cipherText; //Return the decrypted text
}
c++ result
encrypted code : ks8zzu20w6zURkuZMgbx8g==
decrypted code : test
c# result
encrypted code : nsWRYBylyjVaJ5Yckk+SRw==
decrypted code : test
I tested with test word both of C++/C# however encrypted code was not matched.
Also, I tested after remove base64 encode code but pure encrypted code was not matched as well.
Could anyone please share your knowledge?
EDIT1
I tried to copy encrypted code by c++ and pasted to c# decrypt code like below
string testcode = "ks8zzu20w6zURkuZMgbx8g=="
Decrypt(testcode)
//result - test↗↗↗↗↗
As you can see, the results look very similar, but there is something weird.
↗ This symbol is added after the word test.
I could not found out why result like this. Is there something I missed?
Solved
std::string encrypt(const std::string& str_in)
{
std::string str_out;
std::string str_out2;
byte* keybyte = (byte*)key.c_str();
CryptoPP::AES::Encryption aesEncryption((byte*)key.c_str(), CryptoPP::AES::MAX_KEYLENGTH);
CryptoPP::CBC_Mode_ExternalCipher::Encryption cbcEncryption(aesEncryption, (byte*)iv.c_str());
StreamTransformationFilter stfEncryptor(cbcEncryption, new CryptoPP::StringSink(str_out));
// 'str_in.length() + 1' in the line below makes the encryption code different from c# code.
/*stfEncryptor.Put(reinterpret_cast<const unsigned char*>(str_in.c_str()), str_in.length() + 1);*/
stfEncryptor.Put(reinterpret_cast<const unsigned char*>(str_in.c_str()), str_in.length());
stfEncryptor.MessageEnd();
str_out2 = cryptobase64_encode(str_out);
return str_out2;
}
As I commented, str_in.length() + 1 makes the encrypted code different from C# code.
And I changed C# padding options from Zeros to PKCS7 for matching c++ encrypted code.
But I don't know why I need to set this option. I think I need to study this.
Anyway, It works well. Special thanks to #jdweng.

Related

OpenSSL HMACSHA256 produces different result comparing to .NET

I am using C# and C++ with OpenSSL to compute HMACSHA256 has with a key and both produce different results. What am I doing wrong?
C# code:
public static string CreateSignature(string signingString, string sharedKey)
{
var key = Encoding.ASCII.GetBytes(sharedKey);
var hmac = new HMACSHA256(key);
var data = Encoding.ASCII.GetBytes(signingString);
var hash = hmac.ComputeHash(data);
return Convert.ToBase64String(hash);
}
C++ code:
std::string SignatureProvider::getSignature(std::string stringToSign, std::string key)
{
const char* pKey = key.c_str();
const char* pData = stringToSign.c_str();
unsigned char* result = nullptr;
unsigned int len = 32;
result = (unsigned char*)malloc(sizeof(char) * len);
HMAC_CTX ctx;
HMAC_CTX_init(&ctx);
HMAC_Init_ex(&ctx, pKey, strlen(pKey), EVP_sha256(), NULL);
HMAC_Update(&ctx, (unsigned char*)&pData, strlen(pData));
HMAC_Final(&ctx, result, &len);
HMAC_CTX_cleanup(&ctx);
return base64_encode(result, len);
}
std::string base64_encode(unsigned char const* bytes_to_encode, unsigned int in_len)
{
std::string ret;
int i = 0;
int j = 0;
unsigned char char_array_3[3];
unsigned char char_array_4[4];
while (in_len--) {
char_array_3[i++] = *(bytes_to_encode++);
if (i == 3) {
char_array_4[0] = (char_array_3[0] & 0xfc) >> 2;
char_array_4[1] = ((char_array_3[0] & 0x03) << 4) + ((char_array_3[1] & 0xf0) >> 4);
char_array_4[2] = ((char_array_3[1] & 0x0f) << 2) + ((char_array_3[2] & 0xc0) >> 6);
char_array_4[3] = char_array_3[2] & 0x3f;
for (i = 0; (i <4); i++)
ret += base64_chars[char_array_4[i]];
i = 0;
}
}
if (i)
{
for (j = i; j < 3; j++)
char_array_3[j] = '\0';
char_array_4[0] = (char_array_3[0] & 0xfc) >> 2;
char_array_4[1] = ((char_array_3[0] & 0x03) << 4) + ((char_array_3[1] & 0xf0) >> 4);
char_array_4[2] = ((char_array_3[1] & 0x0f) << 2) + ((char_array_3[2] & 0xc0) >> 6);
for (j = 0; (j < i + 1); j++)
ret += base64_chars[char_array_4[j]];
while ((i++ < 3))
ret += '=';
}
return ret;
}
I just included base64 conversion for completeness, but it is already different before it.
Why don't you use HMAC function itself? I have tried with this code and both C++ and c# code result in same HMAC :
std::string getSignature(std::string stringToSign, std::string key)
{
const char* pKey = key.c_str();
const char* pData = stringToSign.c_str();
unsigned char* result = nullptr;
unsigned int len = 32;
result = (unsigned char*)malloc(sizeof(char) * len);
int nkeyLen = strlen(pKey);
int dataLen = strlen(pData);
result = HMAC(EVP_sha256(), pKey, nkeyLen, (unsigned char*)pData, dataLen, NULL, NULL);
return base64_encode(result, len);
}

Encode decoded string with Base64String

I am learning how to encode and decode string. This is a method to decode chiper text to plain text I found around the web.
public static string Decode(string chiperText)
{
byte[] numArray = Convert.FromBase64String(chiperText);
byte[] numArray1 = new byte[(int)numArray.Length - 1];
byte num = (byte)(numArray[0] ^ 188);
for (int i = 1; i < (int)numArray.Length; i++)
{
numArray1[i - 1] = (byte)(numArray[i] ^ 188 ^ num);
}
return Encoding.ASCII.GetString(numArray1);
}
My problem is I have no idea how to encode to original state. I try this method and it doesn't work.
public static string Encode(string plainText)
{
byte[] bytes = Encoding.ASCII.GetBytes(plainText);
byte[] results = new byte[(int)bytes.Length - 1];
byte num = (byte)(bytes[0] ^ 188);
for (int i = 1; i < bytes.Length; i++)
{
results[i - 1] = (byte)(bytes[i] ^ 188 ^ num);
}
return Convert.ToBase64String(results);
}
Although I agree entirely with SLaks comment that the above does not constitute any kind of crypto that you should use, the following procedure will produce the "encrypted" data that you are looking to decrypt:
public static string Encode(string plainText)
{
byte[] numArray = System.Text.Encoding.Default.GetBytes(plainText);
byte[] numArray1 = new byte[(int)numArray.Length + 1];
// Generate a random byte as the seed used
(new Random()).NextBytes(numArray1);
byte num = (byte)(numArray1[0] ^ 188);
numArray1[0] = numArray1[0];
for (int i = 0; i < (int)numArray.Length; i++)
{
numArray1[i + 1] = (byte)(num ^ 188 ^ numArray[i]);
}
return Convert.ToBase64String(numArray1);
}
Please do not, for a single second, consider using this as a method for 'encrypting' sensitive data.

An efficient way to Base64 encode a byte array?

I have a byte[] and I'm looking for the most efficient way to base64 encode it.
The problem is that the built in .Net method Convert.FromBase64CharArray requires a char[] as an input, and converting my byte[] to a char[] just to convert it again to a base64 encoded array seems pretty stupid.
Is there any more direct way to do it?
[[EDIT:]] I'll expaling what I want to acheive better - I have a byte[] and I need to return a new base64 encoded byte[]
Byte[] -> String:
use
system.convert.tobase64string
Convert.ToBase64String(byte[] data)
String -> Byte[]:
use
system.convert.frombase64string
Convert.FromBase64String(string data)
Base64 is a way to represent bytes in a textual form (as a string). So there is no such thing as a Base64 encoded byte[]. You'd have a base64 encoded string, which you could decode back to a byte[].
However, if you want to end up with a byte array, you could take the base64 encoded string and convert it to a byte array, like:
string base64String = Convert.ToBase64String(bytes);
byte[] stringBytes = Encoding.ASCII.GetBytes(base64String);
This, however, makes no sense because the best way to represent a byte[] as a byte[], is the byte[] itself :)
Here is the code to base64 encode directly to byte array (tested to be performing +-10% of .Net Implementation, but allocates half the memory):
static public void testBase64EncodeToBuffer()
{
for (int i = 1; i < 200; ++i)
{
// prep test data
byte[] testData = new byte[i];
for (int j = 0; j < i; ++j)
testData[j] = (byte)(j ^ i);
// test
testBase64(testData);
}
}
static void testBase64(byte[] data)
{
if (!appendBase64(data, 0, data.Length, false).SequenceEqual(System.Text.Encoding.ASCII.GetBytes(Convert.ToBase64String(data)))) throw new Exception("Base 64 encoding failed");
}
static public byte[] appendBase64(byte[] data
, int offset
, int size
, bool addLineBreaks = false)
{
byte[] buffer;
int bufferPos = 0;
int requiredSize = (4 * ((size + 2) / 3));
// size/76*2 for 2 line break characters
if (addLineBreaks) requiredSize += requiredSize + (requiredSize / 38);
buffer = new byte[requiredSize];
UInt32 octet_a;
UInt32 octet_b;
UInt32 octet_c;
UInt32 triple;
int lineCount = 0;
int sizeMod = size - (size % 3);
// adding all data triplets
for (; offset < sizeMod;)
{
octet_a = data[offset++];
octet_b = data[offset++];
octet_c = data[offset++];
triple = (octet_a << 0x10) + (octet_b << 0x08) + octet_c;
buffer[bufferPos++] = base64EncodingTable[(triple >> 3 * 6) & 0x3F];
buffer[bufferPos++] = base64EncodingTable[(triple >> 2 * 6) & 0x3F];
buffer[bufferPos++] = base64EncodingTable[(triple >> 1 * 6) & 0x3F];
buffer[bufferPos++] = base64EncodingTable[(triple >> 0 * 6) & 0x3F];
if (addLineBreaks)
{
if (++lineCount == 19)
{
buffer[bufferPos++] = 13;
buffer[bufferPos++] = 10;
lineCount = 0;
}
}
}
// last bytes
if (sizeMod < size)
{
octet_a = offset < size ? data[offset++] : (UInt32)0;
octet_b = offset < size ? data[offset++] : (UInt32)0;
octet_c = (UInt32)0; // last character is definitely padded
triple = (octet_a << 0x10) + (octet_b << 0x08) + octet_c;
buffer[bufferPos++] = base64EncodingTable[(triple >> 3 * 6) & 0x3F];
buffer[bufferPos++] = base64EncodingTable[(triple >> 2 * 6) & 0x3F];
buffer[bufferPos++] = base64EncodingTable[(triple >> 1 * 6) & 0x3F];
buffer[bufferPos++] = base64EncodingTable[(triple >> 0 * 6) & 0x3F];
// add padding '='
sizeMod = size % 3;
// last character is definitely padded
buffer[bufferPos - 1] = (byte)'=';
if (sizeMod == 1) buffer[bufferPos - 2] = (byte)'=';
}
return buffer;
}
byte[] base64EncodedStringBytes = Encoding.ASCII.GetBytes(Convert.ToBase64String(binaryData))
Based on your edit and comments.. would this be what you're after?
byte[] newByteArray = UTF8Encoding.UTF8.GetBytes(Convert.ToBase64String(currentByteArray));
You could use the String Convert.ToBase64String(byte[]) to encode the byte array into a base64 string, then Byte[] Convert.FromBase64String(string) to convert the resulting string back into a byte array.
To retrieve your image from byte to base64 string....
Model property:
public byte[] NomineePhoto { get; set; }
public string NomineePhoneInBase64Str
{
get {
if (NomineePhoto == null)
return "";
return $"data:image/png;base64,{Convert.ToBase64String(NomineePhoto)}";
}
}
IN view:
<img style="height:50px;width:50px" src="#item.NomineePhoneInBase64Str" />
import base64
encoded = base64.b64encode(b'[3.\x01#\xbcr\xa9/$\xc3\ xe1 "')
print(encoded)
data = base64.b64decode(encoded)
print(data)
This method is efficient irrespective of the characters. For reference I included space, Double Quotes too.
public void ProcessRequest(HttpContext context)
{
string constring = ConfigurationManager.ConnectionStrings["SQL_Connection_String"].ConnectionString;
SqlConnection conn = new SqlConnection(constring);
conn.Open();
SqlCommand cmd = new SqlCommand("select image1 from TestGo where TestId=1", conn);
SqlDataReader dr = cmd.ExecuteReader();
dr.Read();
MemoryStream str = new MemoryStream();
context.Response.Clear();
Byte[] bytes = (Byte[])dr[0];
string d = System.Text.Encoding.Default.GetString(bytes);
byte[] bytes2 = Convert.FromBase64String(d);
//context.Response.Write(d);
Image img = Image.FromStream(new MemoryStream(bytes2));
img.Save(context.Response.OutputStream, ImageFormat.Png);
context.Response.Flush();
str.WriteTo(context.Response.OutputStream);
str.Dispose();
str.Close();
conn.Close();
context.Response.End();
}

Matching Java Triple DES result to C# one

I have the following C# implementation of triple DES
byte[] bKey = HexToBytes("C67DDB0CE47D27FAF6F32ECA5C99E8AF");
byte[] bMsg = HexToBytes("ff00");
byte[] iv = { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 };
DESCryptoServiceProvider des = new DESCryptoServiceProvider();
des.Padding = PaddingMode.Zeros;
des.Mode = CipherMode.CBC;
byte[] bK1 = new byte[8];
for (int i = 0; i < 8; i++) bK1[i] = bKey[i];
byte[] bK2 = new byte[8];
for (int i = 0; i < 8; i++) bK2[i] = bKey[i + 8];
ICryptoTransform ict1 = des.CreateEncryptor(bK1, iv);
byte[] bFt = ict1.TransformFinalBlock(bMsg, 0, bMsg.Length);
byte[] bLCb = new byte[8];
for (int i = 0; i < 8; i++) bLCb[i] = bFt[i + bFt.Length - 8];
des.Mode = CipherMode.ECB;
ICryptoTransform ict1_5 = des.CreateDecryptor(bK2, iv);
bLCb = ict1_5.TransformFinalBlock(bLCb, 0, bLCb.Length);
ICryptoTransform ict2 = des.CreateEncryptor(bK1, iv);
byte[] bMac = ict2.TransformFinalBlock(bLCb, 0, bLCb.Length);
ToHex(bMac); // outputs: 4BC0479D7889CF8E
I need to produce same result in Java/Groovy, in which I'm apparently stuck.
The code I have for now is as follows:
byte[] bKey = Hex.decode("C67DDB0CE47D27FAF6F32ECA5C99E8AF")
byte[] bMsg = Hex.decode("ff00")
byte[] keyBytes = Arrays.copyOf(sKey.bytes, 24)
int j = 0, k = 16
while (j < 8) {
keyBytes[k++] = keyBytes[j++]
}
SecretKey key3 = new SecretKeySpec(keyBytes, "DESede")
IvParameterSpec iv3 = new IvParameterSpec(new byte[8])
Cipher cipher3 = Cipher.getInstance("DESede/CBC/PKCS5Padding")
cipher3.init(Cipher.ENCRYPT_MODE, key3, iv3)
byte[] bMac = cipher3.doFinal(bMsg)
println new String(Hex.encode(bMac))
This one outpus: ef2c57c3fa18d0a5
Hex.decode() here is of bouncy castle
I have also tried to reproduce same C# code in java by using DES/CBC twice and EBC in final round, which gave me even different result: 48f63c809c38e1eb
It'd be great if someone could give me a hint of what I may be doing wrong
Update:
Thanks everyone for your help! Final code that works as needed without much tweaking:
Security.addProvider(new BouncyCastleProvider())
byte[] bKey = Hex.decode("C67DDB0CE47D27FAF6F32ECA5C99E8AF")
byte[] bMsg = Hex.decode("ff00")
byte[] keyBytes = Arrays.copyOf(sKey.bytes, 24)
int j = 0, k = 16
while (j < 8) {
keyBytes[k++] = keyBytes[j++]
}
SecretKey key3 = new SecretKeySpec(keyBytes, "DESede")
IvParameterSpec iv3 = new IvParameterSpec(new byte[8])
Cipher cipher3 = Cipher.getInstance("DESede/CBC/ZeroBytePadding")
cipher3.init(Cipher.ENCRYPT_MODE, key3, iv3)
byte[] bMac = cipher3.doFinal(bMsg)
println new String(Hex.encode(bMac))
You're using some non-standard padding and block chaining. You won't be able to use DESede. Try DES instead:
import javax.crypto.*
import javax.crypto.spec.*
def key1 = new SecretKeySpec("C67DDB0CE47D27FA".decodeHex(), "DES")
def key2 = new SecretKeySpec("F6F32ECA5C99E8AF".decodeHex(), "DES")
def plaintext = ("ff00" + "000000000000").decodeHex() // manually zero pad
def c1 = Cipher.getInstance("DES/CBC/NoPadding")
c1.init(Cipher.ENCRYPT_MODE, key1, new IvParameterSpec(new byte[8]))
def cipherText1 = c1.doFinal(plaintext)
def c2 = Cipher.getInstance("DES/CBC/NoPadding")
c2.init(Cipher.DECRYPT_MODE, key2, new IvParameterSpec(new byte[8]))
def cipherText2 = c2.doFinal(cipherText1)
def c3 = Cipher.getInstance("DES/ECB/NoPadding")
c3.init(Cipher.ENCRYPT_MODE, key1)
def cipherText3 = c3.doFinal(cipherText2)
assert cipherText3.encodeHex().toString() == "4bc0479d7889cf8e"

"Padding is invalid and cannot be removed" when decrypting with Rijndael

I'm trying to decrypt a string crypted with Rijndael algorythm. This type of cryptation apply a padding with "#" on the right of the Key and IV, if they are less than 16 characters long. The string to decrypt is received from a Webservice that sends it, and the Key, to me in XML SOAP Format. The IV is the Mac Address of my machine (that the server use as IV to encrypt the string). When i try to decrypt the received string, my program crash at this instruction:
while ((num5 = stream3.ReadByte()) != -1)
and it give to me this error "Padding is not valid and it cannot be removed".
I've searched this error on MSDN, and it says that it happen when the IV used to encrypt is different from the IV used to decrypt, but, i repeat, the IV is the MacAddress and it is the same everytime.
This is the sourcecode of Encrypt and Decrypt functions:
public static string Decrypt(string strInputString, string strKeyString, string myIV)
{
if ((strInputString == null) || (strInputString.Length == 0))
{
return strInputString;
}
try
{
int num5;
int keySize = 0x100;
int blockSize = 0x100;
int length = keySize / 0x10;
if (strKeyString.Length > length)
{
strKeyString = strKeyString.Substring(0, length);
}
if (strKeyString.Length < length)
{
strKeyString = strKeyString.PadRight(length, '#');
}
Encoding.Unicode.GetBytes(strKeyString);
if (myIV.Length > length)
{
myIV = myIV.Substring(0, length);
}
if (myIV.Length < length)
{
myIV = myIV.PadRight(length, '#');
}
Encoding.Unicode.GetBytes(myIV);
byte[] bytes = Encoding.Unicode.GetBytes(strKeyString);
byte[] rgbIV = Encoding.Unicode.GetBytes(myIV);
RijndaelManaged managed = new RijndaelManaged {
BlockSize = blockSize,
KeySize = keySize
};
MemoryStream stream = new MemoryStream();
for (int i = 0; i < strInputString.Length; i += 2)
{
stream.WriteByte(byte.Parse(strInputString.Substring(i, 2), NumberStyles.AllowHexSpecifier));
}
stream.Position = 0L;
MemoryStream stream2 = new MemoryStream();
CryptoStream stream3 = new CryptoStream(stream, managed.CreateDecryptor(bytes, rgbIV), CryptoStreamMode.Read);
while ((num5 = stream3.ReadByte()) != -1)
{
stream2.WriteByte((byte) num5);
}
stream3.Close();
stream2.Close();
stream.Close();
byte[] buffer3 = stream2.ToArray();
return Encoding.Unicode.GetString(buffer3);
}
catch (Exception exception)
{
Log.Error(exception.Message);
}
}
public static string Encrypt(string strInputString, string strKeyString, string myIV)
{
if ((strInputString == null) || (strInputString.Length == 0))
{
return strInputString;
}
try
{
int num4;
int keySize = 0x100;
int blockSize = 0x100;
int length = keySize / 0x10;
if (strKeyString.Length > length)
{
strKeyString = strKeyString.Substring(0, length);
}
if (strKeyString.Length < length)
{
strKeyString = strKeyString.PadRight(length, '#');
}
Encoding.Unicode.GetBytes(strKeyString);
if (myIV.Length > length)
{
myIV = myIV.Substring(0, length);
}
if (myIV.Length < length)
{
myIV = myIV.PadRight(length, '#');
}
Encoding.Unicode.GetBytes(myIV);
byte[] bytes = Encoding.Unicode.GetBytes(strKeyString);
byte[] rgbIV = Encoding.Unicode.GetBytes(myIV);
string str = "";
RijndaelManaged managed = new RijndaelManaged {
BlockSize = blockSize,
KeySize = keySize
};
MemoryStream stream = new MemoryStream(Encoding.Unicode.GetBytes(strInputString));
MemoryStream stream2 = new MemoryStream();
CryptoStream stream3 = new CryptoStream(stream2, managed.CreateEncryptor(bytes, rgbIV), CryptoStreamMode.Write);
while ((num4 = stream.ReadByte()) != -1)
{
stream3.WriteByte((byte) num4);
}
stream3.Close();
stream2.Close();
stream.Close();
foreach (byte num5 in stream2.ToArray())
{
str = str + num5.ToString("X2");
}
return str;
}
catch (Exception exception)
{
Log.Error(exception.Message);
}
}
}
Works fine for me with test code below - are you sure you're passing in the encrypted string for decryption ?
static void Main(string[] args)
{
string strInputString = "test";
string strKeyString = "test123";
string myIV = GetMacAddress();
string encryptedString = Encrypt(strInputString, strKeyString, myIV);
string decryptedString = Decrypt(encryptedString, strKeyString, myIV);
}
public static string Decrypt(string strInputString, string strKeyString, string myIV)
{
if ((strInputString == null) || (strInputString.Length == 0))
{
return strInputString;
}
int num5;
int keySize = 0x100;
int blockSize = 0x100;
int length = keySize / 0x10;
if (strKeyString.Length > length)
{
strKeyString = strKeyString.Substring(0, length);
}
if (strKeyString.Length < length)
{
strKeyString = strKeyString.PadRight(length, '#');
}
Encoding.Unicode.GetBytes(strKeyString);
if (myIV.Length > length)
{
myIV = myIV.Substring(0, length);
}
if (myIV.Length < length)
{
myIV = myIV.PadRight(length, '#');
}
Encoding.Unicode.GetBytes(myIV);
byte[] bytes = Encoding.Unicode.GetBytes(strKeyString);
byte[] rgbIV = Encoding.Unicode.GetBytes(myIV);
RijndaelManaged managed = new RijndaelManaged
{
BlockSize = blockSize,
KeySize = keySize
};
MemoryStream stream = new MemoryStream();
for (int i = 0; i < strInputString.Length; i += 2)
{
stream.WriteByte(byte.Parse(strInputString.Substring(i, 2), NumberStyles.AllowHexSpecifier));
}
stream.Position = 0L;
MemoryStream stream2 = new MemoryStream();
CryptoStream stream3 = new CryptoStream(stream, managed.CreateDecryptor(bytes, rgbIV), CryptoStreamMode.Read);
while ((num5 = stream3.ReadByte()) != -1)
{
stream2.WriteByte((byte)num5);
}
stream3.Close();
stream2.Close();
stream.Close();
byte[] buffer3 = stream2.ToArray();
return Encoding.Unicode.GetString(buffer3);
}
public static string Encrypt(string strInputString, string strKeyString, string myIV)
{
if ((strInputString == null) || (strInputString.Length == 0))
{
return strInputString;
}
int num4;
int keySize = 0x100;
int blockSize = 0x100;
int length = keySize / 0x10;
if (strKeyString.Length > length)
{
strKeyString = strKeyString.Substring(0, length);
}
if (strKeyString.Length < length)
{
strKeyString = strKeyString.PadRight(length, '#');
}
Encoding.Unicode.GetBytes(strKeyString);
if (myIV.Length > length)
{
myIV = myIV.Substring(0, length);
}
if (myIV.Length < length)
{
myIV = myIV.PadRight(length, '#');
}
Encoding.Unicode.GetBytes(myIV);
byte[] bytes = Encoding.Unicode.GetBytes(strKeyString);
byte[] rgbIV = Encoding.Unicode.GetBytes(myIV);
string str = "";
RijndaelManaged managed = new RijndaelManaged
{
BlockSize = blockSize,
KeySize = keySize
};
MemoryStream stream = new MemoryStream(Encoding.Unicode.GetBytes(strInputString));
MemoryStream stream2 = new MemoryStream();
CryptoStream stream3 = new CryptoStream(stream2, managed.CreateEncryptor(bytes, rgbIV), CryptoStreamMode.Write);
while ((num4 = stream.ReadByte()) != -1)
{
stream3.WriteByte((byte)num4);
}
stream3.Close();
stream2.Close();
stream.Close();
foreach (byte num5 in stream2.ToArray())
{
str = str + num5.ToString("X2");
}
return str;
}
private static string GetMacAddress()
{
string macAddresses = "";
foreach (NetworkInterface nic in NetworkInterface.GetAllNetworkInterfaces())
{
if (nic.OperationalStatus == OperationalStatus.Up)
{
macAddresses += nic.GetPhysicalAddress().ToString();
break;
}
}
return macAddresses;
}
Padding is probably Pkcs#5. Instead of #, pad with a byte value that is the number of bytes to pad.
So if you have 5 bytes to pad, padding bytes will be 0505050505, if you need to pad 2 bytes, padding will be 0202 .
I don't think that the IV is the issue. It's the password itself. I suspect that the password used by the server to encrypt is not the password being used by the client to decrypt.
The only way I could reproduce the crash the OP was reporting was by passing in an incorrect password to the Decrypt() method. Passing in an IV that was just slightly incorrect wouldn't throw an exception. For example, I encrypted with the IV as a MAC address in caps and using colons, and then decrypted with the IV as the same MAC address in lower case and using dashes. -- first few bytes were scrambled, but by about byte 16 everything was started matching up with the plain text original.
Are you using the same character encoding as the original string at the time of encryption?
I had a similar issue... the difference, in the end, was how i was passing the data (string) to be encrypted. If I copy/pasted into a textbox, the encryption was different than if i hardcoded into the program. So in short... the encoding of the original data makes a big difference. While the characters may look the same, in reality they could be represented quite differently (8 bytes, 16 bytes, etc).
Find out how the original string was encoded prior to the encryption algorithm (maybe check on the IV parameter encodings as well.

Categories

Resources