Encrypt and decrypt with MachineKey in C# - c#

I'm trying to encrypt and decrypt Id with MachineKey.
Here is my code that calls the encrypt and decrypt functions:
var encryptedId = Encryption.Protect(profileId.ToString(), UserId);
var decryptedId = Encryption.UnProtect(encryptedId, UserId);
Here is the functions:
public static string Protect(string text, string purpose)
{
if(string.IsNullOrEmpty(text))
{
return string.Empty;
}
byte[] stream = Encoding.Unicode.GetBytes(text);
byte[] encodedValues = MachineKey.Protect(stream, purpose);
return HttpServerUtility.UrlTokenEncode(encodedValues);
}
public static string UnProtect(string text, string purpose)
{
if(string.IsNullOrEmpty(text))
{
return string.Empty;
}
byte[] stream = HttpServerUtility.UrlTokenDecode(text);
byte[] decodedValues = MachineKey.Unprotect(stream, purpose);
return Encoding.UTF8.GetString(decodedValues);
}
The input to the Protect method is 15. This results that the encryptedId variable holds the following string: 6wOttbtJoVBV7PxhVWXGz4AQVYcuyHvTyJhAoPu4Okd2aKhhCGbKlK_T4q3GgirotfOZYZXke0pMdgwSmC5vxg2
To encrypt this, I send this string as a parameter to the UnProtect method.
The result of the decryption should be 15, but is instead: 1\05\0
I can't understand why. I have tried to use only integers in this function, but I still have the same problem. The output of the decrypt differs.

You have an encoding mismatch, you encode a buffer containing the UTF-16 (Encoding.Unicode) representation of the string (which will interleave \0 as you see given that it uses 2 bytes per character for that string) but you decode it as UTF-8 (Encoding.UTF8). You need to be consistent in both methods.

Related

What can cause Base64 decoding throw FormatException

I am using C# and .NET to encode and decode base64 string. The following are snippets of my code:
Base64 encoding:
using (var stream = new MemoryStream())
…...
return Convert.ToBase64String(stream.ToArray());
}
Base64 decoding
byte[] bytes = Convert.FromBase64String(messageBody);
My code fails 99% of the time with 1% chance to succeed though. The stack trace is as follows:
5xx Error Returned:System.FormatException: The input is not a valid Base-64 string as it contains a non-base 64 character, more than two padding characters, or an illegal character among the padding characters. at System.Convert.FromBase64_ComputeResultLength(Char inputPtr, Int32 inputLength) at System.Convert.FromBase64CharPtr(Char* inputPtr, Int32 inputLength) at System.Convert.FromBase64String(String s)*
Does anyone know what can cause base64 decoding to fail? My encoding and decoding methods are symmetric and I am really confused about what can be the root cause for this issue?
Thanks for all your replies.
It turned out there were still some old messages in Json format that previously failed in getting delivered and kept retrying in our system; however the new code change of our receiving side got deployed and our receiving side starts to expect messages in protobuf format which results in Deserialization failure when receiving old Json format messages.
In order to debug an issue like this I usually write some tests or create a console app to watch the variables as they change from function to function.
One of the possible scenario's for base64 decoding to fail is if the decoder input is HTMLEncoded. This is common when you pass an encrypted string into a URL for example. It will automatically be HTML encoded and then it sometimes can and sometimes can't be decoded depending on the characters that the encoded output has.
Here's a simple console app to demonstrate this.
class Program
{
static void Main(string[] args)
{
string input = "testaa";
TestEncodeDecode("test");
TestEncodeDecode("testa");
TestEncodeDecode("testaa");
Console.ReadLine();
}
private static void TestEncodeDecode(string input)
{
string encoded = Encode(input);
Console.WriteLine($"Encoded: {encoded}");
string htmlEncoded = WebUtility.UrlEncode(encoded);
Console.WriteLine($"htmlEncoded: {htmlEncoded}");
string decodedString = Decode(htmlEncoded);
Console.WriteLine($"Decoded: {decodedString}");
Console.WriteLine();
}
private static string Decode(string htmlEncoded)
{
try
{
byte[] decoded = Convert.FromBase64String(htmlEncoded);
return Encoding.ASCII.GetString(decoded);
}
catch(Exception)
{
return "Decoding failed";
}
}
private static string Encode(string input)
{
byte[] bytes = Encoding.ASCII.GetBytes(input);
using (var stream = new MemoryStream())
{
stream.Write(bytes);
return Convert.ToBase64String(stream.ToArray());
}
}
}
You'll see that the first two arguments ("test" and "testa") fail to decode, but the third ("testaa") will succeed.
In order to "fix" this, change the Decode method as follows:
private static string Decode(string htmlEncoded)
{
try
{
string regularEncodedString = WebUtility.UrlDecode(htmlEncoded);
byte[] decoded = Convert.FromBase64String(regularEncodedString);
return Encoding.ASCII.GetString(decoded);
}
catch(Exception)
{
return "Decoding failed";
}
}

AES128-ECB under UWP

I need help in retrieving AES128-EBC encrypted string under Universal Windows Application.
I have a password in string that is used as a key. With it's 32 bits length MD5 hash value I would like to encrypt text with AES128-EBC.
Now I am using this for creating MD5Hash:
public string GetMD5Hash(String strMsg)
{
string strAlgName = HashAlgorithmNames.Md5;
IBuffer buffUtf8Msg = CryptographicBuffer.ConvertStringToBinary(strMsg, BinaryStringEncoding.Utf8);
HashAlgorithmProvider objAlgProv = HashAlgorithmProvider.OpenAlgorithm
string strAlgNameUsed = objAlgProv.AlgorithmName;
IBuffer buffHash = objAlgProv.HashData(buffUtf8Msg);
if (buffHash.Length != objAlgProv.HashLength)
{
throw new Exception("There was an error creating the hash");
}
string hex = CryptographicBuffer.EncodeToHexString(buffHash);
return hex;
}
And this code for encryption:
public string Encrypt(string input, string pass)
{
SymmetricKeyAlgorithmProvider provider = SymmetricKeyAlgorithmProvider.OpenAlgorithm(SymmetricAlgorithmNames.AesEcbPkcs7);
CryptographicKey key;
string encrypted = "";
byte[] keyhash = Encoding.ASCII.GetBytes(GetMD5Hash(pass));
key = provider.CreateSymmetricKey(CryptographicBuffer.CreateFromByteArray(keyhash));
IBuffer data = CryptographicBuffer.CreateFromByteArray(Encoding.Unicode.GetBytes(input));
encrypted = CryptographicBuffer.EncodeToBase64String(CryptographicEngine.Encrypt(key, data, null));
return encrypted;
}
The cause why I am using SymmetricAlgorithmNames.AesEcbPkcs7 is when I am using SymmetricAlgorithmNames.AesEcb the output string is empty. I don't understand why.
My question is: Does my code create an AES128-ECB encryption? Because I not really sure it does. Because the software that is waiting for that encrypted data not recognizes it, so it cannot decrypt it.
My question is: Does my code create an AES128-ECB encryption? Because I not really sure it does. Because the software that is waiting for that encrypted data not recognizes it, so it cannot decrypt it.
Yes, your code create an AES encryption with ECB cipher mode and PKCS7 padding. If I correctly understand your problem, you said this works with AesEcbPkcs7, but failed using AesEcb, your software for decryption doesn't work for this.
The difference between AesEcbPkcs7 and AesEcb is, AesEcbPkcs7 use PKCS#7 block padding modes, and PKCS #7 algorithms automatically pads the message to an appropriate length, so you don’t need to pad the cipher to a multiple of the block-size of the algorithm you are using. So if you insist to use AesEcb to encrypt, I recommend to use `AesEcbPkcs7, otherwise an exception: The supplied user buffer is not valid for the requested operation.
So I guess, one possibility here in your decryption software, it may have the ability to use AesEcbPkcs7, but it doesn't implement the decrytion of AesEcb. Here I tested decryption based on your code, this code can decrypt AesEcb correctly:
public string Decrypt(string input, string pass)
{
var keyHash = Encoding.ASCII.GetBytes(GetMD5Hash(pass));
// Create a buffer that contains the encoded message to be decrypted.
IBuffer toDecryptBuffer = CryptographicBuffer.DecodeFromBase64String(input);
// Open a symmetric algorithm provider for the specified algorithm.
SymmetricKeyAlgorithmProvider aes = SymmetricKeyAlgorithmProvider.OpenAlgorithm(SymmetricAlgorithmNames.AesEcb);
// Create a symmetric key.
var symetricKey = aes.CreateSymmetricKey(keyHash.AsBuffer());
var buffDecrypted = CryptographicEngine.Decrypt(symetricKey, toDecryptBuffer, null);
string strDecrypted = CryptographicBuffer.ConvertBinaryToString(BinaryStringEncoding.Utf8, buffDecrypted);
return strDecrypted;
}
Another possibility I think you catch the exception when using AesEcb and the user buffer is not valid for the requested operation and handled it when you call your Encrypt(string input, string pass) method, the encryption failed actually.

Converting content in a textfile from base64string

i am practicing encryption and decryption.
After i have decrypted some data, i convert the bytes to base64string and store it in a textfile.
After some time i want to decrypt it again, but for that to work i have to convert the content from base64string to bytes again.
I tried with this:
string path = #"C:\encrypt.txt";
string myfile = File.ReadAllText(path);
byte[] convertion = Convert.FromBase64String(myfile);
That will give me an error because the text is actually not a base64string.
Is there anyway to do an convertion?
you can use the following functions for saveing and read base64 strings
public static void WriteAllBase64Text(string path, string text)
{
File.WriteAllText(path, Convert.ToBase64String(Encoding.UTF8.GetBytes(text)));
}
public static string ReadAllBase64Text(string path)
{
var bytes=File.ReadAllText(path);
var encoded = System.Convert.FromBase64String(bytes);
return System.Text.Encoding.UTF8.GetString(encoded);
}

Data encrypted in C# is 1 byte too long to be decrypted in Java

I have a server written in Java which sends converts its RSA key to the XML format used by .NET before sending it to the client:
public String getPublicKeyXML() {
try {
KeyFactory factory = KeyFactory.getInstance("RSA");
RSAPublicKeySpec publicKey = factory.getKeySpec(this.keyPair.getPublic(), RSAPublicKeySpec.class);
byte[] modulus = publicKey.getModulus().toByteArray();
byte[] exponent = publicKey.getPublicExponent().toByteArray();
String modulusStr = Base64.encodeBytes(modulus);
String exponentStr = Base64.encodeBytes(exponent);
String format =
"<RSAKeyValue>" +
"<Modulus>%s</Modulus>" +
"<Exponent>%s</Exponent>" +
"</RSAKeyValue>";
return String.format(format, modulusStr, exponentStr);
} catch (Exception e) {
this.server.logException(e);
return "";
}
}
The client, written in C#, then loads the key and uses it to encrypt a 256 bit AES key:
public static byte[] encrypt(string xmlKey, byte[] bytes)
{
RSACryptoServiceProvider rsa = new RSACryptoServiceProvider();
rsa.FromXmlString(xmlKey);
byte[] cipherBytes = rsa.Encrypt(bytes, false);
rsa.Clear();
return cipherBytes;
}
The server is then supposed to decrypt the AES key using its private RSA key:
public byte[] decrypt(byte[] data) {
try {
PrivateKey privateKey = this.keyPair.getPrivate();
Cipher cipher = Cipher.getInstance("RSA/ECB/PKCS1Padding");
cipher.init(Cipher.DECRYPT_MODE, privateKey);
byte[] cipherData = cipher.doFinal(data);
return cipherData;
} catch (Exception e) {
this.server.logException(e);
return new byte[0];
}
}
However, the server fails with an error stating "Data must not be longer than 384 bytes." Looking at the data to be decrypted, I noticed that it's 385 bytes. I tried increasing the RSA key length, and now the server tells me the data must be no longer than 512 bytes, while the encrypted data from the client is 513 bytes. Why is the encrypted data always one byte longer than expected?
EDIT:
Here's a sample XML-formatted key as is transmitted from the server to the client:
<RSAKeyValue><Modulus>ANsMd2dCF6RsD5v5qjlHEjHm0VWD99gSYHP+pvyU8OgNL9xM5+o+yMAxWISOwMii9vJk1IzYGf18Fj2sMb5BsInlG2boZHb6KHh7v8ObPa4MuwB/U63i8AVU3N/JTugaPH0TKvo1WNUooXEHT23nOk+vh1QipzgKQYGl68qU35vKmpNAa79l1spXA66LckTWal4art9T08Rxgn9cMWujlF+wh9EQKQoxxgj4gCoXWRDTFYjRo/Mp5xDPwNjloTs/vFCPLvY7oI+lVrHhrPyz1R473ZuEhZm+rSeGBcY9I8vhg0AIixN7KYBLhrIecmqoNZHi6LohjD2F9zhdLaTU0IIU8eeKpbEZ5eB1kYngMONBq3A/IoG0Qa/7EcSAMMspBEObffK9kCNzvnbFg5wLuy8EHNaK3nmnuTppgCwCyNqZyHeAbZaUBjNguLhHtqkHFiPJ063Xesj9UbSsCmlBliGTDXWfeJANnjGP6D3R+uLXVy9SZe+cY92JW3eZA2k//w==</Modulus><Exponent>AQAB</Exponent></RSAKeyValue>
I have verified that the data sent is the same as the data being received.
Knocking off the last byte results in a BadPaddingException. I also tried knocking off the first byte, with the same result.
I found the problem. The BigInteger's toByteArray() function included a leading zero for some reason. I just removed the leading zeros from the array and it now works like a charm!
This will not fix the problem (I tested it to no avail), but I wanted to call to your attention that RSACryptoServiceProvider implements the IDisposable interface and therefore should be properly disposed of when complete. Your C# encrypt method can be written a bit better (and more concise!) as such:
public static byte[] encrypt(string xmlKey, byte[] bytes)
{
using (var rsa = new RSACryptoServiceProvider())
{
rsa.FromXmlString(xmlKey);
return rsa.Encrypt(bytes, false);
}
}

How do I convert an array on the client into a byte[] on the server?

I have a class that encrypts and decrypts a string. It uses rijndael. Anyways, I am trying to store an encrypted value on the client. That works all fine and dandy. My problem is when I want to decrypt the value, I need the Key and IV(InitVector) that was used to encrypt the string. They are byte[] values. I currently output those to the client through a javascript variable. I need them when I call a service, as this app talks to a mainframe and it needs the users credentials. They output to the client in the form of
var vector = [143,147,31,70,195,72,228,190,152,222,65,240,152,183,0,161];
I am able to pass that value to my service as a string. My question is , inside my service, how can I convert this string back into a byte array? I have tried
Encoding.ASCII.GetBytes("[143,147,31,70,195,72,228,190,152,222,65,240,152,183,0,161]");
But that is not what I want. That tries to make a new Byte[] from that string value. I need a byte array who's values are that inside the string. Is this possible? Please provide a quick sample if you know how to do this.
Thanks,
~ck in San Diego
I would encode the bytes in base64, using System.Convert.ToBase64String and FromBase64String.
Edit: this program demonstrates this in more detail
class A{
public static void Main()
{
byte[] a = new byte[]{143,147,31,70,195,72,228,190,152,222,65,240,152,183,0,161};
string s = System.Convert.ToBase64String(a);
System.Console.WriteLine(s);
byte[] b = System.Convert.FromBase64String(s);
System.Console.Write("[");
foreach (var n in b)
System.Console.Write(n+",");
System.Console.WriteLine("]");
}
}
I think these two methods in C# might help you. I have used these with my crypto routine in order to get a byte[] for my IV and Key. Is it possible that you can output a string returned from the GetString method below in Javascript and then use that value to call the service with?
private static byte[] GetArray(string input)
{
List<byte> bytes = new List<byte>();
for (int i = 0; i < input.Length; i += 2)
{
string hex = input.Substring(i, 2);
bytes.Add(Convert.ToByte(Convert.ToUInt32(hex, 16)));
}
return bytes.ToArray();
}
private static string GetString(byte[] input)
{
StringBuilder buffer = new StringBuilder(input.Length);
foreach (byte b in input)
{
buffer.Append(b.ToString("x2"));
}
return buffer.ToString();
}

Categories

Resources