I'm receiving a CryptographicException "Bad Hash.\r\n" from the code below when I call CreateSignature. Any ideas as to what might be causing this?
RSAPKCS1SignatureFormatter RSAFormatter =
new RSAPKCS1SignatureFormatter(new RSACryptoServiceProvider());
RSAFormatter.SetHashAlgorithm("SHA256");
byte[] signedHash = RSAFormatter.CreateSignature(myHash);
Your code snippet does not show how you get myHash but my guess is that it is not a 32 byte array. From MSDN:
The hash size for the SHA256 algorithm
is 256 bits.
Try defining your myHash like this: (Just an ugly sample here)
// 256 bit hash size
byte[] myHash = { 59,4,248,102,77,97,142,201,
210,12,224,93,25,41,100,197,
210,12,224,93,25,41,100,197,
213,134,130,135, 213,134,130,135};
When i ran your code with a hash of any other size i got the same exact error. Running with the array defined above, 256 bits or 32 bytes, it worked.
byte[] bPlainText;
bPlainText = Encoding.UTF8.GetBytes(PlainText);
SHA256 sha256 = new SHA256Managed();
bPlainText=sha256.ComputeHash(bPlainText);
Related
I need to decrypt a signature hash using RSA. I've got a hexadecimal string of 288 characters long which is a general public key from an institution. It represents hexadecimal bytes, so 144 bytes total.
The first 8 bytes are the so called CAR. Which is used for identification. The next 128 bytes are the Modulus N. And the next 8 bytes are the exponent E.
I've never worked with cryptography before so go easy on me. I'm using C# and the Bouncy Castle library for the decryption algorithms. Now, If I understand correctly, a 1024 bit modulus and 64 bits exponent is not strange. I currently have this bit of code:
public byte[] rsa_decrypt(byte[] data)
{
var N = PublicKey.ToCharArray().Slice(16,256);
var E = PublicKey.ToCharArray().Slice(272,16);
RsaEngine rsa = new RsaEngine();
Org.BouncyCastle.Math.BigInteger modulus = new Org.BouncyCastle.Math.BigInteger(new string(N).Insert(0,"00"),16);
Console.WriteLine(modulus);
Org.BouncyCastle.Math.BigInteger exponent = new Org.BouncyCastle.Math.BigInteger(new string(E).Insert(0,"00"),16);
Console.WriteLine(exponent);
RsaKeyParameters x = new RsaKeyParameters(false,modulus,exponent);
var eng = new Pkcs1Encoding(new RsaEngine());
eng.Init(false,x);
return eng.ProcessBlock(data,0,data.Length);
}
The Slice<T>(this T[],offset,length) method is just a small thing I wrote to cut arrays in pieces, nothing special and it works. The insertion of the "00" in the string is because the string could otherwise be interpreted as unsigned I believe.
When I run this code I get the exception
Unhandled exception. Org.BouncyCastle.Crypto.InvalidCipherTextException: block incorrect at Org.BouncyCastle.Crypto.Encodings.Pkcs1Encoding.DecodeBlock(Byte[] input, Int32 inOff, Int32 inLen) at Org.BouncyCastle.Crypto.Encodings.Pkcs1Encoding.ProcessBlock(Byte[] input, Int32 inOff, Int32 length)
Obviously I'm doing something wrong. Can anybody tell me what I'm doing wrong, why I'm doing it wrong, and most preferably, what I should be doing. Again, never worked with crypto algorithms or this library ever before.
I have found an example that uses AES encrypt to encrypt text. The code is this:
public static string Encrypt(string PlainText, string Password,
string Salt = "Kosher", string HashAlgorithm = "SHA1",
int PasswordIterations = 2, string InitialVector = "OFRna73m*aze01xY",
int KeySize = 256)
{
if (string.IsNullOrEmpty(PlainText))
return "";
byte[] InitialVectorBytes = Encoding.ASCII.GetBytes(InitialVector);
byte[] SaltValueBytes = Encoding.ASCII.GetBytes(Salt);
byte[] PlainTextBytes = Encoding.UTF8.GetBytes(PlainText);
PasswordDeriveBytes DerivedPassword = new PasswordDeriveBytes(Password, SaltValueBytes, HashAlgorithm, PasswordIterations);
byte[] KeyBytes = DerivedPassword.GetBytes(KeySize / 8);
RijndaelManaged SymmetricKey = new RijndaelManaged();
SymmetricKey.Mode = CipherMode.CBC;
byte[] CipherTextBytes = null;
using (ICryptoTransform Encryptor = SymmetricKey.CreateEncryptor(KeyBytes, InitialVectorBytes))
{
using (MemoryStream MemStream = new MemoryStream())
{
using (CryptoStream CryptoStream = new CryptoStream(MemStream, Encryptor, CryptoStreamMode.Write))
{
CryptoStream.Write(PlainTextBytes, 0, PlainTextBytes.Length);
CryptoStream.FlushFinalBlock();
CipherTextBytes = MemStream.ToArray();
MemStream.Close();
CryptoStream.Close();
}
}
}
SymmetricKey.Clear();
return Convert.ToBase64String(CipherTextBytes);
}
My question is: how is the key for the AES algorithm generated? These 2 lines:
PasswordDeriveBytes DerivedPassword = new PasswordDeriveBytes(Password, SaltValueBytes, HashAlgorithm, PasswordIterations);
byte[] KeyBytes = DerivedPassword.GetBytes(KeySize / 8);
First, it creates a derived key of 256 bytes, and later, create a key getting pseudo random bytes of this derived key. It has to be divided by 8 because the AES algorithm need 128, 182 or 256 bits, not bytes. In this case, how derived key is 256 bytes, the key for AES will be 256 bits.
But why does it do that? Wouldn't it better create the derived key with the needed length, not 256 bytes but 256 bits (256 bytes / 8)? So It wouldn't be needed to create a new key taken the 1/8 bytes of the derived key.
Also, the getBytes() method, in the description of the method, it says it returns pseudo-random key bytes. So doesn't it do the AES key would be different in each case? How to generate again the AES key from decryption if it is pseudo random key bytes?
Thanks.
First, it creates a derived key of 256 bytes
Where? I don't see any 256-byte key being created.
and later, create a key getting pseudo random bytes of this derived key. It has to be divided by 8 because the AES algorithm need 128, 182 or 256 bits, not bytes
Yes, the function input of KeySize (which should be keySize by normal C# naming conventions) is in bits, but GetBytes wants an input in bytes. x / 8 is one of the three right answers for that conversion ((x + 7) / 8 is another, and x & 7 == 0 ? x / 8 : throw new ArgumentException(nameof(x)) is the third)
But why does it do that? Wouldn't it better create the derived key with the needed length, not 256 bytes but 256 bits (256 bytes / 8)? So It wouldn't be needed to create a new key taken the 1/8 bytes of the derived key.
It would be good to do that. But since it is already doing that, there's no "better" to be had.
Also, the getBytes() method, in the description of the method, it says it returns pseudo-random key bytes. So doesn't it do the AES key would be different in each case? How to generate again the AES key from decryption if it is pseudo random key bytes?
I have to make a pedantic point: There is no getBytes method. C# is a case-sensitive language, and the method name is GetBytes.
pseudorandom: noting or pertaining to random numbers generated by a definite computational process to satisfy a statistical test.
PasswordDeriveBytes is an implementation of PBKDF1 (except it continues beyond the limits of PBKDF1), which is a deterministic algorithm. Given the same inputs (password, seed, iteration count, pseudo-random function (hash algorithm)) the same output is produced. Change any of the inputs slightly, and the output is significantly different.
Rfc2898DeriveBytes (an implementation of PBKDF2) is also a deterministic, but chaotic, algorithm.
So you produce the same answer again in either of them (but not across them) by giving all the same inputs.
When using password-based encryption (PKCS#5) the flow is
Pick a PRF
Pick an iteration count
Generate a random salt
Write down these choices
Apply these three things, plus the password to generate a key
Encrypt the data
Write down to the encrypted data
When decrypting one
Read the PRF
Read the iteration count
Read the salt
Apply these three things, plus the password to generate a key
Read the encrypted data
Decrypt it
Party on
While this code is doing that part right, the IV and Salt should not be ASCII (or UTF8) strings, they should be "just bytes" (byte[]). If they need to be transported as strings then they should be base64, or some other "arbitrary" binary-to-text encoding.
There is the function for sha256 computing:
static string GetHash(string input)
{
byte[] bytes = Encoding.UTF8.GetBytes(input); //1
SHA256 SHA256 = SHA256Managed.Create();
byte[] hashBytes = SHA256.ComputeHash(bytes); //2
var output = BitConverter.ToString(hashBytes); //3
return output;
}
It gets utf8-bytes from c# string, next it computes hash and returns one as string.
I'am confusing about BitConverter. Its ToString(byte[]) method depends on machine architecture (liitle/big endian). My purpose is providing specific endianess (big-endian) for output string.
How can I do it?
I think it can be like:
byte[] bytes = Encoding.UTF8.GetBytes(input); //1
if (BitConverter.IsLittleEndian)
{
Array.Reverse(bytes)
}
//..
But I dont' know how UTF8.GetBytes works (UTF8.GetBytes docs doesn't contains anything about endianness). Is it depending on endianness also? If so I suppose it is right way to reverse array after 1 step, is it?
I think it doesn't matter here because UTF-8 is byte oriented as stated here :
Isn’t on big endian machines UTF-8's byte order different than on little endian machines? So why then doesn’t UTF-8 require a BOM?
I have this bit of t-sql code
set #UrlHash = convert(bigint, hashbytes('MD5', #Url))
I wonder if I can write a function in C# which returns me the exact same hash as the line above without going to SQL.
Is it possible?
Requirement is that C# MUST created exact same hash.
The select
SELECT CONVERT(BIGINT, HASHBYTES('MD5', 'http://stackoverflow.com'))
will yield the following result:
-3354682182756996262
If you now try to create a MD5 hash in C#
MD5 md5 = MD5.Create();
byte[] textToHash = Encoding.UTF8.GetBytes("http://stackoverflow.com");
byte[] result = md5.ComputeHash(textToHash);
long numeric = BitConverter.ToInt64(result, 0);
numeric will be 8957512937738269783.
So what's the issue (besides the fact that a MD5 hash is 128-bit and BIGINT/long is just 64-bit)?
It's an endian issue (the bytes are in the wrong order). Let's fix it using the BitConverter class and reverse the bytes as needed:
MD5 md5 = MD5.Create();
byte[] textToHash = Encoding.UTF8.GetBytes("http://stackoverflow.com");
byte[] result = md5.ComputeHash(textToHash);
if (BitConverter.IsLittleEndian)
Array.Reverse(result);
long numeric = BitConverter.ToInt64(result, 0);
numeric is now -3354682182756996262 as you want.
You should use MD5 class, here is the example from http://blogs.msdn.com/b/csharpfaq/archive/2006/10/09/how-do-i-calculate-a-md5-hash-from-a-string_3f00_.aspx, with output as int 64 :
public int64 CalculateMD5Hash(string input)
{
// step 1, calculate MD5 hash from input
MD5 md5 = System.Security.Cryptography.MD5.Create();
byte[] inputBytes = System.Text.Encoding.ASCII.GetBytes(input);
byte[] hash = md5.ComputeHash(inputBytes);
return BitConverter.ToInt64(hash, 0);
}
Isn't an MD5 hash standard? Can't you use a standard MD5 C# implementation? What about using the code in here?
I know very little about Encryption, but my goal is to essentially decrypt strings. I have been given the AES(128) key.
However, I must retrieve the IV from the Encrypted string, which is the first 16 bits.
Heres the doc for salesforce for more information (if what i explained was incorrect)
Encrypts the blob clearText using the specified algorithm and private
key. Use this method when you want Salesforce to generate the
initialization vector for you. It is stored as the first 128 bits (16
bytes) of the encrypted blob
http://www.salesforce.com/us/developer/docs/apexcode/Content/apex_classes_restful_crypto.htm (encryptWithManagedIV)
For Retrieving the IV I've tried something like this (I don't believe it's right though):
public string retrieveIv()
{
string iv = "";
string input = "bwZ6nKpBEsuAKM8lDTYH1Yl69KkHN1i3XehALbfgUqY=";
byte[] bytesToEncode = Encoding.UTF8.GetBytes(input);
for(int i = 0; i <= 15; i++){
iv += bytesToEncode[i].ToString(); ;
}
return iv;
}
(Just ignore the fact that the input is hardcoded and not parameterized; easier for testing purposes)
Then use the Best answer from this question to decrypt the string
The IV shouldn't be expressed as a string - it should be as a byte array, as per the AesManaged.IV property.
Also, using Encoding.UTF8 is almost certainly wrong. I suspect you want:
public static byte[] RetrieveIv(string encryptedBase64)
{
// We don't need to base64-decode everything... just 16 bytes-worth
encryptedBase64 = encryptedBase64.Substring(0, 24);
// This will be 18 bytes long (4 characters per 3 bytes)
byte[] encryptedBinary = Convert.FromBase64String(encryptedBase64);
byte[] iv = new byte[16];
Array.Copy(encryptedBinary, 0, iv, 0, 16);
return iv;
}