What does the "verify" function in RSAKey (AS3Crypto) does? - c#

I really don't understand what the _verify function does in this class :
http://code.google.com/p/as3crypto/source/browse/trunk/as3crypto/src/com/hurlant/crypto/rsa/RSAKey.as
And especially what does it put into the 'dst' variable. I actually have a "verify key" wich use this method on an encrypted data, and I get the public key in the dst variable...
Here is a small diagram, so that you understand better: http://i.imgur.com/R8DqT.png
Thanks
Ps : I have to do the same in .net,so if you know something equivalent, let me know it

The function _verify (included for reference below)
public function verify(src:ByteArray, dst:ByteArray,
length:uint, pad:Function = null):void {
_decrypt(doPublic, src, dst, length, pad, 0x01);
}
Looking at the link you supplied, the function is used to verify RSA signed data - the result is copied to the dst ByteArray.
Breakdown:
doPublic = function parameter, a wrapper around BigInteger.modPowInt()
src = byte array with signed data
dst = byte array which will hold the result of with verification result
length = length of the data in src byte array
pad = function parameter, a wrapper for pkcs1pad (_encrypt) and pkcs1unpad (_decrypt)
0x01 = padType - an integer value specifying if a fixed value (0xff) is used in padding (0x01) or a pseudo-random one (0x02) - (only actually used in pkcs1pad which is called from _encrypt)
In the RSA scheme, signed data is verified by decrypting the signature using the public key.
Update: Unless you have very specific needs which are not covered, is see no reason why you want to port the ActionScript-3 you posted. Use the c# RSACryptoServiceProvider which is included in the framework. Take special note of the section Interoperation with the Microsoft Cryptographic API (CAPI) in the MSDN description.
Addressing your comments about needing the content of the dst byte array in a similar manner to the AS3Crypto implementation you could just create a wrapper to decrypt the signed data against the public key. Have a look at RSACryptoServiceProvider.ImportParameters() function which you use to import the public key information. Since you haven't provided details as to how the public key is retrieved I can't be more specific. This implementation example should help with parsing key files to create appropriate RSAParameters to feed to the ImportParameters method.

Related

Signing SHAKE256 hash with RSA signature with further validation

I need to hash input data with SHAKE256 (found SeriesOne.CORE package for this) and then generate a digital signature for it based on RSA algorithm.
The problem is that the default RSACryptoServiceProvider does not support any of the SHA3 hashing functions. Methods like RSACryptoServiceProvider.SignHash(), RSACryptoServiceProvider.VerifyData() and RSACryptoServiceProvider.VerifyHash() require hashing algorithm specification. Is there any workaround or maybe I miss something?
.NET currently does not natively support SHA-3 (and thus also not the SHA3 variant SHAKE256), see e.g. here.
Therefore, the native C# methods for signing/verifying do not work either (even the SignHash() method throws a runtime exception if the already hashed data is passed, the digest is specified via the OID, and RSASSA-PKCS1-v1_5 is used, i.e. if no explicit hashing would be required at all: The specified OID (2.16.840.1.101.3.4.2.12) does not represent a known hash algorithm).
Ultimately, therefore, not only the hashing, but for the entire signing process a third-party provider must be used. One possibility is BouncyCastle (as suggested in Maarten Bodewes' comment). For .NET6 BouncyCastle.NetCore should be applied.
For RSASSA-PKCS1-v1_5 the RsaDigestSigner class is to be used with the following in mind:
The internal mapping of the RSADigestSigner class does not take the SHAKE digests into account, so the RSADigestSigner(Digest digest) ctor will not work and the RSADigestSigner(Digest digest, ASN1ObjectIdentifier digestOid) ctor must be used, i.e. the OID (2.16.840.1.101.3.4.2.12) must be explicitly specified. This can be done either with new DerObjectIdentifier("2.16.840.1.101.3.4.2.12") or with NistObjectIdentifiers.IdShake256. The DER encoding of the DigestInfo value is thus 3031300D060960864801650304020C05000420 (which differs from the SHA56 value only in the 15th byte with 0x0c instead of 0x01, s. RFC8017).
The SHAKE digests have a variable output length. In the implementation below, a fixed output length of 32 bytes is applied for SHAKE256.
A possible implementation is:
using Org.BouncyCastle.Asn1.Nist;
using Org.BouncyCastle.Crypto.Digests;
using Org.BouncyCastle.Crypto.Signers;
...
RsaDigestSigner signer = new RsaDigestSigner(new ShakeDigest(256), NistObjectIdentifiers.IdShake256);
signer.Init(true, privateKeyParameter);
signer.BlockUpdate(dataToSign, 0, dataToSign.Length);
byte[] signature = signer.GenerateSignature();
...
For RSASSA-PSS the PssSigner class is to be used. The following implementation applies SHAKE256 for PSS and MGF1 digest, and as salt length 32 bytes (but there are also constructors to set the parameters explicitly):
using Org.BouncyCastle.Crypto.Engines;
using Org.BouncyCastle.Crypto.Digests;
using Org.BouncyCastle.Crypto.Signers;
...
PssSigner pssSigner = new PssSigner(new RsaEngine(), new ShakeDigest(256));
//PssSigner pssSigner = new PssSigner(new RsaEngine(), new ShakeDigest(256), new ShakeDigest(256), 32); // works also
pssSigner.Init(true, privateKeyParameter);
pssSigner.BlockUpdate(dataToSign, 0, dataToSign.Length);
byte[] pssSignature = pssSigner.GenerateSignature();

What does CreateSignature(HashAlgorithm) achieve?

I am working with some old code and I don't understand what has been done when creating a signed hash. The authors used this implementation:
AsymmetricSignatureFormatter.CreateSignature(HashAlgorithm)
All the examples I can find, and all the Microsoft documentation, use the other implementation:
AsymmetricSignatureFormatter.CreateSignature(byte[]HashedDataValue)
I understand that the second approach is signing a hash of some user data. At the receiver, we can re-hash the received plain text and compare it with this sent signed version to confirm it hasn't been changed.
But what is the first approach trying to do? No signed data seems to be sent, only a signed version of 'the agorithm', but what actually gets signed? Does it sign the Hash.hash byte array? But if so, there is no plain text byte array at the receiver to re-hash and check the sent hash against.
I suspect I have some fundamental misunderstanding of the purpose of this implementation.
It turns out that to use this form:
AsymmetricSignatureFormatter.CreateSignature(HashAlgorithm)
you need to have done some pre-work with the HashAlgorithm object. The HashAlgorithm object internally stores the hash of the last thing it 'hashed'.
Hence, if alg is the HashAlgorithm and userData is a byte array then
alg.ComputeHash(userData)
will store the hash within the alg object. Now we can sign the hash of userData using this form of the method:
AsymmetricSignatureFormatter.CreateSignature(HashAlgorithm)

Whats the standard code to generate HMAC SHA256 with key using C#

I'd like to know if there is a standard code to generate a SHA256 hash using a key. I've come across several types of code, however, they don't generate the same output.
Code found at JokeCamp
private string CreateToken(string message, string secret)
{
secret = secret ?? "";
var encoding = new System.Text.ASCIIEncoding();
byte[] keyByte = encoding.GetBytes(secret);
byte[] messageBytes = encoding.GetBytes(message);
using (var hmacsha256 = new HMACSHA256(keyByte))
{
byte[] hashmessage = hmacsha256.ComputeHash(messageBytes);
return Convert.ToBase64String(hashmessage);
}
}
Here's another one that I found
private static string ComputeHash(string apiKey, string message)
{
var key = Encoding.UTF8.GetBytes(apiKey);
string hashString;
using (var hmac = new HMACSHA256(key))
{
var hash = hmac.ComputeHash(Encoding.UTF8.GetBytes(message));
hashString = Convert.ToBase64String(hash);
}
return hashString;
}
The code generated by both of these are different to what is generated by http://www.freeformatter.com/hmac-generator.html#ad-output
I'll be using the SHA256 for one of our external API's where consumers would hash the data and send it to us. So we just want to make sure we use a standard approach so that they send us the correct hash. Also, I would like to know if there are any well-known nugets for this. I've also tried to find a solution with Bouncy Castle, however, I couldn't find one that uses a key to hash.
The difference is because of the character encodings (ASCII vs UTF-8 in your examples). Note that the hashing algorithm takes an array of bytes, and you do the conversion from a string to that byte-array beforehand.
Your question "whats the standard code" probably hasnt an answer, id say that if you expect the input to contain content from just the ASCII character-space, go for that, if not go for UTF-8. Either way - communicate it to your users
If you want to look at it from a usability perspective and make it the optimal for your users - go for both. Hash the content both ways and check agains the users incoming hash, but it all depends on your evaluation on clock-cycles vs security vs usability (you can have two)
They are almost equivalent.
The difference is how the encoding for the string is established.
In the first portion of code it assumes ASCII, whereas in the second portion it assumes UTF-8. It is possible that the string used another encoding which is none of those.
But regardless of that, the idea is to understand what is the goal of this operation. The truly relevant things in this context are:
Given equal input, output should be the same
There should be no way to retrieve the plaintext only by knowing the output (within a reasonable amount of time)
After hashing, you no longer require the original input in plaintext.
A secure cryptographic hashing function (meaning not older functions like MD5) achieves that.
Then, if your data store where hashes are stored is compromised, the attacker would only have a hash which cannot be used to retrieved the original plaintext. This is why hashing is used rather than encryption, encryption is a reversible operation (through decryption).
Then, within the system, if you've made the decision to use one encoding, you need to keep that decision consistent across all components in your system so they can interoperate.

Translate C# code to Ruby Code

I am trying to connect my system to a banks payment system. The problem is, their documentation was mostly not correct, if it wasn't a complete disaster.
In the documentation of 3D secure system, the bank asks me to fill out a html form and submit it to their system. The form should include some data AND a SHA1 hash of the data with the data. I tried many times but the bank's system returned "Hash not correct" error all the time.
After some inspection on their example C# code, I found a function they used to get hash results. The problem is function was doing some other stuff to the data rather than just hashing them. And bigger problem is I cannot find out what this piece of code is doing to the string that hashed.
public static string CreateHash(string notHashedStr)
{
SHA1 sha1 = new SHA1CryptoServiceProvider();
byte[] notHashedBytes = System.Text.Encoding.ASCII.GetBytes(notHashedStr);
byte[] hashedByte = sha1.ComputeHash(notHashedBytes);
string hashedStr = System.Convert.ToBase64String(hashedByte);
return hashedStr;
}
I have nearly no experience on .Net framework and also I am on a mac, so I cannot test the code easily, and MSDN is definitely not for me(I am a Ruby developer most of the time, and I know enough C). If anyone can explain what these functions do to the string to be hashed, i'll be very glad.
It's very simple.
Get the ASCII encoded bytes from notHashedStr.
Create a SHA1 hash from that bytes
Convert that hash in a Base64 encoded string.
return that Base64-SHA1-ASCII-String.
I never did any ruby, but it must look a bit like this.
require 'digest/sha1'
returnValue = Digest::SHA1.base64digest 'notHashedStr'

How to import DSA signature in ASN.1 format using BouncyCastle (C#)

OpenSSL, as well as most other DSA implementations, outputs signatures in ASN.1 format. Thus, the 40-byte signature (two 20-byte integers) becomes 46 bytes due to the ASN.1 structure headers. (See this forum post for details.)
My question is, how does one handle this format in C#? (or elsewhere, for that matter)
I spent a while trying to deal with it using the .NET System.Security.Crypto packages, but gave up on that (really frustrating, because it clearly has internal code to parse ASN.1 since it can read DER format, but there's no way for you to use it -- but I digress...)
Then, I started working with the BouncyCastle C# library. I can get it into an Asn1Object, and if I expand it while debugging I see that it contains a DerSequence with the two integers, but how do I pull them out (preferably into BigIntegers so I can feed them to DSA.VerifySignature?)
Code sample:
Byte[] msgText = ReadFile("test_msg.txt");
Byte[] msgSigRaw = ReadFile("test_sig_1.bin"); // reads binary ASN.1 sig using FileStream
Asn1Object sigASN = Asn1Object.FromByteArray(msgSigRaw); // parses into Asn1Object
...
X509Certificate implCert = ReadCertificate("pubcert_dsa.cer"); // cert in DER format
DsaSigner DSA = new DsaSigner();
DSA.Init(false, implCert.GetPublicKey());
...
BigInteger sigIntR, sigIntS;
... //TODO: how to get signature from sigASN into sigIntR, sigIntS?
Boolean validSig = DSA.VerifySignature(msgText, sigIntR, sigIntS); // my goal
Take a look at this CodeProject article: http://www.codeproject.com/KB/security/CryptoInteropSign.aspx
It contains code to convert the DSA signature into the P1363 format expected in C#.
Some example code of how to verify a DSA signature in BouncyCastle C#:
ISigner sig = SignerUtilities.GetSigner("SHA1withDSA");
sig.Init(false, implCert.GetPublicKey());
sig.BlockUpdate(msgText, 0, msgText.Length);
bool valid = sig.VerifySignature(msgSigRaw);
Note that this signer will deal with the ASN.1 and the calculation of the message digest (I've assumed SHA-1 was used here) for you.
If you still really want to know how the conversions of the {r,s} values to/from ASN.1 happen, then have a look in the source for DsaDigestSigner. Internally it does the appropriate ASN.1 encoding/decoding and then uses DsaSigner class for the low-level sig operation.

Categories

Resources