I am creating an open Source Project which uses the youtube pubsub api to get notifications on video uploads. I want to verify that the request does come from youtube and not from a 3rd Party by checking the HMAC SHA1 Signature as described.
So, every time 1 Run my Program, I will generate a Secret later, to debug the problem, i use "test" as my secret string.
I use the following method to check if the provided signature is valid
public static bool Check(string body, string signature)
{
using (var hmac = new HMACSHA1(Encoding.UTF8.GetBytes(Secret)))
{
var hashBytes = hmac.ComputeHash(Encoding.UTF8.GetBytes(body));
var hash = Encoding.UTF8.GetString(hashBytes);
Console.WriteLine("Computed Hash " + hash);
return signature.Equals(hash);
}
}
Where body is the request body and signature is a value provided in the request header.
According to https://pubsubhubbub.github.io/PubSubHubbub/pubsubhubbub-core-0.4.html#rfc.section.7
If the subscriber supplied a value for hub.secret in their subscription request, the hub MUST generate an HMAC signature of the payload and include that signature in the request headers of the content distribution request. The X-Hub-Signature header's value MUST be in the form sha1=signature where signature is a 40-byte, hexadecimal representation of a SHA1 signature [RFC3174]. The signature MUST be computed using the HMAC algorithm [RFC2104] with the request body as the data and the hub.secret as the key.
I supply my Secret as hub.secret in my subscription request.
So if I understand it correctly, the hub SHOULD use that secret to generate a HMACSHA1 of the payload -> the body.
I want to regenerate the HMAC and should get the same value, right?
It does not work. Also the computed hash value logged by console.WriteLine is something completely different, not alphabetic characters at all, so I guess it might be a problem with the encoding, but I can't figure it out. Thanks for all the help!
The documentation says "where signature is a 40-byte, hexadecimal representation" so instead of converting hashBytes to an UTF8 string you should convert it to a hexadecimal string.
public static bool Check(string body, string signature)
{
using (var hmac = new HMACSHA1(Encoding.UTF8.GetBytes(Secret)))
{
var hashBytes = hmac.ComputeHash(Encoding.UTF8.GetBytes(body));
var hash = Convert.ToHexString(hashBytes).ToLowerInvariant();
Console.WriteLine("Computed Hash " + hash);
return signature.Equals(hash);
}
}
Related
I need to hash a string according to the sha1 algorithm, with the user's secret key, something like: hash = hash (string, secret_key).
I use this code:
byte[] data = sha1.ComputeHash(Encoding.UTF8.GetBytes(input));
But I can not find how to use the secret key in this algorithm.
The hash is the result of applying a hash function to data. It maps arbitrary data to fixed size value. This does not require any kind of secret.
The cryptographic process which uses hashes and secret keys is signing data. You may want to create a hash of your data and a MAC (message authentication code) based on public/private key cryptography using the private key.
A hash tells you only that data was modified or not.
A MAC uses any kind of secret, thus it protects also against manipulation. Generally, MACs only work when the sender and receiver know each other.
You could do something like:
public string SignData(string message, string secret)
{
var encoding = new System.Text.UTF8Encoding();
var keyBytes = encoding.GetBytes(secret);
var messageBytes = encoding.GetBytes(message);
using (var hmacsha1 = new HMACSHA1(keyBytes))
{
var hashMessage = hmacsha1.ComputeHash(messageBytes);
return Convert.ToBase64String(hashMessage);
}
}
Is there any example in C# to see how to pre-sign all objects using a start-with policy with AWS v4 signature to let customers download object from their respective folder structure instead signing each document separately.
The documentation says :
https://s3.amazonaws.com/examplebucket/test.txt
?X-Amz-Algorithm=AWS4-HMAC-SHA256
&X-Amz-Credential=<your-access-key-id>/20130721/us-east-1/s3/aws4_request
&X-Amz-Date=20130721T201207Z
&X-Amz-Expires=86400
&X-Amz-SignedHeaders=host
&X-Amz-Signature=<signature-value>
But my signature is not working for GET (download) object, while working correctly for upload
void Main()
{
string bucket = "bucket-name-here";
string s3Key = "s3-key-here";
string s3Secret = "secret-here";
string s3Region = "us-east-1";
string Date = DateTime.UtcNow.ToString("yyyyMMdd");
string xAmzDate = DateTime.UtcNow.ToString("yyyyMMdd") + "T000000Z";
string expiration = DateTime.UtcNow.AddDays(1).ToString("yyyy-MM-ddTHH:mm:ssK");
string policyString = $#"{{""expiration"":""{expiration}"",""conditions"":[{{""bucket"":""{bucket}""}},{{""acl"":""private""}},[""starts-with"",""$key"",""Client_1""],[""starts-with"",""$Content-Type"",""""],[""starts-with"",""$filename"",""""],{{""x-amz-date"":""{xAmzDate}""}},{{""x-amz-credential"":""{s3Key}/{Date}/us-east-1/s3/aws4_request""}},{{""x-amz-algorithm"":""AWS4-HMAC-SHA256""}}]}}";
var policyStringBytes = Encoding.UTF8.GetBytes(policyString);
var policy = Convert.ToBase64String(policyStringBytes);
//policy.Dump();
byte[] signingKey = GetSigningKey(s3Secret, Date, s3Region, "s3");
byte[] signature = HmacSHA256(policy, signingKey);
var sign = ToHexString(signature);
sign.Dump();
}
static byte[] HmacSHA256(String data, byte[] key)
{
String algorithm = "HmacSHA256";
KeyedHashAlgorithm kha = KeyedHashAlgorithm.Create(algorithm);
kha.Key = key;
return kha.ComputeHash(Encoding.UTF8.GetBytes(data));
}
private byte[] GetSigningKey(String key, String dateStamp, String regionName, String serviceName)
{
byte[] kSecret = Encoding.UTF8.GetBytes(("AWS4" + key).ToCharArray());
byte[] kDate = HmacSHA256(dateStamp, kSecret);
byte[] kRegion = HmacSHA256(regionName, kDate);
byte[] kService = HmacSHA256(serviceName, kRegion);
byte[] kSigning = HmacSHA256("aws4_request", kService);
return kSigning;
}
public static string ToHexString(byte[] data)
{
StringBuilder sb = new StringBuilder();
for (int i = 0; i < data.Length; i++)
{
sb.Append(data[i].ToString("x2", CultureInfo.InvariantCulture));
}
return sb.ToString();
}
More about the problem: We have thousands of documents for hundreds of clients on S3 on their respective folder structure as below. Right now, every time the client is looking to download their object they gets signed by our API to created the downloadable link > so each document is signed separately.
Client 1
Client_1/Document1.xyz
Client_1/Document2.xyz
Client 2
Client_2/Document1.xyz
Client_2/Document2.xyz
The signing algorithm for S3 HTML form POST uploads allows you to sign a policy document with constraints like ["starts-with","$key",...] but pre-signed URLs for S3 don't support this. With a pre-signed URL, you sign not a policy document but a "canonical request," which is a canonicalized representation of the browser's exact request. So there is no support for wildcards or prefixes.
There are two alternatives that come to mind.
CloudFront signed URLs and signed cookies do support a policy document when you use a "custom policy" (not a "canned policy," which is more like what S3 supports) and a custom policy allows a * in the URL, similar to ["starts-with","$key",...] but using the URL the browser will be requesting. You only have to do the signing once, and the code running in the browser can reuse that policy and signature. On the back-side of CloudFront, a CloudFront Origin Access Identity is used to sign the requests as they are actually sent to the bucket, after authenticating the request on the front-side of CloudFront using the CloudFront signed URL or signed cookies. (With signed cookies, the browser just makes the request, and automatically submits the cookies, so that works the same but with no browser manipulation of the URLs.
Alternately, the AssumeRole action in Security Token Service could be called by your server, to generate a set of temporary credentials for the client to use, to sign its own individual URLs.
When calling AssumeRole, you can also pass an optional session policy document. If you do, then the generated temporary credentials can only perform actions allowed by both the role policy ("allow read from the bucket") and the session policy ("allow read from the bucket for keys beginning with a specific prefix"). So the role credentials obtained would only allow the user to access to their objects.
I am building login workflow using Google for user. Once user is authenticated, I call GetAuthResponse to get idToken.
https://developers.google.com/identity/sign-in/web/backend-auth
Now, I need to verify certificate against Google certificate. I am using JwtSecurityToken(C#) for the same.
I am referencing for verification - http://blogs.msdn.com/b/alejacma/archive/2008/06/25/how-to-sign-and-verify-the-signature-with-net-and-a-certificate-c.aspx
Issue is - I always gets false from VerifyHash. As, VerifyHash returns just false without any reason, I am not able to find way to verify whether idToken is
valid or not. My code is given below
String strID = ""; // idToken received from Google AuthResponse
JwtSecurityToken token = new JwtSecurityToken(strID);
byte[] text = GetHash(token.RawData);
SHA256Cng sha1 = new SHA256Cng();
UnicodeEncoding encoding = new UnicodeEncoding();
byte[] data = encoding.GetBytes(text);
byte[] hash = sha1.ComputeHash(data);
byte[] signature = Encoding.Unicode.GetBytes(token.RawSignature);
// Modulus and exponent value from https://www.googleapis.com/oauth2/v2/certs - second set of keys
String modulus = "uHzGq7cMlx21nydbz9VsW1PItetb9mqvnpLp_8E3Knyk-mjv9DlaPhKGHYlJfHYGzKa2190C5vfsLLb1MIeGfdAv7ftpFsanIWawl8Zo0g-l0m7T2yG_7XerqcVK91lFifeJtgxKI86cPdZkgRy6DaYxMuAwAlhvpi3_UhPvsIwi7M6mxE8nUNpUWodh_YjJNu3wOxKDwbBZuRV2itjY6Z7RjFgJt1CsKF-QjqSVvWjAl0LaCaeMS_8yae0ln5YNeS8rAb6xkmcOuYeyhYsiBzwLRvgpXzEVLjLr631Z99oUHTpP9vWJDpGhfkrClkbmdtZ-ZCwX-eFW6ndd54BJEQ==";
String exponent = "AQAB";
modulus = modulus.Replace('-', '+').Replace('_', '/'); // Else it gives Base64 error
StringBuilder sb = new StringBuilder();
sb.Append("<RSAKeyValue>");
sb.Append("<Modulus>");
sb.Append(modulus);
sb.Append("</Modulus>");
sb.Append("<Exponent>");
sb.Append(exponent);
sb.Append("</Exponent>");
sb.Append("</RSAKeyValue>");
RSACryptoServiceProvider RSAVerifier = new RSACryptoServiceProvider();
RSAVerifier.FromXmlString(sb.ToString());
// Verify the signature with the hash
return RSAVerifier.VerifyHash(hash, CryptoConfig.MapNameToOID("SHA256"), signature);
You might want to try as done in the Google+ Token Verification project - this fork includes a few minor updates that are still in review.
An alternative approach is to just verify the tokens using Google's token verification endpoints:
curl https://www.googleapis.com/oauth2/v2/tokeninfo?id_token=eyJhbGciOiJSUzI1NiIsImtpZCI6IjkyNGE0NjA2NDgxM2I5YTA5ZmFjZGJiNzYwZGI5OTMwMWU0ZjBkZjAifQ.eyJpc3MiOiJhY2NvdW50cy5nb29nbGUuY29tIiwic3ViIjoiMTEwNTcwOTc3MjI2ODMwNTc3MjMwIiwiYXpwIjoiMzY0MzgxNDQxMzEwLXRuOGw2ZnY2OWdnOGY3a3VjanJhYTFyZWpmaXRxbGpuLmFwcHMuZ29vZ2xldXNlcmNvbnRlbnQuY29tIiwiYXRfaGFzaCI6IlAzLU1HZTdocWZhUkZ5Si1qcWRidHciLCJhdWQiOiIzNjQzODE0NDEzMTAtdG44bDZmdjY5Z2c4ZjdrdWNqcmFhMXJlamZpdHFsam4uYXBwcy5nb29nbGV1c2VyY29udGVudC5jb20iLCJjX2hhc2giOiJjd3hsdXBUSkc4N2FnbU1pb0tSYUV3IiwiaWF0IjoxNDM0NDcyODc2LCJleHAiOjE0MzQ0NzY0NzZ9.Gz_WljZOV9NphDdClakLstutEKk65PNpEof7mxM2j-AOfVwh-SS0L5uxIaknFOk4-nDGmip42vrPYgNvbQWKZY63XuCs94YQgVVmTNCTJnao1IavtrhYvpDqGuGKdEB3Wemg5sS81pEthdvHwyxfwLPYukIhT8-u4ESfbFacsRtR77QRIOk-iLJAVYWTROJ05Gpa-EkTunEBVmZyYetbMfSoYkbwFKxYOlHLY-ENz_XfHTGhYhb-GyGrrw0r4FyHb81IWJ6Jf-7w6y3RiUJik7kYRkvnFouXUFSm8GBwxsioi9AAkavUWUk27s15Kcv-_hkPXzVrW5SvR1zoTI_IMw
I am trying to validate a JWT token. The code is within an OWIN OAuth handler, however I have taken the various pieces out into a small console application and it would appear to be a problem with how the JwtHeader method SigningKeyIdentifier creates a X509ThumbprintKeyIdentifierClause object.
My JWT has a header value of x5t = [base64urlencodedvalue], and I have confirmed that when this string is decoded it is indeed the thumbprint for my certificate. However, in the SigningKeyIdentifier class the following code seems to create a incorrect clause, e.g. the hash of the clause doesnt match the certificate.
identifier.Add(new X509ThumbprintKeyIdentifierClause(Base64UrlEncoder.DecodeBytes(this.GetStandardClaim("x5t"))));
Below is a snippet of the console app that tries to demostrate the issue:
// http://kjur.github.io/jsjws/tool_b64udec.html
const string X5T = "NmJmOGUxMzZlYjM2ZDRhNTZlYTA1YzdhZTRiOWE0NWI2M2JmOTc1ZA"; // value inside the JWT (x5t)
const string thumbPrint = "6bf8e136eb36d4a56ea05c7ae4b9a45b63bf975d"; // correct thumbprint of certificate
string thumbPrintBase64 = Base64UrlEncoder.Encode(thumbPrint); // <--- value in x5t of JWT
// finds correct certificate
var cert1 = X509CertificateHelper.FindByThumbprint(StoreName.My, StoreLocation.LocalMachine, thumbPrint).First();
var certHash = cert1.GetCertHash();
string hexa = BitConverter.ToString(certHash).Replace("-", string.Empty);
Console.WriteLine(hexa.ToLowerInvariant());
// TokenValidationParameters.IssuerSigningKey
var clause1 = new X509ThumbprintKeyIdentifierClause(cert1);
string hex1 = BitConverter.ToString(clause1.GetX509Thumbprint()).Replace("-", string.Empty);
Console.WriteLine(clause1.ToString());
Console.WriteLine(hex1.ToLowerInvariant());
// this is how JwtHeader.SigningKeyIdentifier method creates SecurityKeyIdentifier
var hash = Base64UrlEncoder.DecodeBytes(thumbPrintBase64);
var clause2 = new X509ThumbprintKeyIdentifierClause(hash); // <----- broken
string hexb = BitConverter.ToString(hash).Replace("-", string.Empty);
Console.WriteLine(hexb.ToLowerInvariant());
Console.WriteLine(clause2.ToString());
string hex2 = BitConverter.ToString(clause2.GetX509Thumbprint()).Replace("-", string.Empty);
Console.WriteLine(hex2.ToLowerInvariant());
// clause1 and clause2 should be the same, but they arent!?
The problem seems to be that the various consructors for X509ThumbprintKeyIdentifierClause end up with different hash values which when compared later dont match.
In my OWIN project one piece creates a X509ThumbprintKeyIdentifierClause from a certificate (TokenValidationParameters.IssuerSigningKey). e.g.
IssuerSigningKey = new X509SecurityKey(X509CertificateHelper.FindByThumbprint(StoreName.My, StoreLocation.LocalMachine, thumbPrint).First()),
and the IssuerSigningKeyResolver method called to match the JWT with the issue certificate using the thumbnail from the x5t field.
identifier.Add(new X509ThumbprintKeyIdentifierClause(Base64UrlEncoder.DecodeBytes(this.GetStandardClaim("x5t"))));
but they dont match.
What am I missing? Something feels wrong with the encoding/decoding of the thumbnail.
We had a great discussion about this in GitHub, should have posted here before.
https://github.com/AzureAD/azure-activedirectory-identitymodel-extensions-for-dotnet/issues/100
The input that is encoded and creates the x5t is not the 'thumbprint', which are encoded octets. but the octets from the certhash.
The JwtHeader creates the clause as:
ski.Add(new X509ThumbprintKeyIdentifierClause(Base64UrlEncoder.DecodeBytes(GetStandardClaim(JwtHeaderParameterNames.X5t))));
Not:
var hash = Base64UrlEncoder.DecodeBytes(thumbPrintBase64);
My Client has a Java webservice and I am trying to consumem it using wcf .
It uses a usernametoken with nonce, createddate.. This is the link I am using
http://weblog.west-wind.com/posts/2012/Nov/24/WCF-WSSecurity-and-WSE-Nonce-Authentication,
Following code generates nonce. This simply appends to the string.
string phrase = Guid.NewGuid().ToString();
var nonce = GetSHA1String(phrase);
protected string GetSHA1String(string phrase)
{
SHA1CryptoServiceProvider sha1Hasher = new SHA1CryptoServiceProvider();
byte[] hashedDataBytes = sha1Hasher.ComputeHash(Encoding.UTF8.GetBytes(phrase));
return Convert.ToBase64String(hashedDataBytes);
}
Error: The Nonce which is a randomly generated value has expired. Is the code generating a nonce or checksum?
Soap UI does not give me this error. It is successful.This is obvious becuase Interoperability is always an issue
SOAP Error, The nonce, which is a randomly generated value, has expired. ocurred while running action:
Thank you