Extract r and s from ECDSA signature signedxml c# - c#

I'm using Signedxml class in .net 5 to generate signature using ECDSA
and I need the value of r and s but I can not extract it and another problem is
signature length . My signature always 64 bytes but ECDSA signature length is 71
and I do not know why this length change . Please help me extract r and s

When converting the ECDSA signature from ASN.1/DER to P1363 (r|s) format, the following must be taken into account:
In ASN.1/DER format r and s are contained as signed, big endian arrays, in P1363 format as unsigned, big endian arrays.
In ASN.1/DER format r and s are included as minimal sized arrays, in P1363 both are padded to their maximum size (length of the order of the generator point) with leading 0x00 values. Example: For NIST P-256, the maximum size of r and s is 32 bytes each.
Possible implementation with .NET 5+ using the AsnReader class:
using System.Formats.Asn1;
...
public static byte[] DERtoP1363(byte[] derSignature, int maxSize)
{
AsnReader sequence = new AsnReader(derSignature, AsnEncodingRules.DER).ReadSequence();
byte[] rBytes = sequence.ReadInteger().ToByteArray(true, true); // convert to unsigned, big endian
byte[] sBytes = sequence.ReadInteger().ToByteArray(true, true); // convert to unsigned, big endian
byte[] rsBytes = new byte[2 * maxSize];
Buffer.BlockCopy(rBytes, 0, rsBytes, maxSize - rBytes.Length, rBytes.Length); // resize to maximum size
Buffer.BlockCopy(sBytes, 0, rsBytes, 2 * maxSize - sBytes.Length, sBytes.Length); // resize to maximum size
return rsBytes;
}
AsnReader is available since .NET 5.
For completeness: In other .NET versions BouncyCastle can be applied (using classes from the Org.BouncyCastle.Asn1 namespace). For this, the first three lines in DERtoP1363() must by replaced by:
Asn1Sequence sequence = Asn1Sequence.GetInstance(derSignature);
byte[] rBytes = DerInteger.GetInstance(sequence[0]).PositiveValue.ToByteArrayUnsigned();
byte[] sBytes = DerInteger.GetInstance(sequence[1]).PositiveValue.ToByteArrayUnsigned();

Related

C# Generating PublicKey/IPublicKey object from EC Public key bytes?

When porting a snippet of code from Java to C#, I have come across a specific function which I am struggling to find a solution to. Basically when decoding, an array of bytes from an EC PublicKey needs to be converted to a PublicKey object and everything I have found on the internet doesn't seem to help.
I am developing this on Xamarin.Android using Java.Security libraries and BouncyCastle on Mono 6.12.0.
This is the code I am using in Java:
static PublicKey getPublicKeyFromBytes(byte[] pubKey) throws NoSuchAlgorithmException, InvalidKeySpecException {
ECNamedCurveParameterSpec spec = ECNamedCurveTable.getParameterSpec("secp256r1");
KeyFactory kf = KeyFactory.getInstance("EC", new BouncyCastleProvider());
ECNamedCurveSpec params = new ECNamedCurveSpec("secp256r1", spec.getCurve(), spec.getG(), spec.getN());
ECPoint point = ECPointUtil.decodePoint(params.getCurve(), pubKey);
ECPublicKeySpec pubKeySpec = new ECPublicKeySpec(point, params);
return (ECPublicKey) kf.generatePublic(pubKeySpec);
}
This was the best solution I could come up with which didn't throw any errors in VS. Sadly, it throws an exception and tells me that the spec is wrong:
X9ECParameters curve = CustomNamedCurves.GetByName("secp256r1");
ECDomainParameters domain = new ECDomainParameters(curve.Curve, curve.G, curve.N, curve.H);
ECPoint point = curve.Curve.DecodePoint(pubKey);
ECPublicKeyParameters pubKeySpec = new ECPublicKeyParameters(point, domain);
// Get the encoded representation of the public key
byte[] encodedKey = pubKeySpec.Q.GetEncoded();
// Create a KeyFactory object for EC keys
KeyFactory keyFactory = KeyFactory.GetInstance("EC");
// Generate a PublicKey object from the encoded key data
var pbKey = keyFactory.GeneratePublic(new X509EncodedKeySpec(encodedKey));
I have previously created a PrivateKey in a similar way where I generate a PrivateKey and then export the key in PKCS#8 format, then generating the object from this format. However I couldn't get this to work from an already set array of bytes.
Importing a raw public EC key (e.g. for secp256r1) is possible with pure Xamarin classes, BouncyCastle is not needed for this. The returned key can be used directly when generating the KeyAgreement:
using Java.Security.Spec;
using Java.Security;
using Java.Math;
using Java.Lang;
...
private IPublicKey GetPublicKeyFromBytes(byte[] rawXY) // assuming a valid raw key
{
int size = rawXY.Length / 2;
ECPoint q = new ECPoint(new BigInteger(1, rawXY[0..size]), new BigInteger(1, rawXY[size..]));
AlgorithmParameters algParams = AlgorithmParameters.GetInstance("EC");
algParams.Init(new ECGenParameterSpec("secp256r1"));
ECParameterSpec ecParamSpec = (ECParameterSpec)algParams.GetParameterSpec(Class.FromType(typeof(ECParameterSpec)));
KeyFactory keyFactory = KeyFactory.GetInstance("EC");
return keyFactory.GeneratePublic(new ECPublicKeySpec(q, ecParamSpec));
}
In the above example rawXY is the concatenation of the x and y coordinates of the public key. For secp256r1, both coordinates are 32 bytes each, so the total raw key is 64 bytes.
However, the Java reference code does not import raw keys, but an uncompressed or compressed EC key. The uncompressed key corresponds to the concatenation of x and y coordinate (i.e. the raw key) plus an additional leading 0x04 byte, the compressed key consists of the x coordinate plus a leading 0x02 (for even y) or 0x03 (for odd y) byte.
For secp256r1 the uncompressed key is 65 bytes, the compressed key 33 bytes. A compressed key can be converted to an uncompressed key using BouncyCastle. An uncompressed key is converted to a raw key by removing the leading 0x04 byte.
To apply the above import in the case of an uncompressed or compressed key, it is necessary to convert it to a raw key, which can be done with BouncyCastle, e.g. as follows:
using Org.BouncyCastle.Asn1.X9;
using Org.BouncyCastle.Crypto.EC;
...
private byte[] ConvertToRaw(byte[] data) // assuming a valid uncompressed (leading 0x04) or compressed (leading 0x02 or 0x03) key
{
if (data[0] != 4)
{
X9ECParameters curve = CustomNamedCurves.GetByName("secp256r1");
Org.BouncyCastle.Math.EC.ECPoint point = curve.Curve.DecodePoint(data).Normalize();
data = point.GetEncoded(false);
}
return data[1..];
}
Test: Import of a compressed key:
using Java.Util;
using Hex = Org.BouncyCastle.Utilities.Encoders.Hex;
...
byte[] compressed = Hex.Decode("023291D3F8734A33BCE3871D236431F2CD09646CB574C64D07FD3168EA07D3DB78");
pubKey = GetPublicKeyFromBytes(ConvertToRaw(compressed));
Console.WriteLine(Base64.GetEncoder().EncodeToString(pubKey.GetEncoded())); // MFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAEMpHT+HNKM7zjhx0jZDHyzQlkbLV0xk0H/TFo6gfT23ish58blPNhYrFI51Q/czvkAwCtLZz/6s1n/M8aA9L1Vg==
As can be easily verified with an ASN.1 parser (e.g. https://lapo.it/asn1js/), the exported X.509/SPKI key MFkw... contains the raw key, i.e. the compressed key was imported correctly.

C# Converting base64 string to 16-bit words stored in little-endian byte order

I am trying to upload a base64 of a signature but I need it to be a base64 encoding of an array of 16-bit words stored in little-endian byte order. Can anyone help me convert the base64 to 16-bit array in little-endian byte and then convert it again to base64?
To do this you can create arrays of the correct type (byte[] and short[]) and use Buffer.BlockCopy() to copy the bytes between them, thus converting the data.
This does not account for little-endian/big-endian differences, but since you state that this only needs to run on little-endian systems, we don't need to worry about it.
Here's a sample console app that demonstrates how to do the conversion. It does the following:
Create an array of shorts 0..99 inclusive.
Convert array of shorts to array of bytes (preserving endianness).
Convert array of bytes to base 64 string.
Convert base 64 string back into array of bytes.
Convert array of bytes back into array of shorts (preserving endianness).
Compare converted array of shorts with original array to prove correctness.
Here's the code:
using System;
using System.Linq;
namespace ConsoleApp1
{
class Program
{
static void Main(string[] args)
{
// Create demo array of shorts 0..99 inclusive.
short[] sourceShorts = Enumerable.Range(0, 100).Select(i => (short)i).ToArray();
// Convert array of shorts to array of bytes. (Will be little-endian on Intel.)
int byteCount = sizeof(short) * sourceShorts.Length;
byte[] dataAsByteArray = new byte[byteCount];
Buffer.BlockCopy(sourceShorts, 0, dataAsByteArray, 0, byteCount);
// Convert array of bytes to base 64 string.
var asBase64 = Convert.ToBase64String(dataAsByteArray);
Console.WriteLine(asBase64);
// Convert base 64 string back to array of bytes.
byte[] fromBase64 = Convert.FromBase64String(asBase64);
// Convert array of bytes back to array of shorts.
if (fromBase64.Length % sizeof(short) != 0)
throw new InvalidOperationException("Byte array size must be multiple of sizeof(short) to be convertable to shorts");
short[] destShorts = new short[fromBase64.Length/sizeof(short)];
Buffer.BlockCopy(fromBase64, 0, destShorts, 0, fromBase64.Length);
// Prove that the unconverted shorts match the source shorts.
if (destShorts.SequenceEqual(sourceShorts))
Console.WriteLine("Converted and unconverted successfully");
else
Console.WriteLine("Error: conversion was unsuccessful");
}
}
}

Roundtrip Unicode conversion returns different Byte[] array

I'm tinkering with RSA signing of data.
I'm using a plaintext string, which i convert to byte array. i then generate private certificate, sign the byte array and then generate public key.
next i'm using the same byte array to verify the signature.
but i want to convert signature, in between steps, to the string - idea is to append it later on to the file that's being signed.
static void TestSigning(string privateKey)
{
string data = "TEST_TEST-TEST+test+TEst";
Console.WriteLine("==MESSAGE==");
Console.WriteLine(data);
byte[] dataByte = Encoding.Unicode.GetBytes(data);
using (var rsa = new RSACryptoServiceProvider())
{
rsa.FromXmlString(privateKey);
var publicKey = rsa.ToXmlString(false);
byte[] signature = rsa.SignData(dataByte, CryptoConfig.MapNameToOID("SHA512"));
string signatureString = Encoding.Unicode.GetString(signature);
byte[] roundtripSignature = Encoding.Unicode.GetBytes(signatureString);
Console.WriteLine("==TEST==");
Console.WriteLine(signature.Length.ToString());
Console.WriteLine(roundtripSignature.Length.ToString());
using (var checkRSA = new RSACryptoServiceProvider())
{
checkRSA.FromXmlString(publicKey);
bool verification = checkRSA.VerifyData(
dataByte,
CryptoConfig.MapNameToOID("SHA512"),
roundtripSignature);
Console.WriteLine("==Verification==");
Console.WriteLine(verification.ToString());
Console.ReadKey();
}
}
}
now here's the fun part
if i use UTF8 encoding i get byte arrays of different length
256 is the original size
484 is the roundtrip
UTF7 returns different sizes too
256 vs 679
both ASCII and Unicode return proper sizes 256 vs 256.
i've tried using
var sb = new StringBuilder();
for (int i = 0; i < signature.Length; i++)
{
sb.Append(signature[i].ToString("x2"));
}
to get the string. I'm then using Encoding.UTF8.GetBytes() method
this time i get the sizes of:
256 vs 512
if i remove the format from toString() i get:
256 vs 670
signature verification alwayas failed.
it works fine if i use 'signature' instead of roundtripSignature.
my question: Why, despite using same encoding type i get different byte arrays and strings? shouldn't this conversion be lossless?
Unicode isn't a good choice because, at minimum, \0, CR, LF, <delete>, <backspace> (and the rest of the control codes) can mess things up. (See an answer about this for Encrypt/Decrypt for more).
As #JamesKPolk said, you need to use a suitable binary-to-text encoding. Base64 and hex/Base16 are the most common, but there are plenty of other viable choices.

Shift a 128-bit signed BigInteger to always be positive

I'm converting a Guid to a BigInteger so I can base62 encode it. This works well, however, I can get negative numbers in BigInterger. How do I shift the BigInteger so the number is positive. I'll also need to be able to shift it back so I can convert back to a Guid.
// GUID is a 128-bit signed integer
Guid original = new Guid("{35db5c21-2d98-4456-88a0-af263ed87bc2}");
BigInteger b = new BigInteger(original.ToByteArray());
// shift so its a postive number?
Note: For url-safe version of Base64 consider using modifyed set of characters for Base64 ( http://en.wikipedia.org/wiki/Base64#URL_applications) instead of custom Base62.
I believe you can append 0 to the array first (will make higest byte always not to contain 1 in the highest bit) and then convert to BigInteger if you really need positive BigInteger.
do you mean base64 encode?
Convert.ToBase64String(Guid.NewGuid().ToByteArray());
If you sometimes get negative numbers, it means that your GUID value is large enough to fill all 128 bits of the BigInteger or else the BigInteger byte[] ctor is interpreting the data as such. To make sure your bytes are actually positive, check that you are getting <= 16 bytes (128 bits) and that the most-significant bit of the last byte (because it's little endian) is zero. If you have <16 bytes, you can simply append a zero byte to your array (again, append because it is little endian) to make sure the BigInteger ctor treats it as a positive number.
This article I think it can give you the solution:
In summary it is to add one more byte, to 0, if the most significant bit of the last byte is a 1
Guid original = Guid.NewGuid();
byte[] bytes = original.ToByteArray();
if ((bytes[bytes.Length - 1] & 0x80) > 0)
{
byte[] temp = new byte[bytes.Length];
Array.Copy(bytes, temp, bytes.Length);
bytes = new byte[temp.Length + 1];
Array.Copy(temp, bytes, temp.Length);
}
BigInteger guidPositive = new BigInteger(bytes);

C# Can't generate initialization vector IV

I get the following error when I try to create a IV initialization vector for TripleDES encryptor.
Please see the code example:
TripleDESCryptoServiceProvider tripDES = new TripleDESCryptoServiceProvider();
byte[] key = Encoding.ASCII.GetBytes("SomeKey132123ABC");
byte[] v4 = key;
byte[] connectionString = Encoding.ASCII.GetBytes("SomeConnectionStringValue");
byte[] encryptedConnectionString = Encoding.ASCII.GetBytes("");
// Read the key and convert it to byte stream
tripDES.Key = key;
tripDES.IV = v4;
This is the exception that I get from the VS.
Specified initialization vector (IV) does not match the block size for this algorithm.
Where am I going wrong?
Thank you
MSDN explicitly states that:
...The size of the IV property must be the same as the BlockSize property.
For Triple DES it is 64 bits.
The size of the initialization vector must match the block size - 64 bit in case of TripleDES. Your initialization vector is much longer than eight bytes.
Further you should really use a key derivation function like PBKDF2 to create strong keys and initialization vectors from password phrases.
Key should be 24 bytes and IV should be 8 bytes.
tripDES.Key = Encoding.ASCII.GetBytes("123456789012345678901234");
tripDES.IV = Encoding.ASCII.GetBytes("12345678");
The IV must be the same length (in bits) as tripDES.BlockSize. This will be 8 bytes (64 bits) for TripleDES.
I've upvoted every answer (well the ones that are here before mine!) here as they're all correct.
However there's a bigger mistake you're making (one which I also made v.early on) - DO NOT USE A STRING TO SEED THE IV OR KEY!!!
A compile-time string literal is a unicode string and, despite the fact that you will not be getting either a random or wide-enough spread of byte values (because even a random string contains lots of repeating bytes due to the narrow byte range of printable characters), it's very easy to get a character which actually requires 2 bytes instead of 1 - try using 8 of some of the more exotic characters on the keyboard and you'll see what I mean - when converted to bytes you can end up with more than 8 bytes.
Okay - so you're using ASCII Encoding - but that doesn't solve the non-random problem.
Instead you should use RNGCryptoServiceProvider to initialise your IV and Key and, if you need to capture a constant value for this for future use, then you should still use that class - but capture the result as a hex string or Base-64 encoded value (I prefer hex, though).
To achieve this simply, I've written a macro that I use in VS (bound to the keyboard shortcut CTRL+SHIFT+G, CTRL+SHIFT+H) which uses the .Net PRNG to produce a hex string:
Public Sub GenerateHexKey()
Dim result As String = InputBox("How many bits?", "Key Generator", 128)
Dim len As Int32 = 128
If String.IsNullOrEmpty(result) Then Return
If System.Int32.TryParse(result, len) = False Then
Return
End If
Dim oldCursor As Cursor = Cursor.Current
Cursor.Current = Cursors.WaitCursor
Dim buff((len / 8) - 1) As Byte
Dim rng As New System.Security.Cryptography.RNGCryptoServiceProvider()
rng.GetBytes(buff)
Dim sb As New StringBuilder(CType((len / 8) * 2, Integer))
For Each b In buff
sb.AppendFormat("{0:X2}", b)
Next
Dim selection As EnvDTE.TextSelection = DTE.ActiveDocument.Selection
Dim editPoint As EnvDTE.EditPoint
selection.Insert(sb.ToString())
Cursor.Current = oldCursor
End Sub
Now all you need to do is to turn your hex string literal into a byte array - I do this with a helpful extension method:
public static byte[] FromHexString(this string str)
{
//null check a good idea
int NumberChars = str.Length;
byte[] bytes = new byte[NumberChars / 2];
for (int i = 0; i < NumberChars; i += 2)
bytes[i / 2] = Convert.ToByte(str.Substring(i, 2), 16);
return bytes;
}
There are probably better ways of doing that bit - but it works for me.
I do it like this:
var derivedForIv = new Rfc2898DeriveBytes(passwordBytes, _saltBytes, 3);
_encryptionAlgorithm.IV = derivedForIv.GetBytes(_encryptionAlgorithm.LegalBlockSizes[0].MaxSize / 8);
The IV gets bytes from the derive bytes 'smusher' using the block size as described by the algorithm itself via the LegalBlockSizes property.

Categories

Resources