Let's say that I have an ECDSA key:
$ openssl ecparam -name prime256v1 -genkey | tee private.pem
-----BEGIN EC PARAMETERS-----
BggqhkjOPQMBBw==
-----END EC PARAMETERS-----
-----BEGIN EC PRIVATE KEY-----
cGxlYXNlIGRvbid0IHVzZSBrZXkgbWF0ZXJpYWwgdGhhdCB5b3UgZmlu
ZCBvbiBhIHJhbmRvbSBibG9nIHBvc3QuIHdoeSBvbiBlYXJ0aCB3b3Vs
ZCB5b3UgZG8gdGhhdD8K
-----END EC PRIVATE KEY-----
With it's corresponding public key:
$ openssl ec -in private.pem -pubout | tee public.pem
read EC key
writing EC key
-----BEGIN PUBLIC KEY-----
MFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAE5EVyYKeGTjA60HpH0sW1claw2LXw
zY0sN9YhtcMEzMXuA+9IgdkHntKJO0aiBaVueI65srq6hBK+5QVEjFwr2Q==
-----END PUBLIC KEY-----
And I want to sign some data with it at the command-line:
$ echo "totes legit" | openssl dgst -sign private.pem -sha256 | base64
MEQCIEcOOZWYJUZ6y2zJ61GpRwGw0nXOxpCepn2JDa5WCy42AiA100CUhgycKP5LUVUO9clU1omYu0JvX132pK6cT0Vrgw==
Then I can easily verify it it in .NET:
$ cat Program.cs
using System.Security.Cryptography;
using var alg = ECDsa.Create();
alg.ImportFromPem(File.ReadAllText("public.pem"));
var text = "totes legit"u8;
var signature = Convert.FromBase64String("MEQCIEcOOZWYJUZ6y2zJ61GpRwGw0nXOxpCepn2JDa5WCy42AiA100CUhgycKP5LUVUO9clU1omYu0JvX132pK6cT0Vrgw==");
var verified = alg.VerifyData(text, signature, HashAlgorithmName.SHA256);
Console.WriteLine($"Verified: {verified}");
This should work, right?:
$ dotnet run
Verified: False
uhhh, what??
After spending literally hours earlier this week debugging this, it turns out both programs are functioning as intended. Let's have a look at that signature again:
$ echo "MEQCIEcOOZWYJUZ6y2zJ61GpRwGw0nXOxpCepn2JDa5WCy42AiA100CUhgycKP5LUVUO9clU1omYu0JvX132pK6cT0Vrgw==" | base64 -d | openssl asn1parse -inform der
0:d=0 hl=2 l= 68 cons: SEQUENCE
2:d=1 hl=2 l= 32 prim: INTEGER :470E39959825467ACB6CC9EB51A94701B0D275CEC6909EA67D890DAE560B2E36
36:d=1 hl=2 l= 32 prim: INTEGER :35D34094860C9C28FE4B51550EF5C954D68998BB426F5F5DF6A4AE9C4F456B83
OpenSSL's signature is an ASN.1 blob (yay!), which due to ASN.1 integer signing means it ends up being variable-length. Repeated runs show the same prefix because of the ASN.1 header.
If we create a signature in .NET:
$ cat Program.cs
using System.Security.Cryptography;
using var alg = ECDsa.Create();
alg.ImportFromPem(File.ReadAllText("private.pem"));
var text = "totes legit"u8;
var signature = alg.SignData(text, HashAlgorithmName.SHA256);
Console.WriteLine(Convert.ToBase64String(signature));
$ dotnet run
S7/UdK12MLvUQfw4I+7Avudf5dHNx8XbAOtPISQEP28vKGezDH/8NCn2mb5nL3GYWI87Hv9SuUA0/+dw1cevYg==
This isn't ASN.1, and repeated runs shows that it is fixed-length.
It turns out that these are two different format for DSA-based algorithms. If you target .NET 5 or higher exclusively, then you can switch between these with an overload of SignData
or VerifyData
that takes in a DSASignatureFormat
, which highlights the difference.
.NET defaults to - and .NET Framework/Standard only supports - IEEE P1363, which says "take the two data components and squish them together."
OpenSSL only supports RFC 3279, which describes an ASN.1 format that takes the two data components and puts them in individual fields inside of a SEQUENCE.
You can translate between them by adding or removing the ASN.1 data bytes, but there are no simple APIs to do this. 🙁