Help please - openssl and ruby

Hi,
I’ve run into a problem using openssl and ruby, which I can’t solve.
I have a public key, a text string, and a signature, and I want to check
that the signature was created from the text string using the
appropriate private key.

On the command line i can do
openssl sha1 -sha512 -verify /tmp/public.pem -signature
/tmp/signatureValue.bin /tmp/signatureText.bin

and I get ‘Verified OK’

I want to do the same in ruby. I can create the hash ok, but when I run

pkey=OpenSSL::PKey::RSA.new(
Base64.decode64(File.read(’/tmp/public.pem’)))

p pkey.verify( OpenSSL::Digest::SHA512.new, signatureValue,
signatureText )

I get ‘false’

Examining the key, I notice that before it is loaded into the OpenSSL
object the key reads
-----BEGIN PUBLIC KEY-----
MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEA7p…

but when I print the key out after creating the OpenSSL object i get
-----BEGIN RSA PUBLIC KEY-----
MIIBCgKCAQEA7p…

ie a chunk (MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8) has gone missing from the
front of the key. I guess this might be the source of my problem, but
try as I might I can’t find anything about what is going on here. I’m
no great SSL expert, and I’d love to get this done before christmas, so
if anyone can help it would really be appreciated.

thanks
chris

On 12/22/06, Chris C. [email protected] wrote:

and I get ‘Verified OK’
I get ‘false’
MIIBCgKCAQEA7p…

ie a chunk (MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8) has gone missing from the
front of the key. I guess this might be the source of my problem, but

The public key is encoded in ASN.1 DER encoding, that is of Tag,
Length, Value type.
Base64 decode the key before and after and run it through the asn1dump
program or similar, to see the key’s content.

The content is defined in PKCS#1 standard, that is released as a
RFC3447 as well. You’ll find it somewhere around rsa.com or
rsalabs.com (especially they provide nice-formatted pdf’s).

The content of an openssl-generated public key is (at least mine is)

Sequence (taglength= 1 length= 290) (UNIV/CONST)
Sequence (taglength= 1 length= 13) (UNIV/CONST)
Objectidentifier (taglength= 1 length= 9) (UNIV/PRIM)
::= 1 2 840 887 13 1 1 1
Null (taglength= 1 length= 0) (UNIV/PRIM)
Bitstring (taglength= 1 length= 271) (UNIV/PRIM)

I suppose the ObjectIdentifier and its sequence gets replaced with the
actual e.

Summary: That shouldn’t be a problem unless the n and e are somehow
preserved.
That obvoiusly doesn’t answer your primary question… I can’t say.
Maybe format of the data? If you could post the exact data and exact
(fully working) code, I may have a look at it.

Jan S. wrote:

Summary: That shouldn’t be a problem unless the n and e are somehow
preserved.
That obvoiusly doesn’t answer your primary question… I can’t say.
Maybe format of the data? If you could post the exact data and exact
(fully working) code, I may have a look at it.

Thanks jan - you were right that the difference between the keys was not
causing the problem. For future readers, the way to do what i was
trying to achieve is simple and obvious:

pkey=OpenSSL::PKey::RSA.new(Base64.decode64(userPublicKey))
if ( pkey.verify( OpenSSL::Digest::SHA512.new,
Base64.decode64(signatureValue), signatureText ))
puts " OK: Signature OK"
else
puts " ERROR: Signature failed"
end

I think when I first tried this, I didn’t decode the signature, then saw
the difference in the keys and this threw me off track …

C