Having some issues getting this to work, dont know much about
encryption and I think im just doing a silly misstake somewhere. Would
appreciate a pointer or two in the correct direction.
Encrypting “a” in c# using AES 256 bit cbc yields
Whilst in ruby I get: uNL1Ogvi+mQyWGniJRY88g==
Ruby code: http://pastie.org/601527
c# code: http://pastie.org/601526
Observe that the first gives 32 bytes (256 bits), while the second gives
16 bytes (128 bits).
You should see http://en.wikipedia.org/wiki/Aes256
“AES has a fixed block size of 128 bits and a key size of 128, 192, or
256 bits, whereas Rijndael can be specified with block and key sizes in
any multiple of 32 bits, with a minimum of 128 bits and a maximum of 256
Your C# code appears to be using Rijndael, presumably with a 256 bit
block size, whereas Openssl is using AES.
Anyway, I can get the same results from both Ruby and the openssl
command line tool, if I modify your Ruby program as follows:
cipher_enc.iv = "\x00" * 16
cipher_enc.key = "\x00" * 32
$ echo -n “a” | openssl aes-256-cbc -K 0 -iv 0 | hexdump -C
00000000 b4 a5 e9 8a f6 81 0a 83 bd f5 2b 14 ae 82 2c 37
This demonstrates you are doing AES encryption properly. But AES with a
256 bit key and a 128 bit block size is not the same as Rijndael with a
256 bit block size.
You may find the AES test vectors on the Wikipedia page helpful too.