Question

This is a continuation of my previous question (RSA Decryption).

Edit: The answer below was clear and concise. I've added some sample code below which has helped me verify what I think is happening. I will test today and post back. The client application is creating a 256bit key but when it calls gcry_cipher_setkey it uses the keylength returned from the algorithm. So I'm guessing that the setkey is truncating the 256bit key to 128bit. Easy enough to test out.

I am attempting to decrypt a file that was encrypted using libgcrypt with aes128 / cbc / no padding. There are two different applications that are doing the encryption both of which I have no control over. One is using a 128bit AES key while the other is using 256bit key. All the internal calls are the same otherwise.

Edit: Here is pseudo encrypting code: Edit 2: Fixed order & Added comments for future users:

#define AES_KEY_SIZE 32
char AESKey[AES_KEY_SIZE+1];
GenerateAESKey(AESKey);

Error = gcry_cipher_open(AesHandle, GCRY_CIPHER_AES128, GCRY_CIPHER_MODE_CBC, 0);
// this is always 128bits (16 bytes)
BlockLength = gcry_cipher_get_algo_blklen(GCRY_CIPHER_AES128);
// this should return 128bits (16 bytes)
KeyLength = gcry_cipher_get_algo_keylen(GCRY_CIPHER_AES128);
Error = gcry_cipher_setiv(AesHandle, NULL, 0);
// AESKey is a 32byte (256bit) char array & KeyLength is 128bits (16bytes)
// so gcry_cipher_setkey only uses the first 'KeLength' bytes of 'AESKey'
// which in this case is the first 16 bytes of 'AESKey'
Error = gcry_cipher_setkey(*AesHandle, AESKey, KeyLength);
Error = gcry_cipher_encrypt(AesHandle, Encrypted, BlockLength, ToEncrypt, BlockLenth);


void GenerateAESKey( char * AESKey ) {
    int i;

    srand(time(NULL));
    for ( i = 0; i < AES_KEY_SIZE; i++ ) {
        AESKey[i] = (rand() % 93)+33;
    }
    AESKey[AES_KEY_SIZE] = '\0';
}

So in my C# code I do this before before I start decrypting:

var aesKey = DecryptAesKey(s);
if (aesKey.Length == 32)
{
   var tempKey = new byte[16];
   Buffer.BlockCopy(aesKey,0,tempKey,0,16);
   aesKey = tempKey;
}

I am using C# & bouncycastle to decrypt the files. I successfully decrypt the files that were encrypted w/128bit key. But fail when the key is 256bit. Fail means that the output is garbled.

I have verified that the AES Key is getting decrypted correctly for both sources.

My question is what is Libgrypt doing different when the key is 256bits? Or is this even the right path to find out why my decryption is failing? Thank you for any info or direction you can point me in.

Brian

Was it helpful?

Solution

I assume you have

  1. 2 different keys (128 and 256 bit)
  2. 2 different sources of ciphertext (1: AES128 2: AES256 Both: CBC/No Padding)
  3. Can decrypt 128 ciphertext
  4. Fail with 256 (everything is garbled and nothing from plaintext is decrypted).

The main difference is the key length of 128 or 256 bit. It specifies the number of repetitions of transformation rounds that convert the input. You probably will never need internal implementation details. AES uses more rounds with larger keys. 10 for 128-bit keys, 14 for 256-bit keys.

Important part is that Block size is always 128 bit for 128 and 256 CBC AES.

Both AES-128-CBC and AES-256-CBC use 128 bit IV.

So my wild guess (without seeing your AES256 code) is that there can be bug somewhere in Block or IV size in your AES256 code.

If you just set key for 128AES with this function, check this docs

 gcry_error_t gcry_cipher_setkey (gcry_cipher_hd_t h, const void *k, size_t l)

The length l (in bytes) of the key k must match the required length of the algorithm set for this context or be in the allowed range for algorithms with variable key size. The function checks this and returns an error if there is a problem. A caller should always check for an error.

And you normally don't want to use CBC with NoPadding (unless your data size is always a multiple of 16 bytes) but padding problems garble only the last 16-byte block of plaintext.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top