Question

I'm in the process of porting our Java application to OS X (10.8). One of our unit tests fails when doing encryption (it works on Windows). Both are running Java 7 Update 21 but the Windows version is using the 32 bit JDK and the Mac version is using the 64 bit JDK.

When running it on Mac I get the following exception when trying to decrypt the encrypted data:

Caused by: javax.crypto.BadPaddingException: Given final block not properly padded at com.sun.crypto.provider.CipherCore.doFinal(CipherCore.java:811) at com.sun.crypto.provider.CipherCore.doFinal(CipherCore.java:676) at com.sun.crypto.provider.AESCipher.engineDoFinal(AESCipher.java:313) at javax.crypto.Cipher.doFinal(Cipher.java:2087) at com.degoo.backend.security.Crypto.processCipher(Crypto.java:56) ... 25 more

Here's the encryption class.

import javax.crypto.Cipher;
import javax.crypto.KeyGenerator;
import javax.crypto.SecretKey;
import javax.crypto.spec.IvParameterSpec;
import javax.crypto.spec.SecretKeySpec;
import java.security.MessageDigest;
import java.security.NoSuchAlgorithmException;
import java.security.SecureRandom;

public final class Crypto {

    private final static String CIPHER_ALGORITHM = "AES";
    private final static String CIPHER_TRANSFORMATION = "AES/CBC/PKCS5Padding";

    public final static int CRYPTO_KEY_SIZE = 16;    

    public static byte[] encryptByteArray(byte[] blockToEncrypt, int maxLengthToEncrypt, byte[] encryptionKey, byte[] ivBytes) {
        return processCipher(blockToEncrypt, maxLengthToEncrypt, Cipher.ENCRYPT_MODE, ivBytes, encryptionKey);
    }

    public static byte[] decryptByteArray(byte[] encryptedData, byte[] encryptionKey, byte[] ivBytes) {
        return processCipher(encryptedData, encryptedData.length, Cipher.DECRYPT_MODE, ivBytes, encryptionKey);
    }

    private static byte[] processCipher(byte[] blockToEncrypt, int maxLength, int cryptionMode, byte[] ivBytes, byte[] encryptionKey) {
        try {
            IvParameterSpec iv = new IvParameterSpec(ivBytes);
            final Cipher cipher = initCipher(cryptionMode, iv, encryptionKey);
            return cipher.doFinal(blockToEncrypt, 0, maxLength);
        } catch (Exception e) {
            throw new RuntimeException("Failure", e);
        }
    }

    private static Cipher initCipher(int cryptionMode, IvParameterSpec iv, byte[] encryptionKey) {
        KeyGenerator keyGen;
        try {
            keyGen = KeyGenerator.getInstance(CIPHER_ALGORITHM);

            final SecureRandom randomSeed = new SecureRandom();
            randomSeed.setSeed(encryptionKey);
            keyGen.init(CRYPTO_KEY_SIZE * 8, randomSeed);

            // Generate the secret key specs.
            final SecretKey secretKey = keyGen.generateKey();

            final SecretKeySpec secretKeySpec = new SecretKeySpec(secretKey.getEncoded(), CIPHER_ALGORITHM);

            // Instantiate the cipher
            final Cipher cipher = Cipher.getInstance(CIPHER_TRANSFORMATION);

            cipher.init(cryptionMode, secretKeySpec, iv);
            return cipher;

        } catch (Exception e) {
            throw new RuntimeException("Failure", e);
        }
    }
}

The test code looks like this:

public void testEncryption() throws Exception {
        int dataLength = TestUtil.nextInt(applicationParameters.getDataBlockMinSize());
        byte[] dataToEncrypt = new byte[dataLength];
        TestUtil.nextBytes(dataToEncrypt);

        int keyLength = 16;
        byte[] key = new byte[keyLength];
        TestUtil.nextBytes(key);

        byte[] ivBytes = new byte[16];
        TestUtil.nextBytes(key);

        long startTime = System.nanoTime();
        byte[] encryptedBlock = Crypto.encryptByteArray(dataToEncrypt, dataToEncrypt.length, key, ivBytes);
        long endTime = System.nanoTime();
        System.out.println("Encryption-speed: " + getMBPerSecond(dataLength, startTime, endTime));

        startTime = System.nanoTime();
        byte[] decryptedData = Crypto.decryptByteArray(encryptedBlock, key, ivBytes);
        endTime = System.nanoTime();
        System.out.println("Decryption-speed: " + getMBPerSecond(dataLength, startTime, endTime));

        if (encryptedBlock.length == decryptedData.length) {
            boolean isEqual = true;
            //Test that the encrypted data is not equal to the decrypted data.
            for (int i = 0; i < encryptedBlock.length; i++) {
                if (encryptedBlock[i] != decryptedData[i]) {
                    isEqual = false;
                    break;
                }
            }
            if (isEqual) {
                throw new RuntimeException("Encrypted data is equal to decrypted data!");
            }
        }

        Assert.assertArrayEquals(dataToEncrypt, decryptedData);
    }
Was it helpful?

Solution

I think I've found it. For some reason the code above derives an encryption-key by seeding a SecureRandom instance with the existing encryption key to get a new byte[] (don't ask me why, it was a long time ago it was written). This is then fed to the SecretKeySpec constructor. If I skip all this and just feed the SecretKeySpec constructor the encryption key that we already have then the unit test passes. The code that does the encryption now looks like this:

final SecretKeySpec secretKeySpec = new SecretKeySpec(encryptionKey, CIPHER_ALGORITHM);

// Instantiate the cipher
final Cipher cipher = Cipher.getInstance(CIPHER_TRANSFORMATION);

cipher.init(cryptionMode, secretKeySpec, iv);
return cipher;

The odd thing is that it has worked on Windows. Looks like the SecureRandom implementations behave differently on OS X and on Windows. Calling setSeed on OS X appends to the seed whereas Windows replaces it.

Update: found some more details on the implementation differences of SecureRandom: http://www.cigital.com/justice-league-blog/2009/08/14/proper-use-of-javas-securerandom/

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top