CipherInputStream hangs while reading data

Steven :

I'm trying to encrypt/decrypt some files, which I will be reading/writing using FileIn/OutputStreams piped through CipherIn/OutputStreams. Fairly simple in concept, and I've gotten this to work using raw byte arrays and Cipher.doFinal. So I know my encryption parameters (bit size, iv size, etc.) are correct. (Or at least functional?)

I'm able to write data through a CipherOutputStream just fine. However, when I try to read that data back through a CipherInputStream, it hangs indefinitely.

The only related problem I've found remains unanswered, and is potentially fundamentally different from my problem, as my problem will always have all the data available on disk, as opposed to the related problem's reliance on Sockets.

I've tried a number of solutions, the most obvious one being changing the buffer size (data = new byte[4096];). I've tried a number of values, including the size of the plaintext and the size of the encrypted data. None of these values work. The only solution I've found is avoiding using a CipherInputStream altogether, and instead relying on Cipher.doFinal and Cipher.update.

Am I missing something? It would be very nice to be able to use a CipherInputStream, rather than having to reinvent the wheel by using Cipher.update.

SSCCE:

private static final String AES_ALG = "aes_256/gcm/nopadding";
private static final int GCM_TAG_SIZE = 128;

private static void doEncryptionTest() throws NoSuchAlgorithmException, NoSuchPaddingException, InvalidKeyException,
        InvalidAlgorithmParameterException, FileNotFoundException, IOException
{
    File f = new File("encrypted_random_data.dat");
    // 12-byte long iv
    byte[] iv = new byte[] {0x27, 0x51, 0x34, 0x14, -0x65, 0x4d, -0x67, 0x35, -0x63, 0x11, -0x02, -0x05};
    // 256-bit long key
    byte[] keyBytes = new byte[] {0x55, -0x7f, -0x17, -0x29, -0x68, 0x25, 0x29, 0x5f, -0x27, -0x2d, -0x4d, 0x1b,
            0x25, 0x74, 0x57, 0x35, -0x23, -0x1b, 0x12, 0x7c, 0x1, -0xf, -0x60, -0x42, 0x1c, 0x61, 0x3e, -0x5,
            -0x13, 0x31, -0x48, -0x6e};
    SecretKey key = new SecretKeySpec(keyBytes, "AES");

    OutputStream os = encryptStream(key, iv, f);

    System.out.println("generating random data...");
    // 24MB of random data
    byte[] data = new byte[25165824];
    new Random().nextBytes(data);

    System.out.println("encrypting and writing data...");
    os.write(data);

    os.close();

    InputStream is = decryptStream(key, iv, f);

    System.out.println("reading and decrypting data...");
    // read the data in 4096 byte packets
    int n;
    data = new byte[4096];
    while ((n = is.read(data)) > 0)
    {
        System.out.println("read " + n + " bytes.");
    }

    is.close();
}

private static OutputStream encryptStream(SecretKey key, byte[] iv, File f) throws NoSuchAlgorithmException,
        NoSuchPaddingException, InvalidKeyException, InvalidAlgorithmParameterException, FileNotFoundException
{
    GCMParameterSpec spec = new GCMParameterSpec(GCM_TAG_SIZE, iv);
    Cipher enc = Cipher.getInstance(AES_ALG);
    enc.init(Cipher.ENCRYPT_MODE, key, spec);

    OutputStream os = new CipherOutputStream(new FileOutputStream(f), enc);
    return os;
}

private static InputStream decryptStream(SecretKey key, byte[] iv, File f) throws NoSuchAlgorithmException,
        NoSuchPaddingException, InvalidKeyException, InvalidAlgorithmParameterException, FileNotFoundException
{
    GCMParameterSpec spec = new GCMParameterSpec(GCM_TAG_SIZE, iv);
    Cipher dec = Cipher.getInstance(AES_ALG);
    dec.init(Cipher.DECRYPT_MODE, key, spec);

    InputStream is = new CipherInputStream(new FileInputStream(f), dec);
    return is;
}
frececroka :

It doesn't hang, it's just very slow. The CipherInputStream has a fixed input buffer of size 512, meaning it invokes the Cipher#update(byte[], int, int) method with at most 512 bytes at a time. Decrypting manually with bigger buffer sizes makes it a lot faster.

The reason is that calling update 50 000 times with 512 bytes takes a lot longer than calling it, say, 400 times with 65 kilobytes. I'm not sure why exactly, but there seems to be a constant overhead that you have to pay for every call to update, regardless of the amount of data you pass it.

Additionally be aware that you cannot use AES GCM to decrypt large files. By design, Sun's implementation of the cipher buffers the whole ciphertext in memory before decrypting it. You'd have to split the plaintext into small enough chunks and encrypt each chunk individually.

See also https://crypto.stackexchange.com/questions/20333/encryption-of-big-files-in-java-with-aes-gcm and How come putting the GCM authentication tag at the end of a cipher stream require internal buffering during decryption?.

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=31400&siteId=1