Take the 2-minute tour ×
Information Security Stack Exchange is a question and answer site for Information security professionals. It's 100% free, no registration required.

In TrueCrypt I noticed the option to encrypt a volume with multiple encryption algorithms i.e. AES-Twofish-Serpent. Would it be useful to encrypt something with the same algorithm multiple times? For example AES-AES-AES. I would guess if a flaw or backdoor in the algorithm was discovered this defense would be useless, but would it make brute force attacks harder?

share|improve this question
    
Not exactly the same question, but this question is specifically answered here: security.stackexchange.com/a/35846/12 –  Xander 6 hours ago
add comment

5 Answers

Yes, it makes a difference. It makes your system more risky.

In answering this question, I'm going to assume that you're implementing an AES-AES cascade using a sensible mode of operation (e.g. CBC or GCM) and with independent keys.

The benefit you seem to be proposing is that you could prevent brute-force attacks by using multiple layers. The problem is that, if an adversary has any chance of breaking a 128-bit key, then having to break three of them makes almost zero difference to them. You're talking about the difference between 2128 and 2129.58 operations. Considering that the limits of computation put the costs involved with cracking a 128-bit key somewhere in the region of 1/100th of all of the power ever produced by man, that little bit extra barely matters.

The only benefit that an AES-AES cascade would bring is that many classes of attack against block ciphers are made more difficult. For example, chosen plaintext attacks rely upon getting a system to encrypt attacker-selected plaintexts, and usually involve comparing the resulting ciphertexts. However, with AES-AES cascades, you can't possibly select the input to the second cipher.

There's a problem, though. Remember when I said I'd assume you'd made sane decisions? That's where things start to fall apart. By increasing the complexity of your system, you increase the number of security decisions you have to make, and increase the potential for security failures and bugs. Do you perform two sequential AES block transforms for each block process in CBC, or do you encrypt everything with AES-CBC and then encrypt it all again? Have you used two separate IVs? Are you selecting IVs independently? Are you selecting keys independently? How are you applying and checking authenticity? How are you going to safely exchange, store, or derive the keys and IVs in your protocol? Are all of your implementations strong against side-channel attacks such as DPA and timing attacks? That's a lot of questions to get right, and a lot of potential areas for failure.

Finally, I'd like to remind you of the purpose of cascades: to ensure that a weakness in one cipher doesn't result in a loss of confidentiality of data. By correctly implementing a cascade of, say, AES-Serpent, you ensure that both AES and Serpent need to be broken before the data is compromised.

share|improve this answer
4  
Your figure of 2^129.58 (= 2^128 + 2^128 + 2^128) assumes you can break each key independently as if there's a checksum/MAC at each level (which would be a very dumb way of constructing multiple encryption). But if you do EncryptAES(K1, EncryptAES(K2, EncryptAES(K3, P))), then that's equivalent to using a 384-bit key as unless you simultaneously guess all 384-bits of all three keys at the same time there's no way to verify. DecryptAES(K, C) will not indicate that you guessed the key correctly or incorrectly. –  dr jimbob 22 hours ago
1  
Granted there generally will be Meet in the middle attack that really only gives you 256 bits of security using O(2^128) bit space. And this isn't something brand new -- this is exactly what was used to strengthen DES (56-bit key) by making triple DES with 2^112 attack requiring 2^56 space. –  dr jimbob 22 hours ago
7  
I disagree 3ROT13 is obviously superior to ROT13! –  Aron 16 hours ago
2  
@Celeritas, anything involving ROT13 is a joke. –  Corey Ogburn 4 hours ago
1  
@Celeritas ROT13(ROT13(x)) = x, so ROT13^3 = ROT13. He's making a joke. –  Cameron Martin 3 hours ago
show 5 more comments

Short answer:

Doesn't make brute force any harder - don't bother. Also not needed if using encryption with max size, randomized keys.

Details:

In terms of brute force, using the same encryption algorithm and key multiple times to encrypt data will not change the outcome of the attack. That is to say that if the attack was going to be successful against data encrypted just once by a algorithm/key, the same attack will be successful used against data encrypted multiple times using the same algorithm/key. This is due primarily to the cryptography component succumbing to the brute force (hint: not the algorithm).

There are two fundamental parts to cryptography which can be vulnerable to attacks - the algorithm and the key. Each of these two parts is vulnerable to different risks. Algorithms can be vulnerable to backdoors and mathematical flaws, neither of which are directly put at risk by a brute force attack. And neither of which have been discovered in AES using a 128-bit key, even though the algorithm is 100% open source to the world. (note: there is a flaw in the AES algorithm when using 192 and 256 bit keys which ultimately makes them weaker than if a 128 bit key had been used - don't worry about this though, since 128 is mathematically strong enough (more about this down below)). With not much to attack on the algorithm, let's look at the other half.

Unlike algorithms, encryption keys are very susceptible to brute force attacks, but that susceptibility only exists when the encryption key is generated from non-randomized data (i.e. a user entering an encryption "password"). In cases like these, the strength of the encryption algorithm cannot be maximized, and brute forcing becomes possible. For max-sized 128-bit keys built from randomized data, brute force attacks don't work. In reality they do work, but the possible key space is to large to compute in our lifetimes even if using distributed processing and calculating for moore's law. Brute force attacks against 128-bit keys are looking for 1 key out of the possible 340 undecillion key space (aka 3.4 × 10^38). Mathematically speaking, it isn't possible to brute force a key from that space in our lifetimes, or even any great grand child you'll ever have (there is some possibility that quantum computing will change this in the future, but that is not certain).

Still, if the "password" used to generate the key is weak (e.g. "P@ssW0rd1", etc...), the key space was not maxed out, and brute force is possible there. Moral of the story: randomize encryption keys and make full use of the key space. This eliminates everything but algorithm flaws and backdoors, neither of which i'm concerned about for AES, since the full algorithm is viewable by the world for more than a decade. Just keep the key safe and you're safe too.

share|improve this answer
    
"using the same encryption algorithm and key multiple times to encrypt data will not change the outcome of the attack" Well, obviously; it will slow down each test a bit, but it won't buy you any significant additional security. Could you perhaps elaborate on what reason you have for stating that the same key is used multiple times? –  Michael Kjörling 10 hours ago
add comment

Using the same algorithm multiple times does not necessarily give you as much extra security as you might expect: The reason is that it allows "meet in the middle attacks". Let me quote a part from the referenced link:

"When trying to improve the security of a block cipher, a tempting idea is to simply use several independent keys to encrypt the data several times using a sequence of functions (encryptions). Then one might think that this doubles or even n-tuples the security of the multiple-encryption scheme, depending on the number of encryptions the data must go through.

The Meet-in-the-Middle attack attempts to find a value using both of the range (ciphertext) and domain (plaintext) of the composition of several functions (or block ciphers) such that the forward mapping through the first functions is the same as the backward mapping (inverse image) through the last functions, quite literally meeting in the middle of the composed function."

The old-fashioned TripleDES algorithm did it the following way:

ciphertext = EK3(DK2(EK1(plaintext))), plaintext = DK1(EK2(DK3(ciphertext)))

where K1, K2, K3 are independent keys, E is the encryption and D the decryption function.

But you are probably looking for cascades to strengthen the encryption algorithm. TrueCrypt provides this by implementing the following cascades (more details here):

  • AES-Twofish
  • AES-Twofish-Serpent
  • Serpent-AES
  • Serpent-Twofish-AES
  • Twofish-Serpent

The idea behind is that if there are any weaknesses revealed in the future, your data is still being protected by a second (or third) independent encryption algorithm.

Another technique is protecting the random key in the header by using a SALT and by increasing the number of iterations to > 1000 (SALT protects against rainbow table attacks by increasing the number of possible combinations you have to try in a brute-force attack, while increasing the number of iterations protects you by "slowing down" the algorithm: you have to loop through all iterations in order to decrypt the header: slowing it down means you have less "tries" per second and the brute force runs longer).

share|improve this answer
    
I guess that's what I don't get, how is doing "iterations" different that what I described, for example applying AES twice? –  Celeritas 6 hours ago
add comment

Using the same algorithm twice does not provide significant additional security. You would be vulnerable to a meet in the middle attack. Using the same algorithm three times does give you additional security roughly equivalent to doubling the key size. You can use the same key for the first and last of the three encryptions. Using only two keys that way is probably just as secure as using three different.

This approach with 2 or 3 keys has been used with the DES algorithm and is known as Tripple DES.

A simpler and much faster approach is to apply XOR with a constant before and after applying the cipher. This approach has been used with DES and is known as DES-X.

A generalization could alternate between XOR with a key and enciphering with a key. Starting and ending with XOR. By setting keys appropriately this generalized approach could reduce to Tripple DES or DES-X. But this generalized approach has not had much analysis.

In the case of AES, applying any of the above approaches would be equivalent to using AES with a different key schedule. This is not generally true about every cipher, it is specific to the structure of AES.

The implication of this is that any of the above approaches would really only help if the key size of AES was too small, or if there was some weakness in the standardized key schedule of AES. In other words, applying AES-128 three times with different keys should provide about as much security as AES-256. If there was no weakness in the AES key schedule.

However there are known weaknesses in the AES key schedule, and in particular AES-256 has been shown to be much weaker than you would expect from a 256 bit key. Thus applying AES-128 three times with different keys is to the best of my knowledge more secure than AES-256.

share|improve this answer
add comment

I agree that the benefits of multiple encryption is small, there are several potential pitfalls to do correctly (without falling back to the security of single encryption), and there is a CPU cost to doing the extra encryption. However, I do not think it is an a security anti-pattern like inventing your own cipher.

Yes, whenever you add complexity to a system you may inadvertently open yourself up to a side-channel attacks. Granted, the application code to do this need not be unwieldy or complex.

In my opinion using a EAES(K1, EAES(K2, EAES(K3, P))), as your block cipher doesn't open you up to more attacks than having used EAES(K1, P) as your block cipher.

On the surface, you just changed your block cipher with a 128-bit key to one with a 384-bit key, so on the face of it one could claim that you are much more secure with the new scheme.

Granted there are caveats:

  1. You should to be careful that its not possible to break your keys sequentially. If its possible to verify that you broke the first key, then instead of having 384-bit key, you only have 129-bit of security. You should treat E(K1, E(K2, (E(K3,Pblock))) as your block cipher and apply your block cipher mode of encryption (e.g., CBC with padding) on top of this new cipher. A mistake would be to use ECBC(K1, ECBC(K2, (ECBC(K3,Pmessage))), where EAES-CBC adds padding as necessary according to the PKCS7 padding scheme and operates on your entire message. Note the PKCS-7 padding scheme adds a full block (128-bit=16 bytes for AES) of padding when the plaintext is an exact multiple of the block length. Thus for outer two CBC-encryption steps, you will append blocks of 128-bit to the end of the message before encrypting. Hence you can solve for K1 and K2 by trying to encrypt the padding block with all 2128 keys until you find the two keys that match these two last blocks in your block cipher mode.

  2. If an attacker knows or can guess a block of plaintext that corresponds with a block of ciphertext, then there's a meet-in-the-middle attack that takes O(2128) space and takes O(2256) time. The meet in the middle attack is why triple DES is three rounds of encryption rather than just double DES. It very often is possible for an attacker to be able to reasonably guess one block, so its more accurate to say in practice this triple-AES scheme would only provide 256-bit security.

  3. It's not really true to say 256-bit encryption (time to brute force is 2256) is more secure than 128-bit encryption (brute force time 2128). Both are well outside the limits of attackers. 2128 = 340 282 366 920 938 463 463 374 607 431 768 211 456 (340 billion billion billion billion). This is a billion people on earth each operating a million computers with each computer trying a billion keys a second for a million years before having a 10% chance of brute-forcing the key. Thus, brute-forcing is very much out of the question at 2128 -- so discussion of such attacks is irrelevant for practical purposes unless there are major advances in our computational abilities. Other attacks must be used like keyloggers, steal keys from RAM, $5 wrench, flaws in the block cipher. So even though 2256 is way bigger (2128 times bigger in fact) than 2128, its largely irrelevant. Sort of like comparing your ability to travel to Alpha Centauri (4.3 light-years away) versus your ability to travel to the Andromeda galaxy (2.5 million light-years away). (For reference the Moon is only 0.000 000 04 light-years away.) Yes, the Andromeda galaxy is much further away, but its not clear if travelling there is actually harder in practice as both are currently impossible. Sure, it seems like if humanity ever travels to other stars we'll probably be able to travel to Alpha Centauri first, but its not known if we ever will travel to other stars and if so it requires some incredible breakthrough that possibly could make it possible to travel to other galaxies quite easily bringing both tasks within reach.

share|improve this answer
add comment

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.