In TrueCrypt I noticed the option to encrypt a volume with multiple encryption algorithms i.e. AES-Twofish-Serpent. Would it be useful to encrypt something with the same algorithm multiple times? For example AES-AES-AES. I would guess if a flaw or backdoor in the algorithm was discovered this defense would be useless, but would it make brute force attacks harder?
|
Yes, it makes a difference. It makes your system more risky. In answering this question, I'm going to assume that you're implementing an AES-AES cascade using a sensible mode of operation (e.g. CBC or GCM) and with independent keys. The benefit you seem to be proposing is that you could prevent brute-force attacks by using multiple layers. The problem is that, if an adversary has any chance of breaking a 128-bit key, then having to break three of them makes almost zero difference to them. You're talking about the difference between 2128 and 2129.58 operations. Considering that the limits of computation put the costs involved with cracking a 128-bit key somewhere in the region of 1/100th of all of the power ever produced by man, that little bit extra barely matters. The only benefit that an AES-AES cascade would bring is that many classes of attack against block ciphers are made more difficult. For example, chosen plaintext attacks rely upon getting a system to encrypt attacker-selected plaintexts, and usually involve comparing the resulting ciphertexts. However, with AES-AES cascades, you can't possibly select the input to the second cipher. There's a problem, though. Remember when I said I'd assume you'd made sane decisions? That's where things start to fall apart. By increasing the complexity of your system, you increase the number of security decisions you have to make, and increase the potential for security failures and bugs. Do you perform two sequential AES block transforms for each block process in CBC, or do you encrypt everything with AES-CBC and then encrypt it all again? Have you used two separate IVs? Are you selecting IVs independently? Are you selecting keys independently? How are you applying and checking authenticity? How are you going to safely exchange, store, or derive the keys and IVs in your protocol? Are all of your implementations strong against side-channel attacks such as DPA and timing attacks? That's a lot of questions to get right, and a lot of potential areas for failure. Finally, I'd like to remind you of the purpose of cascades: to ensure that a weakness in one cipher doesn't result in a loss of confidentiality of data. By correctly implementing a cascade of, say, AES-Serpent, you ensure that both AES and Serpent need to be broken before the data is compromised. |
|||||||||||||||||||||
|
Short answer: Doesn't make brute force any harder - don't bother. Also not needed if using encryption with max size, randomized keys. Details: In terms of brute force, using the same encryption algorithm and key multiple times to encrypt data will not change the outcome of the attack. That is to say that if the attack was going to be successful against data encrypted just once by a algorithm/key, the same attack will be successful used against data encrypted multiple times using the same algorithm/key. This is due primarily to the cryptography component succumbing to the brute force (hint: not the algorithm). There are two fundamental parts to cryptography which can be vulnerable to attacks - the algorithm and the key. Each of these two parts is vulnerable to different risks. Algorithms can be vulnerable to backdoors and mathematical flaws, neither of which are directly put at risk by a brute force attack. And neither of which have been discovered in AES using a 128-bit key, even though the algorithm is 100% open source to the world. (note: there is a flaw in the AES algorithm when using 192 and 256 bit keys which ultimately makes them weaker than if a 128 bit key had been used - don't worry about this though, since 128 is mathematically strong enough (more about this down below)). With not much to attack on the algorithm, let's look at the other half. Unlike algorithms, encryption keys are very susceptible to brute force attacks, but that susceptibility only exists when the encryption key is generated from non-randomized data (i.e. a user entering an encryption "password"). In cases like these, the strength of the encryption algorithm cannot be maximized, and brute forcing becomes possible. For max-sized 128-bit keys built from randomized data, brute force attacks don't work. In reality they do work, but the possible key space is to large to compute in our lifetimes even if using distributed processing and calculating for moore's law. Brute force attacks against 128-bit keys are looking for 1 key out of the possible 340 undecillion key space (aka 3.4 × 10^38). Mathematically speaking, it isn't possible to brute force a key from that space in our lifetimes, or even any great grand child you'll ever have (there is some possibility that quantum computing will change this in the future, but that is not certain). Still, if the "password" used to generate the key is weak (e.g. "P@ssW0rd1", etc...), the key space was not maxed out, and brute force is possible there. Moral of the story: randomize encryption keys and make full use of the key space. This eliminates everything but algorithm flaws and backdoors, neither of which i'm concerned about for AES, since the full algorithm is viewable by the world for more than a decade. Just keep the key safe and you're safe too. |
|||||
|
Using the same algorithm multiple times does not necessarily give you as much extra security as you might expect: The reason is that it allows "meet in the middle attacks". Let me quote a part from the referenced link:
The old-fashioned TripleDES algorithm did it the following way:
where K1, K2, K3 are independent keys, E is the encryption and D the decryption function. But you are probably looking for cascades to strengthen the encryption algorithm. TrueCrypt provides this by implementing the following cascades (more details here):
The idea behind is that if there are any weaknesses revealed in the future, your data is still being protected by a second (or third) independent encryption algorithm. Another technique is protecting the random key in the header by using a SALT and by increasing the number of iterations to > 1000 (SALT protects against rainbow table attacks by increasing the number of possible combinations you have to try in a brute-force attack, while increasing the number of iterations protects you by "slowing down" the algorithm: you have to loop through all iterations in order to decrypt the header: slowing it down means you have less "tries" per second and the brute force runs longer). |
|||||
|
Using the same algorithm twice does not provide significant additional security. You would be vulnerable to a meet in the middle attack. Using the same algorithm three times does give you additional security roughly equivalent to doubling the key size. You can use the same key for the first and last of the three encryptions. Using only two keys that way is probably just as secure as using three different. This approach with 2 or 3 keys has been used with the DES algorithm and is known as Tripple DES. A simpler and much faster approach is to apply XOR with a constant before and after applying the cipher. This approach has been used with DES and is known as DES-X. A generalization could alternate between XOR with a key and enciphering with a key. Starting and ending with XOR. By setting keys appropriately this generalized approach could reduce to Tripple DES or DES-X. But this generalized approach has not had much analysis. In the case of AES, applying any of the above approaches would be equivalent to using AES with a different key schedule. This is not generally true about every cipher, it is specific to the structure of AES. The implication of this is that any of the above approaches would really only help if the key size of AES was too small, or if there was some weakness in the standardized key schedule of AES. In other words, applying AES-128 three times with different keys should provide about as much security as AES-256. If there was no weakness in the AES key schedule. However there are known weaknesses in the AES key schedule, and in particular AES-256 has been shown to be much weaker than you would expect from a 256 bit key. Thus applying AES-128 three times with different keys is to the best of my knowledge more secure than AES-256. |
|||
|
I agree that the benefits of multiple encryption is small, there are several potential pitfalls to do correctly (without falling back to the security of single encryption), and there is a CPU cost to doing the extra encryption. However, I do not think it is an a security anti-pattern like inventing your own cipher. Yes, whenever you add complexity to a system you may inadvertently open yourself up to a side-channel attacks. Granted, the application code to do this need not be unwieldy or complex. In my opinion using a EAES(K1, EAES(K2, EAES(K3, P))), as your block cipher doesn't open you up to more attacks than having used EAES(K1, P) as your block cipher. On the surface, you just changed your block cipher with a 128-bit key to one with a 384-bit key, so on the face of it one could claim that you are much more secure with the new scheme. Granted there are caveats:
|
|||
|