What encryption settings to use - TrueCrypt and FreeOTFE

Discuss anything related to portable freeware here.
Post Reply
Message
Author
User avatar
webfork
Posts: 9642
Joined: Wed Apr 11, 2007 8:06 pm
Location: US, Texas
Contact:

What encryption settings to use - TrueCrypt and FreeOTFE

#1 Post by webfork » Sat Jul 31, 2010 8:25 pm

Of the two "transparent" encryption programs, there are a lot of settings to chose from. Although the defaults are very secure for almost any application, some recommendations:

FreeOTFE
  • Fastest - Looked online and found the fastest settings for FreeOTFE based on this article seem to be AES CTR 128 with MD5 hash.

    Most Secure - Probably the most secure setting is also one of the slower options: Serpent 256 XTS with SHA-512.
TrueCrypt
  • Fastest: AES (standard settings)

    Most Secure - Serpent-Twofish-AES - however this is very likely overkill. AES and Serpent are based on similar math, so if one is compromised on a core level, likely the other one will be as well. Twofish-Serpent should be more than enough.

-.-
Posts: 325
Joined: Mon Oct 06, 2008 4:32 pm

Re: What encryption settings to use

#2 Post by -.- » Sun Aug 01, 2010 8:26 am

i go with fastest...

I dont expect to be hiding top secret documents.... so security isnt too big of an issue
I just don't want people to find my usb stick if I forget it somewhere and see what's inside. Of the people where I normally around, I doubt anyone would be able to get into it. and if they do its just family photos, term papers and such (prefer not to let strangers see them, but it wont kill anyone if they do).

so with what's on stick and how often i use it, i go with fastest because even if i prefer it to be secure, i hate waiting even more, especially if all i'm protecting is a few school files and pictures.

lyx
Posts: 84
Joined: Mon Feb 15, 2010 1:23 am

Re: What encryption settings to use

#3 Post by lyx » Sun Aug 01, 2010 10:27 pm

For most personal (and even corporate) scenarios, the password - not the encryption-scheme - is the weakest link, because the password almost always has less bits of information than the generated encryption key. Unless you're up against an adversary who has LOTS of time and ressources, it will always the password that is attacked, not the crypto. And remember that should a crypto be broken, you can with the mentioned apps quickly copy your data to a container with a different crypto.

So unless you're using a keyphrase/keyfile AND are up against a massive adversary, any of the provided encryption-schemes should be sufficient.

If you ARE already using a keyphrase/keyfile and want additional crypto-strength beyond that, then for low-priority data, i'd go with fastest (AES, which's decryption code at least in truecrypt is strongly optimized)... for data that is very private and which doesn't require big bandwidth, i'd go with serpent, simply because it seems to be a very robust and "careful" encryption scheme - it's structure seems to offer little potential for optimization or "shortcuts" (just the number of rounds involved alone should already slow down any bruteforce attack). Serpent's mentality pretty much is the opposite of AES (which mathematically has a very "neat" structure).

Without going into the mathematical details, imagine it like this: AES (Rijndael) is a very simple crypto which's protection and speed relies on novelity and ingenuity - should a fault in this simple "ingenious" structure be found, then it would collapse quickly. Serpent on the other hand is a more conventional crypto, which's protection (and slowness) relies more on adding lots of effort (it uses double as much rounds (32) as was deemed sufficient), thus making it a crypto that (should a crack be found) would be cracked slowly over time - one round at a time - instead of some kind of sudden "landslide".

P.S.: DO make backups, no matter what!

P.S.2: In defense of rijndael (AES), all known attacks today (which strongly weaken it) require some kind of rootkit be present on the machine during encryption - which is a pointless attack, 'cause if the machine is already infected, everything is lost anyways (why crack anything? just read the data when it is decrypted - so, while you work with it). In general, as soon as your machine (or even just your room) is compromised, no crypto can help you anymore - which is why surveilance is the second most frequent attack on crypted data - right after password attacks. The crypto itself in practice is rarely attacked.

User avatar
webfork
Posts: 9642
Joined: Wed Apr 11, 2007 8:06 pm
Location: US, Texas
Contact:

Re: What encryption settings to use

#4 Post by webfork » Mon Aug 02, 2010 6:55 am

> the password - not the encryption-scheme - is the weakest link

Very true.

> The crypto itself in practice is rarely attacked.

That's also been my understanding, but not knowing what types of computers or what sorts of attacks will appear in the future, its probably a decision between what you want to keep secret for the next 5-10 years and what you want to keep secret for the next 50-100 years.

lyx
Posts: 84
Joined: Mon Feb 15, 2010 1:23 am

Re: What encryption settings to use

#5 Post by lyx » Mon Aug 02, 2010 4:11 pm

Well, i guess the "longterm-thing" a bit is a matter of faith in progress. Personally, i (knowing the physical backgrounds, including stuff not considered by the standardmodel) am very sceptical of any breakthrough in computing in the next 10-20 years..... mainly because the entire idea of what "programming" and "a computer" is has almost reached physical limits. Getting more computing power either requires simply adding more of the same (add to the number of "cores"), or a rootlevel overhaul which is not just a matter of making a little invention and then continueing "business as usual" - but rather requires a restart from scratch in terms worldview, mentality, what a computer is, and what "programming" is. If history is any example, humans are very reluctant of doing something like that.

But i guess that is more of a philosophical topic, rather than a "portable freeware" topic :)

User avatar
webfork
Posts: 9642
Joined: Wed Apr 11, 2007 8:06 pm
Location: US, Texas
Contact:

Re: What encryption settings to use

#6 Post by webfork » Tue Aug 03, 2010 9:17 am

lyx wrote:But i guess that is more of a philosophical topic, rather than a "portable freeware" topic :)
For issues of when computers will hit a ceiling and speculation on what they will look like in 10 to 20 years, absolutely; that is a very difficult one to tackle.

lyx
Posts: 84
Joined: Mon Feb 15, 2010 1:23 am

Re: What encryption settings to use

#7 Post by lyx » Tue Aug 03, 2010 8:46 pm

The symptoms are quite obvious to notice however:

- Our current idea of programming is hardwired to the Von Neumann architecture - which doesn't just imply a shism between "processor" and "processed" (thus creating the memory bottleneck), but also singlethreading. It's that age-old idea of a singular subject/god, that commands slaves/objects around. Like mathematics. This also is why our programs must either be perfect or fail totally. Current processors may have multiple cores next to each other, and current programming languages may allow multithreading, but this is all just patchwork - the whole mentality and "approach" is based on singlethreading - not just on the software-level, but also on the hardware-level. The cpu has to babysit EVERYTHING. This isn't something that can be redesigned "on top" - backwards compatibility would need to be axed from the bottom up (slow emulation of the current scheme may however be possible - but who wants to buy a computer that runs current apps slower, yet costs more?)

- Related to the above: Memory is addressed centrally in 1D, but the datastructures become less and less 1D, plus addressing systems get layered on top of each other and become more and more distributed. Thus, the amount of pointers to emulate multidimensional datastructures and virtual memory explodes. Application routines now consist primarily of pointers, and the cpu is mostly busy just jumping around memory locations for no reason except of to emulate nD data, and the virtual memory that OSes and CPUs are supposed to provide, but simply do NOT provide (current vmem is good for nothing, besides of swapping to disk).

- CPUs are small electronic circuit-systems. Heck, ALL our electronic systems are based on circuits. This means that signals travel along clearly defined "channels" - like water flowing through pipes. But what if you make the pipes smaller and smaller and smaller, until you arrive at a point, where the thickness of the walls is thinner than paper? Right, the water/signals leak out of the pipes. This isn't about future - they are already thin enough to leak right now on the machine on which you're reading this post. It's just that the leakage is still low enough to notice it when it happens and then just retry. When however you continue to make them even thinner, the leakage eventually becomes higher than the non-leakage - not to mention that the temperature becomes so high, that the pipes may not even be able anymore to tolerate the heat. This isn't a problem that can be fixed with some magical new invention - it's unfixable for circuit systems.

With the above data i dont mean to imply that its generally impossible to increase practical performance. For starters, you will with the current system see CPUs that are multiple times faster than the current ones - there is some room for improvement left - but the brickwall is already on the horizon. At that point, humans can just make computers larger, rather than smaller (i.e., by adding more cores) - the power usage and heat will become ridiculous (you know, even more than like.... now), but it is doable, if one doesnt care much about efficiency and sanity. After that buffer is also depleted, one may still get practically multiple times more speed, simply by writing applications more efficient than the current bloat of one hundred middlewares stuck together and on top of each other, all of course programmed with 20 abstraction layers. Another alternative would be faster FPGAs that can at runtime reconfigure to implement coderoutines in hardware (though, at that point we're already leaving the von-neumann concept, and "programming" changes)

Beyond this, the circuit-based von neumann machine, would be at the end of the road - conceptually (manageability) and physically. Doing something else would be so different, that for practical purposes, it isn't even a computer as we know it anymore. Who wants to do the whole IT world from scratch? Not gonna happen soon :)

User avatar
webfork
Posts: 9642
Joined: Wed Apr 11, 2007 8:06 pm
Location: US, Texas
Contact:

Re: What encryption settings to use

#8 Post by webfork » Thu Dec 30, 2010 11:42 am

Another old thread update:

Although AES is already a very fast algorithm, TrueCrypt now supports hardware acceleration, which will likely improve speed as well as reduce power consumption.

Post Reply