Cryptography: A Belt We Can't Seem To Put On Right
by Justin Troutman
It certainly has been a while; in fact, the last time found me in Colorado, toughing out the ridiculously beyond-cold weather of the Rockies, but managing to have a great time, and laugh about it, nonetheless. Now, my better half and I are in her hometown of Uberlandia, Minas Gerais, Brazil. I'll stop myself from turning this into a piece on all the goodness one can experience in a Brazilian minute, but let me just say this: You owe it to your taste buds to indulge in the ambrosial bliss that açaí is. Be liberal and have two bowls. It's Dessert 2.0. I hope they serve it in the sweet by-and-by.
Okay, so on to the topic at hand - Windows Vista's BitLocker, which encrypts all data on the system volume. At a glance, this might not seem like the logical place to discuss matters Vista-centric, but what follows is applicable to any instance where cryptography is required. As much as it is about BitLocker, it's about much more. First, before continuing, you might want to take a look at the original draft of an article I wrote, entitled, "On Shifting 'Windows' and 'Security' from Less Antonymous to More Synonymous." An adaptation, "BitLocker and the Complexities of Trust," appears in the October 2007 issue of Microsoft TechNet Magazine.
Phil Zimmermann was kind of enough to provide some commentary for the article, of which I assume you've read at this point. Perhaps the most resounding proverb is, "Design as if making a mistake will cost someone's life." This brings me to my questions for you guys and gals. What do you expect out of cryptography? Are there any general goals you think it should achieve? What is "good cryptography" to you? How do you feel about open-source versus closed-source, in regards to cryptographic implementations? Are there any examples, that stand out to you, of where cryptography is being done the right way?
What I've come to find, over and over again, is a perpetual state of failure within cryptographic implementation; that is, when cryptography fails in practice, it's almost never because of the cryptography itself, but, rather, its implementation. I attribute much of this to the fact that most developers responsible for implementing cryptography aren't, well, cryptographers. I don't expect them to be, either. However, many of these developers haven't the knowledge to properly define the right threat model, let alone identify which cryptographic primitives they need in order to address that threat model. Mistakes ensue. To many of you, I'm sure this isn't news.
I'm proactively working on ways to educate developers so they can avoid the subtle mistakes that leave huge marks. There's a long road ahead, but the dividends are grand. Besides, cryptography is usually the strongest link in any security system. Why all the lax implementations? Shouldn't we expect the same strictness in implementation that's put into design?
Cryptography has, arguably, the best track record out of all the other aspects of security; it's time to do a better job at reflecting this in practice. Something's wrong if cryptography can't reach its fruition in practice. Developers are in dire need of something that cryptographers have, so we need to bridge what is still an uncomfortably large gap between the two.
With our systems like loose slacks, we have a tight belt, yet we can't seem to put it on right.
I'm all ears. Well, eyes. I'm all eyes.
|You might want to at http://www.cl.cam.ac.uk/~rja14/econsec.html|
That link has already sealed its place in my ever-growing bible of cryptographic compilations, but thanks for sharing it. It's worth everyone's time to check out the resources on Ross Anderson's page, as well as the research that comes from WEIS (Workshop on the Economics of Information Security). I find this to be one of the most intriguing aspects of security, and one that deserves attention.