Thanks to the recent NSA leaks, people are more worried than ever that their software might have backdoors. If you don’t believe that the software vendor can resist a backdoor request, the onus is on you to look for a backdoor. What you want is software transparency.
Transparency of this type is a much-touted advantage of open source software, so it’s natural to expect that the rise of backdoor fears will boost the popularity of open source code. Many open source projects are fully transparent: not only is the source code public, but the project also makes public the issue tracker that is used to manage known defects and the internal email discussions of the development team. All of these are useful in deterring backdoor attempts.
This kind of transparency often goes together with permissive licenses that allow users to redistribute the code and to modify it and distribute the modified version. That’s the norm in popular open source projects. But it’s possible in principle for a project to be transparent—making code, issue tracking, and design discussions public—while distributing the resulting code under a license that bans modification or redistribution. Such a product would be transparent but would not be free/open source.
Of course, having everything public does not ensure that there are no holes. The Debian project, which is transparent, had a serious security hole in its pseudorandom generator for several years. Transparency makes holes detectable but it doesn’t guarantee that they will be detected.
There’s a well-known saying in the open-source world, which Eric Raymond dubbed Linus’s Law: “given enough eyeballs, all bugs are shallow.” The idea is that the key to finding and fixing bugs effectively is to have many people looking at the code—even a bug that is hard for most people to detect will be obvious to a few.
But transparency does not guarantee that holes will be found, because there might not be enough eyeballs on the code. For open source projects, finding backdoors, or security vulnerabilities in general, is a public good, in the economists’ sense that effort spent on it benefits everyone, including those who don’t contribute any effort themselves. So it’s not obvious in advance that any particular open source project can avoid backdoors.
Even if there are enough eyes to rule out backdoors in the source code, you’re still not in the clear. Your system doesn’t run source code directly—it must be translated into machine code first. How can you be sure that the machine code running on your machine is really equivalent to the source code that was vetted? This is a famously difficult problem, and the subject of Ken Thompson’s famous Turing Award lecture, Reflections on Trusting Trust.
There is no simple solution to this object code vs. source code problem. Transparency is never easy. But in today’s world it is more important than ever.
Leave a Reply