"Trusted" Computing

One of the most disturbing news items of recent times was watching the Chinese President Hu Jintao meeting with Bill Gates at the Gates's mansion, even before he met with President George Bush. A smiling Hu Jintao mentioned how he used Mr. Gates operating system every day, and Bill promised to help him out with technical support.

But Bill and the Redmond crowd might be helping out with more than just technical support, and not to help the Chinese government either. Humor me a moment while I take a little diversion back to 1982.

The largest non-nuclear explosion ever recorded by satellite took place in Russia in 1982, and was an explosion in the Siberian gas pipeline. The shocking, and little-known truth about this catastrophe was that it was directly caused by the CIA by deliberately giving the Soviet Union modified software (in binary only format of course) that was engineered to destroy the pipeline. Lest you think this is a paranoid fantasy (it sounds like one, I know) this was documented in the book "At the Abyss: An Insider's History of the Cold War", by Thomas C. Reed, a former Air Force secretary who served in the National Security Council, reported on by the Washington Post in 2004, and even mentioned in a 1996 article in the CIA journal “Studies in Intelligence”.

This wasn't just an attack on the Soviet Union, but indirectly affected Western European gas prices (the pipeline destroyed was designed to feed gas to Europe), as a side effect of the US attempt to disrupt Soviet hard currency earnings. I wonder if the Chinese are students of this particular part of history ? Judging by their eagerness to embrace Windows in their infrastructure, and the recent promises by Chinese PC makers to include “genuine Windows” on PC's shipped in China, it would seem not. This single incident of modified binary-only software causing massive economic damage should be required reading for the decision makers of any nation that might have conflicting interests to the USA. That's everyone else in the world, just in case you were wondering. Even that normally docile American pet the UK has balked at accepting binary-only control software for the new Joint Strike Fighter aircraft and threatened to cancel the order if they don't get access to the source code. Maybe the UK isn't so docile after all, as the UK military certainly seems to understand the need to control the software in at least some critical parts of their own infrastructure.

So who can you trust in computing, and why ? I'd love to say that Open Source code companies can be trusted as they give you the source code, and proprietary companies can not, as source code isn't available, but it's not as simple as that. Microsoft loudly proclaim they already make the Windows source code available to China and any other countries who complain about the possibility of such binary-only threats. Do you imagine that any US Linux distributor would say no to the US government if they were requested (politely, of course) to add a back-door to the binary Linux images shipped as part of their products ? Who amongst us actually uses the source code so helpfully given to us on the extra CDs to compile our own version ? With Windows of course there are already so many back-doors known and unknown that the US government might not have even bothered to ask Microsoft, they may have just found their own, ready to exploit at will. What about Intel or AMD and the microcode on the processor itself ?

Even with access to Windows source code, it is still not safe to trust it unless you compile that code yourself and only install the binary versions you create on your own machines. Having source code that claims to match a product proves nothing about the binary version of the product you're using unless you've created it yourself. How many versions of Windows installed on Chinese government computers were actually compiled by the Chinese themselves. None, I'd guess. The same of course is true for the UK. What this means is that governments around the world who accept binary packaged software from US software companies are at the mercy of the US intelligence services who may or may not have decided to add just that little “extra” into the code. If you think I'm being paranoid about this just ask the Russians.....

Just for completeness for the truly paranoid even compiling the source code yourself actually isn't enough to ensure you're getting “trusted” computing. In his landmark 1984 paper “Reflections on trusting trust” Ken Thompson, one of the original UNIX authors, tells a tale of how he hacked the C compiler on the UNIX system, the software used to create new binary code from source code, to add a completely undetectable back-door into UNIX. Once set up there was no trace of the back-door he added in any of the publicly available UNIX source code, it was cleverly hidden in the binaries and was designed to propagate itself into any new binaries created on the system. This was a theoretical attack, not something he actually did, but something he could have done. At least I hope so, but then I'm inclined to trust him. The only way to get trusted code is to design the processor yourself (yes there can be back-doors in processor microcode as well as ordinary binary code), write your own compiler and audit all the Open Source code you create yourself for use in your command and control systems. Anything else is trusting the untrustworthy.

I'll leave you with some words from Ken Thompson's paper that are just are true today as in 1984.

“The moral is obvious. You can't trust code that you did not totally create yourself. (Especially code from companies that employ people like me.) No amount of source-level verification or scrutiny will protect you from using untrusted code. In demonstrating the possibility of this kind of attack, I picked on the C compiler. I could have picked on any program-handling program such as an assembler, a loader, or even hardware microcode. As the level of program gets lower, these bugs will be harder and harder to detect. A well installed microcode bug will be almost impossible to detect.”

Jeremy Allison


Very interesting article

You have a very interesting article here. I would like to translate to spanish and post it in my blog at bitnegro.blogspot.com, with your permission, of course.

Thanks in advance.


Link to Thompson's article

http://www.acm.org/classics/sep95/ is a 1995 reprint of the article. It's not long, only three pages, and easy to read. I read it when it first came out in 1984, and never forgot it. It's scary.

Thompson's back door ...

... was fiendishly simple. According to ESR's jargon file (at http://www.catb.org/~esr/jargon/html/B/back-door.html), it worked like this:

"Historically, back doors have often lurked in systems longer than anyone expected or planned, and a few have become widely known. Ken Thompson's 1983 Turing Award lecture to the ACM admitted the existence of a back door in early Unix versions that may have qualified as the most fiendishly clever security hack of all time. In this scheme, the C compiler contained code that would recognize when the login command was being recompiled and insert some code recognizing a password chosen by Thompson, giving him entry to the system whether or not an account had been created for him.

"Normally such a back door could be removed by removing it from the source code for the compiler and recompiling the compiler. But to recompile the compiler, you have to use the compiler — so Thompson also arranged that the compiler would recognize when it was compiling a version of itself, and insert into the recompiled compiler the code to insert into the recompiled login the code to allow Thompson entry — and, of course, the code to recognize itself and do the whole thing again the next time around! And having done this once, he was then able to recompile the compiler from the original sources; the hack perpetuated itself invisibly, leaving the back door in place and active but with no trace in the sources."

Good article.


Error regarding 'shared' source

The way the article is written it implies that these countries get some kind of meaningful access to the source. MS may show some data to China and other countries which ask to see the source, but I have yet to find any proof, aside from wishful thinking, which correlates what they are shown with what they are actually running. It might be important to emphasize that.

Having "look-but-don't-touch" access is of course fine for developers, at least the ones still on Windows, who want to see why the official APIs don't quite work and kludge their programs into working.

However, for security purposes it's charlatanism on the part of M$ and severe negligence on the part of the governments.


While the dangers talked about in this article are real in some circumstances, the article fails to address several points that work to mitigate it somewhat.

Does it matter whether every copy of software is compiled locally or not? Not really. Given the same compiler and make files used by the distributer of the software, you should be able to compile a single copy and checksum/fingerprint the binaries. It would be all but impossible to compile altered binaries that fingerprint the same as an unmodified binary, given the many different ways to fingerprint a file.

It would also be extremely difficult for a Linux company to get away with distributing altered binaries for any length of time. While you and you and you may not use the source code, there certainly are people out there who DO use the source code that comes with the distro.

Oh and personally, I run Gentoo, so I DO compile almost all of my binaries.

Siberian explosion

I would take the supposed sabotage of the gas pipeline story with a large grain of salt. There isn't a shred of evidence for it, aside from the reminisces of a couple of retired spies decades after the supposed event, embroidered to spice up their autobiographies.

These comments from a Slashdot post on the subject cover many points:

"The fact still remains that there was simply no component available in those days that was complex enough for it to be practical to hide a trojan in"

Not to mention, if the Russians had suspected it was sabotage, they could easily have retaliated by blowing up any number of things in the US.

Windows FUD

"With Windows of course there are already so many back-doors known and unknown"

The article would be better if it abstained from anti-Windows FUD. Don't destroy your credibility with assertions you have zero evidence for.

Research First, then Comment

Google for some facts first then comment

Seriously though, even if you refuse to believe that there were government mandated backdoors in previous versions of windows despite the cryptographic export restrictions in effect at that time, windows has an atrocious track record regarding security, if a regular person can download a copy of metaploit and have good odds at compromising a windows machine then i'm pretty sure a multi-billion dollar a year government agency can do a fair bit better. So even if you assume that the NSA, or whoever, relies on windows bugs rather than specific back doors (which i don't find likely) there is still almost always a way in for someone with the motivation and resources given Microsoft's utter disdain for security.

crt0 is corruptable

Brings back old college memories (circa '87) of a friend who hacked crt0 and gained almost indefinite root access to the school's undergraduate Vax (survived from-tape upgrades shipped by Digital itself). To M. Buta, where ever you are: your genius was vastly under appreciated (even if you were just re-implementing ideas presented in Thompson's paper... it's the sort of thing that second year undergraduates are rarely able to accomplish, especially under the watchful eye of RR <-- got to be careful about these things, you never know when the statue of limitations runs out).

Given the general promiscuity of the Windows kernels (including Vista), "Trusted Computing" will be nothing more than one long continuous joke perpetrated at the expense of consumers who actually believe the marketing hype. Xbox linux and OS X on non-Apple hardware are evidence enough.

There is no "trusted computing"

I don't buy the story of the Russian sabotage. And I state: You can't even trust code you wrote yourself. There is no such thing as total bug-freeness and consequently there is no such thing as total trust.
Think of the OS as a house. It's not the house that matters to a burglar, but the things within. A house always has holes, no matter whether you built it or somebody else. To protect your things a house alone is not enough. Get a safe. Similarly, to protect your sensitive information an OS is not enough. Get encryption software, firewalls, etc.
I don't think that anybody with a common-sense attitude (i.e. somebody who is slightly paranoid) could ever hold the opinion that "trusted computing" exists or is even desirable. At least not more than "trusted housing".


Correction about "trust"

Hi there,

You made one mistake, a very common one:

"there is no such thing as total trust"

Of course there is not! If there was, it would probably be closer to "security" rather than the subjective and purposefully vague concept of "trust". It works like in real life (it actually is linked to real-life trust), where one does look for "complete trust" but rather "trust enough for the task" (and then into accounts other aspects like risk, cost and gain).

In a way, it is an attempt to make explicit the "blurred" frontier between the good guy and the bad guy by adding a bit of nuance.

Just 2 comments - Siberian explosion & Chinese processors

I have just two comments:

1) When you consider things from this point of view, the Chinese-made processors make a lot more sense.

2) The Siberian explosion happened because the bug was placed inside of software that the Soviets stole from Western Europe. They couldn't retaliate for the explosion because they would have to publicly admit their corporate espionage throughout the West.

microcode back doors aren't that scary

I have to disagree with Thompson's closing remark about the possibility of microcode version of his trojan being the scariest of all.

You can foil his trojan using the clean source for the compiler and the bugged binary of the compiler by writing a source code obfuscator powerful enough to defeat the am-I-compiling-the-compiler test in the bugged compiler binary. You then run the obfuscated compiler source through the bugged compiler binary to produce a new compiler binary without backdoors.

In essense:

cat compiler_source_code | bugged_compiler_binary > new_compiler_binary_without_backdoor

A low level computing environment (the microcode execution engine of a CPU) will have no special advantage in seeing through an obfuscator.

Chris Marshall

I guess that's why it's called "Trusted" Computing

Hi there,

Good blog article. Though the idea is right, the details are probably wrong. Trusted Computing goes beyond traditionnal computing by adding a TPM next to the processor, it is a totally passive chip, but one that cannot be broken (i.e. not cheaply ...).

Features like attestation and sealing limit what can be done, but they all rely on the same quite subjective notion of "trust". So to answer your article, all this may be right, but then it's not called "Secure Computing" (although some people use the word "trust" as an excuse for security ideas).

It's a bit like in real life, you can see it from a paranoid, a "normal" (in the sense of a form of concensus?) or a naive point of view. But in the end, it's all about making the life of hackers (that are now 99.9% professionnals earning their wages from exploiting others) much more difficult. And as for the CIA exemple, there is also some collateral damage ...

"Trusted" Computing

What an article!

There is a flaw here, the Russians had known better from the German defeated encryption system during the World War II, not to trust any binary or source code. I am very sure the Siberian gas pipeline explosion in 1982 had nothing to do with the USA let alone Microsoft. The Russian are notorious about unethical standard safety procedures, but that is changing now, after the hard lesson from the Chernobyl nuclear reactor accident that occurred on 26 April 1986.

Heh! Jeremy, I hope you are not saying the Chernobyl disaster too was USA & Microsoft sabotage to the Russians. Do you know how many life supporting devices that runs on MS code? I can smell unfairness in your article.

If you can’t trust open source like UNIX and Linux, you can’t trust proprietary binary like Mac OX or MS Windows, and you can't develop your own, you have two options:
(1) Don’t use anything
(2) Just blindly trust one out of the two
Otherwise you are on your own Jeremy!

Engr. Kassim Adewale

Back to top