Trusted Computing

I’ve finally gotten around to reading up on Trusted Computing (a process that, ironically enough, was interrupted by my being rootkitted a couple of weeks ago). I’d heard some pretty unsettling things about trusted computing, but now that I’ve done some digging… well it’s still pretty disturbing.

Trusted Computing (TC) is one of several names for a set of changes to server, PC, PDA and mobile phone operating systems, software and hardware that will make these computers “more trustworthy.” Microsoft has one version, known as Palladium or Next Generation Secure Computing Base (NGSCB), and an alliance of Intel, Microsoft, IBM, HP and AMD known as the Trusted Computing Group has a slightly different one called either trusted computing, trustworthy computing, or “safer computing.” Some parts of Trusted Computing are already in Windows XP, Windows Server 2003, and the in the hardware for the IBM Thinkpad, and many more will be in Microsoft’s new Longhorn version of Windows scheduled for 2006.

The EFF has a nice introduction to trusted computing systems, written by Seth Schoen, and Ross Anderson has a more detailed and critical analysis. A brief summary of the summary is that a trusted computer includes tamper-resistant hardware that can cryptographically verify the identity and integrity of the programs you run, verify that identity to online “policy servers,” encrypt keyboard and screen communications, and keep an unauthorized program from reading another program’s memory or saved data. The center of this is the so-called “Fritz” chip, named after Senator Fritz Hollings of South Carolina, who tried to make digital rights management a mandatory part of all consumer electronics. (He failed and is retiring in 2004, but I’ve no doubt there will be attempts to pass similar laws in the future.)

When most people think about computer security they think about virus detectors, firewalls and encrypted network traffic — the computer analogs to burglar alarms, padlocks and opaque envelopes. The Fritz chip is a different kind of security, more like the “political officer” that the Soviet Union would put on every submarine to make sure the captain stayed loyal. The whole purpose of the Fritz chip is to make sure that you, the computer user, can’t do anything that goes against the policies set by the people who wrote your software and/or provide you with web services.

There are many people who would like such a feature. Content providers such as Disney could verify that your version of Windows Media Player hasn’t had digital rights management disabled before sending you a decryption key for a movie. Your employer could prevent email from being printed or read on non-company machines, and could automatically delete it from your inbox after six months. Governments could prevent leaks by doing the same with sensitive documents. Microsoft and AOL could prevent third-party instant-message software from working with the MSN or AIM networks, or lock-in customers by making it difficult to switch to other products without losing access to years worth of saved documents. Game designers could keep you from cheating in networked games. Distributed computing and mobile agents programs could be sure their code isn’t being subverted or leaked when running on third-party systems. Software designers could verify that a program is registered and only running on a single computer (as Windows XP does already), and could even prevent all legitimate trusted computers from reading files encrypted by pirated software. Trusted computing is all about their trust, and the person they don’t trust is you.

End users do get a little bit of “trust” out of trusted computing, but not as much as you might think. TC won’t stop hackers from gaining access to a system, but it could be used to detect rootkits that have been installed. TC also won’t prevent viruses, worms or Trojans, but it can prevent them from accessing data or keys owned by other applications. That means a program you download from the Internet won’t be able to email itself to everyone in your (encrypted) address book. However, TC won’t stop worms that exploit security holes in MS Outlook’s scripting language from accessing your address book, because Outlook already has that permission. In spite of what the Trusted Computing Group’s backgrounder and Microsoft’s Palladium overview imply, TC won’t help with identity theft or computer thieves physically accessing your data any more than current public key cryptography and encrypted file systems do.

As long as you agree with the goals of the people who write your software and provide your web services, TC isn’t a bad deal. After all, most people don’t want people to cheat at online games and can see the value of company email deletion policies. The same can be said of the political officer on Soviet submarines — they were great as long as you believed in what the Communist Party stood for. And unlike Soviet submarine commanders, you won’t get shot for refusing to use TC on your computer. Your programs will still run as always, you just won’t be able to read encrypted email from your customers, watch downloaded movies, or purchase items through your TC-enabled cellphone. Some have claimed that this is how it should be, and that the market will try out all sorts of agreements and those that are acceptable to both consumers and service providers will survive. That sounds nice in theory, but doesn’t work when the market is dominated by a few players (e.g. Microsoft for software, wireless providers for mobile services, and the content cartel for music and movies) or when there are network externalities that make it easy to lock in a customer base (e.g. email, web, web services and electronic commerce). What choice will you have in wordprocessors if the only way you can read memos from your boss is by using MS Word? What choice will you have in stereo systems when the five big record companies announce that new recordings will only be released in a secure-media format?

Of course, even monopolies respond to strong enough consumer push-back, but as Ross Anderson points out there are subtle tricks software and service providers can pull to lock in unwary consumers. For example, a law firm might discover that migrating years of encrypted documents from Microsoft to OpenOffice requires sign-off for the change by every client that has ever sent an encrypted email attachment. That’s a nasty barrel to be over, and the firm would probably grudgingly pay Microsoft large continuing license fees to avoid that pain. These kinds of barriers to change can be subtle, and you can bet they won’t be a part of the original sales pitch from Microsoft. But then what do you expect when you invite a political officer into your computer?

References