Trusted Computing
Trusted computing (TC) refers to a family of specifications from the TCPA with a stated goal of making computers more secure through the use of dedicated hardware. Critics, including academics, security experts, members of the free and open source software community, contend that the overall effect (and perhaps intent) of trusted computing is to impose unreasonable restrictions on how people can use their computers.
Synopsis
The basic system concepts in trusted computing are:
- Unique machine/CPU is identified using certificates;
- Encryption is performed in the hardware;
- Data can be signed with the machine's identification;
- Data can be encrypted with the machine's secret key.
The nature of trust
Trust means something different to security experts than the meaning laypersons often assign. For example, the United States Department of Defense's definition of a trusted system is one that can break your security policy; i.e., "a system that you are forced to trust because you have no choice." Cryptographer Bruce Schneier observes "A 'trusted' computer does not mean a computer that is trustworthy." According to those definitions a video card is trusted by its users to correctly display images. Trust in security parlance is always a kind of compromise or weakness—sometimes inevitable, but never desirable as such.
The main controversy around trusted computing is around this meaning of trust. Critics characterize a trusted system as a system you are forced to trust rather than one which is particularly trustworthy. In contrast, Microsoft, in adopting its term trustworthy computing presumably intends to focus consumers' attention on the allegedly trustworthy aspects of trusted computing systems.
Critics of trusted computing are further concerned that they are not able to look inside trusted computing hardware to see if it is properly implemented or if there are backdoors. The trusted computing specifications are open and available for anyone to review, but implementations are generally not. As well, many are concerned that cryptographic designs and algorithms become obsolete. This may result in the forced obsolescence of TC-enabled computers. For example, recent versions of trusted computing specifications added, and require, the AES encryption algorithm.
While proponents claim trusted computing increases security, critics counter that not only will security not be helped, but trusted computing will facilitate mandatory digital rights management (DRM), harm privacy, and impose other restrictions on users. Trusting networked computers to controlling authorities rather than to individuals may create digital imprimaturs. Contrast trusted computing with secure computing in which anonymity, not disclosure, is the main concern. Advocates of secure computing argue that the additional security can be achieved without relinquishing control over computer from users to superusers.
Proponents of trusted computing argue that privacy complaints are baseless since consumers will retain a choice between systems, based on their individual needs. Moreover, trusted computing advocates claim that some needs require changes to the current systems at the hardware level to enable a computer to act as a trusted client.
Related terms
The TCG project is known by a number of names. Trusted computing was the original one, and is still used by the Trusted Computing Group (TCG) and IBM. The hardware device they developed is called the TPM, the Trusted Platform Module. Microsoft calls it trustworthy computing. Intel has started calling it safer computing. Prior to May 2004, the TCG was known as the TCPA. Richard Stallman of the FSF has adopted the name Treacherous computing.
Background
A variety of initiatives fall under the heading of trusted computing: Microsoft is working on a project called NGSCB. An industry consortium including Microsoft, Intel, IBM, HP and AMD, have formed the Trusted Computing Platform Alliance (TCPA), which has a Trusted Computing Group (TCG), designing a Trusted Platform Module (TPM). Intel is working on a form called LaGrande Technology (LT), while AMD's is called Secure Execution Mode (SEM) or also known as Presidio. But essentially, there are proposals for four new features provided by new hardware, which require new software (including new operating systems and applications) to be taken advantage of. Each feature has a different reason, although they can be used together. The features are:
- Secure I/O
- Memory curtaining
- Sealed storage
- Remote attestation
Secure I/O
Secure input and output (I/O) is attested to by using checksums to verify that the software used to do the I/O has not been tampered with. Malicious software injecting itself in this path could be identified.
This would not be able to defend against a hardware based attack such as a key capture device physically between the user's keyboard and the computer.
Memory curtaining
Memory curtaining has the hardware keep programs from reading or writing each other's memory (the space where the programs store information they're currently working on). Even the operating system doesn't have access to curtained memory, so the information would be secure from an intruder who took control of the OS.
Something very similar can be achieved with new software, but doing it in hardware can be more elegant and reliable. It can also make certain methods of debugging impossible.
Sealed storage
Sealed storage protects private information by allowing it to be encrypted using a key derived from the software and hardware being used. This means the data can be read only by the same combination of software and hardware. For example, users who keep a private diary on their computer do not want other programs or other computers to be able to read it. Currently, a virus can search for the diary, read it, and send it to someone else. The Sircam virus did something similar to this. Even if the diary were protected by a password, the virus might run a dictionary attack. Alternately the virus might modify the user's diary software to have it leak the text once he unlocked his diary. Using sealed storage, the diary is securely encrypted so that only the unmodified diary program on his computer can read it.
Remote attestation
Remote attestation allows changes to the user's computer to be detected by him and others. That way, he can avoid having private information sent to or important commands sent from a compromised or insecure computer. It works by having the hardware generate a certificate stating what software is currently running. The user can present this certificate to a remote party to show that their computer hasn't been tampered with.
Remote attestation is usually combined with public-key encryption so that the information sent can only be read by the programs that presented and requested the attestation, and not by an eavesdropper.
To take the diary example again, the user's diary software could send the diary to other machines, but only if they could attest that they were running a secure copy of the diary software. Combined with the other technologies, this provides a more secured path for the diary: secure I/O protects it as it's entered on the keyboard and displayed on the screen, memory curtaining protects it as it's being worked on, sealed storage protects it when it's saved to the hard drive, and remote attestation protects it from unauthorized software even when it is used on other computers.
Drawbacks
Opponents of trusted computing argue that the security features that protect computers from viruses and attackers also restrict the actions of their owners. This makes new anti-competitive techniques possible, potentially hurting people who buy trusted computers.
Cambridge Cryptographer Ross Anderson has concerns that "TC can support remote censorship. In general, digital objects created using TC systems remain under the control of their creators, rather than under the control of the person who owns the machine on which they happen to be stored (as at present). So someone who writes a paper that a court decides is defamatory can be compelled to censor it—and the software company that wrote the word processor could be ordered to do the deletion if she refuses. Given such possibilities, we can expect TC to be used to suppress . . . writings that criticise political leaders." He goes on to state that:
- " . . . software suppliers can make it much harder for you to switch to their competitors' products. At a simple level, Word could encrypt all your documents using keys that only Microsoft products have access to; this would mean that you could only read them using Microsoft products, not with any competing word processor.
- "The . . . most important, benefit for Microsoft is that TC will dramatically increase the costs of switching away from Microsoft products (such as Office) to rival products (such as OpenOffice). For example, a law firm that wants to change from Office to OpenOffice right now merely has to install the software, train the staff and convert their existing files. In five years' time, once they have received TC-protected documents from perhaps a thousand different clients, they would have to get permission (in the form of signed digital certificates) from each of these clients in order to migrate their files to a new platform. The law firm won't in practice want to do this, so they will be much more tightly locked in, which will enable Microsoft to hike its prices."
Anderson summarizes the case by saying "The fundamental issue is that whoever controls the TC infrastructure will acquire a huge amount of power. Having this single point of control is like making everyone use the same bank, or the same accountant, or the same lawyer. There are many ways in which this power could be abused."
Users can't change software
In the diary example, sealed storage protects the diary from malicious programs like viruses, but it doesn't distinguish between those and useful programs, like ones that might be used to convert the diary to a new format, or provide new methods for searching within the diary. A user who wanted to switch to a competing diary program might find it would be impossible for that new program to read the old diary, as the information would be "locked in" to the old program. It could also make it impossible for the user to read or modify his diary except as specifically permitted by the diary software. If he were using diary software with no edit or delete option then it could be impossible to change or delete previous entries.
Remote attestation could cause other problems. Currently web sites can be visited using a number of web browsers, though certain websites may be formatted (intentionally or not) such that some browsers cannot decipher their code. Some browsers have found a way to get around that problem by emulating other browsers. For example, when Microsoft's MSN website briefly refused to serve pages to non-Microsoft browsers, users could access those sites by instructing their browsers to emulate a Microsoft browser. Remote attestation could make this kind of emulation irrelevant, as sites like MSN could demand a certificate stating the user was actually running an Internet Explorer browser.
Users don't control information they receive
One of the early motivations behind trusted computing was a desire to support stricter Digital Rights Management (DRM): technology to prevent users from sharing and using copyrighted or private files without permission. Microsoft has announced a DRM technology that it says will make use of trusted computing.
Trusted computing can be used for DRM. An example could be downloading a music file from a band: the band could come up with rules for how their music can be used. For example, they might only want the user to play the file three times a day without paying more money. Also, they could use remote attestation to only send their music to a music player that enforces their rules: sealed storage would prevent the user from opening the file with another player that did not enforce the restrictions. Memory curtaining would prevent the user from making an unrestricted copy of the file while it's playing, and secure output would prevent capturing what is sent to the sound system.
Once digital recordings are converted to analog signals, that (perhaps degraded) signal could be recorded by conventional means, such as by connecting an audio recorder, instead of speakers, to the card, or by recording the produced sound with a microphone.
Without remote attestation, this problem would not exist. The user could simply download the song with a player that did not enforce the band's restrictions, or one that lets him convert the song to an "unrestricted" format such as MP3.
Users don't control their data
If a user upgrades her computer, sealed storage could prevent her from moving her music files to the new computer. It could also enforce spyware, with music files only given to users whose machines attest to telling the artist or record company every time the song is played. In a similar vein, a news magazine could require that to download their news articles, a user's machine would need to attest to using a specific reader. The mandated reader software could then be programmed not to allow viewing of original news stories to which changes had been made on the magazine's website. Such "newest version" enforcement would allow the magazine to "rewrite history" by changing or deleting articles. Even if a user saved the original article on his computer, the software might refuse to view it once a change had been announced.
Loss of Internet Anonymity
Because a TC-equipped computer is able to uniquely attest to its own identity, it will be possible for vendors and others who possess the ability to use the attestation feature to zero-in on the identity of the user of that computer with a high degree of certainty.
Such a capability is contingent on the reasonable chance that the user at some time provides user-identifying information, whether voluntarily or indirectly. One common way that information can be obtained and linked is when a user registers a computer just after purchase. Another common way is when a user provides identifying information to the website of an affiliate of the vendor.
As new identification technologies such as biometrics and RFID become widespread, it is expected that computer users will be identified with still greater certainty, and that ever increasing amounts of information will be available about them. While proponents of TC believe that online purchases and credit transactions could potentially be more secure as a result of the remote attestation capability, this may cause the computer user to lose expectations of anonymity when using the Internet.
Critics point out that this could have a chilling effect on political free speech, the ability of journalists to use anonymous sources, whistleblowing, political blogging and other areas where the public has traditionally enjoyed protection from retaliation through anonymity.
Proposed owner override for TC
All these problems come up because trusted computing protects programs against everything, even the owner. A simple solution to this is to let the owner of the computer override these protections. This is called Owner Override, and it is only currently outlined as a suggested fix.
When you activate Owner Override, the computer will use the secure I/O path to make sure that you are physically present and actually the owner. Then it will bypass the protections. So, with remote attestation, you can force the computer to generate false attestations — certificates that say you're running Internet Explorer, when you're really running Opera. Instead of saying when your software has been changed, remote attestation will say when the software has been changed without your permission.
While it would seem that the idea of Owner Override would be met with praise, some Trusted Computing Group members have instead heralded it as the biggest potential downfall of the TC movement. Owner Override defeats the entire idea of being able to trust other people's computers, remote attestation. Owner Override continues to offer all of the security and enforcement benefits to an owner on his own machine, but loses any ability to ensure another owner cannot waive rules or restrictions on his own computer. Once you send data to someone else's computer, whether it is your diary, a DRM music file, or a joint project, that person controls what security, if any, their computer will enforce on their copy of that data.
External links
- Trusted Computing Group (TCG) — Trusted computing standards body, previously known as the TCPA.
- 'Trusted Computing' Frequently Asked Questions — Anti-TC FAQ by Cambridge University security director and professor Ross Anderson.
- TrouSerS - The open-source TCG Software Stack with a good FAQ
- TCPA Misinformation Rebuttal and Linux drivers from the IBM Watson Research - Global Security Analysis Lab
- Experimenting with TCPA/TCG Hardware, Or: How I Learned to Stop Worrying and Love The Bear. Technical Report TR2003-476, CS, Dartmouth College. December 2003. and the "Enforcer" Linux Security Module
- Next-Generation Secure Computing Base (NGSCB) — Microsoft's trusted computing architecture
- Palladium and the TCPA — from Bruce Schneier's Crypto-Gram newsletter.
- Against-TCPA
- Interesting Uses of Trusted Computing
- Can you trust your computer? — essay by the FSF
- Technically Speaking blog's "Microsoft Meeting" article -- Explains "sealed storage" in more depth than this article, yet without going into all the mathematics
- Trust Computing: Promise and Risk, a paper by EFF (Electronic Frontier Foundation) staff technologist Seth Schoen.
- Microsoft's Machiavellian manoeuvring (ZDNet UK) by Bruce Schneier
- LAFKON - A movie about Trusted Computing. Video opposed to Trusted Computing