Kamis, 27 September 2007

Trusted Computing

Trusted Computing

From Wikipedia, the free encyclopedia

Jump to: navigation, search
Logo of Trusted Computing Group, an initiative to implement Trusted Computing
Logo of Trusted Computing Group, an initiative to implement Trusted Computing

Trusted Computing (also abbreviated TC) is a technology developed and promoted by the Trusted Computing Group. The term is taken from the field of trusted systems and has a specialized meaning. Trusted Computing means that the computer will consistently behave in specific ways, and those behaviors will be enforced by hardware and software.

Trusted Computing proponents such as International Data Corporation,[1] the Enterprise Strategy Group[2] and Endpoint Technologies Associates[3] claim that it will make computers safer, less prone to viruses and malware, and thus more reliable from an end-user perspective. In addition, they also claim that Trusted Computing will allow computers and servers to offer improved computer security over that which is currently available.

Chip manufacturers Intel and AMD, and computer and system manufacturers such as Dell and Microsoft, plan to include Trusted Computing into coming generations of products.[4][5]a[›] The U.S. Army requires that every new small PC it purchases must come with a Trusted Platform Module (TPM)[6][7]. As of July 3, 2007, so does virtually the entire Department of Defense.[8] According to the International Data Corporation, by 2010 essentially all portable PCs and the vast majority of desktops will include a TPM chip.[9]

Trusted Computing has proved controversial with opponents believing that trust in the underlying companies is not deserved and that the technology puts too much power and control into the hands of those who design systems and software. Many see trusted computing as an anti-competitive practice. It is also seen by them as a possible enabler for future versions of mandatory access control, copy protection, and Digital Rights Management, all critized for undue censorship.

Contents

[hide]

[edit] The nature of trust

Security experts define a trusted system to be one which is required to be trusted for the security of a larger system to hold. For example, the United States Department of Defense's definition of a trusted system is one which could break security policy if it misbehaved; i.e., "a system that you have chosen to trust, possibly out of necessity."[10] Cryptographer Bruce Schneier observes "A 'trusted' computer does not mean a computer that is trustworthy."[10] Using these definitions a hard drive controller must be trusted by its users that it genuinely saves to the drive, in every case, the data it is intended to be saving, and a secure website must be trusted that it is secure because a user cannot verify this for themselves. Trust in security parlance is always a kind of compromise or weakness—sometimes inevitable, but never desirable as such.

Controversial in Trusted Computing is this definition of trust; the Trusted Computing group describes Technical Trust as "an entity can be trusted if it always behaves in the expected manner for the intended purpose". Critics characterize a Trusted System as a system you are forced to trust rather than one which is particularly trustworthy.

[edit] Key Concepts

Trusted computing encompasses five key technology concepts, of which all are required for a fully trusted system.

  1. Endorsement Key
  2. Secure Input and Output
  3. Memory curtaining / Protected execution
  4. Sealed storage
  5. Remote attestation

[edit] Endorsement Key

"The endorsement key is a 2,048-bit RSA public and private key pair, which is created randomly on the chip at manufacture time and cannot be changed. The private key never leaves the chip, while the public key is used for attestation and for encryption of sensitive data sent to the chip, as occurs during the TPM_TakeOwnership command."- David Safford[11]

This key is used to allow the executions of secure transactions: every TPM is required to sign a random number, using a particular protocol created by the trusted computing group (the Direct anonymous attestation protocol) in order to ensure its compliance of the TCG standard and to prove its identity; this makes it impossible for a software TPM emulator to start a secure transaction with a trusted entity. The TPM should be designed to make the extraction of this key by hardware analysis hard, but tamper-resistance is not a strong requirement.

[edit] Secure I/O

Secure input and output (I/O) refers to a protected path between the computer user and the software with which they believe they are interacting. On current computer systems there are many ways for malicious software to intercept data as it travels between a user and a software process - for example keyboard loggers and screen-scrapers. Secure I/O reflects a hardware and software protected and verified channel, using checksums to verify that the software used to do the I/O has not been tampered with. Malicious software injecting itself in this path could be identified. Secure I/O is traditionally known as a Trusted Path

[edit] Memory curtaining

Memory curtaining extends the current memory protection techniques to provide full isolation of sensitive areas of memory — for example locations containing cryptographic keys. Even the operating system doesn't have full access to curtained memory, so the information would be secure from an intruder who took control of the OS.

[edit] Sealed storage

Sealed storage protects private information by binding it to platform configuration information including the software and hardware being used. This means the data can be read only by the same combination of software and hardware. For example, users who keep a song on their computer that hasn't been licensed to be listened won't be able to play it. Currently, a user can locate the song, listen to it, and send it to someone else, play it in the software of their choice, or for backup (in some cases, using circumvention software to decrypt it, such as hymn). Alternately the user may use software to modify the operating system's DRM routines to have it leak the song data once, say, a temporary license was acquired. Using sealed storage, the song is securely encrypted so that only the unmodified and untampered music player on his or her computer can play it.

[edit] Remote attestation

Remote attestation allows changes to the user's computer to be detected by authorized parties. That way, software companies can avoid users tampering with their software to circumvent technological protection measures. It works by having the hardware generate a certificate stating what software is currently running. The computer can then present this certificate to a remote party to show that its software hasn't been tampered with.

Remote attestation is usually combined with public-key encryption so that the information sent can only be read by the programs that presented and requested the attestation, and not by an eavesdropper, such as the computer owner.

To take the song example again, the user's music player software could send the song to other machines, but only if they could attest that they were running a secure copy of the music player software. Combined with the other technologies, this provides a more secured path for the music: secure I/O prevents the user from recording it as it is heard on the speakers, memory curtaining prevents it from being dumped to regular disk files as it is being worked on, sealed storage curtails unauthorized access to it when saved to the hard drive, and remote attestation protects it from unauthorized software even when it is used on other computers.

[edit] Applications for Trusted Computing

[edit] Protecting hard-drive data after theft

Windows Vista Ultimate and Enterprise make use of a Trusted Platform Module to facilitate BitLocker Drive Encryption.[12] The Trusted Platform Module is used to securely bootstrap and access decryption keys for volume level hard drive encryption. This is done via the Trusted Platform Module's Platform Configuration Registers. As the computer starts up a series of validations occur on the BIOS, the master boot record, the boot sector and so on until the decryption keys can be retrieved from the Trusted Platform Module and used to decrypt the hard drive as needed. This use of the TPM mitigates some attacks on accessing the data on a stolen or lost laptop; such as just plugging in the hard drive in a different system, booting to a different operating system or attempting to modify the boot code.

The Enforcer is a Linux Security Module designed to improve integrity of a computer running Linux by ensuring no tampering of the file system. It can interact with 'trusted' hardware to provide higher levels of assurance for software and sensitive data. The Enforcer can also work with the TPM to store the secret to an encrypted loopback file system, and unmount this file system when a tampered file is detected; the secret will not be accessible to mount the loopback file system until the machine has been rebooted with untampered files. This allows sensitive data to be protected from an attacker.

[edit] Possible applications for Trusted Computing

[edit] Digital Rights Management

Trusted Computing would allow companies to create an almost unbreakable DRM system. An example is downloading a music file. Remote attestation could be used so that the music file would refuse to play except on a specific music player that enforces the record company's rules. Sealed storage would prevent the user from opening the file with another player or another computer. The music would be played in curtained memory, which would prevent the user from making an unrestricted copy of the file while it's playing, and secure I/O would prevent capturing what is being sent to the sound system.

[edit] Preventing cheating in on-line games

Trusted computing could be used to combat cheating in online games. Some players modify their game copy in order to gain unfair advantages in the game; remote attestation, secure I/O and memory curtaining could be used to verify that all players connected to a server were running an unmodified copy of the software.

[edit] Protection from identity theft

Trusted Computing could be used to prevent identity theft. Take for example, online banking. Remote attestation could be used when the user is connecting to the bank's server and would only serve the page if the server could produce the correct certificates. Then the user can send his encrypted account number and PIN, with some assurance that the information is private to him and the bank.

[edit] Protection from viruses and spyware

Digital signature of software will allow users to identify applications modified by third parties that could add spyware to the software. For example, a website offers a modified version of a popular instant messenger that contains spyware as a drive-by download. The operating system could notice the lack of a valid signature for these versions and inform the user that the program has been modified. Of course this leaves open the question of who determines if a signature is valid.

Trusted computing might allow increased protection from viruses. However, Microsoft has denied that this functionality will be present in its NGSCB architecture. A possible improvement in Virus protection would be to allow antivirus vendors to write software that couldn't be corrupted by virus attacks. However, as with most advanced uses of Trusted Computing technology, preventing software corruption necessitates a Trusted Operating System, such as Trusted Gentoo In practice any operating system which aims to be backwards compatible with existing software will not be able to protect against viruses in this way.

biometrics ATM in South Korea
biometrics ATM in South Korea

[edit] Protection of biometric authentication data

Biometric devices used for authentication could use trusted computing technologies (memory curtaining, secure I/O) to assure the user that no spyware installed on his/her PC is able to steal sensitive biometric data. The theft of this data could be extremely harmful to the user because while a user can change a password if he or she knows that the password is no longer secure, a user cannot change the data generated by a biometric device.

[edit] Verification of remote computation for grid computing

Trusted computing could be used to guarantee participants in a grid computing system are returning the results of the computations they claim to be instead of forging them. This would allow large scale simulations to be run (say a climate simulation) without expensive redundant computations to guarantee malicious hosts aren't undermining the results to achieve the conclusion they want.[13]

[edit] Criticism of Trusted Computing

Trusted Computing opponents such as the Electronic Frontier Foundation and the Free Software Foundation believe that trust in the underlying companies is not deserved and that the technology puts too much power and control into the hands of those who design systems and software. They also believe that it may cause consumers to lose anonymity in their online interactions, as well as mandating technologies Trusted Computing opponents deem as unnecessary. Finally, Trusted Computing is seen by them as a possible enabler for future versions of mandatory access control, copy protection, and Digital Rights Management, all critized for undue censorship.

Some security experts[14][10] have spoken out against Trusted Computing, believing it will provide computer manufacturers and software authors with increased control to impose restrictions on what users are able to do with their computers. There are concerns that Trusted Computing would have an anti-competitive effect on competition in the IT market.

There is concern amongst critics that it will not always be possible to examine the hardware components on which Trusted Computing relies, the Trusted Platform Module, which is the ultimate hardware system where the core 'root' of trust in the platform has to lie. If not implemented correctly, it presents a security risk to overall platform integrity and protected data. The specifications, as published by the Trusted Computing Group, are open and are available for anyone to review. However, the final implementations by commercial vendors will not necessarily be subjected to the same review process. In addition, the world of cryptography can often move quickly, and that hardware implementations of algorithms might create an inadvertent obsolescence.

While the promise of Trusted Computing is to increase security, critics counter not only will security not be helped, but Trusted Computing will facilitate mandatory Digital Rights Management, invade privacy, and impose other restrictions on users. Trusting networked computers to controlling authorities rather than to individuals may create digital imprimaturs. Contrast Trusted Computing with secure computing in which anonymity, not disclosure, is the main concern. Advocates of secure computing argue that the additional security can be achieved without relinquishing control of computers from users to superusers.

The Cambridge cryptographer Ross Anderson has great concerns that "TC can support remote censorship [...] In general, digital objects created using TC systems remain under the control of their creators, rather than under the control of the person who owns the machine on which they happen to be stored (as at present) [...] So someone who writes a paper that a court decides is defamatory can be compelled to censor it — and the software company that wrote the word processor could be ordered to do the deletion if she refuses. Given such possibilities, we can expect TC to be used to suppress everything from pornography to writings that criticise political leaders."[15] He goes on to state that:

"[...] software suppliers can make it much harder for you to switch to their competitors' products. At a simple level, Word could encrypt all your documents using keys that only Microsoft products have access to; this would mean that you could only read them using Microsoft products, not with any competing word processor."[15]
"The [...] most important benefit for Microsoft is that TC will dramatically increase the costs of switching away from Microsoft products (such as Office) to rival products (such as OpenOffice). For example, a law firm that wants to change from Office to OpenOffice right now merely has to install the software, train the staff and convert their existing files. In five years' time, once they have received TC-protected documents from perhaps a thousand different clients, they would have to get permission (in the form of signed digital certificates) from each of these clients in order to migrate their files to a new platform. The law firm won't in practice want to do this, so they will be much more tightly locked in, which will enable Microsoft to hike its prices."[15]

Anderson summarizes the case by saying "The fundamental issue is that whoever controls the TC infrastructure will acquire a huge amount of power. Having this single point of control is like making everyone use the same bank, or the same accountant, or the same lawyer. There are many ways in which this power could be abused."[15]

One of the biggest criticisms of the Trusted Computing Group comes from the Free Software Foundation, the foundation responsible for developing the GNU operating system. The Free Software Foundation refers to Trusted Computing as “Treacherous Computing” frequently in their articles.[16] Their website provides an example animated short that explains negative aspects of Trusted Computing.[17]

[edit] Users unable to modify software

In the diary example, sealed storage protects the diary from malicious programs like viruses, but it doesn't distinguish between those and useful programs, like ones that might be used to convert the diary to a new format, or provide new methods for searching within the diary. A user who wanted to switch to a competing diary program might find that it would be impossible for that new program to read the old diary, as the information would be "locked in" to the old program. It could also make it impossible for the user to read or modify his or her diary except as specifically permitted by the diary software. If he or she were using diary software with no edit or delete option then it could be impossible to change or delete previous entries.

Remote attestation could cause other problems. Currently web sites can be visited using a number of web browsers, though certain websites may be formatted (intentionally or not) such that some browsers cannot decipher their code. Some browsers have found a way to get around that problem by emulating other browsers.

[edit] Users have no control over information received

One of the early motivations behind trusted computing was a desire by media and software corporations for stricter Digital Rights Management technology to prevent users from freely sharing and using potentially copyrighted or private files without explicit permission. Microsoft has announced a DRM technology, PVP-OPM, that says it will make use of hardware encryption.

Trusted Computing can be used for DRM. An example could be downloading a music file from a band: the band's record company could come up with rules for how the band's music can be used. For example, they might want the user to play the file only three times a day without paying additional money. Also, they could use remote attestation to only send their music to a music player that enforces their rules: sealed storage would prevent the user from opening the file with another player that did not enforce the restrictions. Memory curtaining would prevent the user from making an unrestricted copy of the file while it's playing, and secure output would prevent capturing what is sent to the sound system.

Once digital recordings are converted to analog signals, the (possibly degraded) signals could be recorded by conventional means, such as by connecting an audio recorder to the card instead of speakers, or by recording the speaker sounds with a microphone. Even trusted computing cannot prevent the analog hole.

Without remote attestation, this problem would not exist. The user could simply download the song with a player that did not enforce the DRM restrictions, or one that lets him convert the song to a normal unrestricted format such as MP3 or Vorbis.

[edit] Users have no control over data

One commonly stated criticism of Trusted Computing, is that sealed storage could prevent them from moving sealed files to the new computer. This limitation might exist either through poor software design or deliberate limitations placed by publishers of works. The migration section of the TPM specification requires that it be impossible to move certain kinds of files except to a computer with the identical make and model of security chip. If an old model of chip is no longer produced it becomes impossible to move the data to a new machine at all; the data is forced to die along with the old computer.

[edit] Users unable to override

Some opponents of Trusted Computing advocate allowing owner overrides to allow the computer to use the secure I/O path to make sure the owner is physically present, to then bypass restrictions. Such an override would allow remote attestation to a user's specification, e.g., to create certificates that say Internet Explorer is running, even if a different browser is used. Instead of preventing software change, remote attestation would indicate when the software has been changed without owner's permission.

Trusted Computing Group members have refused to implement owner override.[18] Proponents of trusted computing believe that Owner override defeats the trust in other computers since remote attestation can be forged by the owner. Owner override offers the security and enforcement benefits to a machine owner, but does not allow him to trust other computers, because their owners could waive rules or restrictions on their own computers. Under this scenario, once data is sent to someone else's computer, whether it be a diary, a DRM music file, or a joint project, that other person controls what security, if any, their computer will enforce on their copy of those data. This has the potential to undermine the applications of trusted computing to enforce Digital Rights Management, control cheating in online games and attest to remote computations for grid computing.

According to the Electronic Frontier Foundation, one of the fundamental premises behind trusted computing is that the owner and his software cannot be trusted.[19] It is assumed that the user will — through negligence or willful intent — take actions that may result in compromising his own system. For example, an IT administrator could not ensure that notebook computers are running a specified operating system. An alternative approach would be to require a cryptographic key provided with the computer in order to engage the override. This would require that the owner authorize the override rather than merely someone who has physical access to the computer and may satisfy the IT administrator, but it would reduce the usefulness of Trusted Computing in Digital Rights Management or other applications that require Remote Attestation against the will of the computer owner.

[edit] Loss of anonymity

Because a Trusted Computing equipped computer is able to uniquely attest to its own identity, it will be possible for vendors and others who possess the ability to use the attestation feature to zero-in on the identity of the user of TC-enabled software with a high degree of certainty.

Such a capability is contingent on the reasonable chance that the user at some time provides user-identifying information, whether voluntarily or indirectly. One common way that information can be obtained and linked is when a user registers a computer just after purchase. Another common way is when a user provides identifying information to the website of an affiliate of the vendor.

While proponents of TC point out that online purchases and credit transactions could potentially be more secure as a result of the remote attestation capability, this may cause the computer user to lose expectations of anonymity when using the Internet.

Critics point out that this could have a chilling effect on political free speech, the ability of journalists to use anonymous sources, whistle blowing, political blogging and other areas where the public needs protection from retaliation through anonymity.

In response to privacy concerns, researchers developed direct anonymous attestation which allows a client to perform attestation while limiting the amount of identifying information that is provided to the verifier.

[edit] Practicality

It has also been argued that many of the assumptions which underly TC are impractical in everyday use.

Any hardware component, including the TC hardware itself, has the potential to fail, or be upgraded and replaced. A user might rightfully conclude that the mere possibility of being irrevocably cut-off from access to his or her own information, or to years' worth of expensive work-products, with no opportunity for recovery of that information, is unacceptable. Legal restrictions on the use and dissemination of information, or mandating its reliable storage for a period of time that may extend to many years in the future, may also, it has been argued, preclude the practical application of TC technology in many of the ways now contemplated. The concept of basing ownership or usage restrictions upon the verifiable identity of a particular piece of computing hardware may be perceived by the user as problematic if the equipment in question malfunctions.

[edit] Interoperability issues

Trusted Computing requests that all software and hardware vendors will follow the technical specifications released by the Trusted Computing Group in order to allow interoperability between different trusted software stacks. However, even now there are interoperability problems between the TrouSerS trusted software stack (released as open source software by IBM) and Hewlett-Packard's stack[20]. Another problem is the fact that the technical specifications are still changing, so it isn't clear which is the standard implementation of the trusted stack.

[edit] Hardware and Software support

  • Since 2004, most major manufacturers have shipped systems (usually laptops) that have included Trusted Platform Modules, with associated BIOS support.[21] In accordance with the TCG specifications, the user must enable the Trusted Platform Module before it can be used.
  • The Linux kernel has included trusted computing support since version 2.6.13, and there are several projects to implement trusted computing for Linux. In January 2005, members of Gentoo Linux's "crypto herd" announced their intention of providing support for TC - in particular support for the Trusted Platform Module.[22] There is also a TCG-compliant software stack for Linux named TrouSerS, released under an open source license.
  • Some limited form of trusted computing can be implemented on current versions Microsoft Windows with third party software.
  • The Intel Classmate PC (a competitor to the One Laptop Per Child) includes a Trusted Platform Module[23]

Tidak ada komentar: