词条 | Trusted Computing |
释义 |
Trusted Computing (TC) is a technology developed and promoted by the Trusted Computing Group.[1] The term is taken from the field of trusted systems and has a specialized meaning. With Trusted Computing, the computer will consistently behave in expected ways, and those behaviors will be enforced by computer hardware and software.[1] Enforcing this behavior is achieved by loading the hardware with a unique encryption key inaccessible to the rest of the system. TC is controversial as the hardware is not only secured for its owner, but also secured against its owner. Such controversy has led opponents of trusted computing, such as free software activist Richard Stallman, to refer to it instead as treacherous computing,[2] even to the point where some scholarly articles have begun to place scare quotes around "trusted computing".[3][4] Trusted Computing proponents such as International Data Corporation,[5] the Enterprise Strategy Group[6] and Endpoint Technologies Associates[7] claim the technology will make computers safer, less prone to viruses and malware, and thus more reliable from an end-user perspective. They also claim that Trusted Computing will allow computers and servers to offer improved computer security over that which is currently available. Opponents often claim this technology will be used primarily to enforce digital rights management policies and not to increase computer security.[2][10]{{Rp|23|date=May 2009}} Chip manufacturers Intel and AMD, hardware manufacturers such as HP and Dell, and operating system providers such as Microsoft include Trusted Computing in their products if enabled.[8][9] The U.S. Army requires that every new PC it purchases comes with a Trusted Platform Module (TPM).[10][11] As of July 3, 2007, so does virtually the entire United States Department of Defense.[12] Key conceptsTrusted Computing encompasses six key technology concepts, of which all are required for a fully Trusted system, that is, a system compliant to the TCG specifications:
{{anchor|ENDORSEMENT-KEY}}Endorsement keyThe endorsement key is a 2048-bit RSA public and private key pair that is created randomly on the chip at manufacture time and cannot be changed. The private key never leaves the chip, while the public key is used for attestation and for encryption of sensitive data sent to the chip, as occurs during the TPM_TakeOwnership command.[13] This key is used to allow the execution of secure transactions: every Trusted Platform Module (TPM) is required to be able to sign a random number (in order to allow the owner to show that he has a genuine trusted computer), using a particular protocol created by the Trusted Computing Group (the direct anonymous attestation protocol) in order to ensure its compliance of the TCG standard and to prove its identity; this makes it impossible for a software TPM emulator with an untrusted endorsement key (for example, a self-generated one) to start a secure transaction with a trusted entity. The TPM should be{{vague|date=March 2015}} designed to make the extraction of this key by hardware analysis hard, but tamper resistance is not a strong requirement. Memory curtainingMemory curtaining extends common memory protection techniques to provide full isolation of sensitive areas of memory—for example, locations containing cryptographic keys. Even the operating system does not have full access to curtained memory. The exact implementation details are vendor specific. {{anchor|SEALED-STORAGE}}Sealed storageSealed storage protects private information by binding it to platform configuration information including the software and hardware being used. This means the data can be released only to a particular combination of software and hardware. Sealed storage can be used for DRM enforcing. For example, users who keep a song on their computer that has not been licensed to be listened will not be able to play it. Currently, a user can locate the song, listen to it, and send it to someone else, play it in the software of their choice, or back it up (and in some cases, use circumvention software to decrypt it). Alternatively, the user may use software to modify the operating system's DRM routines to have it leak the song data once, say, a temporary license was acquired. Using sealed storage, the song is securely encrypted using a key bound to the trusted platform module so that only the unmodified and untampered music player on his or her computer can play it. In this DRM architecture, this might also prevent people from listening to the song after buying a new computer, or upgrading parts of their current one, except after explicit permission of the vendor of the song. ==={{anchor|REMOTE-ATTESTATION}}Remote attestation=== Remote attestation allows changes to the user's computer to be detected by authorized parties. For example, software companies can identify unauthorized changes to software, including users tampering with their software to circumvent technological protection measures. It works by having the hardware generate a certificate stating what software is currently running. The computer can then present this certificate to a remote party to show that unaltered software is currently executing. Remote attestation is usually combined with public-key encryption so that the information sent can only be read by the programs that presented and requested the attestation, and not by an eavesdropper. To take the song example again, the user's music player software could send the song to other machines, but only if they could attest that they were running a secure copy of the music player software. Combined with the other technologies, this provides a more secured path for the music: secure I/O prevents the user from recording it as it is transmitted to the audio subsystem, memory locking prevents it from being dumped to regular disk files as it is being worked on, sealed storage curtails unauthorized access to it when saved to the hard drive, and remote attestation protects it from unauthorized software even when it is used on other computers. Sending remote attestation data to a trusted third party, however, has been discouraged in favour of Direct Anonymous Attestation. Proof of space (PoS) have been proposed to be used for malware detection, by determining whether the L1 cache of a processor is empty (e.g., has enough space to evaluate the PoSpace routine without cache misses) or contains a routine that resisted being evicted.[14][15]Trusted third party{{Multiple issues|section=yes|{{Cleanup section|reason=some sections read as if they are taken from another work verbatim. Tone needs wikification.|date=May 2010}}{{Unreferenced section|date=March 2013}}}} One of the main obstacles that had to be overcome by the developers of the TCG technology was how to maintain anonymity while still providing a “trusted platform”. The main object of obtaining “trusted mode” is that the other party (Bob), with whom a computer (Alice) may be communicating, can trust that Alice is running un-tampered hardware and software. This will assure Bob that Alice will not be able to use malicious software to compromise sensitive information on the computer. Unfortunately, in order to do this, Alice has to inform Bob that she is using registered and “safe” software and hardware, thereby potentially uniquely identifying herself to Bob. This might not be a problem where one wishes to be identified by the other party, e.g., during banking transactions over the Internet. But in many other types of communicating activities people enjoy the anonymity that the computer provides. The TCG acknowledges this, and allegedly have developed a process of attaining such anonymity but at the same time assuring the other party that he or she is communicating with a "trusted" party. This was done by developing a “trusted third party”. This entity will work as an intermediary between a user and his own computer and between a user and other users. In this essay the focus will be on the latter process, a process referred to as remote attestation. When a user requires an AIK (Attestation Identity Key) the user wants its key to be certified by a CA (Certification Authority). The user through a TPM (Trusted Platform Module) sends three credentials: a public key credential, a platform credential, and a conformance credential. This set of certificates and cryptographic keys will in short be referred to as "EK". The EK can be split into two main parts, the private part "EKpr" and the public part "EKpub". The EKpr never leaves the TPM. Disclosure of the EKpub is however necessary (version 1.1). The EKpub will uniquely identify the endorser of the platform, model, what kind of software is currently being used on the platform, details of the TPM, and that the platform (PC) complies with the TCG specifications. If this information is communicated directly to another party as a process of getting trusted status it would at the same time be impossible to obtain an anonymous identity. Therefore, this information is sent to the privacy certification authority, (trusted third party). When the C.A (Privacy certification Authority) receives the EKpub sent by the TPM, the C.A verifies the information. If the information can be verified it will create a certified secondary key pair AIK, and sends this credential back to the requestor. This is intended to provide the user with anonymity. When the user has this certified AIK, he or she can use it to communicate with other trusted platforms. In version 1.2, the TCG have developed a new method of obtaining a certified AIK. This process is called DAA Direct anonymous attestation. This method does not require the user to disclose his/her EKpub with the TTP. The unique new feature of the DAA is that it has the ability to convince the remote entity that a particular TPM (trusted platform module) is a valid TPM without disclosing the EKpub or any other unique identifier. Before the TPM can send a certification request for an AIK to the remote entity, the TPM has to generate a set of DAA credentials. This can only be done by interacting with an issuer. The DAA credentials are created by the TPM sending a TPM-unique secret that remains within the TPM. The TPM secret is similar but not analogous to the EK. When the TPM has obtained a set of DAA credentials, it can send these to the Verifier. When the Verifier receives the DAA credentials from the TTP, it will verify them and send a certified AIK back to the user. The user will then be able to communicate with other trusted parties using the certified AIK. The Verifier may or may not be a trusted third party (TTP). The Verifier can determine whether the DAA credentials are valid, but the DAA credentials do not contain any unique information that discloses the TPM platform. An example would be where a user wants trusted status and sends a request to the Issuer. The Issuer could be the manufacturer of the user’s platform, e.g. Compaq. Compaq would check if the TPM it has produced is a valid one, and if so, issues DAA credentials. In the next step, the DAA credentials are sent by the user to the Verifier. As mentioned this might be a standard TTP, but could also be a different entity. If the Verifier accepts the DAA supplied it will produce a certified AIK. The certified AIK will then be used by the user to communicate with other trusted platforms. In summary the new version introduces a separate entity that will assist in the anonymous attestation process. By introducing the Issuer which supplies a DAA, one will be able to sufficiently protect the user’s anonymity towards the Verifier/TTP. The issuer most commonly will be the platform manufacturer. Without such credentials, it will be probably difficult for a private customer or small business or organization to convince others that they have a genuine trusted platform. Known applicationsThe Microsoft products Windows Vista, Windows 7, Windows 8 and Windows RT make use of a Trusted Platform Module to facilitate BitLocker Drive Encryption.[16] Possible applications{{Unreferenced section|date=March 2013}}Digital rights managementTrusted Computing would allow companies to create a digital rights management (DRM) system which would be very hard to circumvent, though not impossible. An example is downloading a music file. Sealed storage could be used to prevent the user from opening the file with an unauthorized player or computer. Remote attestation could be used to authorize play only by music players that enforce the record company's rules. The music would be played from curtained memory, which would prevent the user from making an unrestricted copy of the file while it is playing, and secure I/O would prevent capturing what is being sent to the sound system. Circumventing such a system would require either manipulation of the computer's hardware, capturing the analogue (and thus degraded) signal using a recording device or a microphone, or breaking the security of the system. New business models for use of software (services) over Internet may be boosted by the technology. By strengthening the DRM system, one could base a business model on renting programs for a specific time periods or "pay as you go" models. For instance, one could download a music file which could only be played a certain number of times before it becomes unusable, or the music file could be used only within a certain time period. Preventing cheating in online gamesTrusted Computing could be used to combat cheating in online games. Some players modify their game copy in order to gain unfair advantages in the game; remote attestation, secure I/O and memory curtaining could be used to determine that all players connected to a server were running an unmodified copy of the software.[17] Verification of remote computation for grid computingTrusted Computing could be used to guarantee participants in a grid computing system are returning the results of the computations they claim to be instead of forging them. This would allow large scale simulations to be run (say a climate simulation) without expensive redundant computations to guarantee malicious hosts are not undermining the results to achieve the conclusion they want.[18] CriticismTrusted Computing opponents such as the Electronic Frontier Foundation and Free Software Foundation claim trust in the underlying companies is not deserved and that the technology puts too much power and control into the hands of those who design systems and software. They also believe that it may cause consumers to lose anonymity in their online interactions, as well as mandating technologies Trusted Computing opponents say are unnecessary. They suggest Trusted Computing as a possible enabler for future versions of mandatory access control, copy protection, and DRM. Some security experts[19][20] have spoken out against Trusted Computing, believing it will provide computer manufacturers and software authors with increased control to impose restrictions on what users are able to do with their computers. There are concerns that Trusted Computing would have an anti-competitive effect on the IT market.[10] There is concern amongst critics that it will not always be possible to examine the hardware components on which Trusted Computing relies, the Trusted Platform Module, which is the ultimate hardware system where the core 'root' of trust in the platform has to reside.[10] If not implemented correctly, it presents a security risk to overall platform integrity and protected data. The specifications, as published by the Trusted Computing Group, are open and are available for anyone to review. However, the final implementations by commercial vendors will not necessarily be subjected to the same review process. In addition, the world of cryptography can often move quickly, and that hardware implementations of algorithms might create an inadvertent obsolescence. Trusting networked computers to controlling authorities rather than to individuals may create digital imprimaturs. Cryptographer Ross Anderson, University of Cambridge, has great concerns that:[21] TC can support remote censorship [...] In general, digital objects created using TC systems remain under the control of their creators, rather than under the control of the person who owns the machine on which they happen to be stored [...] So someone who writes a paper that a court decides is defamatory can be compelled to censor it — and the software company that wrote the word processor could be ordered to do the deletion if she refuses. Given such possibilities, we can expect TC to be used to suppress everything from pornography to writings that criticize political leaders. He goes on to state that: [...] software suppliers can make it much harder for you to switch to their competitors' products. At a simple level, Word could encrypt all your documents using keys that only Microsoft products have access to; this would mean that you could only read them using Microsoft products, not with any competing word processor. [...] The [...] most important benefit for Microsoft is that TC will dramatically increase the costs of switching away from Microsoft products (such as Office) to rival products (such as OpenOffice). For example, a law firm that wants to change from Office to OpenOffice right now merely has to install the software, train the staff and convert their existing files. In five years' time, once they have received TC-protected documents from perhaps a thousand different clients, they would have to get permission (in the form of signed digital certificates) from each of these clients in order to migrate their files to a new platform. The law firm won't in practice want to do this, so they will be much more tightly locked in, which will enable Microsoft to hike its prices. Anderson summarizes the case by saying: The fundamental issue is that whoever controls the TC infrastructure will acquire a huge amount of power. Having this single point of control is like making everyone use the same bank, or the same accountant, or the same lawyer. There are many ways in which this power could be abused. Digital rights managementOne of the early motivations behind trusted computing was a desire by media and software corporations for stricter DRM technology to prevent users from freely sharing and using potentially copyrighted or private files without explicit permission. An example could be downloading a music file from a band: the band's record company could come up with rules for how the band's music can be used. For example, they might want the user to play the file only three times a day without paying additional money. Also, they could use remote attestation to only send their music to a music player that enforces their rules: sealed storage would prevent the user from opening the file with another player that did not enforce the restrictions. Memory curtaining would prevent the user from making an unrestricted copy of the file while it is playing, and secure output would prevent capturing what is sent to the sound system. Users unable to modify softwareA user who wanted to switch to a competing program might find that it would be impossible for that new program to read old data, as the information would be "locked in" to the old program. It could also make it impossible for the user to read or modify their data except as specifically permitted by the software. Remote attestation could cause other problems. Currently, web sites can be visited using a number of web browsers, though certain websites may be formatted such that some browsers cannot decipher their code. Some browsers have found a way to get around that problem by emulating other browsers. With remote attestation, a website could check the internet browser being used and refuse to display on any browser other than the specified one (like Internet Explorer), so even emulating the browser would not work. Users unable to exercise legal rightsThe law in many countries allows users certain rights over data whose copyright they do not own (including text, images, and other media), often under headings such as fair use or public interest. Depending on jurisdiction, these may cover issues such as whistleblowing, production of evidence in court, quoting or other small-scale usage, backups of owned media, and making a copy of owned material for personal use on other owned devices or systems. The steps implicit in trusted computing have the practical effect of preventing users exercising these legal rights.[2] Users vulnerable to vendor withdrawal of serviceA service that requires external validation or permission - such as a music file or game that requires connection with the vendor to confirm permission to play or use - is vulnerable to that service being withdrawn or no longer updated. A number of incidents have already occurred where users, having purchased music or video media, have found their ability to watch or listen to it suddenly stop due to vendor policy or cessation of service,[22][23][24] or server inaccessibility,[25] at times with no compensation.[26] Alternatively in some cases the vendor refuses to provide services in future which leaves purchased material only usable on the present -and increasingly obsolete- hardware (so long as it lasts) but not on any hardware that may be purchased in future.[22] Users unable to overrideSome opponents of Trusted Computing advocate "owner override": allowing an owner who is confirmed to be physically present to allow the computer to bypass restrictions and use the secure I/O path. Such an override would allow remote attestation to a user's specification, e.g., to create certificates that say Internet Explorer is running, even if a different browser is used. Instead of preventing software change, remote attestation would indicate when the software has been changed without owner's permission. Trusted Computing Group members have refused to implement owner override.[27] Proponents of trusted computing believe that owner override defeats the trust in other computers since remote attestation can be forged by the owner. Owner override offers the security and enforcement benefits to a machine owner, but does not allow him to trust other computers, because their owners could waive rules or restrictions on their own computers. Under this scenario, once data is sent to someone else's computer, whether it be a diary, a DRM music file, or a joint project, that other person controls what security, if any, their computer will enforce on their copy of those data. This has the potential to undermine the applications of trusted computing to enforce DRM, control cheating in online games and attest to remote computations for grid computing. Loss of anonymityBecause a Trusted Computing equipped computer is able to uniquely attest to its own identity, it will be possible for vendors and others who possess the ability to use the attestation feature to zero in on the identity of the user of TC-enabled software with a high degree of certainty. Such a capability is contingent on the reasonable chance that the user at some time provides user-identifying information, whether voluntarily, indirectly, or simply through inference of many seemingly benign pieces of data. (e.g. search records, as shown through simple study of the AOL search records leak[28]). One common way that information can be obtained and linked is when a user registers a computer just after purchase. Another common way is when a user provides identifying information to the website of an affiliate of the vendor. While proponents of TC point out that online purchases and credit transactions could potentially be more secure as a result of the remote attestation capability, this may cause the computer user to lose expectations of anonymity when using the Internet. Critics point out that this could have a chilling effect on political free speech, the ability of journalists to use anonymous sources, whistle blowing, political blogging and other areas where the public needs protection from retaliation through anonymity. The TPM specification offers features and suggested implementations that are meant to address the anonymity requirement. By using a third-party Privacy Certification Authority (PCA), the information that identifies the computer could be held by a trusted third party. Additionally, the use of direct anonymous attestation (DAA), introduced in TPM v1.2, allows a client to perform attestation while not revealing any personally identifiable or machine information. The kind of data that must be supplied to the TTP in order to get the trusted status is at present not entirely clear, but the TCG itself admits that "attestation is an important TPM function with significant privacy implications".[29] It is, however, clear that both static and dynamic information about the user computer may be supplied (Ekpubkey) to the TTP (v1.1b),[30] it is not clear what data will be supplied to the “verifier” under v1.2. The static information will uniquely identify the endorser of the platform, model, details of the TPM, and that the platform (PC) complies with the TCG specifications . The dynamic information is described as software running on the computer.[31] If a program like Windows is registered in the user’s name this in turn will uniquely identify the user. Another dimension of privacy infringing capabilities might also be introduced with this new technology; how often you use your programs might be possible information provided to the TTP. In an exceptional, however practical situation, where a user purchases a pornographic movie on the Internet, the purchaser nowadays, must accept the fact that he has to provide credit card details to the provider, thereby possibly risking being identified. With the new technology a purchaser might also risk someone finding out that he (or she) has watched this pornographic movie 1000 times. This adds a new dimension to the possible privacy infringement. The extent of data that will be supplied to the TTP/Verifiers is at present not exactly known, only when the technology is implemented and used will we be able to assess the exact nature and volume of the data that is transmitted. TCG specification interoperability problemsTrusted Computing requests that all software and hardware vendors will follow the technical specifications released by the Trusted Computing Group in order to allow interoperability between different trusted software stacks. However, since at least mid-2006, there have been interoperability problems between the TrouSerS trusted software stack (released as open source software by IBM) and Hewlett-Packard's stack.[32] Another problem is that the technical specifications are still changing, so it is unclear which is the standard implementation of the trusted stack. Shutting out of competing productsPeople have voiced concerns that trusted computing could be used to keep or discourage users from running software created by companies outside of a small industry group. Microsoft has received a great deal{{vague|date=March 2015}} of bad press surrounding their Palladium software architecture, evoking comments such as "Few pieces of vaporware have evoked a higher level of fear and uncertainty than Microsoft's Palladium", "Palladium is a plot to take over cyberspace", and "Palladium will keep us from running any software not personally approved by Bill Gates".[33] The concerns about trusted computing being used to shut out competition exist within a broader framework of consumers being concerned about using bundling of products to obscure prices of products and to engage in anti-competitive practices.[3] Trusted Computing is seen as harmful or problematic to independent and open source software developers.[34] TrustIn the widely used public-key cryptography, creation of keys can be done on the local computer and the creator has complete control over who has access to it, and consequentially their own security policies.[35] In some proposed encryption-decryption chips, a private/public key is permanently embedded into the hardware when it is manufactured,[36] and hardware manufacturers would have the opportunity to record the key without leaving evidence of doing so. With this key it would be possible to have access to data encrypted with it, and to authenticate as it.[37] It is trivial for a manufacturer to give a copy of this key to the government or the software manufacturers, as the platform must go through steps so that it works with authenticated software. Therefore, to trust anything that is authenticated by or encrypted by a TPM or a Trusted computer, an end user has to trust the company that made the chip, the company that designed the chip, the companies allowed to make software for the chip, and the ability and interest of those companies not to compromise the whole process.[38] A security breach breaking that chain of trust happened to a SIM card manufacturer Gemalto, which in 2010 was infiltrated by US and British spies, resulting in compromised security of cellphone calls.[39] It is also critical that one be able to trust that the hardware manufacturers and software developers properly implement trusted computing standards. Incorrect implementation could be hidden from users, and thus could undermine the integrity of the whole system without users being aware of the flaw.[40] Hardware and software support{{cleanup list|section|date=July 2014}}
See also{{Portal|Computer security}}{{Div col|colwidth=22em}}
References1. ^1 {{cite book|author=Chris Mitchell|title=Trusted Computing|url=https://books.google.com/books?id=9iriBw2AuToC|year=2005|publisher=IET|isbn=978-0-86341-525-8}} 2. ^1 2 {{cite web|last=Stallman|first=Richard|title=Can You Trust Your Computer?|url=https://www.gnu.org/philosophy/can-you-trust.html|work=gnu.org|accessdate=12 August 2013}} 3. ^1 Ross Anderson, "Cryptography and Competition Policy - Issues with ‘Trusted Computing’ ", in Economics of Information Security, from series Advances in Information Security, Vol. 12, April 11, 2006. 4. ^F. Stajano, "Security for whom? The shifting security assumptions of pervasive computing", Lecture notes in computer science, vol. 2609, pp. 16-27, 2003. 5. ^{{cite web | accessdate = 2007-02-07 | first = Shane | last = Rau | url = https://www.trustedcomputinggroup.org/news/Industry_Data/IDC_448_Web.pdf | title = The Trusted Computing Platform Emerges as Industry's First Comprehensive Approach to IT Security | work = IDC Executive Brief | publisher = International Data Corporation |date=February 2006}} 6. ^{{cite web | title = Trusted Enterprise Security: How the Trusted Computing Group (TCG) Will Advance Enterprise Security | work = White Paper | publisher = Enterprise Strategy Group | first = Jon | last = Oltsik |date=January 2006 | url = https://www.trustedcomputinggroup.org/news/Industry_Data/ESG_White_Paper.pdf | accessdate = 2007-02-07 }} 7. ^{{cite web | url = https://www.trustedcomputinggroup.org/news/Industry_Data/Implementing_Trusted_Computing_RK.pdf | title = How to Implement Trusted Computing: A Guide to Tighter Enterprise Security | first = Roger L. | last = Kay |year=2006 | publisher = Endpoint Technologies Associates | accessdate = 2007-02-07 }} 8. ^{{cite web | quote = TPMs [Trusted Platform Modules] from various semiconductor vendors are included on enterprise desktop and notebook systems from Dell and other vendors | title = Enhancing IT Security with Trusted Computing Group standards | work = Dell Power Solutions |date=November 2006 | url = http://www.dell.com/downloads/global/power/ps4q06-20070160-tcg.pdf | page = 14 | accessdate = 2006-02-07 }} 9. ^{{cite web | quote = Windows Vista provides a set of services for applications that use TPM technologies. | url = http://www.microsoft.com/whdc/system/platform/pcdesign/TPM_secure.mspx | title = Trusted Platform Module Services in Windows Vista | date = 2005-04-25 | work = Windows Hardware Development Central | accessdate = 2007-02-07 | publisher = Microsoft |archiveurl = https://web.archive.org/web/20070515072944/http://www.microsoft.com/whdc/system/platform/pcdesign/TPM_secure.mspx |archivedate = 2007-05-15}} 10. ^{{cite news | url = http://www.securityfocus.com/brief/265 | title = U.S. Army requires trusted computing | publisher = Security Focus | date = 2006-07-28 | first = Robert | last = Lemos | accessdate = 2007-02-07 }} 11. ^{{cite web | url = http://www.army.mil/ciog6/news/500Day2006Update.pdf | quote = Strategic goal n. 3 , 'deliver a joint netcentric information that enables warfighter decision superiority' |date=October 2006 | title = Army CIO/G-6 500-day plan | publisher = U.S. Army | accessdate = 2007-02-07 }} 12. ^encryption of unclassified data {{Webarchive|url=https://web.archive.org/web/20070927060332/http://iase.disa.mil/policy-guidance/dod-dar-tpm-decree07-03-07.pdf |date=2007-09-27 }} 13. ^{{cite web | author = Safford, David | url = http://www.linuxjournal.com/article/6633 | title = Take Control of TCPA | date = 2006-10-27 | accessdate = 2007-02-07 | work = Linux Journal | authorlink = David Safford }} 14. ^{{cite journal|last1=Jakobsson|first1=Markus|title=Mobile Malware: Why the Traditional AV Paradigm is Doomed, and How to Use Physics to Detect Undesirable Routines, BlackHat|last2=Stewart|first2=Guy|year=2013}} 15. ^Markus Jakobsson [https://eprint.iacr.org/2018/031.pdf Secure Remote Attestation] Cryptology ePrint Archive. Retrieved January 8, 2018. 16. ^{{cite web | url=http://download.microsoft.com/download/0/2/3/0238acaf-d3bf-4a6d-b3d6-0a0be4bbb36e/BitLockerCipher200608.pdf | title=AES-CBC + Elephant: A Disk Encryption Algorithm for Windows Vista | publisher = Microsoft TechNet |date=August 2006 | author = Ferguson, Niels | accessdate = 2007-02-07 | authorlink= Niels Ferguson}} 17. ^{{cite book|author=Bin Xiao|title=Autonomic and Trusted Computing: 4th International Conference, ATC 2007, Hong Kong, China, July 11-13, 2007, Proceedings|url=https://books.google.com/books?id=cUhpq98Zb8AC&pg=PA124|year=2007|publisher=Springer Science & Business Media|isbn=978-3-540-73546-5|page=124}} 18. ^{{cite web | url = http://www.hpl.hp.com/personal/Wenbo_Mao/research/tcgridsec.pdf | title = Innovations for Grid Security From Trusted Computing | author = Mao, Wenbo Jin, Hai and Martin, Andrew | date = 2005-06-07 | accessdate = 2007-02-07 |archiveurl = https://web.archive.org/web/20060822043633/http://www.hpl.hp.com/personal/Wenbo_Mao/research/tcgridsec.pdf |archivedate = 2006-08-22}} 19. ^{{cite news | title = Trusted Computing comes under attack | url = http://news.zdnet.co.uk/internet/security/0,39020375,39249368,00.htm | publisher = ZDNet | first = Ingrid | last = Marson | date = 2006-01-27 | accessdate = 2007-02-07 }} 20. ^{{cite news | url = http://www.schneier.com/crypto-gram-0208.html#1 | title = Palladium and the TCPA | date = 2002-08-15 | work = Crypto-Gram Newsletter | author = Schneier, Bruce | accessdate = 2007-02-07 | authorlink = Bruce Schneier }} 21. ^1 2 3 {{cite web | url = http://www.cl.cam.ac.uk/~rja14/tcpa-faq.html | title = 'Trusted Computing' Frequently Asked Questions: TC / TCG / LaGrande / NGSCB / Longhorn / Palladium / TCPA Version 1.1 |date=August 2003 | author = Anderson, Ross | accessdate = 2007-02-07 | authorlink = Ross J. Anderson }} 22. ^1 {{cite web|last=Cheng |first=Jacqui |url=https://arstechnica.com/information-technology/2008/04/drm-sucks-redux-microsoft-to-nuke-msn-music-drm-keys/ |title=DRM sucks redux: Microsoft to nuke MSN Music DRM keys |publisher=Ars Technica |date=2008-04-22 |accessdate=2014-05-31}} 23. ^{{cite web|url=http://www.fudzilla.com/home/item/3495-yahoo-drm-servers-going-away?tmpl=component&print=1 |title=Yahoo! DRM servers going away |publisher=Fudzilla.com |date=2008-07-29 |accessdate=2014-05-31}} 24. ^{{cite web|last=Fisher |first=Ken |url=https://arstechnica.com/tech-policy/2007/08/google-selleth-then-taketh-away-proving-the-need-for-drm-circumvention/ |title=Google selleth then taketh away, proving the need for DRM circumvention |publisher=Ars Technica |date=2007-08-13 |accessdate=2014-05-31}} 25. ^{{cite web|last=Fister |first=Mister |url=http://www.shacknews.com/article/62995/ubisoft-offers-free-goodies-as |title=Ubisoft Offers Free Goodies as Compensation f - Video Game News, Videos and File Downloads for PC and Console Games at |publisher=Shacknews.com |date= |accessdate=2014-05-31}} 26. ^{{cite web|last=Bangeman |first=Eric |url=https://arstechnica.com/uncategorized/2007/11/major-league-baseballs-drm-change-strikes-out-with-fans/ |title=Major League Baseball's DRM change strikes out with fans |publisher=Ars Technica |date=2007-11-07 |accessdate=2014-05-31}} 27. ^{{cite news | url = http://www.linuxjournal.com/article/7055 | title = Give TCPA an Owner Override | publisher = Linux Journal | author = Schoen, Seth | date = 2003-12-01 | accessdate = 2007-02-07 | authorlink = Seth Schoen }} 28. ^{{cite news | url = https://www.nytimes.com/2006/08/09/technology/09aol.html?pagewanted=all&_r=0 | title = A Face Is Exposed for AOL Searcher No. 4417749 | date = 2006-08-09 | accessdate = 2013-05-10 | authorlink = Michael Barbaro | publisher = The New York Times }} 29. ^TPM version 1.2 specifications changes, 16.04.04 30. ^TPM v1.2 specification changes, 2004 31. ^TPM v1.2 specification changes,2004 32. ^{{cite web | work = TrouSerS FAQ | url = http://trousers.sourceforge.net/faq.html#1.7 | title = 1.7 - I've taken ownership of my TPM under another OS... | accessdate = 2007-02-07 }} 33. ^E.W. Felten, "Understanding trusted computing: will its benefits outweigh its drawbacks?", Security & Privacy, IEEE, Vol. 1, No. 3, pp. 60-62, 34. ^R. Oppliger, R. Rytz, "Does trusted computing remedy computer security problems?", Security & Privacy, IEEE, Vol. 3, No. 2, pp. 16-19, 2005. 35. ^"IEEE P1363: Standard Specifications For Public-Key Cryptography", Retrieved March 9, 2009. {{Webarchive|url=https://web.archive.org/web/20141201024245/http://grouper.ieee.org/groups/1363/ |date=December 1, 2014 }} 36. ^Tal Garfinkel, Ben Pfaff, Jim Chow, Mendel Rosenblum, Dan Boneh, "Terra: a virtual machine-based platform for trusted computing", ACM SIGOPS Operating Systems Review, Vol. 37, No. 5, pp. 193-206, 2003. 37. ^These are the functions of the private key in the RSA algorithm 38. ^{{cite web |last1=Sullivan |first1=Nick |title=Deploying TLS 1.3: the great, the good and the bad (33c3) |url=https://www.youtube.com/watch?time_continue=1533&v=0opakLwtPWk |website=media.ccc.de |publisher=YouTube |accessdate=30 July 2018}} 39. ^{{Cite web |url = https://firstlook.org/theintercept/2015/02/19/great-sim-heist |title = The Great SIM Heist: How Spies Stole the Keys to the Encryption Castle |date = 2015-02-19 |accessdate = 2015-02-27 |website = firstlook.org}} 40. ^Seth Schoen, "Trusted Computing: Promise and Risk", COSPA Knowledge Base: Comparison, selection, & suitability of OSS, April 11th, 2006. {{Webarchive|url=https://web.archive.org/web/20090319043100/http://pascal.case.unibz.it/handle/2038/871 |date=2009-03-19 }} 41. ^{{cite web | url = http://www.tonymcfadden.net/tpmvendors_arc.html | title = TPM Matrix | author = Tony McFadden | date = March 26, 2006 | accessdate = 2006-05-05 | deadurl = yes | archiveurl = https://web.archive.org/web/20070426034219/http://www.tonymcfadden.net/tpmvendors_arc.html | archivedate = April 26, 2007 | df = }} 42. ^{{cite web | url=https://lwn.net/Articles/121386/ | title = Trusted Gentoo | date = January 31, 2005 | accessdate=2006-05-05 | work = Gentoo Weekly Newsletter }} 43. ^{{cite web | url=http://download.intel.com/intel/worldahead/pdf/classmatepc_productbrief.pdf?iid=worldahead+ac_cmpc_pdf | title = Product Brief: Classmate PC | author = Intel | date = December 6, 2006 | accessdate = 2007-01-13 }} 44. ^{{cite web|url=http://www.thinkwiki.org/wiki/Embedded_Security_Subsystem |title=Embedded Security Subsystem |publisher=thinkwiki.org |date=2010-04-12 |accessdate=2014-06-17}} 45. ^{{cite web|url=http://www1.us.dell.com/content/learnmore/learnmore.aspx?c=us&l=en&s=gen&~id=desktop_security&~line=desktops&~mode=popup&~series=optix&~tab=topic| title=Dell Security Software FAQ|accessdate = 2007-05-24}} 46. ^{{cite web|url=http://www.trustkernel.org|title=T6: TrustZone Based Trusted Kernel|accessdate=2015-01-12}} 47. ^{{cite web|url=https://news.samsung.com/global/editorial-protecting-your-mobile-with-samsung-knox|title=Samsung Newsroom|accessdate=2018-03-07}} External links
4 : Cryptography|Copyright law|Trusted computing|Microsoft Windows security technology |
随便看 |
开放百科全书收录14589846条英语、德语、日语等多语种百科知识,基本涵盖了大多数领域的百科知识,是一部内容自由、开放的电子版国际百科全书。