There has been a lot of talk now about the implications of Trusted Computing in the industry. The idea has sent shockwaves through the computing world. What, you haven’t noticed? I’m not surprised.
Trusted Computing is essentially a platform with checks and balances to make sure everything runs exactly as intended with no interruptions from threats from any source, inside or out. On the surface this seems like a great idea. Everything from the very moment your computing device is switched on is controlled by a special part of the processor and a special chip. This processor and chip are controlled by a “hypervisor” which is a program intended to make sure everything works as intended.
It is the ‘inside threat’ part of this equation that has people worried as the functions enabled by this technology allow operating system vendors to prevent you from making these choices as to what is a threat and what should be allowed to happen.
Trusted Computing is essentially a way to positively, once and for all enforce Digital Rights Management on the general-purpose computing platform. You see, big media companies really want you to own an HD-DVD or Blu-ray player on your computer and be able to enjoy full resolution output that these platforms can offer. The catch is that they don’t want clever hackers to figure out how to get a perfect, bit-for-bit copy of that extremely high-quality content. The only way the MPAA will allow your computer to play this content at it’s fullest quality is to have the checks described in Trusted Computing in place.
So the big deal is that consumers are worried this is all just a move to lock us out of our computers.
How will recent events such as the introduction of the Intel vPro processor pan out? Only time will tell, but I’ll definitely be watching.
What does this mean for the Open Source community?
Can users make their own “hypervisor” to use on the computers they purchase?
It seems at this point there are only questions, and the answers will have to wait.