Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>Suppose you want to be assured of the software running on your machine. You go into the firmware, point it at your boot loader and say "only this one". It makes a hash of the boot loader and refuses to use any other one until you change the setting, which requires your firmware password. Your boot loader then only loads the operating systems you've configured, and so on.

What if you need to update the bootloader?

>The only thing the signature gets you is remote attestation, which is the evil to be prevented. Simple hashing would get you everything else.

TPMs can do remote attestation without signatures just fine, by measuring the hash of the bootloader. It'd be clumsy, but doable, just like your idea of using hashes for verification.



> What if you need to update the bootloader?

Then you boot the system from the existing bootloader, causing the booted system to be trusted to supply a new hash.

> TPMs can do remote attestation without signatures just fine, by measuring the hash of the bootloader.

If there are no private keys in the TPM from the factory then there is nothing for a third party to force you to sign the hash with, as intended.


How does the system know whether the new bootloader is legitimate or not?

All TPMs have private keys from the factory. They're entirely unrelated to the secure boot keys.


> How does the system know whether the new bootloader is legitimate or not?

However it wants to. Maybe the existing bootloader (chosen by the owner rather than the vendor) or the OS it loads has its own signature verification system for update packages, like apt-get. Maybe the OS downloads it from a trusted URL via HTTPS and relies on web PKI. Maybe it uses Kerberos authentication to get it from the organization's own update servers. Maybe it just boots an OS that allows the operator to apply any update they want from a USB stick, but only after authenticating with the OS.

None of that is the firmware's problem, all it has to do is disallow modifications to itself unless the owner has entered the firmware password or the system is booted from the owner-designated trusted bootloader.

> All TPMs have private keys from the factory. They're entirely unrelated to the secure boot keys.

The point isn't which device has the keys, it's that it shouldn't contain any from the factory. Nothing good can come of it.


The situation you're protecting against is one where someone who compromises the OS can make that compromise persistent by replacing the bootloader. That means you can't place any trust in any component after the bootloader, since an attacker could just fake whatever mechanism you're enforcing.

> The point isn't which device has the keys, it's that it shouldn't contain any from the factory. Nothing good can come of it.

TPMs have private keys, and are not involved in enforcing secure boot. The firmware validating the signatures only has public keys.


> The situation you're protecting against is one where someone who compromises the OS can make that compromise persistent by replacing the bootloader. That means you can't place any trust in any component after the bootloader, since an attacker could just fake whatever mechanism you're enforcing.

Isn't that kind of pointless?

Suppose the attacker gets root on your OS, i.e. what they would need to supply the firmware with a new hash. That OS install is now compromised, because they can now change whatever else they want in the filesystem. If you boot the same OS again, even using the trusted bootloader, it's still compromised.

If you don't realize that it's compromised, you're now using a compromised system regardless of the bootloader. If you do realize it's compromised then you do a clean reinstall of the OS and designate your bootloader as the trusted one again instead of whatever the compromised OS installed.

What does the bootloader really get them that root didn't already?

> The firmware validating the signatures only has public keys.

Having the keys installed from the factory still seems like the thing causing the problem:

If it only trusts e.g. Microsoft's public key, they now get to decide if they want to sign something you might want to use. If they don't, secure boot prevents it from working, which causes problems for you if you want it to work.

Which then puts them under pressure to sign all kinds of things because people want their firmware updaters etc. to work, and then you get compromised by some code they signed which wasn't even relevant to you.

Whereas what you want is some way of designating what can run on your machine, regardless of what someone else would like to run on theirs. But then that's a machine-specific determination rather than something somebody should be deciding globally for everyone.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: