Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The problem here is that the signature doesn't do anything for you.

Suppose you want to be assured of the software running on your machine. You go into the firmware, point it at your boot loader and say "only this one". It makes a hash of the boot loader and refuses to use any other one until you change the setting, which requires your firmware password. Your boot loader then only loads the operating systems you've configured, and so on.

That doesn't require any certificates and you get 100% of the benefits. The firmware needs to verify the boot loader and the boot loader the OS etc. The OS doesn't need to verify the firmware because it can't because if the firmware or boot loader was compromised then the code in the OS to validate it would be just as compromised.

The only thing the signature gets you is remote attestation, which is the evil to be prevented. Simple hashing would get you everything else.

And then you also don't get this "garbage code is nonetheless trusted" problem because there is no global root of trust and you never told your firmware to trust this random firmware update utility for somebody else's hardware.



> The problem here is that the signature doesn't do anything for you.

For your own personal machine, sure. But say you're a sysadmin in a company that has thousands of units. Suddenly, a CA infrastructure is much more appealing than having to deal with component hashes.


How is it any different? You install the hash of the boot loader when you issue the machine, then use the trusted system to update the hash if necessary.

Also, the concern is that the system comes from the factory with private keys the owner doesn't have access to, allowing the device to defect by informing on them to a third party. Keys installed by the owner rather than the manufacturer are fine, and then such keys also wouldn't be trusting random third party code either.


> How is it any different? You install the hash of the boot loader when you issue the machine, then use the trusted system to update the hash if necessary.

With your private CA you can skip the "update the hash" part, removing a crucial step that one might forget in a hurry or that simply might go wrong because of whatever sort of bug or power outage... and brick thousands of machines as a result.


The "update hash" part is the counterpart to the "sign the binary" part, so if you forget to do it you're going to have problems either way. Also, this is the sort of thing that large organizations would have automated tooling to do anyway.


If a device is running code you control, how does it defect?


If you can't make it do something you don't want it to do, someone else can't pressure you to do it.


>Suppose you want to be assured of the software running on your machine. You go into the firmware, point it at your boot loader and say "only this one". It makes a hash of the boot loader and refuses to use any other one until you change the setting, which requires your firmware password. Your boot loader then only loads the operating systems you've configured, and so on.

What if you need to update the bootloader?

>The only thing the signature gets you is remote attestation, which is the evil to be prevented. Simple hashing would get you everything else.

TPMs can do remote attestation without signatures just fine, by measuring the hash of the bootloader. It'd be clumsy, but doable, just like your idea of using hashes for verification.


> What if you need to update the bootloader?

Then you boot the system from the existing bootloader, causing the booted system to be trusted to supply a new hash.

> TPMs can do remote attestation without signatures just fine, by measuring the hash of the bootloader.

If there are no private keys in the TPM from the factory then there is nothing for a third party to force you to sign the hash with, as intended.


How does the system know whether the new bootloader is legitimate or not?

All TPMs have private keys from the factory. They're entirely unrelated to the secure boot keys.


> How does the system know whether the new bootloader is legitimate or not?

However it wants to. Maybe the existing bootloader (chosen by the owner rather than the vendor) or the OS it loads has its own signature verification system for update packages, like apt-get. Maybe the OS downloads it from a trusted URL via HTTPS and relies on web PKI. Maybe it uses Kerberos authentication to get it from the organization's own update servers. Maybe it just boots an OS that allows the operator to apply any update they want from a USB stick, but only after authenticating with the OS.

None of that is the firmware's problem, all it has to do is disallow modifications to itself unless the owner has entered the firmware password or the system is booted from the owner-designated trusted bootloader.

> All TPMs have private keys from the factory. They're entirely unrelated to the secure boot keys.

The point isn't which device has the keys, it's that it shouldn't contain any from the factory. Nothing good can come of it.


The situation you're protecting against is one where someone who compromises the OS can make that compromise persistent by replacing the bootloader. That means you can't place any trust in any component after the bootloader, since an attacker could just fake whatever mechanism you're enforcing.

> The point isn't which device has the keys, it's that it shouldn't contain any from the factory. Nothing good can come of it.

TPMs have private keys, and are not involved in enforcing secure boot. The firmware validating the signatures only has public keys.


> The situation you're protecting against is one where someone who compromises the OS can make that compromise persistent by replacing the bootloader. That means you can't place any trust in any component after the bootloader, since an attacker could just fake whatever mechanism you're enforcing.

Isn't that kind of pointless?

Suppose the attacker gets root on your OS, i.e. what they would need to supply the firmware with a new hash. That OS install is now compromised, because they can now change whatever else they want in the filesystem. If you boot the same OS again, even using the trusted bootloader, it's still compromised.

If you don't realize that it's compromised, you're now using a compromised system regardless of the bootloader. If you do realize it's compromised then you do a clean reinstall of the OS and designate your bootloader as the trusted one again instead of whatever the compromised OS installed.

What does the bootloader really get them that root didn't already?

> The firmware validating the signatures only has public keys.

Having the keys installed from the factory still seems like the thing causing the problem:

If it only trusts e.g. Microsoft's public key, they now get to decide if they want to sign something you might want to use. If they don't, secure boot prevents it from working, which causes problems for you if you want it to work.

Which then puts them under pressure to sign all kinds of things because people want their firmware updaters etc. to work, and then you get compromised by some code they signed which wasn't even relevant to you.

Whereas what you want is some way of designating what can run on your machine, regardless of what someone else would like to run on theirs. But then that's a machine-specific determination rather than something somebody should be deciding globally for everyone.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: