This reminded me of Matt Blaze's work on physical lock security back in 2003. He found a method of deriving the "master key" for a building (one key that opens all locks) from a single example: https://www.mattblaze.org/masterkey.html
When he published about this he was bombarded with messages from locksmiths complaining that they all knew about this and kept it secret for a reason! https://www.mattblaze.org/papers/kiss.html
It was a fascinating clash between computer security principles - disclose vulnerabilities - and physical locksmith culture, which was all about trade secrets.
In 'Three Days of Condor', Robert Redford's character locates the hotel room of a professional hitmat (who is after him) by going to a locksmith and asking him "which hotel and room this key belongs to?" and the locksmith asks him "are you in the trade?" and he responds, "No, but I read a lot".
It's a serious hacker film, actually. Redford is the ultimate hacker in that film: social engineering, picking locks, scrambling MaBell's circuits, and taking out the bad guys in the CIA.
I have to disgree here regarding the film's merit. There are a few quite interesting (and unique) films in that genre from the 70s that are little known to today's audiances. Most came out around Watergate ~'74 (so were topical in those days) but then have been kind of memory holed.
The Kremlin Letter, 1970 (https://www.imdb.com/title/tt0065950/) - I recall someone saying this film really shows the ugly underbelly of intelligence services. This is an interesting film but it is very dark and somewhat disturbing. This one predates Watergate - it is a Cold War spy flick and makes Smiley's People look warm and cuddly ..
The Conversation, 1974 (https://www.imdb.com/title/tt0071360/) - Gene Hackman's character resurfaces a couple decades later in Enemy of the State (1998).
"memory holed" implies a deliberate coordinated attempt by the mass media and/or powerful actors to suppress some information they don't want knocking around the ether
Perhaps the most important difference is that software — even after being purchased and used — remains relatively easy to patch, unlike a physical lock.
Tbf that's a new-ish principle. 2003 was Windows XP era and the early days of Metasploit. I.e. Microsoft and all the other companies were still figuring out this internet thing, while most computers were riddled with unpatched vulnerabilities. There was no such thing as zero day back then, because you could use many exploits years later.
> But Windows Update was definitely already a thing back then, so I don’t think this “Microsoft was still figuring out this Internet thing” holds.
They had update mechanisms sure. But it was very much upto you to run. When XP came out most people used dial-up (at least in the UK), after 2002 ADSL internet started to become ubiquitous and computers were on the internet for longer periods.
They had to start baking security into every aspect of the OS. It was one of the reasons Vista came out several years later than planned. They had to pull people from Vista development and move them onto Windows XP SP2.
One of the reasons Vista was such a reviled OS is because the UAC controls broke lots of piece of software which ran under XP, 2000 and 98.
> Software was updated all the time, and it’s much more difficult to do that with locks.
YIt wasn't unusual to run un-patched software that come from a disc for years. You had to manually download patches and run them yourself. A software update / next version could take like 30 minutes or so on 56k dialup to download. If you didn't need to download a patch, you probably didn't.
It was a thing, but it was also a thing to have it disabled or simply not working. XP was famous for its hackability. And web frameworks were also far from what you see today with auto updates. It's hard to describe to people who were not involved how crazy ITsec was back then. It felt like the wild west compared to today. Literally every other DB had a critical unpatched vulnerability. Thankfully Shodan did not exist yet, so the barrier to entry was high for people without a particular skillset (which was also much harder to learn back then). But MSF pushed security awareness pretty hard once people realized how easy it can be if you just collect a bunch of scripts for common exploits in a simple framework that everyone can learn.
Totally true. Also consider that although software can theoretically or technically be patched, sometimes patches just don't exist... the amount of unmaintained but yet useful software is just huge.
Long ago I used to maintain a door lock system. I was responsible for designing a new system to encode the room keys and it became obvious as I worked with the internals that it had a vulnerability that would allow anyone to open any lock from this vendor with the right tool.
When I quietly mentioned this, the response was that everyone knows this but we don’t talk about it.
When Mahmoud al-Mabhouh was assassinated with no signs of forced entry on his hotel room, let’s just say it wasn’t surprising. And no, I don’t think these security flaws are some conspiracy or by design - it’s simply the difficulty of updating firmware on 10 year old boards with a 20 year old design with millions of them out in the field. And they cost around $750 a piece to replace and that was back in 2010.
There's a reason for the different cultures and information asymmetry: in most countries you need a criminal background check to be a locksmith. But not to operate a keyboard.
Right. I’m in a structured trade right now and learning about _~hF.8f@,8zKub&&@(4’v but we’re not supposed to talk about it. At least they let us use the company computers for personal net stuff.
That only applies to access to modern electronic tools and digital codebooks requiring accounts. Nothing prevented people in the past from buying the physical books used.
When my first house was under construction in 2001, I created my own key from the builder’s key. When I finally moved in, my key would then disable the builder’s key. Curious how this work, I disassembled my lock and found that there were (iirc) 8 keys which would open all my neighbors houses, even after they had moved in and had disabled the builder’s key. Of course I rekeyed my house to prevent that vulnerability, but in theory it remains on all my neighbors. I also didn’t disclose the vulnerability.
I don't know about builder's keys from that era, but modern builder's keys aren't vulnerable to this problem. The builder's key uses a deeper cut on one or more pins, but the owner's key can and should be pretty unique. The only real requirement is that there be at least one cut to a high enough number to allow for a builder's key. There's a good video here: https://www.youtube.com/watch?v=GUCW4OnE6Mc
"Although a few people have confused my reporting of the vulnerability with causing the vulnerability itself, I can take comfort in a story that Richard Feynman famously told about his days on the Manhattan project. Some simple vulnerabilities (and user interface problems) made it easy to open most of the safes in use at Los Alamos. He eventually demonstrated the problem to the Army officials in charge. Horrified, they promised to do something about it. The response? A memo ordering the staff to keep Feynman away from their safes."
When he published about this he was bombarded with messages from locksmiths complaining that they all knew about this and kept it secret for a reason! https://www.mattblaze.org/papers/kiss.html
It was a fascinating clash between computer security principles - disclose vulnerabilities - and physical locksmith culture, which was all about trade secrets.