Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is a very imperfect analogy, but CPUs are typically designed using a high level language like SystemVerilog (equivalent to something like C++ in the software world.) To actually get the CPU manufactured, you have to "compile" it into a binary format that specifies the physical layout of the chip while respecting a set of design rules for whatever process you're targeting. These rules are usually considered highly confidential since they can reveal manufacturing details that the foundry wants to keep secret. This secrecy is part of the reason that open source never really took off for silicon the way it did in the software world - imagine if every time you wanted to compile your code for a new machine you had to pay a bunch of money and sign an NDA.

The hope here is that open sourcing the design rules will help build an ecosystem of open source silicon. While it's true that 180nm is too big for anything high performance, it's perfectly fine for hobbyists who don't have the money required for more advanced nodes anyway. On the foundry side, they don't need to be as secretive about old nodes and it might increase demand for underutilized manufacturing capacity.



I dream of a world where public libraries have CPU printers.

180nm isn't crazy awful though, is it? That's about 20 years back, so you won't be doing ML, but it's enough to have industry applications and low power general purpose computing.


180 nm is what the first generation of Intel Pentium 4 and the second generation of AMD Athlon have used, in 2000.

While you would not want to make a CPU in 180 nm now, or any purely digital circuit, which could be better made with a FPGA, unless you need clock frequencies over 1 GHz, if you want to make any mixed digital-analog circuit with an important analog part, 180 nm can be fine.

Any analog circuit part made in 180 nm will not be much larger than in any up-to-date process, because the dimensions of the analog components are determined by functional requirements, such as noise or maximum current, not by the lithography limits.


once a process is no longer bleeding edge, what's the benefit of keeping manufacturing details secret?

at this stage, presumably no one would risk the capital and time required to copy the foundry while publishing details increases the likelihood of attracting more clients.

thanks in advance for the clarification on why the industry operates the way it does.


> once a process is no longer bleeding edge, what's the benefit of keeping manufacturing details secret?

Even if it's not bleeding edge, there still might be special features or secrets they want to protect. For really old nodes, I think it boils down to two reasons: 1) Clients with money don't care about open design rules, so there was basically no demand for it. 2) Foundries generally have a pretty strong culture of secrecy, so without someone asking externally they're not going to open source anything.


thanks for the clarification.

assuming no one will invest the time and capital to build a similar foundry, is it fair to say that such secrecy has minimal benefits? or can an existing foundry learn these secrets and improve their own products pretty easily?


There's a lot of risk for new nodes and less for old ones. Exactly how much, I'm not sure.

I guess let me put it this way - I think it's entirely rational for them to be cautious. There's currently not a lot of upside to being more open, so it's usually not worth it even if the risk is small. Programs like this are great because they hit both sides of the equation: increasing demand for open PDKs while lowing the perceived risk by building a track record of successful releases.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: