Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A short look at any compiled code on godbolt will very quickly inform you that pretty much all instructions at the assembly level are, in fact, NOT sequences of AND/OR/XOR operations.


All instructions are implemented with logic gates. In fact. All instructions today are likely NAND gates.

Have you ever seen a WallaceTree multiplier? A good sequence that shows how XOR and AND gates can implement multiply.

Now, if multiply + XOR gets the new function you want, it's likely better than whatever the original compiler output.


That was not the claim. The claim was that assembler was made up out of sequences of OR/AND/XOR, and that claim is demonstrably false.


"Made up of" isn't helpful. All assembly languages are equivalent to a sequence of bit operations. (…if you ignore side effects and memory operations.)

In fact there's several different single operations you can build them all out of: https://en.wikipedia.org/wiki/One-instruction_set_computer#I...

So you take your assembly instructions, write a sufficiently good model of assembly instructions<>bit operations, write a cost model (byte size of the assembly works as a cheap one), and then search for assembly instructions that perform the equivalent operation and minimize the cost model.

Like here: https://theory.stanford.edu/~aiken/publications/papers/asplo...


You're missing forest for the trees and lacking information to discuss this really. Check out this paper and similar ones if you want to learn about this area: https://arxiv.org/pdf/1711.04422


What do you think implements assembly language?

Answer: Verilog or VHDL. And these all synthesize down to AND/OR/XOR gates and eventually converted into NAND gates only.

Every assembly language statement is either data movement, or logic, or some combination of the two.

-------

We are talking about SAT solvers and superoptimizers. Are you at all familiar with this domain? Or have you even done a basic search on what the subject matter is?


Computers are not programmed on the level of logic gates. If you want to do that, design an FPGA.


Superoptimizers take this stuff into consideration.

Which is what the parent post was talking about, the rare superoptimizer.


If you want something optimised… "equivalent" isn't going to do it.


You're missing the point. All instructions can be simplified to short integer operations, then all integer operations are just networks of gates, then all gates can be replaced with AND/OR/NOT, or even just NAND. That's why you can SAT solve program equivalence. See SMT2 programs using BV theory for example.

Also of course all instructions are MOV anyway. https://github.com/xoreaxeaxeax/movfuscator


I seem to remember that program equivalence is an NP-hard problem. I very much doubt that you can solve it by reducing it to logic gates.


NP-hard is just a complexity class, not a minimal threshold. SAT solving is kind of a guided optimistic bruteforcing. If the example is small enough, you can still solve it very quickly. Same as you can solve travelling salesman for a small number of cities. We solve small cases of NP-hard things all over the place.


(not to confirm it's an np-hard problem - it may not even be decidable generally, but in practice yes, you can check it that way and SMT2 theories provide the right ops)


That's why superoptimizers work on short sequences of code and take a long time doing so.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: