Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's not the right one on its own, but it's the closest we have right now to talk about things like "high level equivalence" rigourously. Like, one way to talk about logic is that it's e.g. invariant under double negation - so any compression algorithm that does "semantically lossless compression" can exploit that property. Deep learning then becomes an algorithm to find features in the data which remain invariant under irrelevant transformations, and representations useful for semantic similarity could then be formulated as a very big and complex composite group, potentially conditional... there's probably a new theory of hierarchal similarity and irreducible collisions/errors required, but as of now, group theory, approximate equivariance and category theory are the bleeding edge of thinking about this rigorously while still having sota benchmarks/real world applicability to my knowledge

Happy to discuss more via email as well



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: