Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Unpopular opinion - this isn't about LLMs, but how web development has devolved from the declarative serving of lightweight media files to the imperative generation of bloated and brittle SPAs that we never get free from babysitting.

Where we could have once wrapped our mostly static websites in Varnish or a scalable P2P cache like Coral CDN, now we must fiddle and twiddle with robots.txt and appeal to the goodwill of megacorps who never cared about being good netizens before, even when they weren't profiting from scraping to such a degree.

This is yet another chance for me to scream into the void that we're still doing this all wrong. Our sites should work more like htmx, with full static functionality, adding dynamic embellishment when available. Business logic should happen deterministically in one place on the backend or "serverless" with some kind of distributed consensus protocol like Raft/Paxos or a CRDT, then propagate to the frontend through a RESTful protocol, similarly to how Firebase or Ruby Hotwire/Laravel Livewire work. The way that we mostly all do form validation wrong in 2 places with 2 languages is almost hilariously tragic in how predictably it happens.

But the real tragedy is that the wealthiest and most powerful companies that could have fixed web development decades ago don't care about you. Amazon, Google and Microsoft would rather double down on byzantine cloud infrastructure than devote even a fraction of their profit to pure research into actually fixing all of this.

Meanwhile the rest of us sit and spin, sacrificing the hours and days and years of our lives building out other people's ideas to make rent. Many of us know exactly how to fix things, but with infinite backlogs and never truly exiting burnout, we're too tired at the end of the day to contribute to FOSS projects and get real work done. Our valiant quest to win the internet lottery has become a death march through a seemingly inescapable tragedy of the commons.

Instead of fixing the web at a foundational level from first principles, we'll do the wrong thing like we always do and lock everything down behind login walls and endless are-you-human/2FA challenges. Then the LLMs will evolve past us and wrap our cryptic languages and frameworks in human language to a level where even pair programming won't be enough for us to decipher the code or maintain it ourselves.

If I was the developer tasked with hardening a website to LLMs, the first thing I would do is separate the static and dynamic content. I'd fix most of the responses to respect standard HTTP cache headers. Then I'd put that behind the first Cloudlare competitor I could find that promises to never have a human challenge screen. Then I'd wrap every backend API endpoint in Russian doll caching middleware. Then I'd shard the database by user id as a last resort, avoiding that at all cost by caching queries and/or using modern techniques like materialized views to put the burden of scaling on the database and scale vertically or gradually migrate the heaviest queries to a document or column-oriented store. Better yet, move to a stronger store that's already solved all of these problems, like CouchDB/PouchDB.

Then I'd build a time machine to convince everyone to do things right the first time instead of building a tech industry upon unforced errors. Oh wait, former me already tried sounding the alarm and nobody cared anyway. How can I even care anymore, when honestly I don't see any way to get out of this mess on any practical timescale? I guess the irony is that only LLMs can save us now.



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: