Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sometimes all the frameworks that re-invent MVC (but in a detached fashion) and tout "server side rendering" just feel like we've come full circle where we're just serving a static view cache that was generated from models and data using a controller.

The big difference here (as with other JavaScript-based renderers) is of course the "use one tool/language for everything" aspect which to me is a bit of a 'meh'-benefit.

At this point, while all the renderers get better and simpler, we're still in a situation where complexity or writing code hasn't really gone away, it just shifted around to new places. Perhaps at some point we get to the HTML + WebComponents stage again (kinda like MDX and React mixing), and classic Apache Server-Side Includes become the new hot thing again.



I think edge computing and CDNs might be what is making this all more interesting. It is not just where you render (Server vs. Client), it is where you render (Dallas vs. Rio). If you can get your app to re render small sections and have the framework figure out that these sections are static and can chuck them in a CDN (or they are not and need an edge server, or a central server) then you can get lots of optimizations that way.

That said, for most side projects and commercial projects the complexity may not be worth it. It probably makes the most difference to very high traffic sites where money is lost for each millisecond delay in giving you the page (or the ads). And in this sense it may be premature optimization.

I like using Next.js, but I often wonder if it is "too much" and adds complexity I wouldn't have otherwise. An old fashioned rails app, and some caching / CDN done in front of it as a separate concern is probably more than enough performance for most people.


I think that any view cache works fine with edge caching. Worst case your first request has to traverse the CDN network to your compute once every cache invalidation.

That said, none of the static site renderers really do anything special that is required to do anything on any edge. Considering the huge amount of storage and memory required to store a node_modules tree and re-render on demand, even classic MVC would easily do the same.

What would make a somewhat larger difference might be a combination of factors; something like centralised push and distributed compute via WASM. That means any logic could be compiled and packaged as-is without needing anything else. Somewhat similar to using cloud flare workers with something like Go or Zig.

Then again, like you described with the rails example, this is mostly a solved problem. Even a full server-side view with just classic HTML 4 and CSS output can do this, at very low latencies and very wide reach. After the first hit, the difference between this and static pre-rendered pages is almost irrelevant.

The biggest 'solves' one might get would be uncached always-computed content which needs to be executed on demand, and attack surface changes. But that gives us a different class of problem since we could also decide to simply offload all of the hard work to someone else and simply pay Squarespace and the likes ;-)


> I think edge computing and CDNs might be what is making this all more interesting

You know what's old-fashioned way of doing this? Buy multiple VPS around the world and deploy website files using a script.


Since this seems like a fairly separate concern, and it is a fair bit of maintenance, that is nice to outsource to the cloud.

Let the experts deal with adding locations, location detection, DNS, redundancy and so on, for a very small price, relatively speaking.

I am not saying you need Astro: You can make the files yourself, then use a service like Netlify to do the CDN-ing for you.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: