Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Ya I used Rails on an aging project for about 6 months and there was so much magic behavior that we couldn't effectively trace through the code, so debugging even the simplest issue took days. Also the happy path mostly ran fine, but we couldn't answer even the simplest questions about the code or make estimations when something went wrong, because we couldn't isolate the source of truth in its convention-dominated codebase.

I come from a C++ background and mostly use PHP/Laravel today, and even though it does things less "efficiently" than the syntactic sugar in Ruby or low-level optimizations in .NET, I find that its lack of magic makes for much higher productivity in the long run. IMHO it feels like Ruby solves the easiest problems with sugar and then glosses over the hardest problems like they don't exist. So I just can't tell what problems it actually solves.

Generally, I think that cleverness was popular in the 2010s but has fallen out of fashion. A better pattern IMHO works more like Cordova or scripting in video games, where native plugins or a high-performance engine written in a language like Swift or Rust is driven by a scripting language like Javascript or Lua. Or better yet, driven declaratively by HTML or no-code media files that encode complex behavior like animations.

Of course all of this is going away with AI, and I anticipate atrociously poorly-written codebases that can't be managed by humans anymore. Like we might need pair programming just to take a crack at fixing something if the AI can't. I'm always wrong about this stuff though, so hopefully I'm wrong about this.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: