Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

About time! Every bit of compute performance increase I've had personally, has been eaten up by software bloat. Admittedly as software goes to use 10x as much compute, it does get better, by maybe 10%, but the explosive increase in compute performance up to, say, the mid 2010s mostly just served to encourage bloat and obsolete older devices.

My main machine is from about 2012 - a garage sale hand-me-down ultra high end game machine of the time (24GB in 2012!) It does suck a fair bit of power, running 24/7 but it's comfortably adequate for everything, and that's with two simultaneous users. Such hand-me-down laptops as have come my way (garage sales or free) are also adequate.

So if the continuing evolution is, instead, in compute per per joule, rather than absolute compute, I'm all for it. Graphics card power connectors melting at 570W... not for me.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: