Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Standards have certainly changed over the years. This takes me straight back to 2003 when SimCity 4 came out, turned out to be an absolute resource hog, and I'd have been overjoyed with 20fps.

As the late Henry Petroski said: "The most amazing achievement of the computer software industry is its continuing cancellation of the steady and staggering gains made by the computer hardware industry."



I've heard this version: "what Andy giveth, Bill taketh away"

(Intel vs Microsoft CEOs)

https://en.m.wikipedia.org/wiki/Andy_and_Bill%27s_law

Henry Petroski probably said it first though


The 2013 version of SimCity was a disaster for all sorts of reasons, not least of which that they took game devs and just mysteriously expected them to know how to build and run online services, run databases etc. A friend of mine was working for another EA subsidiary and ended up being parachuted in to try to help save the day. One of his first contributions made a phenomenal difference: He enabled connection pooling to the database servers. They'd done the entirely understandable, naive, thing of just going with the client defaults.


Same CEO who’s responsible for the Unity Fiasco


I missed that, but I guess I shouldn't be surprised.


Seeing what folks in the demoscene can do nowadays with such limited hardware makes modern software feel all the more puzzling. I mean, yes demoscene stuff isn't concerned about ease of development, security or integration. But it does leave you yearning, think about the possibilities of modern hardware if treated with care.


This is the precise reason I prefer embedded development. The challenge of fitting my entire application into a handful of kilobytes with just a KB or two of RAM is a lot of fun. I get to build a program that runs really fast on a very slow system.

It's a point of personal pride for me to really understand what the machine is and what it's doing. Programming this way is working with the machine rather than trying to beat it into submission like you do with high level languages.

It seems a lot of programmers just see the CPU as a black box, if they even think about it at all. I don't expect more than a couple percent of programmers would truly grok the modern x86 architecture, but if you stop to consider how the CPU actually executes your code, you might make better decisions.

In the same vein, very high level languages are a big part of the problem. It's so far abstracted from the hardware that you can't reason about how it will actually behave on any real machine. And then you also have an invisible iceberg of layer upon layer upon layer of abstraction and indirection and unknowable, unreadable code that there's no reasonable way to know that your line of code does what you think and nothing else.

Modern software practices are bad and we should all throw away our computers and go back to the 8086. Just throw away the entire field of programming and start again.


I love embedded as a hobby but God is it a silly statement to imply we should go back to low level asm/C bs for everything, we would get so little done. Oh, but it would run fast at least.

Problem isn't high level dev, it's companies skimping out on the optimisation process.


That's sort of the opposite of treating the hardware with care. It's all done with no allowances for varying hardware at all. This is like pining for a 70s text editor, while refusing to admit the world has moved beyond 7bit ASCII, and that stuff like unicode support isn't "optional".


People can use whatever tools they want; but all my code, blog posts and personal notes would work with 7-bit ASCII and a 70s text editor.

The editors I use support unicode and use UTF-8, but if they didn't I'd hardly notice.


After some editing, aye. But you use emojis on your website and you can't even type your first name with 7-bit ASCII. Ö isn't ASCII.


Ah yeah fuck everyone that doesn't use the Latin ascii character set amirite!!!


Great leap of logic.

1. Write a piece of content in ASCII.

2. ???

3. Fuck everyone that doesn't use the Latin ASCII character set.

Not sure what you put in number two in your mind, but I'd be interested to see it.


It'd all blow up the moment you tried to use someone else's code though


Treated with care and 1000x the development time, budget etc.

Things are slow because we prefer ease of development for these extraordinarily large and complex projects we call video games.

I think the smart thing really is to work it all out at a high level and then target the slow, considered and expensive fixes to where they're really needed.

I'm not excusing obviously lazy development though, but I do think we need to remember how crazy the scope of newer games can be. Imagine showing MSFS2020 to someone from 10-15 years ago; much of the earth scanned in and available for you to fly over, of course there are perf hiccups.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: