Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So this isn't just me. I thought I was getting more impatient, but performance does seem to be degrading as it's no longer considered to be a feature. There is absolutely ZERO reason why a modern operating system or some rudimentary site with 10K users should not be blazing fast.

I have to constantly close my browser on a 12-core Intel Mac MINI just because four open sites get its fan spinning like it's a Hoover Dam turbine.



> There is absolutely ZERO reason why a modern operating system or some rudimentary site with 10K users should not be blazing fast.

This just isn’t true: optimizing for speed is always a trade-off and, when an application is fast enough, it can be wasteful to focus on speed rather than other factors


I think we all know this, but have different definitions of "fast enough".


Almost no website or application in 2024 is fast enough. HN itself is the only website I can think of which is "fast enough".


As a general rule, the people who make this decision don’t agree; which is why we’re surrounded by slow websites.


I agree. But it doesn't help when engineers are always whispering in their ear about the supposed evils of premature optimization and that things are fast enough anyway.


When I see vintage computers (amiga, atari) boot up straight into a repl in an instant I have to admit I feel confused.

Modern uses forced a lot of layers for genericity and security ..


There was nothing instantly about Amiga. It only has bootloader in rom. Booting to Workbench is a good 30-60 seconds from floppy, and its not much faster with a HDD (kickstart has some weird hardcoded wait loops). Loading new drawers (folders) is still slow (drawing icon at a time) on fastest accelerators (Vampire) and SSD.


People who lived through the time consistently report that PCs had to reach about 300MHz before they became as responsive as a 25MHz Amiga. Part of this is the design of the OS: Amiga gave user-input interrupts the absolute highest priority.


It is not only that. I remember reading some article where someone did actual measurements. Some of the delays are attributed to just how for example keyboards as such work like amount of movement needed before an electrical keypress signal happens and stuff like that, plus the whole processing chain following. And that is where these older computers are much much faster, because USB or worse, Bluetooth, only allow for so much low latency. Plus there are insane amounts of signal processing steps before it even arrives as an interrupt in the CPU, let alone is processed into some actual OS input event, and then it still needs to go through various levels of application software layers. And that is just the input side, the whole thing has to lead to screen updates and this is another level of technocraziness.


Amiga keyboard uses pretty much same architecture as PC PS/2. Microcontroller in the keyboard talking over serial (~10Kbit/s), pushing keys as soon as they are pressed, another microcontroller on motherboard generating interrupts per key. USB is pooled at 125 Hz. While yes, PS/2 and Amiga keyboard will have lower latency, does ~7ms make that big of a difference?


Musicians can feel latencies of 1ms. It makes a difference.


> Amiga gave user-input interrupts the absolute highest priority

That’s a pretty bad idea in general. Handling priorities properly is absolutely essential for the stability of a system. I’m not familiar with Amiga, but systems weren’t known for their stability back then. Whole OS crashes were much much more common back then.


A typical operating system should not have stability issues just because of badly prioritized interrupts. You may have severe performance issues, sure, but if you're crashing because someone didn't handle an interrupt in time, someone designed something wrong. Maybe it's not software's fault and some hard engineer made bad choices, but this is generally not true today since no mass market OS actually guarantees interrupt latency.


Athlon + Windows 2000 was, for me, the first time a PC did not feel outright sluggish relative to the Amiga 500.


Cold booting my 7 MHz A600 to a fully loaded, functional and responsive desktop takes 17 seconds from hard drive (I just measured). Pretty decent, I'd say.

There's plenty about Amiga that is and/or feels instant. My workhorse is a 14 MHz A1200 and I use it at least once a week, so I get plenty of opportunity to compare. For its intended use cases, most things feel very snappy. Then there are of course areas where it doesn't stand a chance compared to a modern PC, even if the workload is "Amiga sized". Decompression and picture downsampling, for example.


The Amiga booted into the Workbench GUI. It didn't have a native "repl" as such, although you could open a window for a Rexx script interpreter. And if you booted from a floppy drive, that wasn't fast at all.


When people are used to virtually everything waiting on multiple round-trips over cellular connections, there's no point in optimizing local performance. An extra hundred milliseconds of lag in the UI gets attributed to the slowness of the connection.

Plus web and mobile developers have put a ton of work into animations and things like that to make slowness feel natural, which lowers expectations even further. Nobody expects a device to respond quickly. You expect to have to wait a bit for round-trips or animations or both.


> An extra hundred milliseconds of lag in the UI gets attributed to the slowness of the connection.

This is looking at the problem wrong.

I notice your shittily optimised application not just because it’s slow, but because it’s draining the battery.

This is a problem, and developers need to wake up to it. Crappy performance uses more energy because the machine is doing unnecessary work to an end that could be achieved more efficiently, often much more efficiently.

So, please, invest time in performance.


Alternate take: in a world where our entire society is trying to reduce energy usage, to minimize emissions, and so on, it strikes me as insanity that we do not demand every ounce of performance out of the hardware that we have, and equivalently, aim to minimize unnecessary demands on the hardware at the same time.

Burning up a laptop CPU and torching racks of servers and routers in some data center just because the web is full of shitty ui frameworks should be intolerable to consumers and providers. Efficiency really needs to be a goal of all system designers.


What UI framework do you like to use?


On the other hand, in a world of slightly non-responsive software, something that does respond instantly has a subconscious psychological impact on us and builds affinity.


Indeed.

I cover this in "Your Database Skills Are Not 'Good to Have'": https://renegadeotter.com/2023/11/12/your-database-skills-ar...

Specifically when I cite THIS: https://designingforperformance.com/performance-is-ux/


> Imagine if SELECT field1, field2 FROM my_table was faster than SELECT field2, field1 FROM my_table.

possibly it might not be smart enough to reorder an index on (field1, field2), or it had some weird internal constraint on tuple ordering, or it was simply different enough to go down a different query plan sometimes, or maybe there’s something around the actual physical ordering on disk?

but yeah postgres and SQLite are modern marvels that we take for granted… myISAM was not a good time, or at least people tended to violate the correctness/visibility rules it promised (iirc) or something like that. The fact that you can just open up a stable, well-tested sql instance that runs against disk, or drop Postgres into most transactional use-cases, and generally not have to worry about unduly fighting the DB itself, is underappreciated.


That's probably a useful perspective for someone that makes software... But developers and most other folks interact and reason about computers and software pretty differently. Generally, people have practical problems that computers can solve, and doing so with as few steps, things to learn, or any other hassle is what they really care about. Anything more than that is a nice bonus, but pretty unnecessary. Unless something is really sluggish, they just don't really care much. Some of it is that lots of things on computers justifiably require a bit of a wait, and it takes a nontrivial bit of understanding to know if it's justified. Hell most junior devs don't even have that intuition tuned accurately.

IMO Product Managers should be steering resources based on what end users want rather than what we feel they should want. Now if we could just get more of them to listen to actual users more than marketing people that are more interested in growing their list of feature bullet points than making software that's useful to anybody at all.


It’s just that single core performance sort of plateaued (as expected). We just can’t get it faster by 2x as we used to in a couple of years, and it is not really perceptible otherwise, especially that most software besides artificial benchmarks won’t max out the computing capacity of CPUs, it’s mostly waiting for memory and IO.

What we can increase is parallelism, but most problems simply can’t be parallelized for a good enough percentage of the workload, and Amdahl’s law can’t be circumvented.

Nonetheless, many other things can be scaled, like resolution (like, think how much bigger 4k@120fps vs hd@60fps is), file sizes, etc, so the workload does increase and not everything can cancel out, hence the perceived slow down in certain cases.


> There is absolutely ZERO reason why a modern operating system or some rudimentary site with 10K users should not be blazing fast.

I find dismissive statements like this to be extremely useless to the conversation. I agree that a modern operating system could be blazing fast. But if that’s not the case, then there obviously is a reason.

If you know the reason why software in general has gotten slower (compatibility?, bad devs?, more feature?), and already dismissed it as an invalid or insufficient reason, then share what you think that reason is.

If you, like me, don’t know what that reason is, then let’s approach it with a problem-solving attitude and try to find out.


Virtually everything in a modern development stack contributes to the problem. Buses are optimized for bandwidth at the cost of latency. CPUs are broadly optimized for throughput at the cost of latency. OS design is optimized for throughput and isolation, again at the cost of latency. Language runtimes are optimized for feature-richness, again at the cost of performance. Languages are optimized for quality of life features that carry hidden performance costs, like virtual functions and exception-based control flow. Developers compound this with dozens of layers of indirection and abstraction until finally, eventually there's a message on screen for the user.

There isn't a singular cause to brainstorm. It's everything in the entire stack optimizing just a bit in other directions that produces the end result of disappointing, sluggish computers despite their actual capabilities. This discussion isn't going to tread any new ground either. All of these things are known because there are people who can't afford modern computing, like the HFT firms that are all on FPGAs now and real-time embedded systems. It's not hard to do, it's just tedious and expensive the way development used to be for everyone.


But then you go play a game on windows, and everything works instantly. When performance is critical to have a good experience, it can be done, and pretty well.

Most desktop software just doesn't give a shit about performance.


Games have always been one sector where squeezing out every ounce of performance is the goal. And, unlike web devs, game developers test their products on inferior hardware, not just the latest and greatest video cards.


I’m fairly sure you would be equally upset with your world document taking multiple seconds to load were it lower bandwidth. Or if you couldn’t have a 4k screen at 120Hz, which is severalfold more data than hd@60Hz


Capitalism constraints wont even allow adequate bug fixing. There sure as hell isn't time or money to optimize things




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: