I get my power off grid through solar generation and battery storage, so I monitor usage pretty closely.
In my experience, newer computers have slightly higher idle power consumption, but much less total consumption for a given compute task. On top of that, new computers are more likely to have dedicated hardware to accelerate the latest codecs.
If you're using an old computer for ray tracing or neural networks or video transcoding, it's probably using enough power that it's worth upgrading.
If you're browsing the Web or watching YouTube videos or running a file server, power consumption is probably similar on old and new computers, but regardless of the age, much higher on desktop computers than laptops.
Look at power supplies on vintage computers, and you'll see that they're much, much smaller than on modern computers.
In my experience, newer computers have slightly higher idle power consumption, but much less total consumption for a given compute task. On top of that, new computers are more likely to have dedicated hardware to accelerate the latest codecs.
If you're using an old computer for ray tracing or neural networks or video transcoding, it's probably using enough power that it's worth upgrading.
If you're browsing the Web or watching YouTube videos or running a file server, power consumption is probably similar on old and new computers, but regardless of the age, much higher on desktop computers than laptops.
Look at power supplies on vintage computers, and you'll see that they're much, much smaller than on modern computers.