Hacker Newsnew | past | comments | ask | show | jobs | submit | eadonmachine's commentslogin

> What are the big milestones to expect in your life?

> Your next milestone is 08 Aug 2030 when you’ll be the 5th billionth person to be alive in the world.

I can't work out what they mean by this.


If you order everyone alive from youngest to oldest on that date, you'll be older than 5 billion people.


Oh! Thank you. They phrased that really badly. "You'll be the 5 billionth oldest person" would've been a lot clearer.

Or 5 billionth youngest, whatever lol


Inevitably they get bored or the fad dies out, nothing is gained, but then people still excitedly go along with the next one every time as if they have no memory of the previous 1000 iterations.

I suppose it's probably more fun than miserably judging everyone from the outside...


Many of the people involved in the new fad don't have a memory of the previous 1000 iterations because they are different people. Many new people are joining the internet every day. Also be aware of the obvious-in-hindsight bias: anyone can point out fads after they've come and gone. It's not as easy confidently identifying the fads that are going on right now.


> The vast majority of people who work in the tech industry are perpetually 5 years out of university.

> So the vast majority of the time we get the same tools, the same frameworks, and the same ideas poorly reimplemented over and over.

I heard that at a talk by a salty tech OG. My experience doesn’t disagree with his observation, but I do have slightly more hope than he had.


Why not a third option of just enjoying life from any perspective you have, including sometimes being on the outside and sometimes on the inside?


>Microsoft launches a deepfake training tool


3200Hz memory??


Wait. You're right. I did a double-take then doubted myself.

I can't find anything historical that used 3.2kHz memory access speeds, but I'm pretty sure there was something out there... from the 1960s.


That speed is slower than magnetic memory; it could be delay-line memory, which would be even earlier.


I'm not sure what your question is. It's shared with the integrated GPU, so faster is better even if it means that it costs a bit battery life.


Memory speed is usually in MHz, not Hz. They're just making fun of the mistake.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: