> The definition of AGI is diffuse enough to make it an argued point
This hits the nail on the head. 2-3 years ago when the current round of AGI hype started everyone came up with their own definitions of what it meant. Sam Altman et al made it clear that it meant people not needing to work anymore, and spun it in as positive a way as they could.
Now we're all realising that everyone has a different definition, and the Sam Altmans of the world are nit picking over exactly what they mean now so that they can claim success while not actually delivering what everyone expected. No one actually believes that AGI means beating humans on some specific maths olympiad, but that's what we'll likely get. At least this round.
LLMs will become normalised, everyone will see them for the 2x-3x improvement they are (once all externalities are accounted) for, rather than the 10x-100x we were promised, just like every round of disruption beforehand, and we'll wait another 10-20 years to get the next big AI leap.
This hits the nail on the head. 2-3 years ago when the current round of AGI hype started everyone came up with their own definitions of what it meant. Sam Altman et al made it clear that it meant people not needing to work anymore, and spun it in as positive a way as they could.
Now we're all realising that everyone has a different definition, and the Sam Altmans of the world are nit picking over exactly what they mean now so that they can claim success while not actually delivering what everyone expected. No one actually believes that AGI means beating humans on some specific maths olympiad, but that's what we'll likely get. At least this round.
LLMs will become normalised, everyone will see them for the 2x-3x improvement they are (once all externalities are accounted) for, rather than the 10x-100x we were promised, just like every round of disruption beforehand, and we'll wait another 10-20 years to get the next big AI leap.