The concern here is mainly on practicality. The original mainframes did not command startup valuations counted in fractions of the US economy, they did qualify for billions in investment.
This is a great milestone, but OpenAI will not be successful charging 10x the cost of a human to perform a task.
Hmm the link is saying the price of an LLM that scores 42 or above on MMLU has dropped 100x in 2 years, equating gpt 3.5 and llama 3.2 3B. In my opinion gpt 3.5 was significantly better than llama 3B, and certainly much better than the also-equated llama 2 7B. MMLU isn't a great marker of overall model capabilities.
Obviously the drop in cost for capability in the last 2 years is big, but I'd wager it's closer to 10x than 100x.
Or 10x the skill and speed of a human in some specific class of recurrent tasks. We don't need full super-human AGI for AI to become economically viable.
Companies routinely pay short-term contractors a lot more than their permanent staff.
If you can just unleash AI on any of your problems, without having to commit to anything long term, it might still be useful, even if they charged more than for equivalent human labour.
(Though I suspect AI labour will generally trend to be cheaper than humans over time for anything AIs can do at all.)
This is a great milestone, but OpenAI will not be successful charging 10x the cost of a human to perform a task.