The issue is that, when presented with a situation that requires writing legibly, spelling well, or reading a map, WITHOUT their AI assistants, they will fall apart.
The AI becomes their brain, such that they cannot function without it.
I'd never want to work with someone who is this reliant on technology.
Maybe 40 years ago there were programmers that would not work with anyone that use IDEs or automated memory management. When presented with a programming task that requires these things and you're WITHOUT your IDE or whatever, they will fall apart.
Look, I agree with you, I'm just trying to articulate to someone why they should learn X if they believe an LLM could help them and "an LLM won't always be around" isn't a good argument, because lets be honest, it likely will. This is the same thing as "you won't walk around all day with a calculator in your pocket so you need to learn math"
> This is the same thing as "you won't walk around all day with a calculator in your pocket so you need to learn math"
People who can't do simple addition and multiplication without a calculator (12*30 or 23 + 49) are absolutely at a disadvantage in many circumstances in real life and I don't see how you could think this isn't true. You can't work as a cashier without this skill. You can't play board games. You can't calculate tips or figure out how much you're about to spend at the grocery store. You could pull out your phone and use a calculator in all these situations, but people don't.
A lot of developers of my generation (30+) learned to program within a code editor and compile their project in command line. Remove the IDE and we can still code.
On the other hand my master 2 students, most of which learned scripting in the previous year, can't even split a project in multiple files after being explained multiple times. Some have more knowledge and ability than others, but a signifiant fraction is just copy-pasting LLM output to solve whatever is asked from them instead of trying to do it themselves, or asking questions.
I think the risk isn't just that LLMs won't exist, but that they will fail at certain tasks that need to get done. Someone who is highly dependent on prompt engineering and doesn't understand any of the underlying concepts is going to have a bad time with problems they can't prompt their way out of.
This is something I see with other tools. Some people get highly dependent on things like advanced IDE features and don't care to learn how they actually work. That works fine most of the time but if they hit a subtle edge case they are dead in the water until someone else bails them out. In a complicated domain there are always edge cases out there waiting to throw a wrench in things.
Knowledge itself is the least concern here. Human society is extremely good at transmitting information. More difficult to transmit are things like critical thinking and problem-solving ability. Developing meta-cognitive processes like the latter are the real utility of education.
Indeed. More people need to grow their own vegetables. AI may undermine our ability for high level abstract thought, but industrial agriculture already represents an existential threat, should it be interrupted for any reason.
My point is that the necessary skill set required by society is ever-changing. Skills like handwriting, spelling, and reading a map are fading from importance.
I could see a future where pioneering might be useful again.
Do you work with people who can multiply 12.3% * 144,005.23 rapidly without a calculator?
> The issue is that, when presented with a situation that requires writing legibly, spelling well, or reading a map, WITHOUT their AI assistants, they will fall apart.
The parent poster is positing that for 90% of cases they WILL have their AI assistant because its in their pocket, just like a calculator. It's not insane to think that and its a fair point to ponder.
When in human history has a reasonably educated person been able to do that calculation rapidly without a calculator (or tool to aid them)? I think it's reasonable to draw a distinction between "basic arithmetic" and "calculations of arbitrary difficulty". I can do the first and not the second, and I think that's still been useful for me.
I do agree that it's a fair point to ponder. It does seem like people draw fairly arbitrary lines in the sand around what skills are "essential" or not. Though I can't even entertain the notion that I shouldn't be concerned about my child's ability to spell.
Seems to me that these gains in technology have always come at a cost, and so far the cost has been worth it for the most part. I don't think it's obviously true that LLMs will be (or won't be) "worth it" in the same way. And anyways the tech is not nearly mature enough yet for me to be comfortable relying on it long term.
The issue is that, when presented with a situation that requires writing legibly, spelling well, or reading a map, WITHOUT their AI assistants, they will fall apart.
The AI becomes their brain, such that they cannot function without it.
I'd never want to work with someone who is this reliant on technology.