Yeah, we're basically repeating the "search engine/query" problem but slightly differently. Using a search engine the right way always been a skill you needed to learn, and the ones who didn't always got poor results, and many times took those results at face value. Then Google started giving "answers" so if your query is shit, the "answer" most likely is too.
Point is, I don't think this phenomenon is new, it's just way less subtle today with LLMs, at least for people who have expertise in the subjects.
> On two occasions I have been asked, ’Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?’ I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.
Yeah, we're basically repeating the "search engine/query" problem but slightly differently. Using a search engine the right way always been a skill you needed to learn, and the ones who didn't always got poor results, and many times took those results at face value. Then Google started giving "answers" so if your query is shit, the "answer" most likely is too.
Point is, I don't think this phenomenon is new, it's just way less subtle today with LLMs, at least for people who have expertise in the subjects.