Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If somebody asks a question on Stackoverflow, it is unlikely that a human who does not know the answer will take time out of their day to completely fabricate a plausible sounding answer.


People are confidently incorrect all the time. It is very likely that people will make up plausible sounding answers on StackOverflow.

You and I have both taken time out of our days to write plausible sounding answers that are essentially opposing hallucinations.


Sites like stackoverflow are inherently peer-reviewed, though; they've got a crowdsourced voting system and comments that accumulate over time. People test the ideas in question.

This whole "people are just as incorrect as LLMs" is a poor argument, because it compares the single human and the single LLM response in a vacuum. When you put enough humans together on the internet you usually get a more meaningful result.


At least it used to be true.


Have you ever heard of Dunning Kruger effect?

There's a reason why there are upvotes, solution and third party edit system in StackOverflow - people will spend time to write their "hallucinations" very confidently.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: