Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> No clue how they let it pass

It’s very common to see AI evangelists taking its output at face value, particularly when it’s about something that they are not an expert in. I thought we’d start seeing less of this as people get burned by it, but it seems that we’re actually just seeing more of it as LLMs get better at sounding correct. Their ability to sound correct continues to increase faster than their ability to be correct.



> Their ability to sound correct continues to increase faster than their ability to be correct

Sounds like a core skill for management. Promote this man (LLM).


This is just like the early days of Google search results, "It's on the Internet, it must be true".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: