Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Do we know how human understanding works? It could be just statistical mapping as you have framed it. You can’t say llms don’t understand when you don’t have a measurable definition for understanding.

Also, humans hallucinate/confabulate all the time. Llms even forget in the same way humans do (strong recall in the start and end of the text but weaker in the middle)



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: