Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It seems that computing correlations between words captures something from the underlying reality. GPT also uses embeddings to contextualize words.

https://news.ycombinator.com/item?id=35318448



So going by your comment about the embeddings, it still doesn't have any understanding of the concept the words it's grouping. The only thing it actually knows about these individual words is which part of speech they are and which group they are in


What I'm saying is that knowing how words are related to each other is probably the same thing as "understanding" the underlying concepts. For example, knowing that "queen" relates to "king" as "woman" relates to "man", captures at least something from reality.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: