Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

by that definition what you realize is that it's the same as what I said since it can easily be reduced down to any thing any human can do, and your definition says AGI can go figure out how to do it. you extrapolate this onto future tasks and viola.

as I mention in another post, this is why I do not make any distinction between AGI and superintelligence. I believe they are the same thing. a thought experiment - what would it mean for a human to be superintelligent? presumably it would mean learning things with the least possible amount of exposure (not omniscience, necessarily).



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: