Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Emergent Intelligence (chetansurpur.com)
10 points by chetan51 on Aug 8, 2013 | hide | past | favorite | 6 comments


While it is true that seemingly complex phenomena may unfold from interactions between a few simple rules, and that the human brain is also a complex structure composed of repeating basic units, i.e. neurons, by no means we could also conclude that the intelligence housed in our brain is also an emergent phenomena.

Our intelligence could just as well be a complicated mesh of a huge number of 'rules' that have evolved over a long period of time in response to many difficult situations that had threatened the survival of our ancestors. They are also what we call intuition/pre-born knowledge/natural tendencies and reflex responses.

Having said that, we have not ruled out the possibility of intelligence (albeit of a form different than ours) arising as an emergent phenomena from simple rules. But if we were to replicate human intelligence on silicon, I'm more inclined to believe that we'll have to 'manually' encode a huge number of situation-specific rules, e.g. when you see wriggly thing run away.


Actually, there's a growing amount of evidence that there's a single, general-purpose algorithm in the human brain that gives rise to intelligence. For one, there's the fact that every part of the brain looks and behaves the same. There's also the fact that the brain is very plastic in what it learns – the auditory cortex can learn to "see" if we were to rewire the signals from the eyes from the visual cortex to the auditory cortex. It's very unlikely that our brain is hard-wired to recognize faces, for instance, but rather that it learns to do so using this generic learning algorithm.

I urge you to watch Andrew Ng's talk that I linked to in the post, and read On Intelligence (http://www.amazon.com/On-Intelligence-Jeff-Hawkins/dp/080507...) by Jeff Hawkins, a book that totally changed the way I look at intelligent behavior.


Yep I've seen his talk. It's quite fascinating. However, what you're talking about is a learning algorithm, which does not necessarily equate with intelligence. OpenCyc would be the best example that illustrates my point. Edit: on second thought, you probably meant to say that given such a general purpose learning algorithm, and a suitable environment, the algorithm would in time learn enough to produce intelligence of some kind (of what kind, I'm not sure) that's capable of thinking. In that case, I agree with you, and I'll have to revise my opinion, but I'm still not sure if it qualifies as emergent phenomena from simple rules. An analogy would be Google's search algorithm running on huge amounts of data. Would you call the search results an emerging phenomena from simple rules?


The most concrete version of my point is that I don't think the most powerful AI we'll create will have, for instance, a human-coded algorithm for detecting faces. Instead, it'll have the ability to read electrical signals from a camera and understand the changing patterns in them, including the presence of faces. This ability to understand changing patterns would be due to "simpler" rules than the rules specifically designed to understand faces.

So yes, a general purpose learning algorithm, using the correct paradigm, would learn to think in a way as powerful as we do. And it'll do so in a way that its programmers would never be able to predict.

In the same vein, I would say that Google search results is an emerging phenomena, albeit not quite as interesting as general purpose intelligence. This is because it's intractable to predict what Google will return for certain queries, even if we know all of its rules. Keep in mind that there are degrees of emergence, it's not black and white. (On the other hand, I don't think Google's algorithm is as "simple" as it originally was, but that's for another discussion.)


What we call intelligence seems to be heritable to an extent, and we do all share instincts and even Jungian archetypes. All suggest that we each arrive with a "mother matrix" of experience-patterned structure (not just repeating basic units). For another example, consider the genius an infant requires to acquire and decode an entire modern language.

I surmise that that's the "nature" part, and what emerges is the "nurture" part. But these are all just potshots in the dark. Passing our eventual understanding on to faster machines with better retention seems suicidal.


When I first read John Holland's book on Emergence it opened my eyes to an exciting new field.

http://www.amazon.com/dp/0738201421

I'm not sure if we can solve AI in an 8 bit cellular automata, but the general concept of intelligence emerging versus being designed certainly rings true with human evolution.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: