Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As the person said, they still had qualia while without the ability to form a narrative, and another person has said it was hard to remember, because narratives act as a mnemonic. Having experienced ego-death myself, I can confirm that I still have a memory of the experience, but that it is more of disjointed impressions (hard to maintain memory of the chronology, which a narrative might be able to cement). But I was still conscious and experiencing things; that is precisely the weird part.

You may have a point in some way (not specifically narrative alone, though): It is true that certain anesthetics don't even render us unconscious, but just make us forget what we were perceiving while having our CNS motor functions paralyzed... and maybe deep sleep is similar--while paralyzed maybe we forget our senses moment by moment. But ego death from psychedelics is not the same thing: it doesn't completely erase your ability to recall impressions and images.

Either way none of this explains how qualia can arise other than it helps to remember things for us to have a sense of chronology/persistence of experience, and so I doubt a machine that can come up with a narrative of what it is sensing would magically have qualia (we would have no way of knowing, even if it claimed it did--all we could know is that WE are consciously observing an apparently conscious thing).

I sometimes think that persistence of experience is an illusion maintained by memory. That every time we lose consciousness then regain it, "it" is simply a new instance that is a clone of the old instance with the illusion it is the same person due to memories and being in the same body. Like "the prestige" the man drowns each night and experiences death, but the clone has the illusion that it has been alive up until the point it was created. In this case I see consciousness and qualia as a temporary instance/shape of neurons activating in unison that is suddenly conscious, but can lose residency, and when it re-forms has the illusion "it" is the same thing and maintains an identity.

Considering it takes many depths of artificial neurons to simulate a single human neuron, and the speed at which learning can regress is so slow now, I think we are a very very long way from having such an instance of consciousness appear in an active neural network. Such a network would need to be so analog that it would be effectively the same thing as actual grey matter, and you may as well just have a baby. At least with a baby you would have a true sense of the responsibility you have in creating such an instance (or persistent series of instances that creates an identity).



> I sometimes think that persistence of experience is an illusion maintained by memory. That every time we lose consciousness then regain it, "it" is simply a new instance that is a clone of the old instance with the illusion it is the same person due to memories and being in the same body.

That makes me think about a saying I've heard about how memory isn't read-only, it's always read-write, and you write back the modified version of what you read. We're constantly changing our past to be more more like what wished it was, and you can see an example of that process in another of my replies in this thread.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: