> I think maybe you are misrepresenting the thesis of the book
I certainly could be! It has been a while. But are you saying your understand was that sentience was achieved in the scrambler's evolution, and then later discarded in favour of "more teraflops", if you'll excuse my analogy? That wasn't my understanding - but I might well be wrong.
Even if so, the idea remains unconvincing, in fact more so. After a species achieves a certain domination in its environment - something you would very much expect from a civilisation capable of building spacecraft - evolution simply doesn't apply any more, not in any darwinian sense. Even with humans, evolution has basically stopped, or if it continues, does so under circumstances entirely under our control. I find it impossible to imagine any naturally-ocurring scenario in which such any sentient species is forced by evolutionary pressure to optimise away consciousness in favour of other mental tasks. They would instead, as you mention, simply augment their powers with technology - or deliberate genetic intervention.
It's still an interesting premise and definitely got me thinking. I also highly recommend the book. It's actually quite exciting, all this philosophy does not take up that much space in the text ;)
The problem with the chinese room is that someone had to build it... and someone had to invent chinese.
>I find it impossible to imagine any naturally-ocurring scenario in which such any sentient species is forced by evolutionary pressure to optimise away consciousness in favour of other mental tasks.
If a person is born tomorrow, on earth, with a mutation that makes them slightly less sentient but more intelligent, they would not have any issue reproducing.
Or you could conduct this thought experiment: if chat-gpt4 were somehow installed in a robot with prompt "reproduce", do you think they would not replace humans after some time?
Here is an interesting passage from the book I feel is relevant:
"So sentience has gotta be good for something, then. Because it's expensive, and if it sucks up energy without doing anything useful then evolution's gonna weed it out just like that."
"Maybe it did." He paused long enough to chew food or suck smoke. "Chimpanzees are smarter than Orangutans, did you know that? Higher encephalisation quotient. Yet they can't always recognize themselves in a mirror. Orangs can."
"So what's your point? Smarter animal, less self-awareness? Chimpanzees are becoming nonsentient?"
"Or they were, before we stopped everything in its tracks."
"So why didn't that happen to us?"
"What makes you think it didn't?"
It was such an obviously stupid question that Sascha didn't have an answer for it. I could imagine her gaping in the silence.
"You're not thinking this through," Cunningham said. "We're not talking about some kind of zombie lurching around with its arms stretched out, spouting mathematical theorems. A smart automaton would blend in. It would observe those around it, mimic their behavior, act just like everyone else. All the while completely unaware of what it was doing. Unaware even of its own existence."
"Why would it bother? What would motivate it?"
"As long as you pull your hand away from an open flame, who cares whether you do it because it hurts or because some feedback algorithm says withdraw if heat flux exceeds critical T? Natural selection doesn't care about motives. If impersonating something increases fitness, then nature will select good impersonators over bad ones. Keep it up long enough and no conscious being would be able to pick your zombie out of a crowd." Another silence; I could hear him chewing through it. "It'll even be able to participate in a conversation like this one. It could write letters home, impersonate real human feelings, without having the slightest awareness of its own existence."
"I dunno, Rob. It just seems—"
"Oh, it might not be perfect. It might be a bit redundant, or resort to the occasional expository infodump. But even real people do that, don't they?"
"And eventually, there aren't any real people left. Just robots pretending to give a shit."
"Perhaps. Depends on the population dynamics, among other things. But I'd guess that at least one thing an automaton lacks is empathy; if you can't feel, you can't really relate to something that does, even if you act as though you do. Which makes it interesting to note how many sociopaths show up in the world's upper echelons, hmm? How ruthlessness and bottom-line self-interest are so lauded up in the stratosphere, while anyone showing those traits at ground level gets carted off into detention with the Realists. Almost as if society itself is being reshaped from the inside out."
"Oh, come on. Society was always pretty— wait, you're saying the world's corporate elite are nonsentient?"
"God, no. Not nearly. Maybe they're just starting down that road. Like chimpanzees."
"Yeah, but sociopaths don't blend in well."
"Maybe the ones that get diagnosed don't, but by definition they're the bottom of the class. The others are too smart to get caught, and real automatons would do even better. Besides, when you get powerful enough, you don't need to act like other people. Other people start acting like you."
While im not totally convinced about the premise of "1%ers" being zombies I think its again a bit more than an interesting thought experiment.
"Things do what they do, because if they didn't, they wouldn't be what they are."
If machines can be sentient, then fire could be justifiably called sentient as well. Considering that our metabolic processes largely amount to exothermic energy transfer through oxidization and distribution via what amounts to liquid rust, I suppose the circle from man to machine to simpler machine to thermodynamics, and back to man, could be completed through this pathway.
I certainly could be! It has been a while. But are you saying your understand was that sentience was achieved in the scrambler's evolution, and then later discarded in favour of "more teraflops", if you'll excuse my analogy? That wasn't my understanding - but I might well be wrong.
Even if so, the idea remains unconvincing, in fact more so. After a species achieves a certain domination in its environment - something you would very much expect from a civilisation capable of building spacecraft - evolution simply doesn't apply any more, not in any darwinian sense. Even with humans, evolution has basically stopped, or if it continues, does so under circumstances entirely under our control. I find it impossible to imagine any naturally-ocurring scenario in which such any sentient species is forced by evolutionary pressure to optimise away consciousness in favour of other mental tasks. They would instead, as you mention, simply augment their powers with technology - or deliberate genetic intervention.
It's still an interesting premise and definitely got me thinking. I also highly recommend the book. It's actually quite exciting, all this philosophy does not take up that much space in the text ;)
The problem with the chinese room is that someone had to build it... and someone had to invent chinese.