The idea of a technological singularity is sometimes derided as the Rapture of the Nerds, a phrase invented by SF writer Ken MacLeod [update: this isn't true, see his comment] and popularized by SF writers Charlie Stross and Cory Doctorow. I can take a joke, even a boring old joke that implies I’m a robot cultist, but it irks me when jokes become a substitute for thinking. There’s always someone in discussions on the topic who uses the comparison to fringe Christian beliefs about the End Days as if it’s some sort of argument, a reason why all the developments postulated by those who do take the singularity seriously will fail to materialize.
Although the parallel — like any appeal to authority, positive or negative — might work as a heuristic, a hint to look harder for more direct and technical criticisms, it of course fails as such a criticism itself. When computing atom trajectories in a supercomputer, or in nanotechnological devices, Mother Nature doesn’t check the consequences against a List of Ridiculous Beliefs, rejecting any outcomes too similar to those expected by the uncool and stupid.
Now, it could be that if there’s a close similarity between the singularity and the rapture, this points at some sort of psychological flaw shared by believers in both, a seductive but irrational attractor of the human mind that sucks people in, with those raised religiously dressing it up in terms of God, and us technologically-oriented atheists imagining a human-made God-substitute. But that image of a shared psychological flaw is itself so seductive that it has distorted people’s view of what the singularity is about into a kind of geek-bible-wielding strawman — singularitarian ideas are assumed to parallel fundamentalist Christian ideas even where they don’t, just because the comparison is apparently so much fun. “Oh, look at those silly nerds, aping the awful fundies without even knowing it!” In this post, I will list some (but not all) ways in which the singularity and rapture resemble each other less than some people think.
First, though, it’s worth listing some ways in which the singularity and the rapture do resemble each other. Both deal with something beyond human rules emerging into the world, something so powerful that, if it wanted to, it could make human effort from that point on irrelevant. Some predictions about the singularity have included ideas that this power would suddenly help us “transcend” human life and our human bodies, with uploading, in the critic’s mind, parallelling God snatching true believers up to Heaven. And with such an event looming on the horizon, it’s only to be expected that both groups would take the possibility very seriously, in some cases even making it a central concern in their lives.
Now, some differences:
- Rationalism: Whatever you want to call it — critical thinking, debiasing, epistemic hygiene — I don’t know of any movement that emphasizes this nearly to the extent that the singularity movement does. For example, a few of the posters at Overcoming Bias (a group blog that I highly recommend) are involved somehow in the singularity movement, and several others buy into some sort of transhumanist worldview. I think this convergence is a good sign. Here’s an article by Eliezer Yudkowsky (pronounced “Frankensteen”) that talks about how to avoid biased views of the future. Whatever you might call the rapture believers, paragons of rationality they’re not.
- Naturalism: Though this should be obvious, all developments associated with a technological singularity would take place within the ordinary physical processes that scientific folks have come to call home. Kooks like Terence McKenna, who connect the singularity to Mayan prophecies, are laughed at by serious singularitarians. Transhuman intelligence, through its ability to explore unexpected corners in solution space, may seem magical, but we all realize no actual magic is involved. I’m not one who would disqualify non-naturalistic claims outright; it’s just that, in my opinion, the evidence so far strongly favors a naturalist world. Still, it’s a difference worth noting.
- Uncertainty: Contrary to what you might think, most singularity activists don’t think a singularity is in any way an unavoidable consequence of technological progress; the collapse of human civilization, unfortunately, could avoid it quite well. Nor are they anywhere close to absolute certainty that the singularity is going to happen, or happen soon, the way the rapture people have faith in their rapture. It’s possible, after all, that we’ve underestimated the difficulties in making progress in AI, brain scanning, and the like. Experience from past attempts at futurism, as well as psychological research, tells us that it’s easy to be overconfident. But one thing they do agree on is the singularity is worth influencing, and some kinds are worth striving toward. In terms of expected value, even a 10% chance of such a world-changing event should cause many of us to refocus our efforts. A high-profile exception on this point may be Ray Kurzweil, whose overly precise predictions based on such an unreliable methodology as curve-extrapolating should earn him at least a little mockery (though there is also a lot to like in his writings).
- Human-caused: Rapture believers wait for an external God to save them, independent of human effort. Programming a transhuman AI, on the other hand, is something we humans can do. The singularitarian worldview is sometimes claimed to encourage an attitude of passive waiting. I think that’s an unfair accusation; it actually encourages going out and solving problems, just through a different approach.
- Nature contingent on human action: Again, contrary to what you might think, singularity activists don’t blindly expect a singularity to be positive. Intelligence sometimes helps humans understand how to bring out the best in themselves, but this will not generalize to AI. Unless thorough precautions are taken from the beginning, a superintelligent AI is likely to be indifferent toward us — not benevolent or cruel, just uncaring, except to the extent that our continued existence might get in the way of whatever it’s trying to achieve. That means it’s crucial for the first successful AI project to work under such precautions. And that is also to say: that a project under such precautions should be the first.
- No in-group perks: In the Singularity-as-Rapture-for-Nerds analogies that I’ve seen, it’s claimed that the Nerds expect only themselves to benefit, with the rest Left Behind in their misery, in the same way that only Christian true believers are supposed to benefit from the rapture. This seems like a clear example of fitting reality to your analogy when it should be the other way around. I haven’t seen any predictions by singularity advocates that restrict the effects to some elite group of techno-savvy westerners, nor have I seen anyone advocate that this should happen. The singularity is supposed to benefit humanity, not some particular group, and singularity activists understand this. If we succeeded at building an AI that cared enough to leave us alive in the first place, the AI would almost certainly be enough of a humanitarian to help all of us, and with maybe thousands of years of progress in a short time, it would have the resources to make this easy. Scenarios where only the rich and 31337 classes benefit from technological progress seem like a possible danger (though, if the cost of new technologies falls quickly enough, only a temporary one), but these are always pre-singularity scenarios.
- No religious trappings like rituals, worship, or holy writings. I’d expand on this further if this post were about comparisons to religion in general, but it’s specifically about rapturism.
- No revenge: One of the dynamics fueling religious rapture beliefs is the expectation of unbelievers being deliciously proved wrong when it happens, after which horrible things will happen to them. As far as I know, no one in the singularity movement deals in anything like these revenge fantasies. This is a good thing.
- No anthropomorphism: The Christian God is in a way just a big authoritarian alpha monkey, but a superintelligent AI is not expected to think or behave anything like a human. Perhaps it would manifest more like a new set of laws of nature than like a human leader. It might not even be conscious. It would certainly not be a source of arbitrary moral authority.
- The difference that actually matters, of course, is that a belief in the rapture is not justified by the evidence, and a qualified belief in the singularity, defined as disruptive changes caused by a recursively-improving superhuman AI, is. I have found a truly marvelous proof of this, but alas, it falls beyond the scope of this post.
It’s also interesting to think about what would happen if we applied “Rapture of the Nerds” reasoning more widely. Can we ignore nuclear warfare because it’s the Armageddon of the Nerds? Can we ignore climate change because it’s the Tribulation of the Nerds? Can we ignore modern medicine because it’s the Jesus healing miracle of the Nerds? It’s been very common throughout history for technology to give us capabilities that were once dreamt of only in wishful religious ideologies: consider flight or artificial limbs. Why couldn’t it happen for increased intelligence and all the many things that would flow from it?
It would be tragic if, by thinking of some subjects as inherently religious, we let the religious impose their terms on our understanding of the world.