Rapture of the Nerds, Not

The idea of a technological singularity is sometimes derided as the Rapture of the Nerds, a phrase invented by SF writer Ken MacLeod [update: this isn't true, see his comment] and popularized by SF writers Charlie Stross and Cory Doctorow. I can take a joke, even a boring old joke that implies I’m a robot cultist, but it irks me when jokes become a substitute for thinking. There’s always someone in discussions on the topic who uses the comparison to fringe Christian beliefs about the End Days as if it’s some sort of argument, a reason why all the developments postulated by those who do take the singularity seriously will fail to materialize.

Although the parallel — like any appeal to authority, positive or negative — might work as a heuristic, a hint to look harder for more direct and technical criticisms, it of course fails as such a criticism itself. When computing atom trajectories in a supercomputer, or in nanotechnological devices, Mother Nature doesn’t check the consequences against a List of Ridiculous Beliefs, rejecting any outcomes too similar to those expected by the uncool and stupid.

Now, it could be that if there’s a close similarity between the singularity and the rapture, this points at some sort of psychological flaw shared by believers in both, a seductive but irrational attractor of the human mind that sucks people in, with those raised religiously dressing it up in terms of God, and us technologically-oriented atheists imagining a human-made God-substitute. But that image of a shared psychological flaw is itself so seductive that it has distorted people’s view of what the singularity is about into a kind of geek-bible-wielding strawman — singularitarian ideas are assumed to parallel fundamentalist Christian ideas even where they don’t, just because the comparison is apparently so much fun. “Oh, look at those silly nerds, aping the awful fundies without even knowing it!” In this post, I will list some (but not all) ways in which the singularity and rapture resemble each other less than some people think.

First, though, it’s worth listing some ways in which the singularity and the rapture do resemble each other. Both deal with something beyond human rules emerging into the world, something so powerful that, if it wanted to, it could make human effort from that point on irrelevant. Some predictions about the singularity have included ideas that this power would suddenly help us “transcend” human life and our human bodies, with uploading, in the critic’s mind, parallelling God snatching true believers up to Heaven. And with such an event looming on the horizon, it’s only to be expected that both groups would take the possibility very seriously, in some cases even making it a central concern in their lives.

Now, some differences:

  • Rationalism: Whatever you want to call it — critical thinking, debiasing, epistemic hygiene — I don’t know of any movement that emphasizes this nearly to the extent that the singularity movement does. For example, a few of the posters at Overcoming Bias (a group blog that I highly recommend) are involved somehow in the singularity movement, and several others buy into some sort of transhumanist worldview. I think this convergence is a good sign. Here’s an article by Eliezer Yudkowsky (pronounced “Frankensteen”) that talks about how to avoid biased views of the future. Whatever you might call the rapture believers, paragons of rationality they’re not.
  • Naturalism: Though this should be obvious, all developments associated with a technological singularity would take place within the ordinary physical processes that scientific folks have come to call home. Kooks like Terence McKenna, who connect the singularity to Mayan prophecies, are laughed at by serious singularitarians. Transhuman intelligence, through its ability to explore unexpected corners in solution space, may seem magical, but we all realize no actual magic is involved. I’m not one who would disqualify non-naturalistic claims outright; it’s just that, in my opinion, the evidence so far strongly favors a naturalist world. Still, it’s a difference worth noting.
  • Uncertainty: Contrary to what you might think, most singularity activists don’t think a singularity is in any way an unavoidable consequence of technological progress; the collapse of human civilization, unfortunately, could avoid it quite well. Nor are they anywhere close to absolute certainty that the singularity is going to happen, or happen soon, the way the rapture people have faith in their rapture. It’s possible, after all, that we’ve underestimated the difficulties in making progress in AI, brain scanning, and the like. Experience from past attempts at futurism, as well as psychological research, tells us that it’s easy to be overconfident. But one thing they do agree on is the singularity is worth influencing, and some kinds are worth striving toward. In terms of expected value, even a 10% chance of such a world-changing event should cause many of us to refocus our efforts. A high-profile exception on this point may be Ray Kurzweil, whose overly precise predictions based on such an unreliable methodology as curve-extrapolating should earn him at least a little mockery (though there is also a lot to like in his writings).
  • Human-caused: Rapture believers wait for an external God to save them, independent of human effort. Programming a transhuman AI, on the other hand, is something we humans can do. The singularitarian worldview is sometimes claimed to encourage an attitude of passive waiting. I think that’s an unfair accusation; it actually encourages going out and solving problems, just through a different approach.
  • Nature contingent on human action: Again, contrary to what you might think, singularity activists don’t blindly expect a singularity to be positive. Intelligence sometimes helps humans understand how to bring out the best in themselves, but this will not generalize to AI. Unless thorough precautions are taken from the beginning, a superintelligent AI is likely to be indifferent toward us — not benevolent or cruel, just uncaring, except to the extent that our continued existence might get in the way of whatever it’s trying to achieve. That means it’s crucial for the first successful AI project to work under such precautions. And that is also to say: that a project under such precautions should be the first.
  • No in-group perks: In the Singularity-as-Rapture-for-Nerds analogies that I’ve seen, it’s claimed that the Nerds expect only themselves to benefit, with the rest Left Behind in their misery, in the same way that only Christian true believers are supposed to benefit from the rapture. This seems like a clear example of fitting reality to your analogy when it should be the other way around. I haven’t seen any predictions by singularity advocates that restrict the effects to some elite group of techno-savvy westerners, nor have I seen anyone advocate that this should happen. The singularity is supposed to benefit humanity, not some particular group, and singularity activists understand this. If we succeeded at building an AI that cared enough to leave us alive in the first place, the AI would almost certainly be enough of a humanitarian to help all of us, and with maybe thousands of years of progress in a short time, it would have the resources to make this easy. Scenarios where only the rich and 31337 classes benefit from technological progress seem like a possible danger (though, if the cost of new technologies falls quickly enough, only a temporary one), but these are always pre-singularity scenarios.
  • No religious trappings like rituals, worship, or holy writings. I’d expand on this further if this post were about comparisons to religion in general, but it’s specifically about rapturism.
  • No revenge: One of the dynamics fueling religious rapture beliefs is the expectation of unbelievers being deliciously proved wrong when it happens, after which horrible things will happen to them. As far as I know, no one in the singularity movement deals in anything like these revenge fantasies. This is a good thing.
  • No anthropomorphism: The Christian God is in a way just a big authoritarian alpha monkey, but a superintelligent AI is not expected to think or behave anything like a human. Perhaps it would manifest more like a new set of laws of nature than like a human leader. It might not even be conscious. It would certainly not be a source of arbitrary moral authority.
  • The difference that actually matters, of course, is that a belief in the rapture is not justified by the evidence, and a qualified belief in the singularity, defined as disruptive changes caused by a recursively-improving superhuman AI, is. I have found a truly marvelous proof of this, but alas, it falls beyond the scope of this post.

It’s also interesting to think about what would happen if we applied “Rapture of the Nerds” reasoning more widely. Can we ignore nuclear warfare because it’s the Armageddon of the Nerds? Can we ignore climate change because it’s the Tribulation of the Nerds? Can we ignore modern medicine because it’s the Jesus healing miracle of the Nerds? It’s been very common throughout history for technology to give us capabilities that were once dreamt of only in wishful religious ideologies: consider flight or artificial limbs. Why couldn’t it happen for increased intelligence and all the many things that would flow from it?

It would be tragic if, by thinking of some subjects as inherently religious, we let the religious impose their terms on our understanding of the world.

46 thoughts on “Rapture of the Nerds, Not

  1. Very interesting article, and interesting blog, only recently come across it. I have been sceptical of the term ‘Singularization’ the processes involved in transhumanism and AI projections for a while. As the technologies involved (at least this far) have tended to decentralise social behaviour, eroding traditional power formations at the same time. The term may be doing the agents of said change, a bit of a disservice. Identification often is a hideous disservice at the best of times. Those the who decry singularitans for being monotheistic, are perhaps not doing the research. But, on the other hand, western techno-capitalist states, do have a tendency to attempt to subsume all cultural systems into its power structure, with monotheistic power structures hosting decentralising techologies with universalising transcendent salvation based philosophies behind them. Maybe the real problem is with certain capitalist power formations. Perhaps this behaviour will auto-collapse through its own stupidity, perhaps it is necessary?…maybe it is simply a phase of multiple organisational systems that are mutually tied?

    One singularities, or many? Is perhaps a pertinent questions that many singularitians should ask.

  2. Speaking as a 40 something American male I find the present to be really disappointing and I expect the future to be the same except with faster computers and cool gadgets. I haven’t last all hope of a Utopian future but from what I have gathered of humanity it won’t happen any time soon.

  3. Steven:

    A high-profile exception on this point may be Ray Kurzweil, whose overly precise predictions based on such an unreliable methodology as curve-extrapolating should earn him at least a little mockery…

    Maybe you could provide details for this criticism of Kurzweil?

    Thanks in advance for any reply!

  4. FrF, I should really read Kurzweil’s book, but from his online stuff it always seems to me he’s to eager to say things like “X will be developed in this and that year” when a lot of things could happen. Certainly transhuman AI would make all his curves obsolete (too “pessimistic”). A lot of people would also argue they’re too “optimistic”. Not sure whether these people are right.

    Mark, that essay is both inspiring and embarrassing. It looks to me like FM just made his dates up. I think >H futurism nowadays is more careful and wiser. Kurzweil at least has his curves. Also, as Michael points out, all the criticisms about rates of change could be true, and it wouldn’t affect the possibility of recursively self-improving AI except by making it happen later.

  5. mea culpa… i hereby admit to most of the above strawman singularitanarianism==second-coming charges.
    sorry about that; i’m a bit of tool sometimes.
    why does it have to be that trolling is so much easier to do than real investigation and insight?… in my defence I tend to produce such vomitus-verbage at cognition-constricting times of day – e.g. its almost 4am my time as I speak.
    heh I need a snooze. keep up the good work, hopefully i’ll be able to de-bias and deconstruct myself to a point of stable rationality at some future juncture.
    sorry to comment on a public board in such a first person manner. dunno about you guys but i’m the sort that needs frequent public humiliation as an agent for change :-)

  6. Pingback: Accelerating Future » Existential Risks: Serious Business.

  7. I have seen some people who entertain “revenge” ideas, saying that in case of a Singularity they will personally hunt down and torture/kill anyone who was aware of the Singuarity/transhumanist scenario, but didn’t help in any way.

  8. I suppose that you could say that the simple awareness and coming to terms with transhumanist technologies, like uploading, would give many people a definite head start in using such technologies to improve their lives.

    On the subject of revenge, I do think that outright obstructionist should be punished somehow. For example, George W. Bush and his embryonic stem cell research ban. It may be inaccurate to say people suffering from fatal illnesses today are doomed to death specifically because of Bush’s actions. However, any delay will result in SOME people in the future from dying needlessly; even it’s only by one day.

    The way I see it is the singularity, and other transhumanist technologies, makes things that in the past were merely treated as “acts of God” become moral issues. In the past, no one could be blamed for the deaths from hurricane Katrina. Today, with all our know-how and early warning technologies all those deaths cease to be acts of God and become acts of Man. Whether it is the president’s fault, or the governor’s, or the mayor’s, or the people of New Orleans for electing such incompetent leaders, people are to blame. In the future, people will be put in into the position to be blamed for even more.

  9. The sad way of the world is that it requires a lot of work to create that better future we all dream of, and if we don’t work — really give it our all — we’re just going to keep waking up 20 years later to find we’re not living in the future we want to be in. At a slow and steady pace, it will be our grandchildren and great-grandchildren who live in the world we want to.

    I’m sure I’ll /like/ my grandchildren and all, when I have them, but I’d really prefer to live in the future myself, and not just imagine it for my grandchildren.

    So for my own selfish reasons, I think that everyone who can — which is really most of us when you get right down to it — should really go out there and /try/, try to build robots and moon bases and systems more intelligent than any systems which have previously existed. In today’s world, you can probably get funding for something even as absurd as a moon base, and since most startups fail anyway, wouldn’t you much rather try and fail to build a moon base, than to try and fail to build another web browser or webmail service … ? And if you succeed –

  10. I know there are some people who entertain revenge fantasies related to the Singularity, but I can’t see that as reflecting on its likelihood to happen. The Singularitarians may be basing their beliefs on rationality and sound analysis of the facts, but they’re still human, and humans are frequently vindictive.

    If Bob believes the sun will rise in the west next morning, and Joe believes it will rise in the east like always, does it have any bearing on truth value of their beliefs that each one intends to laugh and gloat when the other is proven wrong?

  11. first, great blog. Second, Improbus’ comment gave me some hope. Perhaps the best protection against obliteration by a singularity is to require that any attempts at creating self-improving intelligences smarter than humanity be done on microsoft platforms?

  12. Pingback: Black Belt Bayesian » Authority

  13. Good essay, but I think you should add a section to it explaining how not all transhumanists are blinkered optimists. That seems to be another common misconception that fits in with the whole “rapture of the nerds” thing. For instance, here’s something just said to me in a discussion, referring to this post:

    The link pdf gave didn’t seem to touch on the assurance of singularitists that the singularity would be a wonderful thing. It strikes me that it could have the same wonderfulness level as the mass media or high finance, both of which are very sophisticated and futurist, but as far as I can tell, horrible.

  14. Two points I’ve had to make over and over: the expression ‘The Rapture of the nerds’ was not coined by me but by an Extropian writing a satirical piece in an early-90s issue of Extropy; and in my novel The Cassini Division, where a character derides ‘the Rapture for nerds’, the singularity actually happens.

  15. I never read Extropy much, but I was on the extropians list starting in 1993, and I remember Timothy C. May talking about the “Techno-Rapture”. The exact wording “Rapture for Nerds” I associate with the Cassini Division, though as Ken points out, the Nerds made it happen.

  16. Pingback: Accelerating Future » Analogies So Funny, They Halt Critical Thinking

  17. Pingback: Accelerating Future » A Non-Half-Assed Response to “One Half a Manifesto”

  18. I navigated here from a peak oil discussion thread started by someone who is concerned about peak oil but who is put off by the tendency of peak oil zealots to reflexively reject any notion of a technological singularity without taking the time to thoroughly examine the arguments of Singularitarians. Another poster in that thread provided the link to “Rapture of the Nerds, Not.”

    I had just made a post that contained the following when I found the link:

    “I’ve heard people bring up the similarity between the “Geek Rapture” and the “Millenarian Christian Rapture” for over a decade now, but I’ve yet to see anyone flesh it out into a rigorous argument for rejecting the possibility of a technological singularity.”

    Then I came here and I read, “Kooks like Terence McKenna, who connect the singularity to Mayan prophecies, are laughed at by serious singularitarians.”

    Sigh. Bias physician, heal thyself.

    If you know of any “serious singularitarians” who demonstrate even a passing clue with regard to Terence McKenna’s thoughts and predictions concerning the accelerating evolution of machine intelligence, a url would be appreciated. The only person with “serious singularitarian” cred that I’ve encountered who has made a good-faith effort to take in McKenna’s viewpoint is Ben Goertzel, and Ben admires McKenna.

    http://c-realmpodcast.podomatic.com/entry/2007-04-11T10_39_40-07_00

  19. Pingback: tartley.com » Blog Archive » Trans-Speculative Ramblings

  20. Pingback: Accelerating Future » Special Report on the Singularity by IEEE Spectrum

  21. Pingback: Dad2059’s Blog of Science-Fiction/Science Fact and Tinfoil

  22. Pingback: Accelerating Future » Response to Glenn Zorpette, Editor of IEEE Spectrum

  23. Pingback: David J. Williams » Blog Archive » Nazi plans to bomb North America. . .not

  24. Pingback: Accelerating Future » Stross’ Singularity-Clueless, “New Scientist”, Yawn-tastic 21st Century Future

  25. Pingback: Accelerating Future » Charles Stross Adventures Continued

  26. Pingback: Accelerating Future » Technological Singularity/Superintelligence/Friendly AI Concerns

  27. Pingback: Accelerating Future » Superintelligence Is Likely to Happen, Whether or Not You Were Disillusioned by AI in Your College Days.

  28. Pingback: Black Belt Bayesian » Rapture versus MechaRapture

  29. Pingback: Accelerating Future » Making Sense of the Singularity

  30. Pingback: Intelligence Explosion

  31. Pingback: Accelerating Future » Response to Jamais Cascio on “The Singularity and Society”

  32. Pingback: Thinkologist: The Dudley Lynch Blog on Brain Change » Blog Archive » So Far, the Singularity Volunteer Fire Dept. Has Been Sounding Ten Alarms While Rushing Around Trying to Find Smoke

  33. Pingback: Nerd Nihilism | Thrivenotes

  34. Pingback: Accelerating Future » Raiders News Networks Highlights My Comments to Tom Horn

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>