So, a negative review of the Summit has finally been posted, by Maxwell Barbakow and Jacob Albert at the Yale Daily News, the student paper.
Reading the beginning of the article, it seems as if Max and Jacob were prodded into going by an associate or something, because they show that they have no clue about the entire topic, and are negatively predisposed to it from the starting line. This is demonstrated by the quote:
Though they seemed incomprehensible at the time, we came to a better understanding of the attendees' motives for schlepping from various parts of the country to New York, once we got a better grasp of the tenets behind the Singularity.
There's nothing wrong with that... but then, why are you going to a semi-advanced conference on a topic you dislike? Why should your review be taken seriously if you openly admit that you got yourself in over your head by going in the first place? Isn't it clear that a negative predisposition from the start is going to influence as you look to confirm your initial beliefs?
Then comes the compulsory line mentioning pony-tails.
Behind us, there were computer hackers -- some pony-tailed, most overweight, almost all clad in leather jackets -- mingling with tech hippies sporting braided goatees and yoga pants.
Why the heck are people so obsessed with pony tails? Another writer who reviewed the Summit was also mentioning them. Why is it that our corporate mainstream culture sees pony tails or long hair in men as some sort of horrible affront? Are there really conferences out there where there isn't a pony tail in sight? If you look at images of the attendees on Flickr, you can see that 1) the number of people with pony tails is a tiny minority, maybe 5%, if that, and 2) the number of overweight people is also a minority.
The Singularity connotes a moment in time; to be precise, some moment in 2029, when the first â€œsuper-intelligent machine will arise, capable of improving on its own source code without human input.
No. We have programs that can improve their code without human input now -- they're called compilers. And certainly there is no known precise date for the Singularity, that is rubbish. Go back to the Wikipedia page for the Singularity for a basic introduction.
The purpose of the summit was twofold: to celebrate and to expound. To celebrate, because the ever-accelerating rate of technological change cannot help but lead to a future where man is machine, inhabiting a world without famine, war or disease.
Wrong, wrong, wrong. The purpose of the Summit was to investigate the issue with an open mind, and all the talks (which went right over your heads, as you openly admit later) did exactly that, as will be proven when they go online. (Don't ask me when. It will be a few weeks, not a few months like last time.)
If anything, maybe this is a reaction to Kurzweil's personal vision, which he didn't really even present too strongly in his talk, and which is not necessarily the vision of the 820 or so other individuals who attended. Not a single speaker said the Singularity was inevitable. The organizers of the conference, namely the organization I work for, the Singularity Institute, includes many people who consider technological change an incidental aspect around the Singularity and view the event mainly as the potential rise of greater-than-human intelligence, as it was originally defined by Vernor Vinge. In fact, the Singularity Institute fears a future where incorrectly programmed artificial intelligences produce havoc and possibly even extinction for humanity. This is hardly a blind celebration of an inevitable future Rapture. If you had gotten up in time for the first talk on Saturday, you would have seen that conference began with a look to the risks of the Singularity, not the mystical inevitable Rapturous benefits that only idiots believe in.
The delusion that the Singularity is the "Rapture for Nerds" is perpetuated entirely by the expectation that it is, and subsequent confirmation bias operating to confirm that expectation, because that provides a convenient rhetorical framing to "understand" the issue without all the hard work of actually understanding it. It is an attractive cognitive shortcut that makes the believer seem both smart and sauve without actually being either. As Eliezer Yudkowsky puts it in his "Three Major Schools" page, the three schools of thought on the Singularity are 1) accelerating change, 2) event horizon, and 3) intelligence explosion, with only the most ignorant believing that the Singularity is about 4) Apocalyptism. Yudkowsky comments:
I find it very annoying, therefore, when these three schools of thought are mashed up into Singularity paste. Clear thinking requires making distinctions.
But what is still more annoying is when someone reads a blog post about a newspaper article about the Singularity, comes away with none of the three interesting theses, and spontaneously reinvents the dreaded fourth meaning of the Singularity:
* Apocalyptism: Hey, man, have you heard? Thereâ€™s this bunch of, like, crazy nerds out there, who think that some kind of unspecified huge nerd thing is going to happen. What a bunch of wackos! Itâ€™s geek religion, man.
These nerdy nerds have their nerd Rapture because they are computer freaks who never get laid! Bwa ha ha ha! We are so cool and not at all like that. Like, totally.
A lot of the people who "criticize" the Singularity have not even read the papers that present the reasoning behind the most often-heard arguments. They have read a blog post here, a newspaper article here, and just make up the rest based on what science fiction they saw on television last week. Real critiques of the Singularity, like Katja Grace's recent post, which can actually serve as a starting point for real discussion, are disturbingly difficult to find because would-be geniuses latch on to the most fun-sounding framing of the idea that denigrates the thinking of an entire community, so they don't have to waste the time understanding it. When you transform all your opponents into straw men, victory is easy.
Back to the article:
And to expound, because its supposed inevitability aside, it is still unclear -- to us two, to the pony-tailed geniuses in the room and no doubt to you -- how the Singularity will actually come about.
The person who first introduced the concept of the Singularity, Vernor Vinge, didn't call it inevitable. None of the speakers implied it was inevitable, except perhaps Kurzweil. Many papers and analyses have been published of possible paths to the Singularity, including the famous first paper published on the topic by Vernor Vinge, but since you guys haven't read any of them, all this stuff must seem mighty mysterious to you. And since it's mysterious to you, why not say that everyone else around you is clueless as well? That helps you look a lot better.
Kurzweil's latest prediction, and the one he hopes to be remembered by, is the Singularity. He is obsessed with analyzing the patterns of technological improvement. According to him, a self-conscious computer program is the necessary product of Mooreâ€™s law, which says that technical innovation accelerates at exponential rates, currently doubling every 10 years and only getting faster.
Yes, absolutely obsessed. Anyone who studies something and publishes books on it that make him famous must be obsessed with that something. Wildly obsessed.
The beauty of the Singularity is that it's grounded in a rigorous understanding of the organizational principles of the universe, despite the ludicrous claims and messianic elements that might surround it.
How would you know? You've only been thinking about the Singularity seriously for a couple weeks before you wrote this article.
And the finisher:
Sure, except for Kurzweil, we didn't know anyone there and couldn't follow many of the talks. Too intelligent for themselves, too intelligent for others, too intelligent for the planet, the Singularity Summit attendees can only hope for a future humanity that is bound, it, too, by the cold intelligence of the machine mind. Live long and prosper, Mother Earth.
Such anti-intellectualism in a Yale University publication is pretty shameful. When the talks are put online, everyone will see that the vast majority of them are perfectly easy to follow. How did these kids get into Yale? By mocking people more intelligent than them? By spacing out and surfing the Internet on their phones while professors are lecturing?
Intelligence is why we are sitting comfortably in heated buildings, with ample food and water available through industrial agriculture, interconnected on a global digital network of information sharing. Like Roko recently pointed out, most arguments against the power or plausibility of greater intelligence could have easily been invoked a million years ago to argue why the contemporary hominids were sufficiently intelligent and Homo sapiens, a more intelligent species, would contribute nothing to the universe or otherwise be more of the same. In a world of such error and stupidity, no one can credibly dismiss the power of greater intelligence directed towards benevolent ends.