Simplified Humanism, Positive Futurism & How to Prevent the Universe From Being Turned Into Paper Clips

I recently interviewed Eliezer Yudkowsky for the reboot of h+ magazine, which is scaling down from being a magazine into a community blog of sorts.

The interview is a good primer for what the Singularity Institute is about and the basic rationales behind some of our research choices, like focusing on decision theory. This is a good interview to read especially for those not entirely familiar with the research of the Singularity Institute. It can also be used to promote the Singularity Summit, so please share the link!

Here are the questions I asked Eliezer:

1. Hi Eliezer. What do you do at the Singularity Institute? 2. What are you going to talk about this time at Singularity Summit? 3. Some people consider “rationality” to be an uptight and boring intellectual quality to have, indicative of a lack of spontaneity, for instance. Does your definition of “rationality” match the common definition, or is it something else? Why should we bother to be rational? 4. In your recent work over the last few years, you’ve chosen to …

Read More

Singularity Hub Posts About the Summit 2010

Singularity Hub, one of the best websites on the Internet for tech news (along with Next Big Future and KurzweilAI news) has posted a reminder on the upcoming Singularity Summit in San Francisco, and a promise that they will provide excellent coverage.

Register before July 1st, before the price goes up another $100! We also have a special block of discounted rooms at the Hyatt available — $130/night instead of the usual $200.

Sorry the Summit is $485 and will be $585 and then $685. We fly all the speakers out and cover all their expenses, there are twenty speakers, do the math. Profits from the Summit go to the Singularity Institute for our year-round operations and Visiting Fellows program, which provides us with a community of writers, speakers, and researchers to continue our Singularity effort until it is successful.

If you want to organize a cheaper annual event related to the Singularity, feel free to do so. We hold a workshop after the event for academics, so we get …

Read More

Reducing Long-Term Catastrophic Artificial Intelligence Risk

Check out this new essay from the Singularity Institute: “Reducing long-term catastrophic AI risk”. Here’s the intro:

In 1965, the eminent statistician I. J. Good proposed that artificial intelligence beyond some threshold level would snowball, creating a cascade of self-improvements: AIs would be smart enough to make themselves smarter, and, having made themselves smarter, would spot still further opportunities for improvement, leaving human abilities far behind. Good called this process an “intelligence explosion,” while later authors have used the terms “technological singularity” or simply “the Singularity”.

The Singularity Institute aims to reduce the risk of a catastrophe, should such an event eventually occur. Our activities include research, education, and conferences. In this document, we provide a whirlwind introduction to the case for taking AI risks seriously, and suggest some strategies to reduce those risks.

Pay attention and do something now, or be eliminated by human-indifferent AGI later. Why is human-indifferent AGI plausible or even likely within the next few decades? Because 1) what we consider “normal” or “common sense” morality is actually extremely complex, …

Read More

Last Chance to Contribute to 2010 Singularity Research Challenge!

Cross-posted from SIAI blog:

Thanks to generous contributions by our donors, we are only $11,840 away from fulfilling our $100,000 goal for the 2010 Singularity Research Challenge. For every dollar you contribute to SIAI, another dollar is contributed by our matching donors, who have pledged to match all contributions made before February 28th up to $100,000. That means that this Sunday is your final chance to donate for maximum impact.

Funds from the challenge campaign will be used to support all SIAI activities: our core staff, the Singularity Summit, the Visiting Fellows program, and more. Donors can earmark their funds for specific grant proposals, many of which are targeted towards academic paper-writing, or just contribute to our general fund. The grants system makes it easier to bring new researchers into the fold on a part-time basis, widening the pool of thinkers producing quality work on Artificial Intelligence risks and other topics relevant to SIAI’s interests. It also provides transparency so our donor community can directly evaluate …

Read More

Singularity Institute Featured in January Issue of GQ

If you haven’t picked up this month’s GQ magazine, do it soon. There is a feature on the Singularity Summit and Singularity Institute. (I also hear there is a piece by Carl Zimmer on the Singularity in Playboy but I haven’t picked it up yet.) Seeing community names like Rick Schwall (an SIAI donor and supporter) in a national magazine sure is a trip. According to the National Magazine Awards, circulation is somewhere between 500,000 and 1,000,000 and is up in recent years.

Here is the Singularity portion (I removed the magazine cover due to copyright concerns and complaints from the comments section):

Really freaky, mmhmmm! Freaky like our ancestral past or Pandora freaky, I hope.

H/t to Gus K. for pointing out the article earlier this month.

Read More

Bob Mottram Objections to 2010 Singularity Research Challenge and Response

Bob Mottram isn’t impressed by the Singularity Institute’s grant proposals for our $100,000 Singularity Research Challenge:

It’s kind of sad how SIAI seems to have become obsessed with “AI risks” and human extinction. Perhaps they always were from the beginning, but it’s just my perception of them that was at fault. There’s certainly a place for some group, existing independently from academia, who actively promote AI related R&D in a direction which has positive value to society and addresses problems which are highly relevant. This applies especially to the work which is less glamorous, more ambitious stuff which requires an expenditure of effort on a longer time scale than a typical PhD thesis or DARPA/X-prize contest. The list of grant proposals for the Singularity Research Challenge seems incredibly disappointing, and focused on spurious notions of risk which, in my opinion, would have no beneficial impact on AI even if it were to be funded in its entirety.

To clarify what is happening, what Bob Mottram considers “spurious notions of risk”, we consider “deadly serious notions …

Read More

Me on the Radio — KUSP in Santa Cruz

On Sunday, January 3rd, I did an interview on KUSP (Central Coast Public Radio) in Santa Cruz, California, a National Public Radio affiliate. I talked to Rick Kleffel for an hour about the Singularity, the Singularity Institute, what we do, anthropomorphism, Friendly AI, and the like. It was for his “Talk of the Bay” radio program. Here is the audio archive.

Read More

Support for 2010 Singularity Research Challenge

Over the past week and a half that the Singularity Research Challenge has been launched, we’ve received some nice support, including a post by Razib Khan at the Gene Expression blog and an explicit donation recommendation by Alan Darwst, author of Utilitarian-Essays.com and a well-regarded figure in the online utilitarian community. Here is a post by Alan on Felicia, the utilitarian community, that goes into why research charities like SIAI offer a very high return on philanthropic investment.

Don’t it just let be Alan and Razib — you too can make a blog post about the Singularity Research Challenge, right at this very moment!

Read More

10 Years of “Singularitarian Principles”: Analysis

Today is January 1st, 2010, the 10th anniversary of the online publishing of “The Singularitarian Principles” by Eliezer Yudkowsky. This document is a handy set of common sense advice for anyone who considers the possible creation of superintelligence a big deal in utilitarian terms (or otherwise). The work is divided into four “definitional principles”, which form of the central definition of the term “Singularitarian” (as it was defined at the time), and “descriptive principles”, which “aren’t strictly necessary to the definition, but which form de facto parts of the Singularitarian meme.”

The definitional principles are:

1. Singularity 2. Activism 3. Ultratechnology 4. Globalism

The descriptive principles are:

1. Apotheosis 2. Solidarity 3. Intelligence 4. Independence 5. Nonsuppression

The “Singularity” principle refers to believing in “some fundamental change in the rules” in the future. Looking back on this from the vantage point of 2010, I think the term “Singularity” as defined here (“defined many different ways”) is far too vague …

Read More

2010 Singularity Research Challenge: Donate Now!

As I mentioned in my last post, the Singularity Institute (SIAI) has launched a 2010 Singularity Research Challenge to raise funds for Singularity research. Our organization is worth giving money to because the Singularity is a matter of life and death for our entire species, and we may only have a few decades remaining to deal with it. Our group is the most dedicated to maximizing the probability of a positive outcome, and has the intelligence and skill to produce detailed ideas and attract major media attention. We achieve a huge amount with our money. Nearly everyone at the Singularity Institute, including myself, takes a salary significantly lower than our market value given our education and experience, because we personally care about this issue a whole lot.

We have a network of several dozen young academics, mostly aged 20-30, who are devoted to performing research and writing papers on the topic of the Singularity if given the proper support and infrastructure. (For a snapshot of 2009′s Visiting Fellows, along with names and bios, see this …

Read More