Accelerating Future Transhumanism, AI, nanotech, the Singularity, and extinction risk.


Global Catastrophic Risk Research Page

From Seth Baum:

Global catastrophic risks (GCR) are risks of events that could significantly harm or even destroy human civilization at the global scale. GCR is related to the concept of existential risk, which is risk of events that would cause humanity to no longer exist. (Note that Nick Bostrom, who coined the term existential risk, defines it in a slightly different way.) Prominent GCRs include climate change, nuclear warfare, pandemics, and artificial general intelligence. Due to the breadth of the GCRs themselves and the issues that GCRs raise, the study of GCR is quite interdisciplinary.

According to a range of ethical views, including my views, reducing GCR should be our top priority as individuals and as a society. In short, if a global catastrophe occurs, then not much else matters, since so much of what we might care about (such as human wellbeing, the wellbeing of non-human animals, or the flourishing of ecosystems) would be largely or entirely wiped out by the catastrophe. The details about prioritizing GCR are a bit more complicated than this (and are part of ongoing research), but GCR does nonetheless remain a (or the) top priority from a range of views.

Seth Baum is one of the only academics working on existential risks. Last December I attended the Society for Risk Analysis annual meeting at his invitation, and gave a talk on molecular nanotechnology risk. I also summarized what the Singularity Institute does.

Seth Baum has some attention and interest from a few leading figures in the risk analysis community, but he needs to get more momentum to have a larger impact. If you are an academic you should consider partnering with him. The UK has a nicely established existential risk research group in the form of the Future of Humanity Institute, but the US lacks one. We have SIAI and the Lifeboat Foundation, but SIAI is focused on AGI, and the Lifeboat Foundation doesn't have any research staff.

Filed under: risks 4 Comments

Michio Kaku on 2013 Solar Maximum: “It Would Paralyze the Planet Earth”

Maybe it's nothing at all! Maybe. Still, I have enough room in my thoughts to consider this, even if the probability is low. I don't think anyone has the expertise to say for sure one way or the other.

A real analysis would involve probability distributions over solar energy flux and expensive tests on electronic equipment.

This is a good test case for our reasoning on global risk probabilities -- are we quick to make unqualified judgments, or are we willing to spend the time to find the facts?

A commenter pointed out that scientists actually predict that this solar maximum will be the least intense since 1928, but this prediction is meaningless because below-average solar maxima can still be extremely intense:

"If our prediction is correct, Solar Cycle 24 will have a peak sunspot number of 90, the lowest of any cycle since 1928 when Solar Cycle 16 peaked at 78," says panel chairman Doug Biesecker of the NOAA Space Weather Prediction Center.

It is tempting to describe such a cycle as "weak" or "mild," but that could give the wrong impression.

"Even a below-average cycle is capable of producing severe space weather," points out Biesecker. "The great geomagnetic storm of 1859, for instance, occurred during a solar cycle of about the same size we’re predicting for 2013."

Does this mean that every 13 years is a significant danger? If so, then that lowers my estimated probability of disaster significantly. The problem is that I've switched my opinion back and forth already based on the evidence, and I have no way of knowing if this will continue.

Filed under: risks, videos 46 Comments

UK Government Chief Scientist: Solar Storms Could Lead to a “Global Katrina”, Costing Over $2 Trillion

From The Guardian:

The threat of solar storms that could wreak havoc on the world's electronic systems must be taken more seriously, the UK government's chief scientist has warned. A severe solar storm could damage satellites and power grids around the world, he said, leading to a "global Katrina" costing the world's economies as much as $2tn (£1.2tn).

"This issue of space weather has got to be taken seriously," said John Beddington, the UK government's chief scientific adviser, speaking at the annual meeting of the American Association for the Advancement of Science (AAAS) in Washington DC. "We've had a relatively quiet [period] in space weather and we can expect that quiet period to end. Over the same time, over that period, the potential vulnerability of our systems has increased dramatically. Whether it's the smart grid in our electricity systems or the ubiquitous use of GPS in just about everything these days."

Our electrical grid is completely vulnerable. None of the major transformers are contained in Faraday cages ready to be sealed off in the case of a coronal mass ejection. If these transformers short out, we are screwed. It could take years to replace them all, and by that time half the population could be dead of starvation, conflict, and disease. Without electricity, how can you pump the gas to get the trucks and mechanics to replace the transformers?

Security is the foundation of everything else. No electrical grid means no water or gas, no water or gas means no food, no food means people need to find food however they can, which means no security. No security means that repairing any given piece of machinery automatically becomes 10-100X harder. A limited security infrastructure could be bootstrapped from the military, but it's more likely that soldiers will defect and join their families, which will need to be protected on the local level.

On Valentine's Day there was a class-X solar flare, the most powerful in four years. The sun is ramping up to its next solar maximum, due to hit in 2013.

During the Carrington event in 1859, the solar storm was so powerful that intense auroras lit up in the night sky in the Rocky Mountains and gold miners awoke prematurely because they thought it was morning.

Filed under: risks 7 Comments

Geomagnetic Storm in Progress

In other, potentially civilization-saving news, NSF-affiliated scientists are rolling out the first system that may help predict coronal mass ejections and other solar storms one-to-four days in advance. Would power companies be foresightful enough to shut down the grid for a few days in the instance of a truly major solar storm? We can only hope so.

To search for science on the connection between solar storms and earthquakes, see here. However, I doubt earthquakes are what we should be worried about. Still, did you know that geomagnetic storms and earthquakes have actually been linked?

Filed under: risks 45 Comments

Happiness Set Point and Existential Risk

Talking to Phil, Stephen, and PJ on FastForward Radio last night, I made a point that I make often in person but I don't think I've ever said on my blog.

The point is a reaction to accusations of doomsaying. People say, "you're so negative, contemplating catastrophic scenarios and apocalypse!" My response is that rather than being indicative of me being pessimistic or depressed, it is actually evidence that I am a happy person. Because I have a high happiness set point, I am enabled to consider negative scenarios without suffering personal depression or momentary sadness. I am immune from the reactive flinching away that most people have when they consider nuclear war or robots destroying all humans. Well, not entirely immune, but certainly more immune than most, and acclimation is part of it.

Because of my high happiness set point, there are greater volumes of idea space that I can comfortably navigate. Try it. Can you consider nuclear war in an entirely objective way, thinking about scientific facts and evidence, rather than fixating on the emotional human impact? For me and some of my friends, nuclear war can be brought up at a casual conversation, without gloominess, simply because it's interesting to work through the probabilities involved. We can be sad and humanistic/emotional about it too, but we have the option to be analytical as well. Others don't have a choice. More choices is good in this situation.

People with an average or low happiness set-point are unfortunately handicapped. They can't think about negative possibilities without feeling sad. Thus, that portion of the memetic state space is blocked off to them. Poor schmucks.

Ironically, their inability to rationally confront existential risks increases the probability that we will all experience a disaster. Unfortunate, because their actions will cause others to suffer.

A corollary of this effect is that when existential risks are brought up at all, it tends to be in a humorous context, because most people are too fragile to consider it in a non-humorous context.

Filed under: philosophy, risks 12 Comments

Yellowstone Caldera has Risen Three Inches Per Year for Last Three Years

I saw this at the Daily Mail, which everyone should know is a very unreliable source, but it's still a little concerning:

They said that the super-volcano underneath the Wyoming park has been rising at a record rate since 2004 - its floor has gone up three inches per year for the last three years alone, the fastest rate since records began in 1923.

But hampered by a lack of data they have stopped short of an all-out warning and they are unable to put a date on when the next disaster might take place.

When the eruption finally happens it will dwarf the effect of Iceland's Eyjafjallajökull volcano, which erupted in April last year, causing travel chaos around the world.

The University of Utah's Bob Smith, an expert in Yellowstone's volcanism told National Geographic: "It's an extraordinary uplift, because it covers such a large area and the rates are so high."

"At the beginning we were concerned it could be leading up to an eruption."

The prior probability of a catastrophic eruption per year is about 0.00014%. Alexsei remarked in the comments that the probability is actually higher because it rises the more time elapses since the last eruption, and the eruptions have been fairly periodic for the last two million years. I want to do a more precise calculation, but say that it increased the probability by a factor of ten, that would be 0.0014% chance per year, or a 0.14% chance this century.

Filed under: risks 30 Comments

Global Catastrophic Risks in the Spotlight at Society for Risk Analysis Conference

Today I attended the global catastrophic risk sessions at the Society for Risk Analysis annual meeting in Salt Lake City, and was very pleased by the attendance at these two sessions. Two former Presidents of the society attended, and one, Jonathan Weiner, gave a compelling talk that reminded me very much of Eliezer Yudkowsky's "Cognitive biases potentially affecting judgment of global risks". Jonathan called for more attention to global catastrophic risks, including global financial crises, and pointed out specific biases that prevent people from giving due attention to these risks. The whole experience gave me the strong impression that the risk analysis mainstream is very much interested in global catastrophic risks. Congratulations to Seth Baum for spearheading this effort.

Robin Hanson gave a fascinating talk on refuge entry futures. Basically, the idea is that you could potentially judge the probability of catastrophic risks better than the status quo by seeing how many people would be willing to buy tickets to enter secure refuges in case of a disaster or some triggering event.

My talk, which I gave yesterday at a session on nanotechnology risk assessment and perception (everyone besides me was focused primarily on nanoparticles), was titled "Public Scholarship and Global Catastrophic Risk". Nothing new to readers of this blog, the points are all relatively straightforward:

1) showed the catastrophic risks table from Bostrom (2008)
2) gave a few examples
3) global catastrophic risks (GCRs) outclass all other risks in terms of importance
4) books to read: Global Catastrophic Risks, Military Nanotechnology, The Singularity is Near
5) pointed out Bill Joy's influential 2000 article, "Why the future doesn't need us"
6) said that we focus on GRAIN: genetics, robotics, AI, nanotechnology
7) groups working on GCRs: Singularity Institute, Future of Humanity Institute, Presidential Commission for the Study of Bioethical Issues (synthetic biology risk), SENS Foundation (aging is considered a GCR according to Bostrom), Center for Responsible Nanotechnology (CRN), Lifeboat Foundation
8) Quick summary of CRN work, pointed out that more than half of average people view nanotech associated with Drexlerian nanotech (Ann Bostrom gave evidence of this in a talk that came before mine, from a mall intercept study)
9) tried to make it clear to the audience that most risk analysts in nanotechnology today have failed to focus on the important risks, if it weren't for CRN there wouldn't have even been a scientific rebuttal of grey goo
10) most risk analysts probably aren't even clear on why grey goo is implausible, they just dismiss it out of hand without having good reasons or understanding
11) public scholarship: bringing academic work to the public
12) summarized Singularity Institute activities to raise awareness of GCRs: Visiting Fellows Program, Singularity Summit, workshops, blogs, papers, and contributions to edited volumes
13) showed pics of Visiting Fellows Program and Singularity Summit 2010
14) showed San Jose Mercury article on Thiel's Audacious Optimism dinner to illustrate the enthusiasm of some philanthropists for this area
15) summarized our media exposure since I became media director: a lot, including GQ, New York Times, Popular Mechanics, Popular Science, Playboy (Carl Zimmer), etc.
16) interdisciplinary effort: biology, decision theory, computer science, risk analysis, physics, philosophy, nanotechnology
17) suggested some websites to visit, and the like
18) wrapped it up.

The meeting was productive enough that I'll likely attend next year. Thanks to everyone I met for their stimulating conversations.

Filed under: events, risks No Comments

2013 Solar Maximum Resources

2008 report from US National Academies of Sciences' Space Studies Board:

Severe Space Weather Events -- Understanding Societal and Economic Impacts: A Workshop Report

NASA Science News, June 4, 2010, "As the Sun Awakens, NASA Keeps a Wary Eye on Space Weather"

Richard Fisher, head of NASA's Heliophysics Division, explains what it's all about:

"The sun is waking up from a deep slumber, and in the next few years we expect to see much higher levels of solar activity. At the same time, our technological society has developed an unprecedented sensitivity to solar storms. The intersection of these two issues is what we're getting together to discuss."

The National Academy of Sciences framed the problem two years ago in a landmark report entitled "Severe Space Weather Events—Societal and Economic Impacts." It noted how people of the 21st-century rely on high-tech systems for the basics of daily life. Smart power grids, GPS navigation, air travel, financial services and emergency radio communications can all be knocked out by intense solar activity. A century-class solar storm, the Academy warned, could cause twenty times more economic damage than Hurricane Katrina.

(Hurricane Katrina caused $81 billion in damage, so 20 times that would be $1.6 trillion in damage.)

Media articles

It is difficult to come to any conclusion because the experts disagree.

Discovery: Is a Devastating Solar Flare Coming to a City Near You? -- pessimistic analysis from an astrophysicist:

In the case of space weather, wouldn't it be great if, as a civilization, we could look at the sun and get advanced notice of a solar eruption? All we'd need is a few hours lead-time and we could reduce the output of national power grids (to avoid overload) and switch our satellites into "safe mode." Once the storm has passed, we'd continue our lives as normal. Disaster averted.

Unfortunately, it often takes a disaster to teach us to prepare better in the future. I just hope the next solar maximum doesn't teach us a lesson we can't recover from.

A lot of drama and ink spilled in this post, but ultimately this astrophysicist sounds just as uncertain and confused as your average dude.

Counterpoint to the above: Expert rubbishes solar storm claims:

One report quotes an Australian astronomer as saying "the storm is likely to come sooner rather than later".

But Dr Phil Wilkinson, the assistant director of the Bureau of Meteorology's Ionospheric Prediction Service, says claims that this coming solar maximum will be the most violent in 100 years are not factual.

"All this talk about gloom and doom has selling power, but I'm certain it's overstated," he said.

"[It's] going far beyond what's realistic and could be worrying or concerning for people who don't really understand the underlying science behind it all.

"The real message should be that the coming solar maximum period could be equally as hazardous as any other solar maximum."

Unfortunately, there's not a whole lot of explanation as to why this expert believes there isn't a major risk, or what exactly he means by "regional".

Space Weather Enterprise Forum has been meeting for four years to discuss the solar maximum risk.

The Register: NASA: Civilization will end in 2013 (possibly)

Searching for "solar maximum", "power grid" on Google Scholar reveals 13 results. Either this means I'm using the wrong search terms, there's only a minority of scientists qualified to write on this topic and barely any do, or something else disappointing given the scale of the risk. Here's one article from Science:

Are We Ready for the Next Solar Maximum? No Way, Say Scientists
Richard A. Kerr

If the once-in-500-years "solar superstorm" that crippled telegraph systems for a day or two across the United States and Europe in 1859 but otherwise was mainly remembered for its dramatic light show were to happen today, the charged-particle radiation and electromagnetic fury would fry satellites, degrade GPS navigation, disrupt radio communications, and trigger continent-wide blackouts lasting weeks or longer. Even a storm of the century would wreak havoc. That's why space physicists are so anxious to forecast space weather storms accurately. If predicting a hurricane a few days ahead can help people prepare for a terrestrial storm's onslaught, they reason, predicting solar storms should help operators of susceptible systems prepare for an electromagnetic storm. And space weather forecasters' next challenge is coming up soon. The next peak in the 11-year sunspot cycle of solar activity looms in 2012 or 2013. A space weather symposium last month asked, "Are we ready for Solar Max?" The unanimous answer from participants was "No."

So far, the balance of scientific opinion seems to be on the side of very serious concern.

My response to the above abstract is mostly -- in this context, who cares about fried satellites, degraded GPS navigation, and disrupted radio communications in comparison to week-long blackouts? Translated into risks affecting personal health, in my mind that line would read something like this: "the risk could cause a broken toenail, difficulty hearing, an itchy back, and cancer".

Filed under: risks, science 40 Comments

Contrasting Views on the Stability of the US Power Grid

"Why it's hard to crash the electric grid" from Eurekalert:

Last March, the U.S. Congress heard testimony about a scientific study in the journal Safety Science. A military analyst worried that the paper presented a model of how an attack on a small, unimportant part of the U.S. power grid might, like dominoes, bring the whole grid down.

Members of Congress were, of course, concerned. Then, a similar paper came out in the journal Nature the next month that presented a model of how a cascade of failing interconnected networks led to a blackout that covered Italy in 2003.

These two papers are part of a growing reliance on a particular kind of mathematical model -- a so-called topological model -- for understanding complex systems, including the power grid.

And this has University of Vermont power-system expert Paul Hines concerned.

"Some modelers have gotten so fascinated with these abstract networks that they've ignored the physics of how things actually work -- like electricity infrastructure," Hines says, "and this can lead you grossly astray."

For example, the Safety Science paper came to the "highly counter-intuitive conclusion," Hines says, that the smallest, lowest-flow parts of the electrical system -- say a minor substation in a neighborhood -- were likely to be the most effective spots for a targeted attack to bring down the U.S. grid.

"That's a bunch of hooey," says Seth Blumsack, Hines's colleague at Penn State.

Related news: U.S. electricity blackouts skyrocketing from CNN, "US power grid easy prey for hackers" from Homeland Security Newswire, reporting on a story in MIT's Technology Review.

Filed under: risks No Comments

Society for Risk Analysis Annual Meeting Presentation

This is just a reminder that I will be presenting at the Society for Risk Analysis annual meeting in Salt Lake City on December 5-8. The meeting is open to anyone interested in risk analysis. Registration is $500. Robin Hanson and Seth Baum will be there as well. My presentation will be part of the "Assessment, Communication and Perception of Nanotechnology" track. The full session list is here. Seth will be chairing the "Methodologies for Global Catastrophic Risk Assessment" track, where Robin will be giving his talk.

Here's my abstract:

T3-F.4 14:30 Public Scholarship For Global Catastrophic Risks. Anissimov M*; Singularity Institute

Abstract: Global catastrophic risks (GCRs) are risks that threaten civilization on a global scale, including nuclear war, ecological collapse, pandemics, and poorly understood risks from emerging technologies such as nanotechnology and artificial intelligence. Public perception of GCRs is important because these risks and responses to them are often driven by public activities or by the public policies of democracies. However, much of the public perception is based on science fiction books and films, which unfortunately often lack scientific accuracy. This presentation describes an effort to improve public perceptions of GCR through public scholarship. Public scholarship is the process of bringing academic and other scholarship into the public sphere, often to inform democratic processes. The effort described here works on all GCRs and focuses on emerging technologies such as biotechnology and nanotechnology. The effort involves innovating use of blogs, social networking sites, and other new media platforms. This effort has already resulted in, among other things, a visible online community of thousands following the science around GCRs, and plans to further move discussion of scholarly GCR literature into the mainstream media. It is believed that public scholarship efforts like these can play important roles in societal responses to GCRs.

Here's Professor Hanson's abstract:

W3-A.3 14:10 Catastrophic Risk Forecasts From Refuge Entry Futures. Hanson RD*; George Mason University

Abstract: Speculative markets have demonstrated powerful abilities to forecast future events, which has inspired a new field of prediction markets to explore such possibilities. Can such power be harnessed to forecast global catastrophic risk? One problem is that such mechanisms offered weaker incentives to forecast distant future events, yet we want forecasts about distant future catastrophes. But this is a generic problem with all ways to forecast the distant future; it is not specific to this mechanism. Bets also have a problem forecasting the end of the world, as no one is left afterward to collect on bets. So to let speculators advise us about world's end, we might have them trade an asset available now that remains valuable as close as possible to an end. Imagine a refuge with a good chance of surviving a wide range of disasters. It might be hidden deep in a mine, stocked with years of food and power, and continuously populated with thirty experts and thirty amateurs. Locked down against pandemics, it is opened every month for supplies and new residents. A refuge ticket gives you the right to use an amateur refuge slot for a given time period. To exercise a ticket, you show up at its entrance at the assigned time. Refuge tickets could be auctioned years in advance, broken into conditional parts, and traded in subsidized markets. For example, one might buy a refuge ticket valid on a certain date only in the event that USA and Russia had just broken off diplomatic relations, or in the event a city somewhere is nuked. The price of such resort tickets would rise with the chance of such events. By trading such tickets conditional on a policy that might mitigate a crisis, such as a treaty, prices could reflect conditional chances of such events.

Filed under: events, risks 7 Comments

Doubt Thrown on Uncle Fester’s Botulism Recipe

In the comments, Martin said:

I wonder how accurate it is. Uncle Fester became underground famous in the 90s when he published books on meth and acid manufacture, but other clandestine chemists criticized his syntheses for being inaccurate.

From this small snippet, it sounds like he wants you to go out and find the right Clostridium species and strains in soil and culture them yourself, which sounds as impractical as his suggestion in the acid book to grow acres of ergot-infested rye. :)

Any more comments on why this is impractical? It sounds much simpler than growing acres of ergot-infested rye. He describes how he would isolate spores, first by heating the culture (this kills anything that is not a spore), then encouraging growth in an anoxic environment (kills anything that is not anaerobic). This leaves only anaerobic bacteria derived from spores.

The book does claim that botulinum germs are "fussy about what they like to grow in, its pH, and its temperature" and that "This need to exclude air from the environment where the germs are growing is the most difficult engineering challenge to the aspiring cultivator of Clostridia botulinum", so he's not saying that it's a cakewalk.

Of course, many of these underground books (Anarchist Cookbook...) are rife with misinformation. Anyone serious about producing botulism toxin would need actual biochemical knowledge and multiple corroborating sources. Still, there's a lot of information in this particular book that would at least provide a compelling starting point.

It's worth noting that Uncle Fester probably never synthesized all the compounds described in his book, which includes over half a dozen different types of nerve gas. He repeatedly points out that synthesizing these chemicals is a risk to the life of the person performing the synthesis. In some parts of the book, he names sources, like literature released by the military, but the vast majority of his book lacks citations.

Filed under: biology, risks 40 Comments

Instructions for Mass Manufacture of Botulinum Toxin Freely Available Online

Properly delivered from a plane, a few grams of botulinum toxin could kill hundreds of thousands, if not more, in a major city.

Silent Death by "Uncle Fester" has the full process instructions, including details on optimal delivery.

The LD-50 of botulinum injected into chimpanzees is 50 nanograms.

Combine it with effective microbots, and you have a situation where anyone can kill anyone without accountability.

This is one of the reasons I want a Friendly AI "god" (really more like a machine) to watch over me is that the dangers will simply multiply beyond human capability to manage.

Here's a bit of an excerpt from my version of Silent Death:

Botulin is the second most powerful poison known, taking the runner up position to a poison made by an exotic strain of South Pacific coral bacteria. The fatal dose of pure botulin is in the neighborhood of 1 microgram, so there are 1 million fatal doses in a gram of pure botulin.

The bacteria that makes botulin, Clostridia botulinum, is found all over the world. A randomly chosen soil sample is likely to contain quite a few spores of this bacteria. Spores are like seeds for bacteria, and can withstand very harsh treatment. This properly will come in very handy in any attempt to grow botulism germs, because other germs can be wiped out by heating in hot water, leaving the spores to germinate and take over once they cool down. Much more on this later.

Another very important property of botulism germs is that they can't survive exposure to air. The oxygen in it kills them, but does not kill their spores. Whatever toxin the germs made before their demise also survives. This needs to exclude air from the environment where the germs are growing is the most difficult engineering challenge to the aspiring cultivator of Clostridia botulinum.

Finally, all botulism germs are not created equal. There are subgroups within the species that make toxins that vary immensely in their potency. They are called types: A, B, C, D, E, F and 84. Type A is by far the most deadly, followed by type B and 84. THe other ones we won't even bother to discuss. Also within a single type, there are individual differences in how much toxin a given strain will produce. Breeding and gene manipulation have a lot to do with this, and our government (and the Russkies as well) have put a lot of effort into picking out strains that make an inordinate amount of toxin. The champion as of about 30 years ago was the Hall strain, but I'm sure that they've come up with something better since then. The Hall strain of type A was able to make 300 human fatal doses of botulin per ml of broth it grew in.

Here we will explore the two major levels of use for botulin as an attack weapon: the individual or small group assassination, and the large scale assault with the poison in a manner similar to nerve gas.

Very informative! As a Russian, I love the "Russkies" anachronism.

99.9% of the population will dismiss the above as not a big deal, due to wishful thinking. It's all just words on the page, until people start dying.

Filed under: risks 77 Comments