WSJ: Gains in Bioscience Cause Terror Fears

From The Wall Street Journal:

Rapid advances in bioscience are raising alarms among terrorism experts that amateur scientists will soon be able to gin up deadly pathogens for nefarious uses.

Fears of bioterror have been on the rise since the Sept. 11, 2001, attacks, stoking tens of billions of dollars of government spending on defenses, and the White House and Congress continue to push for new measures.

But the fear of a mass-casualty terrorist attack using bioweapons has always been tempered by a single fact: Of the scores of plots uncovered during the past decade, none have featured biological weapons. Indeed, many experts doubt terrorists even have the technical capability to acquire and weaponize deadly bugs.

The new fear, though, is that scientific advances that enable amateur scientists to carry out once-exotic experiments, such as DNA cloning, could be put to criminal use. Many well-known figures are sounding the alarm over the revolution in biological science, which amounts to a proliferation of know-how—if not the actual pathogens.

Another bit later in the article:

All the government attention comes …

Read More

Geomagnetic Solar Storms and EMP

I wish to qualify my statement in the previous post where I wrote, ” I currently think that EMP attack is the second greatest risk we face, right behind a genetically engineered superplague.”

What I should really say is that I think that any electromagnetic event that wrecks havoc on electronics is the second greatest risk, and that includes geomagnetic storms as well as EMP. I don’t want the particularly vivid risk of EMP attack to distract attention from the fundamental point that the most critical nodes in our power grids simply need to be more protected.

EMP attack is controversial. The experts are divided. Scientists can agree, however, that a solar maximum is on the way for 2013, and it could rival the Carrington Event of 1858 in its intensity.

The Space Review has an article that argues that EMP attack is unlikely while geomagnetic storms are the real threat.

Read More

Welcome to 1850: The Risk of EMP Attack

I am concerned about the PR aspects of the EMP attack risk communication over the last couple years. Awareness of the EMP risk has spread much faster among the right than any other portion of the political spectrum. This is already making it unfashionable among the educated left.

Given the year (2010), I currently think that EMP attack is the second greatest risk we face, right behind a genetically engineered superplague. A small EMP-optimized nuke launched from a container ship in the Gulf of Mexico could take out the power grid of the entire continental United States. The same thing could be done anywhere, like Europe or Japan.

The facts are available from the Commission to Assess the Threat to the United States from Electromagnetic Pulse (EMP) Attack. No one cares except the Fox News crowd. It wasn’t like this only a few years ago: EMP attack was primarily a topic limited to analysts and sci-fi TV show writers. Obama seems concerned about nukes in general (which presumably includes the EMP risk that emanates from them), …

Read More

Patrick Lin in London Times: “The Reality of Robocops”

Patrick Lin is spreading the valuable message of roboethics:

They have everything the modern policeman could need – apart from a code of ethics. Without that, a Pentagon adviser fears, the world could be entering an era where automotons pose a serious threat to humanity.

The robots need to be hack-proof to prevent perpetrators from turning them into criminals, and a code of ethical conduct must be agreed while the technology is nascent.

The article mentions that there are currently over 7 million robots in operation, about half of them cleaning floors.

Read More

Reducing Long-Term Catastrophic Artificial Intelligence Risk

Check out this new essay from the Singularity Institute: “Reducing long-term catastrophic AI risk”. Here’s the intro:

In 1965, the eminent statistician I. J. Good proposed that artificial intelligence beyond some threshold level would snowball, creating a cascade of self-improvements: AIs would be smart enough to make themselves smarter, and, having made themselves smarter, would spot still further opportunities for improvement, leaving human abilities far behind. Good called this process an “intelligence explosion,” while later authors have used the terms “technological singularity” or simply “the Singularity”.

The Singularity Institute aims to reduce the risk of a catastrophe, should such an event eventually occur. Our activities include research, education, and conferences. In this document, we provide a whirlwind introduction to the case for taking AI risks seriously, and suggest some strategies to reduce those risks.

Pay attention and do something now, or be eliminated by human-indifferent AGI later. Why is human-indifferent AGI plausible or even likely within the next few decades? Because 1) what we consider “normal” or “common sense” morality is actually extremely complex, …

Read More

Hungry Cannibals and Soft Apocalypses

Robin Hanson recently posted about The Road and cannibals, which is great, because I think about this stuff all the time, and it’s good not to be alone.

The Road is a movie/book about a man and his son traveling south to reach the coast of the Gulf of Mexico in a post-apocalyptic world where the Sun is blocked out by huge dust clouds, and there are no plants or other life except for a few refugees and murderous cannibals. I thought the book was OK because it gave a sneak preview at what daily life could be like when or if the United States gets hit by a massive EMP attack. (The human conflict and desperate lack of food part, not the blocking out the Sun part.)

Prof. Hanson remarks how some reviewers called the movie “realistic”, when it absolutely is not. The story takes place more than seven years after apocalypse, but there are a couple occasions where the characters stumble on stored food supplies, which doesn’t make sense to Hanson. Second, he points …

Read More

Dangers of Molecular Nanotechnology, Again

Over at IEET, Jamais Cascio and Mike Treder essentially argue that the future will be slow/boring, or rather, seem slow and boring because people will get used to advances as quickly as they occur. I heartily disagree. There are at least three probable events which could make the future seem traumatic, broken, out-of-control, and not slow by anyone’s standards. These three events include 1) a Third World War or atmospheric EMP detonation event, 2) an MNT revolution with accompanying arms races, and 3) superintelligence. In response to Jamais’ post, I commented:

I disagree. I don’t think that Jamais understands how abrupt an MNT revolution could be once the first nanofactory is built, or how abrupt a hard takeoff could be once a human-equivalent artificial intelligence is created.

Read Nanosystems, then “Design of a Primitive Nanofactory”, and look where nanotechnology is today.

For AI, you can do simple math that shows once an AI can earn enough money to pay for its own upkeep and then some, it would quickly gain …

Read More

Dispelling Stupid Myths About Nuclear War

In response to discussion in the comments section on my recent post on nuclear war, Dave said:

Really, I mean, honestly, no one is surviving a nuclear war.

This is absolute nonsense. To quote the very first paragraph of Nuclear War Survival Skills, a civil defense manual based on in-depth research at the Oak Ridge National Laboratory:

An all-out nuclear war between Russia and the United States would be the worst catastrophe in history, a tragedy so huge it is difficult to comprehend. Even so, it would be far from the end of human life on earth. The dangers from nuclear weapons have been distorted and exaggerated, for varied reasons. These exaggerations have become demoralizing myths, believed by millions of Americans.

Here’s another good quote:

Only a very small fraction of Hiroshima and Nagasaki citizens who survived radiation doses some of which were nearly fatal have suffered serious delayed effects. The reader should realize that to do essential work after a massive nuclear attack, many survivors must be willing to receive much larger radiation doses than …

Read More

Interviews with Academics in Robot Ethics

Over at the Moral Machines blog, Colin Allen lists three recent interviews by Gerhard Dabringer on the topic of robot ethics. One of the interviews is with Jurgen Altmann, who I admire greatly for his academic work on preventive arms control. His book Military Nanotechnology is my favorite book on molecular nanotechnology policy, and I hope that its recommendations will be adopted. A small preview is online, but you’ll have to shell out $128 if you want a hard copy. Anyway, here are the interviews:

George Bekey: Professor Emeritus of Computer Science, Electrical Engineering and Biomedical Engineering at the University of Southern California and Adjunct Professor of Biomedical Engineering and Special Consultant to the Dean of the College of Engineering at the California Polytechnic State University. He is well known for his book Autonomous Robots (2005) and is Co-author of the study “Autonomous Military Robotics: Risk, Ethics and Design” (2008).

Jurgen Altmann: University of Dortmund, a founding member of the International Committee for Robot Arms Control. Since 2003 he is …

Read More

Risk From Engineered Microorganisms, Strategies for Evolutionary Dominance

From yesterday’s list of links, I particularly want to call attention to the rotifer link. This press release is interesting because it shows how animals can survive even when they are exact genetic copies of one another. Instead of outcompeting parasites through mutation, they run away by going into cryptobiosis. I predict that a form of asexual multicellular synthetic life will be created by 2030 that can defend against parasites through aggressive defense, say silica spines, so that running away isn’t even necessary. These organisms will just sit around and reproduce. The primary method to get rid of them at first will be dessication, but this will eventually prove useless as they disperse too widely to target.

What many humans don’t realize is that we are surrounded by quintillions of organisms with very little genetic diversity that dominate us in terms of biomass and persistence. They are the status quo — we are the aberration. These are organisms that have survived every mass extinction. Culprits include the tardigrades (which can survive outer space),

Read More