Here are some posts that a lot of folks liked, spurred long discussions, or which I think are noteworthy.

Confirmed: Key Activities by “Anonymous” Masterminded by Small Groups of Decision-Makers
TIME Cover on Ray Kurzweil, Singularity Summit, Singularity Institute
Immortality Institute Mentioned in Newsweek
Converging Technologies Report Gives 2085 as Median Date for Human-Equivalent AI
Some Singularity, Superintelligence, and Friendly AI-Related Links
I’m Quoted on Friendly AI in the United Church Observer
Singularity Institute Covered by NPR’s All Things Considered
Josh Tenenbaum Video Again: Bayesian Models of Human Inductive Learning
IBM Cat Brain Nonsense in the Zeitgeist
Transhumanism is Still Winning
Comprehensive Nanorobotic Control of Human Morbidity and Aging
My Upcoming Talk in Texas: Anthropomorphism and Moral Realism in Advanced Artificial Intelligence
Yellowstone Caldera has Risen Three Inches Per Year for Last Three Years
Phil Bowermaster on the Singularity
Yes, The Singularity is the Biggest Threat to Humanity
What Would it Cost to Develop a Nanofactory?
Michael Nielsen: What Should a Reasonable Person Believe About the Singularity?
Introducing brainSCANr


New Singularity Institute Publications in 2010
Singularity Summit 2009 Featured in Carl Zimmer Article in Scientific American
Kurzweil Reveals an In-Depth Analysis of His Predictions for 2009 in Letter to IEEE Spectrum
The Whiskerwheel — Flexible Locomotion
WikiLeaks “Cyberwar” Nonsense from “Security Experts” Who Don’t Understand 4chan or Anonymous
Bill Gates Mentions the Risk of Superintelligence in the Wall Street Journal
Josh Tenenbaum: Bayesian Models of Human Inductive Learning
I Get Quoted in IEEE Spectrum Blog by the Former Editor-in-Chief of Scientific American
Chalmers to Discuss Singularity at Berkeley Tomorrow
“Liberal Eugenics” — An Awkward Term
500 Pictures of Singularity Summit 2010 Available
Superior Retinal Prosthesis Developed for Mice
Zyvex Labs: “Atomic Precision Fabrication Using Patterned Si Atomic Layer Epitaxy”
The More We Talk, the Less We Might Agree: Study Shows Discussion Can Hurt Consensus-Building on Science/Technology
More Debate on Superintelligent AI Goals
2013 Solar Maximum Resources
Alcor Receives $7 Million
Hank Hyena’s “Me, Me!” Transhumanism
John D. Furber’s Comprehensive Aging Graph
My Four-Layered Model of Human Nature
Doubt Thrown on Uncle Fester’s Botulism Recipe
Instructions for Mass Manufacture of Botulinum Toxin Freely Available Online
Michael Anissimov: “Don’t Fear the Singularity, but Be Careful: Friendly AI Design” at Foresight 2010 Conference
Concerning the Extraterrestrials and Their Absence
Welcome to 1850: the Risk of EMP Attack
Geomagnetic Solar Storms and EMP
Michael Anissimov Interviews Eliezer Yudkowsky on Friendly AI
Why Intelligent People Fail
Why Arguments Against Mind Uploading Don’t Work — Constant Neural Molecular Turnover
Amusing Ourselves to Death
Michael Anissimov Essays at the Lifeboat Foundation
The World the Singularity Creates Could Destroy All Value
Charles Lindbergh — Early Transhumanist
Survey: Hiding Risks Can Hurt Public Support for Nanotechnology
More Cat Brain Nonsense from IBM
A Christian Perspective on the Singularity Movement
Dangers of Molecular Nanotechnology, Again
A Proximity-Based Programmable DNA Nanoscale Assembly Line
What Does a Buckyball Undergoing Unimolecular Disassociation by Use of Extremely High Levels of Vibrational Excitation Look Like?
Weapon Energy Over Time
Humanity+ UK 2010: “Unprecedented Gathering in London”
Eliezer Yudkowsky and Massimo Pigliucci Debate the Singularity
Yes, We Can Do Better Than This…
New Paper: Optimal Tooltip Trajectories in a Hydrogen Abstraction Tool Recharge Reaction Sequence for Positionally Controlled Diamond Mechanosynthesis
Anti-Transhumanist Wesley J. Smith Argues that Yogi Claims of Living Without Food/Water are Evidence Against Scientific Materialism and in Favor of Human Exceptionalism
Transhumanism Has Already Won
Nine Reasons I’m Interested in Survivalism
Atmospheric CO2 Levels Over Geologic Time
My Talk at Foresight 2010: “Don’t Fear the Singularity, but Be Careful: Friendly AI Design”
The Power of Self-Replication
Valid Transhumanist Criticism
Diamond Trees (Tropostats): A Molecular Manufacturing Based System for Compositional Atmospheric Homeostasis
Michael Anissimov Interviews Juergen Schmidhuber
Ray Kurzweil Response to “Ray Kurzweil’s Failed 2009 Predictions”
7 of 108 of Ray Kurzweil’s 1996-1997 Predictions for 2009 Which Seem Incorrect


“Benefits of a Successful Singularity” Reaches the Front Page of Digg
Why I Care About Malcolm Gladwell’s Igon Values
Singularity 101 at
Why Structure Matters
The Singularity Meme Coalesces in NYC
Michael Anissimov Interviews Aubrey de Grey on the Singularity

2009 Chemistry Nobel Prize Awarded for Productive Nanosystems Research

This is Your Brain on Cryonics
Answering Popular Science’s 10 Questions on the Singularity
Nanodiamond is Now Buckymesh
Nanofactories Will Be Powerful and Cheap if They Work at All
The Chilly Frontier: Antarctica, Greenland, and Elsewhere
Charles Stross’ Singularity-Clueless, “New Scientist”, Yawn-tastic 21st Century Future
Nuclear Bomb Powered Cannons for Getting to Space
Submissions to Futurist Magazine Wide Open
First Live-Action Movie of Individual Carbon Atoms
Power Density Graph for Nanotechnology Products
Sorry, We’re All Dumb
Transhumanism: It’s Small
What is Meant by “Superintelligence”?
The Debate Between Advocates of Soft and Hard Nanotech
Why Singularity Advocacy Needn’t be Techno-Utopian
What are the Benefits of Mind Uploading?
Invasion of the Worm Robots
Some Books I Read in 2008


10 Futuristic Materials
Five Ways You Can Help Transhumanism Now
Five Futuristic Forms of Air Travel
Feasibility Arguments for Molecular Nanotechnology
A Challenger Appears!
Interview with Future Blogger
Transhumanist Blogs
Negative Utilitarianism
Seven Influential Transhumanists
Is Star Trek a Fascist Society?
Brain-Computer Interfaces for Manipulating Dreams
Boston Dynamics Big Dog
Top 10 Excuses for Dying
Cognitive Enhancement Strategies
Vatican Takes Official Anti-Transhumanist Stance
Response to Amor Mundi on Transhumanism
Nuclear Terrorism
High Cost of Force Protection
Annalee Newitz’s Vitriolic Anti-Transhumanism
Taking Global Risk and WWIII Seriously
Human Arrogance
Look to Inner, not Outer Space
The Religion of Science
Temperature Engineering
Stephen Omohundro’s “Basic AI Drives”
The Danger of Powerful Computing


Scratching my Head about Global Risks
Death Race
AGI from AI
Full-Body Haptic Suits
Yellowstone Caldera Rising
The Transhumanist Vision
Me at Singularity Summit 2007
Singularity Debate
Full Transcript of Minsky Interview at Transvision 2007
The Word “Singularity” Has Lost All Meaning
Cascio: the Nanofactory Ecosystem
Paris Hilton Signing Up for Cryonics
The Word “Transhumanist”
Al Gore on Molecular Nanotechnology
Bacterial Apocalypse?
What is Intelligence?
Transhumanism as Questioning Our Nature
A Thousand Chinese Einsteins Every Year
Superintelligence: the Very Idea
Transhumanists as a Coherent Group
Conservative Technological Projections for 2020
Classifying Human Extinction Risk
Transformative Technologies
Technological Transcension — You’d Better Believe It
Should Transhumanists Talk in Detail About the Futures They Want?
Criticisms of “Superlative Technology Discourse”
Forbes Looks at Transhumanism
Center for Responsible Nanotechology Conference 2007
Liveblogging Singularity Summit 2007
You Can Only Focus on So Many Things
Humanity’s Potential is Greater than We Can Comprehend
Looking Human Extinction in the Face
The Other Side of the Immortality Coin
Is it Possible to get Non-Immortalists to Care About Existential Risks?
10 Reasons to Live as Long as Possible