Accelerating Future Transhumanism, AI, nanotech, the Singularity, and extinction risk.


80 Missing Computers at Nuke Lab: Watchdog

From Physorg:

Eighty computers have been lost, stolen or gone "missing" at a major US nuclear weapons lab, the nonprofit watchdog group Project On Government Oversight (POGO) has said.

The group posted online a copy of what they say is an internal letter outlining what appear to be worrisome losses at Los Alamos National Laboratory in the state of New Mexico.

The letter says that 13 lab computers were lost or stolen during the past year, three of the machines taken from an employee's home in January. Another 67 computers are deemed "missing."

"The magnitude of exposure and risk to the laboratory is at best unclear as little data on these losses has been collected or pursued," the letter dated February 3 maintains.

The letter, addressed to Department of Energy security officials, contends that "cyber security issues were not engaged in a timely manner" because the computer losses were treated as a "property management issue."

What became of the missing computers and the "security ramifications of each of the 80 systems" was to be detailed in a written report to lab officials by February 6, according to the letter.

AFP telephone calls to the lab on Friday in search of comment were not returned.

Los Alamos was created as a secret facility during World War II and was the site for the Manhattan Project that gave birth to the first nuclear bombs.

It is a major center for research related to national security, outer space, renewable energy, medicine, nanotechnology, and supercomputing.

World leaders have started to get serious about nuclear risk in recent years, but current risks from synthetic biology and all-but-certain near-future (2015+) risks from nanotechnology and AI are pretty much ignored. When the new bio or nuclear 9/11 happens, I'll be able to look back and say that I was constantly sounding the alarm and proposing countermeasures. Will you?

Recently, in The Global Spiral, an online magazine that barely anyone reads (according to, transhumanists responded to recent criticism of our philosophy. This was a good issue and I liked a lot of the articles. Immediately relevant, however, is Mark Walker's article, "Ship of Fools: Why Transhumanism is the Best Bet to Prevent the Extinction of Civilization". Walker is a member of the Scientific Advisory Board of the Lifeboat Foundation, the only organization on the planet devoted to advancing a comprehensive set of safeguards to extinction risks. I am Fundraising Director, United States for the Lifeboat Foundation.

Filed under: nuclear, risks Leave a comment
Comments (13) Trackbacks (2)
  1. The idea that secrecy will prevent technology from being duplicated is laughable, in this day and age. At best, stolen data from nuclear labs just accelerates progress by nuclear engineers.

    Anti-proliferation NGOs have shifted gears from secrecy and control to voluntary relinquishment by responsible actors, and policing efforts to control irresponsible ones some time ago, but many still seem to have the idea that only engineers from G8 national militaries can read upper division physics textbooks or twenty year old magazine articles.

  2. The problem is that although the article may soundly reason that neither a status quo program nor more regulation is likely to work, transhumanism, which is now just so much handwaving, gets chosen by default, without any viable method of connecting the policy dots. This is fine for Mark, since his question is really whether there is any ethical justification that can be made for transhumanism on security grounds, but it doesn’t work as policy. We could just as easily say “therefore we should appeal to our alien overlords for help,” or “let us pray to Jesus for help,” since alien overlords or Jesus don’t exist and have as little prospect of existing when we would need them as brilliant and helpful superintelligences do.

  3. Tom D:

    “since alien overlords or Jesus don’t exist and have as little prospect of existing when we would need them as brilliant and helpful superintelligences do.”

    Your rhetoric is writing cheques that your evidence can’t cash.

    Or can it? Maybe you have an utterly convincing argument whose conclusion is that benevolent superintelligence is impossible?

  4. I would suggest that the burden of proof rests on those who would claim that benevolent superintelligences are the key to our existential crises. If bioweapons are a threat, they are a threat right now. If nuclear proliferation is a threat, it is a threat right now. In what way are imaginary superintelligences a solution right now, or even in ten years, or twenty? Even positing some kind of germline engineering to create enhanced humans–something that is neither feasible nor ethically permissible at this moment–you’re still looking at a twenty-year time window until such experiments become mature. What happens in the meantime?

  5. I would suggest that the burden of proof rests on he who makes a statement!

    I’m not disagreeing with you, you could be right. But if you don’t tell me what your evidence for believing something is, how am I supposed to benefit?

  6. Superintelligence is necessarily imaginary before it happens… otherwise it would have already happened.

    I know it sounds tautological, but the same statement applies to any discontinuous or semi-discontinuous technology. A light bulb filament that can produce light without burning up was imaginary before Edison found one.

    It’s also possible to just agree to disagree on this issue… sometimes the debate gets nowhere. There are reasonable people on both sides. But, it seems to be in our interest to keep raising the issue.

    Singularitarians — “Raising the Issue” Since 2000â„¢

  7. “It’s also possible to just agree to disagree on this issue… sometimes the debate gets nowhere.”

    – But I’d be much happier if people like Tom D presented their evidence for making statements like

    “since alien overlords or Jesus don’t exist and have as little prospect of existing when we would need them as brilliant and helpful superintelligences do.”

    I suspect that I know more about the subject than Tom, and that he doesn’t know anything I don’t already know, and that his statement is primarily an emotional reaction to the weirdness of our ideas. But there’s always the possibility that he has read something I haven’t, or has considered a good argument that I haven’t.

  8. Tom, there do exist arguments (see, – I apologize for not giving more detailed links right now) for superintelligence.

    I don’t believe anyone is saying that FAI is a short-term solution, and few expect it within 10-20 years. The longer term is the issue.

    It is possible to reason about and plan for things that do not yet exist.

    (Also, lesser point: if there were no arguments for superintelligence, it still wouldn’t be on a level with prayer or aliens – both are disconfirmed by observation, and the first is inconsistent with the way the world appears to fundamentally work.)

    Roko, Michael: To be fair, the burden of justification is on us. (And has been met, as far as I’m concerned.)

  9. Knowledge is not the problem. Knowledge plus fissionable material is the problem. Is it to much to ask the government agencies of the world to keep track of fissionable materials? Apparently so.

  10. “Roko, Michael: To be fair, the burden of proof justification is on us. (And has been met, as far as I’m concerned.)”

    – Nick, what do you think the most convincing argument in favor of the the eventual development of superintelligence, or our extinction is?

    I would say that Nick Bostrom’s “the future of humanity” and the accompanying FHI brain emulation roadmap are probably the most convincing.

  11. The VA sells veteran and military records around for grant money. Thus, US Officer Records are sold for money.
    Last year the US Supreme Court refused to hear my case because: I am a citizen of the USA and therefor have no access to US courts. My status as an Ambassador to a Foreign Nation for a Foreign Nation was not enough for them to hear the case. Nor was the fact that a lower court sent this case to them!

    Can we say – SLUSH FUND?

    So – 60 computers go missing.

    Now – Los Alamos employees give away secrets?


    Wake up you wimps in the FBI

    Wake up you self centered CIA folks

    Hello Blackwater – Hello?

    Time to stop the leaks or there may not be enough of this planet to even stand on.

    Two weeks ago the US 7th fleet gave Vietnam 16 Nukes.

    How about unifying your efforts and plug the leak?



    Honorable Grace
    Dr William B. Mount
    Knight of Malta
    Cpt (Ret) USA

  12. Hi, Neat post. There’s a problem with your web site in internet explorer, might test this… IE nonetheless is the marketplace leader and a good element of other people will miss your fantastic writing due to this problem.

  13. It’s amazing designed f?r me to h?ve a weeb
    site, ?hich iss beneficial ?n favokr ?f my k?ow-ho?.
    t?anks admin

Leave a comment