Thanks to Phil Bowermaster of The Speculist for filming this.
Technology can be used to slice through certain social and humanitarian problems like a hot knife through butter. Read about oral rehydration salts. This cheap solution eliminates the life-threatening dehydrating effects of diarrhea when water alone isn’t enough. You can talk about corruption in African governments all day long, but when humanitarian agencies actually deliver this physical substance to people suffering from diarrhea, it is life-saving.
Oral rehydration salts are being deployed now. What about the near future? Everyone in Africa needs a self-replication capable fabber based something like the RepRap project. This could be done in five years, if humanitarian organizations like the Bill and Melinda Gates Foundation put a tiny portion of their budget towards realizing it. But I’m not seeing it. Self-replicating factories are the path to dirt cheap products for everyone, not just the developing countries but the developed countries as well. Leaders are lacking the vision to push towards solutions that automagnify even when we stop writing the checks.
In the longer-term future, we need to think about reprogramming …
Lots of transhumanism coverage in the news lately.
Immediately on my mind is the New Scientist piece, “Death special: the plan for eternal life” and its accompanying video.
Reading the article made me feel a little squirmy. I don’t think the author had negative intentions at all: she just reported what she saw in the limited space she had. But I think that articles like this getting written show that transhumanism is doing something wrong.
The report was included as part of a special series on the topic of death. (Hence the indicator “Death special”.) Of course, this is because transhumanists are encouraging an engineering approach to combating human aging. But I often worry that this side of transhumanism is somewhat overemphasized. Again, no fault of the author here, I’m just looking at transhumanism in the mirror and commenting on what I see.
Although life extension is a big part of the transhumanist vision, there are other very important technologies, futuristic as well as contemporary, that I think deserve similar if not greater timeshare. Technologies mentioned …
I often enjoy the news items from PhysOrg, a top-notch science news site. Here are some from just today:
Scientists found a clam that lived for 400 years. A non-sentient clam gets to live 400 years and we humans typically drop dead at around 80? Doesn’t seem very fair.
A Japanese Institute is taking robotics to the next level, creating a system that learns through gestures rather than just executing pre-programmed routines.
UC San Diego scientists found that the T4 virus contains a molecular motor with twice the power density of an automobile engine. My thought is, “that’s it?” Scaling laws should enable molecular motors with thousands or even millions of times the power density of an auto engine.
Lots of people are keen to modify their appearance surgically. 48% of women were interested, 23% of men. As the procedures lower in cost and increase in elegance and utility, more people will sign up for cosmetic surgery. Of course, I take this to mean that many people will embrace …
For the last couple weeks, I have been debating Berkeley professor of rhetoric Dale Carrico more or less non-stop. This morning, responding to his criticisms of the Singularity Institute (SIAI), I wrote a summary of reasons to support the organization:
SIAI works towards Friendly (through whatever means works, something other than mathematical-deductive if necessary) seed AGI because the people in the organization see it as a high moral priority. This is humanity’s first experience of stepping beyond the Gaussian curve of ordinary human intelligence distributions. If it is not facilitated by AGI, it will be by enhancing humans: whether through psychopharmacology, neuroengineering, brain-computer interfaces, gene therapy, etc.
The question is not “if” intelligence enhancement technologies will be available, but “when”. When they are, it will become possible to “construct intelligence” actively rather than be limited to human generational cycles, birth-rates, and education. Now there’s nothing at all wrong with these conventional human patterns, but we have to note that the introduction of enhancement technology is bound to throw the existing order out of whack.
Accelerating Future friend Jeriaska has recently transcribed Eliezer Yudkowsky’s talk from the Singularity Summit 2007. Yudkowsky is a highly respected figure in the transhumanist community whose intense dissection and analysis of futurist issues is second to none. Thank you Jeriaska for transcribing this!
Judging by the way some people casually use the word “Singularity” in the comments section of this blog, I think you could really use this. Read this talk and you’ll see the “Singularity” for what it really is — three entirely separate but terribly conflated ideas. Cory Doctorow, for one, is guilty of conflating these ideas to the point where they were nothing more than a tepid paste by the time he was through with them.
Lots of other fantastic transcripts are up at the People Database Blog, and many more will be going up in coming weeks. Subscribe for the latest info, mofo!
Full name: Convention on the Prohibition of the Development, Production, Stockpiling and Use of Chemical Weapons and on their Destruction.
Short name: Chemical Weapons Convention (CWC) Open for signature: January 13, 1993 Entered into force: April 29, 1997 Member states: 182 Map of member states:
(States in light blue are full participants but still have stockpiles in various stages of disposal.)
Notable non-signatories include Angola, North Korea, Egypt, Iraq, Lebanon, Somalia, and Syria.
Article I. General Obligations Article II. Definitions and Criteria Article III. Declarations Article IV. Chemical Weapons Article V. Chemical Weapons Production Facilities Article VI. Activities Not Prohibited under this Convention Article VII. National Implementation Measures Article VIII. The Organization Article IX. Consultations, Cooperation and Fact-Finding Article X. Assistance and Protection against Chemical Weapons Article XI. Economic and Technological Development Article XII. Measures to Redress a Situation and to Ensure Compliance Article XIII. Relation to Other International Agreements Article XIV. Settlement of Disputes Article XV. Amendments Article XVI. Duration and Withdrawal Article XVII. Status of the …
Richard Jones, a professor of physics and the Senior Strategic Advisor for Nanotechnology for the UK’s Engineering and Physical Sciences Research Council — a role informally known as “UK Nano-Champion” — has adopted Berkeley rhetoric professor Dale Carrico’s criticism of so-called superlative technology discourse. In a recent blog post, he responded to an article following on the heels of a TV series hosted by Michio Kaku, titled “We will have the power of gods”. Here is an extract with choice pieces, selected by Carrico:
“Superlative technology discourseâ€¦ starts with an emerging technology with interesting and potentially important consequences, like nanotechnology, or artificial intelligence, or the medical advances that are making (slow) progress combating the diseases of aging. The discussion leaps ahead of the issues that such technologies might give rise to at the present and in the near future, and goes straight on to a discussion of the most radical projections of these technologies. The fact that the plausibility of these radical projections may be highly contested is by-passed by a …
Full name: Convention on the Prohibition of the Development, Production and Stockpiling of Bacteriological (Biological) and Toxin Weapons and on their Destruction
Short name: Biological Weapons Convention (BWC) Open for signature: April 10, 1972 Entered into force: March 26, 1975 Member states: 158 Map of member states:
Article I. Never under any circumstances to acquire or retain biological weapons. Article II. To destroy or divert to peaceful purposes biological weapons and associated resources prior to joining. Article III. Not to transfer, or in any way assist, encourage or induce anyone else to acquire or retain biological weapons. Article IV. To take any national measures necessary to implement the provisions of the BWC domestically. Article V. To consult bilaterally and multilaterally to solve any problems with the implementation of the BWC. Article VI. To request the UN Security Council to investigate alleged breaches of the BWC and to comply with its subsequent decisions. Article VII. To assist States which have been exposed to a danger as a result …
Some elitist-sounding comments by Marvin Minsky in a New Scientist article recently have caused some concern. Danielle Egan, the author of the article, posted the full transcript of her interview with Minsky in an effort to defend her choice of words and demonstrate that she wasn’t trying to misrepresent him. Some transhumanists disagreed with her. Here is Egan’s email in defense of herself, along with the full transcript, with the quoted parts bolded:
Danielle Egan here.
I’m here to stand behind the published quotes by Minsky. The transcript of our interview is copied below. You can judge for yourselves whether I took Minsky’s comments out of context or not.
(And also whether publishing any of his additional info would have softened or hardened Minsky’s published quotes.)
While I would have loved to write 5,000 words about the conference and included my interviews with all of the people I met there (and some I didn’t even get to meet), I had to deal with a 1,200 word assignment that was eventually reduced to just under 900 words by …
Should transhumanists be emotionally invested in particular technologies, such as molecular manufacturing, which could radically accelerate the transhumanist project?
My answer: for fun, sure. When serious, no. Transhumanists should always have a part of themselves serving as a detached futurist. Even though accurate prediction of the future is absolutely impossible, it makes sense to estimate probability distributions — we all have them, the difference is whether we explicitly acknowledge their presence and at least try to make them logically self-consistent.
For instance, right now I believe there is an over 85% chance that, barring catastrophic disaster, AI surpassing the human level will be developed by the end of this century.
Do I have an emotional investment in this prediction? No. What do I desire? Radical life extension, intelligence enhancement, permanent elimination of disease and war, and even the abolition of suffering in all sentient life. But this does not require artificial intelligence. I believe AI could accelerate these goals, but they are not dependent on it, so there is little personal emotional investment.
Most transhumanist goals are achievable …