I recently found an article on existential risks in the Guardian from about a year ago. Of course, the Martin Rees book is immediately cited as justification for running the piece. In the article, 10 risks are summarized and scientists are asked to give a paragraph or two of commentary. They are, in no particular order:
1: Climate Change
2: Telomere erosion
3: Viral Pandemic
5: Nuclear war
6: Meteorite impact
7: Robots taking over
8: Cosmic ray blast from exploding star
10: Earth swallowed by a black hole
The likelihood of many of the above risks was exaggerated in the piece. Climate change could be troublesome, but it operates over long timescales, and couldn’t possibly kill us all. Terrorism and nuclear war wouldn’t kill everyone. Telomere erosion is just silly. Large meteorite impacts, supervolcano explosions, and cosmic ray blasts all happen only once every few tens of millions of years, so I think we’re okay for now.
A viral pandemic is a serious risk, though not as great as some others. Getting killed by recursively self-improving robots/AI is probably the greatest risk to our future, and the most poorly understood (thanks to Hollywood and our innate tendency to anthropomorphize and mechanomorphize). The Earth getting swallowed by a black hole or stable strangelet generated in a particle accelerator is one of those wild cards. The Brookhaven study rates the probability as negligible, but other studies still encourage caution. Anthropic calculations by Max Tegmark and Nick Bostrom give an upper bound of once per 10^9 years for the occurrence of such disasters. It is true that extremely high-energy cosmic rays slam into the moon regularly without creating stable strangelets.
Risks they missed: deliberate or accidental misuse of nanotechnology, badly programmed superintelligence, genetically engineered pathogens, repressive totalitarian global regime, take-over by a transcending upload, or something unforseen. See the classic Bostrom paper on the issue here. For organizations working on comprehensive solutions to address global risk, see the Lifeboat Foundation and the Singularity Institute.