http://peakoil.com/generalideas/twelve- ... -could-endThat Oxford University group I referenced above seems to have put out a paper, although the I could not find it at their site a couple of weeks ago.
Our front page has a link to a Financial Times article, which itself is a PITA to chase so I'll just leave it as a self referential link to the PO article. Thanks to whoever put that up.
This is an interesting article, not very useful, but interesting.
First, one needs to understand the question they are attempting to answer.
They are looking for "infinite impacts" which seems to mean, things that might end humanity, extinction. These are not assessing things that will reduce our population, but take it to zero. A somewhat different question from "what are our 5 biggest problems." Still related.
Also they assign probabilities to 10 of the 12. Artificial Intelligence (0 to 10%) islands above the rest.
Three have no numerical assessment, Ecological Collapse, Global System Collapse, and Bad Goverence, have too many unknowns to estimate. In my mind both of these are driven by Over Population. It is pretty easy to see the link to eclogical collapse and system collapse. For bad Goverence I would look to Assimovs infamous "Bathroom Metaphore" for explanation.
http://en.m.wikipedia.org/wiki/Isaac_AsimovThus I think this is a coop out. We damn well know where population is going, and that will drive ecological collapse and or system collapse. PC Wimps!
And I think their concern with Artifical Intelligence is misguided.
...............................................
Unknown consequences
A catch-all category to cover the unknown unknowns — an amalgamation of all the risks that we have not thought of or that seem ridiculously unlikely in isolation (such as sending signals to extraterrestrial civilisations that attract deadly alien attention). Together they represent a significant apocalyptic threat.
Probability: 0.1%
Asteroid impact
An asteroid at least 5km across — big enough to end civilisation, if not wipe out human life — hits Earth about once every 20 million years. But programs to map hazardous objects are making progress and, given enough warning, a concerted effort by the world’s space powers might succeed in deflecting an incoming asteroid on to a non-collision path.
Probability: 0.00013%
Artificial intelligence
AI is the most discussed apocalyptic threat at the moment. But no one knows whether there is a real risk of extreme machine intelligence taking over the world and sweeping humans out of their way. The study team therefore gives a very wide probability estimate.
Probability: 0-10%
Supervolcano
An eruption ejecting thousands of cubic kilometres of material into the atmosphere — far larger than anything experienced in human history — could lead to a “volcanic winter”, with effects similar to an asteroid impact or nuclear war. Such events are known from the geological record to have caused mass extinctions. And with today’s technology, there is not much we could do to prevent its effects.
Probability: 0.00003%
Ecological collapse
A full collapse of the global ecosystem, so that the planet could no longer sustain a population of billions, is one of the most complex risks in the study. Because many unknown sequences would be involved, the team does not even guess at a probability.
Probability: n/a
Bad global governance
This category covers mismanagement of global affairs so serious that it is the primary cause of civilisation collapse (rather than a secondary response to other disasters). One example would be the emergence of an utterly incompetent and corrupt global dictatorship. The probability is impossible to estimate.
Probability: n/a
Global system collapse
This means economic and/or societal collapse, involving civil unrest and a breakdown of law and order that makes the continuation of civilised life impossible anywhere on Earth. There are too many unknowns to give a probability estimate.
Probability: n/a
Extreme climate change
Conventional modelling of climate change induced by human activity (adding carbon dioxide to the atmosphere) has focused on the most likely outcome: global warming by up to 4C. But there is a risk that feedback loops, such as the release of methane from Arctic permafrost, could produce an increase of 6C or more. Mass deaths through starvation and social unrest could then lead to a collapse of civilisation.
Probability: 0.01%
Nuclear war
A nuclear war between the US and Russia was the chief apocalyptic fear of the late 20th century. That threat may have reduced but, with proliferation of nuclear weapons, there is still a risk of a conflict serious enough to cause a “nuclear winter” as a pall of smoke in the stratosphere shuts out sunlight for months. That could put an end to civilised life regardless of the bombs’ material impact.
Probability: 0.005%
Global pandemic
An apocalyptic disease would combine incurability (like Ebola), lethality (like rabies), extreme infectiousness (like the common cold) and a long incubation period (like HIV/Aids). If such a virus spread around the world before people were aware of the danger, the international health system would have to move with unprecedented speed and resources to save mankind.
Probability: 0.0001%
Synthetic biology
Genetic engineering of new super-organisms could be enormously beneficial for humanity. But it might go horribly wrong, with the emergence and release, accidentally or through an act of war, of an engineered pathogen targeting humans or a crucial part of the global ecosystem. The impact could be even worse than any conceivable natural pandemic.
Probability: 0.01%
Nanotechnology
Ultra-precise manufacturing on an atomic scale could create materials with wonderful new properties but they could also be used in frightening new weapons. There is also the “grey goo” scenario of self-replicating nanomachines taking over the planet.
Probability: 0.01%