Donate Bitcoin

Donate Paypal


PeakOil is You

PeakOil is You

5 Biggest Problems

General discussions of the systemic, societal and civilisational effects of depletion.

Re: 5 Biggest Problems

Unread postby dinopello » Tue 10 Feb 2015, 18:09:41

The list one might come up with really depends on the interpretation of the question being asked, the physical and time scale one assumes, problems we can do something about or not, etc.

Here in the US:

I think a lot of the problems are political - meaning how we and our systems support wise group decision making and policy.

1) Apathy at the local level demonstrated by people not participating fully in local governance.
2) Structural problems caused by gerrymandering that make it near impossible for the center to hold

Economic problems are certainly there, but I tend to discount the ones people bring up related to monetary issues. It's all made up anyway.

3) A big issue is wealth disparity and that it is going in the wrong direction.
4) Another issue, especially under the most optimistic outlook is what are people going to do in the society of the future and how are they going to get there from where they are now.

5) The information revolution has led to people gravitating to their own facts rather than better opinions based on better access to the facts, and the 4th estate has failed us.

Generally, the psychological state of the average person has been taxed by feelings of hopelessness, inability to do anything and just overload.

Most of the problem though is the structure and operation of the current political system.

So I tried not to give the same old population, environment, greed, delusion... although I wouldn't say they aren't problems.
User avatar
dinopello
Light Sweet Crude
Light Sweet Crude
 
Posts: 6088
Joined: Fri 13 May 2005, 03:00:00
Location: The Urban Village

Re: 5 Biggest Problems

Unread postby Newfie » Tue 10 Feb 2015, 18:16:39

In my mind these are BIG, REALLY BIG questions.

You missed a bit of the context from where Pops posted it on the 15 billion thread.

But think very top level, human existential questions.
User avatar
Newfie
Forum Moderator
Forum Moderator
 
Posts: 18651
Joined: Thu 15 Nov 2007, 04:00:00
Location: Between Canada and Carribean

Re: 5 Biggest Problems

Unread postby radon1 » Thu 12 Feb 2015, 00:39:59

Would take a digressing view and specify 2 main problems instead of those 5:

1. Over shorter term - nothing-doers (moochers).
2. Fundamentally - the state of the human consciousness.

On the other hand, it may be completely off-topic, depending on how it is viewed.
radon1
Intermediate Crude
Intermediate Crude
 
Posts: 2054
Joined: Thu 27 Jun 2013, 06:09:44

Re: 5 Biggest Problems

Unread postby Newfie » Sun 15 Feb 2015, 10:42:01

http://peakoil.com/generalideas/twelve- ... -could-end

That Oxford University group I referenced above seems to have put out a paper, although the I could not find it at their site a couple of weeks ago.

Our front page has a link to a Financial Times article, which itself is a PITA to chase so I'll just leave it as a self referential link to the PO article. Thanks to whoever put that up.

This is an interesting article, not very useful, but interesting.

First, one needs to understand the question they are attempting to answer.

They are looking for "infinite impacts" which seems to mean, things that might end humanity, extinction. These are not assessing things that will reduce our population, but take it to zero. A somewhat different question from "what are our 5 biggest problems." Still related.

Also they assign probabilities to 10 of the 12. Artificial Intelligence (0 to 10%) islands above the rest.

Three have no numerical assessment, Ecological Collapse, Global System Collapse, and Bad Goverence, have too many unknowns to estimate. In my mind both of these are driven by Over Population. It is pretty easy to see the link to eclogical collapse and system collapse. For bad Goverence I would look to Assimovs infamous "Bathroom Metaphore" for explanation.
http://en.m.wikipedia.org/wiki/Isaac_Asimov

Thus I think this is a coop out. We damn well know where population is going, and that will drive ecological collapse and or system collapse. PC Wimps! :x

And I think their concern with Artifical Intelligence is misguided.


...............................................
Unknown consequences
A catch-all category to cover the unknown unknowns — an amalgamation of all the risks that we have not thought of or that seem ridiculously unlikely in isolation (such as sending signals to extraterrestrial civilisations that attract deadly alien attention). Together they represent a significant apocalyptic threat.
Probability: 0.1%

Asteroid impact
An asteroid at least 5km across — big enough to end civilisation, if not wipe out human life — hits Earth about once every 20 million years. But programs to map hazardous objects are making progress and, given enough warning, a concerted effort by the world’s space powers might succeed in deflecting an incoming asteroid on to a non-collision path.
Probability: 0.00013%

Artificial intelligence
AI is the most discussed apocalyptic threat at the moment. But no one knows whether there is a real risk of extreme machine intelligence taking over the world and sweeping humans out of their way. The study team therefore gives a very wide probability estimate.
Probability: 0-10%


Supervolcano
An eruption ejecting thousands of cubic kilometres of material into the atmosphere — far larger than anything experienced in human history — could lead to a “volcanic winter”, with effects similar to an asteroid impact or nuclear war. Such events are known from the geological record to have caused mass extinctions. And with today’s technology, there is not much we could do to prevent its effects.
Probability: 0.00003%

Ecological collapse
A full collapse of the global ecosystem, so that the planet could no longer sustain a population of billions, is one of the most complex risks in the study. Because many unknown sequences would be involved, the team does not even guess at a probability.
Probability: n/a

Bad global governance
This category covers mismanagement of global affairs so serious that it is the primary cause of civilisation collapse (rather than a secondary response to other disasters). One example would be the emergence of an utterly incompetent and corrupt global dictatorship. The probability is impossible to estimate.
Probability: n/a

Global system collapse
This means economic and/or societal collapse, involving civil unrest and a breakdown of law and order that makes the continuation of civilised life impossible anywhere on Earth. There are too many unknowns to give a probability estimate.
Probability: n/a

Extreme climate change
Conventional modelling of climate change induced by human activity (adding carbon dioxide to the atmosphere) has focused on the most likely outcome: global warming by up to 4C. But there is a risk that feedback loops, such as the release of methane from Arctic permafrost, could produce an increase of 6C or more. Mass deaths through starvation and social unrest could then lead to a collapse of civilisation.
Probability: 0.01%

Nuclear war
A nuclear war between the US and Russia was the chief apocalyptic fear of the late 20th century. That threat may have reduced but, with proliferation of nuclear weapons, there is still a risk of a conflict serious enough to cause a “nuclear winter” as a pall of smoke in the stratosphere shuts out sunlight for months. That could put an end to civilised life regardless of the bombs’ material impact.
Probability: 0.005%

Global pandemic
An apocalyptic disease would combine incurability (like Ebola), lethality (like rabies), extreme infectiousness (like the common cold) and a long incubation period (like HIV/Aids). If such a virus spread around the world before people were aware of the danger, the international health system would have to move with unprecedented speed and resources to save mankind.
Probability: 0.0001%

Synthetic biology
Genetic engineering of new super-organisms could be enormously beneficial for humanity. But it might go horribly wrong, with the emergence and release, accidentally or through an act of war, of an engineered pathogen targeting humans or a crucial part of the global ecosystem. The impact could be even worse than any conceivable natural pandemic.
Probability: 0.01%

Nanotechnology
Ultra-precise manufacturing on an atomic scale could create materials with wonderful new properties but they could also be used in frightening new weapons. There is also the “grey goo” scenario of self-replicating nanomachines taking over the planet.
Probability: 0.01%
User avatar
Newfie
Forum Moderator
Forum Moderator
 
Posts: 18651
Joined: Thu 15 Nov 2007, 04:00:00
Location: Between Canada and Carribean

Previous

Return to Peak Oil Discussion

Who is online

Users browsing this forum: No registered users and 1 guest

cron