Donate Bitcoin

Donate Paypal


PeakOil is You

PeakOil is You

GPT4 Artificial Intelligence Online

What's on your mind?
General interest discussions, not necessarily related to depletion.

Re: The Singularity Summit 2008

Unread postby EnergyUnlimited » Sat 30 Aug 2008, 16:27:03

$this->bbcode_second_pass_quote('TWilliam', 'W')hat I see as the biggest challenge to the concept of the Singularity is this: what if in fact the mystical traditions are correct and what we define as matter is a byproduct of Consciousness, rather than Consciousness being a byproduct of matter? Brain being a manifestation of mind rather than mind being a manifestation of brain, in other words? What then of the idea of conscious machines?

There are far more down to Earth challenges, though.

For example once you have a difficulty to get enough food, struggle to pay your bills, don't get any R & D budget, live in collapsing society facing intractable environmental problems, would you still carry on working with some silly AI project, knowing that you are not going to get paid for it, that no one will need your work and that your wife/GF is keeping herself busy with someone more useful than you are while you sit in forgotten lab which enjoys 3 hours of electricity a day?
User avatar
EnergyUnlimited
Light Sweet Crude
Light Sweet Crude
 
Posts: 7537
Joined: Mon 15 May 2006, 03:00:00

Re: The Singularity Summit 2008

Unread postby Carlhole » Sat 30 Aug 2008, 16:47:53

Global Catastrophic Risks Conference
$this->bbcode_second_pass_quote('Reason Online', 'T')his week the Future of Humanity Institute at Oxford University, headed by bioprogressive philosopher Nick Bostrom, is convening a conference on Global Catastrophic Risks. The Institute's work focuses on how radical technological developments such as nanotechnology, artificial intelligence, and life-extension treatments will affect the human condition. One of the Institute's research programs is global catastrophic risks which mulls questions like: What are the biggest threats to global civilization and human well-being? Will the human species survive the 21st century?

$this->bbcode_second_pass_quote('', 'T')he whole cheery conference kicks off this evening with a talk by Sir Crispin Tickell entitled, "Humans: Past, Present and Future." Apparently Tickell buys into the whole litany of environmentalist doom. However, he thinks that doom can be avoided if we "radically change our thinking on global governance" and pursue some "interesting" technological options.

I found this article by following links from the Singularity Summit 2008 website. Apparently, the Oxford group responsible for the Global Catastrophic Risks Conference will be present in force at the SS08.
Carlhole
 

Re: The Singularity Summit 2008

Unread postby EnergyUnlimited » Sat 30 Aug 2008, 17:01:33

$this->bbcode_second_pass_quote('Carlhole', 'T')he people who are interested in the singularity idea are extremely bright people. They are not living under rocks. They are aware of all the arguments that the peak oil crowd and others have put forth about declining petroleum production and the limits of Earth's mineral and ecological resources...
And yet they still hope to get to this singularity, whatever it might be.
Not very bright, it seems... :)
$this->bbcode_second_pass_quote('', 'B')ut they have said that adversity only strengthens the momentum of technological innovation. And I far as I can tell, the momentum of technological progress is showing absolutely NO SIGNS whatsoever of slowing down due to higher energy costs or other limitations.

The ultimate limits on progress would come in any case from restrictions related to material science.
Huge albeit limited number of applications can come out of finite number of chemical elements and other forms of matter in our disposal.
$this->bbcode_second_pass_quote('', 'I')f you are making the claim that technological progress WILL slow down and ultimately stop, then it's not unreasonable for skeptics to ask you WHEN the slowdown will become evident? How long will it take for technological advancement to come to a dead stop? I don't see ANY signs of that stuff happening.

For reasons as stated above it has to stop at some point, one like it or not, but it will stop much earlier for far more down to Earth reasons.
$this->bbcode_second_pass_quote('', 'W')hen you predict a die-off, you are claiming that total energy availability will go SO LOW that people will be entirely unable to adapt. This does not look likely at all. It looks MORE likely that people will adapt to a different energy mix which has less growth associated but which is a stable energy mix and more or less continuous.

Dieoff may come for combinations of reasons entirely unrelated to energy supply.
$this->bbcode_second_pass_quote('', 'A') singularitarian would simply look at the amount of sunlight falling on the Earth and the percentage that could reasonably be captured. The singularitarian would also look at the trend of nanotechnologies that advance the efficiencies of solar power. This minimum level of energy availability PRECLUDES a major die-off. Plenty of other options exist apart from solar energy.

That wont help much, if airborne Ebola alike virus showing latent (and yet infective) period of 2 weeks appeared out of blue in February next year. The same holds for clathrate gun activation within a decade from now on or if massive global wars related to various shortages erupted.
$this->bbcode_second_pass_quote('', 'A')nd, as long as there are no abrupt, unavoidable cataclysms, the trend of technology solutions to every imaginable problem will continue and thrive.
Yep. Within next 100 years we will have time machine.
$this->bbcode_second_pass_quote('', ' ')Human Beings are extraordinarily adaptive creatures. That's how Nature designed the beast.
And what if the only path to adaptation left will also lead to a low tech world.
$this->bbcode_second_pass_quote('', 'A') big sudden decline of a small percentage of population could occur -- but that would not stop technology.
What about slow drop of numbers with extreme poverty of all still alive?
$this->bbcode_second_pass_quote('', 'T')he world will probably witness a qualitative increase in the health, intelligence and viability of the average human being during this time of population decline.
Increased health during dieoff? :-D :-D :-D
$this->bbcode_second_pass_quote('', 'W')e will probably also see the advent of Machine Intelligence (powerful increasing intelligence is the most valuable thing in the Universe).
How that could help you?

BTW. Higher intelligence (if created) will in all probabilities not need you.
$this->bbcode_second_pass_quote('', 'T')his is something that will create an unimaginable revolution to how life on Planet Earth is lived.
It would, if created.
$this->bbcode_second_pass_quote('', 'P')erhaps some immensely powerful consciousness will review the Life on the planet over a billion years and muse that a highly developed intelligence was what the Earth must have had in mind from the beginning. And that's why we had to have the 20th Century with all its population growth, consumerism, etc. It may all look quite organic in the perspective of the rear view mirror. How else to explain the progression of Life's complexity since single cells first appeared? We are not talking about anything new here.
Evolution of life has no purpose, other then facilitation of faster entropy increase.
User avatar
EnergyUnlimited
Light Sweet Crude
Light Sweet Crude
 
Posts: 7537
Joined: Mon 15 May 2006, 03:00:00
Top

Re: The Singularity Summit 2008

Unread postby Carlhole » Sat 30 Aug 2008, 17:25:07

Usually, in science, you posit a theory and then use it to make a prediction.

To say that technological progress MUST stop is mere gibberish until you make some concrete prediction about it.

Everyone can plainly see that the rate of technological progress is increasing as never before. Some really wild things are in a nascent stage. One only has to extrapolate a little to see that the direction the rapid technological advances are taking us is towards a mind-blowing future.

Until you can show that technology is slowing or stopping according to your own prognosis, I'm afraid you're completely full of sh*t.
Carlhole
 

Re: The Singularity Summit 2008

Unread postby pedalling_faster » Sat 30 Aug 2008, 18:22:58

like computers, AI is an useful tool.

there was something about the transition to DDR400 in 2003, that is a kind of memory computers used in the 2003-2004 time period, that really struck me ... i look at a computer motherboard, with the DDR400 computer memory, i see AI, "this thing is going to be smarter than us people". in 10 years, 20 years, whatever.

so it is a subject for interesting conferences, provides employement at the Stanford AI center, maybe will lead to a phpBB forum software that loads the forums i usually read at PO.com.

but - so what ?

we do not have shortage of energy, we have shortage of cooperation.

the initial uses of AI, a reflection of one of the primary funding sources : for the US military.

having the US military, given their recent track record, since 2003, since 1963, since 1898, however you count, smarter weapons does not strike me as a step forward.

the electronics industry has a caveman approach to controlling waste products from manufacturing - move production to Tijuana, dump the waste in the river.

as long as the electronic hardware which facilitates AI is manufactured in a way that makes the Earth less sustainable - how is AI helping us ?

it just highlights an irony - human beings are smart enough to make fast computers, but so stupid they make their living place unliveable with pollution.
http://www.LASIK-Flap.com/ ~ Health Warning about LASIK Eye Surgery
User avatar
pedalling_faster
Permanently Banned
 
Posts: 1399
Joined: Sat 10 Dec 2005, 04:00:00

Re: The Singularity Summit 2008

Unread postby Carlhole » Sat 30 Aug 2008, 18:48:24

$this->bbcode_second_pass_quote('pedalling_faster', 'b')ut - so what ? we do not have shortage of energy, we have shortage of cooperation.
the initial uses of AI, a reflection of one of the primary funding sources : for the US military.
having the US military, given their recent track record, since 2003, since 1963, since 1898, however you count, smarter weapons does not strike me as a step forward.
the electronics industry has a caveman approach to controlling waste products from manufacturing - move production to Tijuana, dump the waste in the river. as long as the electronic hardware which facilitates AI is manufactured in a way that makes the Earth less sustainable - how is AI helping us?
it just highlights an irony - human beings are smart enough to make fast computers, but so stupid they make their living place unliveable with pollution.

This is one of the reasons for having a conference dedicated to anticipating the problems associated with the Singularity.

As I said before, members of the Global Catastrophic Risk Conferencewill be at the SS08 in numbers:
$this->bbcode_second_pass_quote('GCRC', 'G')lobal catastrophic risks are risks that seriously threaten human well-being on a global scale. An immensely diverse collection of events could constitute global catastrophes: potential factors range from volcanic eruptions to pandemic infections, nuclear accidents to worldwide tyrannies, out-of-control scientific experiments to climatic changes, and cosmic hazards to economic collapse.

Global Catastrophes Global catastrophes have occurred many times in history, even if we only count disasters causing more than 10 million deaths. A very partial list of examples includes the An Shi Rebellion (756-763), the Taiping Rebellion (1851-1864), and the famine of the Great Leap Forward in China, the Black Death in Europe, the Spanish flu pandemic, the two World Wars, the Nazi genocides, the famines in British India, Stalinist totalitarianism, and the decimation of the native American population through smallpox and other diseases following the arrival of European colonizers. Many others could be added to this list.

Although the current and future risks are of various kinds, treating global catastrophic risk as a field for academic enquiry is a useful, coherent and important endeavour.
Carlhole
 
Top

Re: The Singularity Summit 2008

Unread postby Carlhole » Sat 30 Aug 2008, 18:50:02

[align=center][flash width=425 height=350]http://www.youtube.com/v/9PWXrnsSrf0[/flash][/align]

Ray Kurzweil (pt1of3) The Singularity Summit at Stanford 2006
Carlhole
 

Re: The Singularity Summit 2008

Unread postby Carlhole » Sat 30 Aug 2008, 20:10:07

Whatever it was that Golem said (he's on my 'ignore' list), I can guarantee that it was utter gibberish.

Golem... If you don't mind... Please stay out of my threads. Your blithering comments aren't welcome. I stay out of your threads so... fair enough.
Carlhole
 

Re: The Singularity Summit 2008

Unread postby outcast » Sat 30 Aug 2008, 21:15:12

$this->bbcode_second_pass_quote('', 'T')hat is correct under presumption that a dieoff would proceed in environment where those doomed ones would curl up and die quietly singing Kumbayah meantime. In real world however dieoff will take a shape of string of worldwide genocidal wars where technology development centers will become one of primary targets of mass destruction weapons and enemy scientists an important group of those to be exterminated.

Wow, just wow. So basically what you're saying is that we're all going die?
$this->bbcode_second_pass_quote('', 'D')on't you observe that international order is already on the brink of collapse?

Actually this is the most peaceful time in history, as sad as that is. It just doesn't seem that way because we always have the media in our faces trying to scare the shit out of us.
$this->bbcode_second_pass_quote('', 'I')t is also a prerequisite of progress for industrial base permitting it to exist and function well. So how are you going to preserve existing industrial base trough coming era of collapse? Again, even if you preserved knowledge but lost all means to exercise it in practice, your knowledge is rendered useless.

RE consumerism and progress: Without consumerism there is no need to produce advanced items. No one will need these in sufficiently large quantities to warrant future research. Infrastructure required for their production and for further R & D will fall into disrepair.

This assumes that industrial civilization is going to collapse, which there isn't much evidence that this will happen.
$this->bbcode_second_pass_quote('', 'Q')uote: A culling might be scary for an individual to try to live through but it wouldn't be scary for descendents of survivors remembering it via artificial memory (or whatever).
Unfortunately you will run out of time and resources required to maintain industrial infrastructure before you can even think about creating of such entities. Anyway there is no reason to create these. They would compete for dwindling resources with remaining humans.

The club of rome in the 70's said that something like that would happen by now........it didn't.
User avatar
outcast
Tar Sands
Tar Sands
 
Posts: 885
Joined: Mon 21 Apr 2008, 03:00:00
Top

Re: The Singularity Summit 2008

Unread postby gampy » Sun 31 Aug 2008, 00:07:27

I am somewhat familiar with this idea of the "Singularity".

It's certainly an interesting premise. Shades of Terminator, notwithstanding.

I think some of the folk here are sceptical because they are thinking in terms of human intelligence.
Intelligence comes in various degrees, and flavours, I would say. Some of it is obvious, a lot of it is not.

This singularity may resemble a hive, or insect type of intelligence.

Or more likely, something so alien, and inconceivable, because the human mind is just that. Human. It perceives reality, and other intelligences through the prism of it's evolutionary, and biological constraints.

I can see it happening, but I wonder if our high energy society has enough time left to make it a reality. I wonder what
The Singularity would think of peak-oil, and environmental degradation, and climate change.

Proabably what Agent Smith thought. Cancer. Humans are a cancer. A disease. Most likely terminal.

Time for some chemo, and radiation.
"Some people are like Slinky's. They don't serve a useful purpose, but they still bring a smile to your face when you push them down the stairs."
User avatar
gampy
Tar Sands
Tar Sands
 
Posts: 761
Joined: Fri 27 Oct 2006, 03:00:00
Location: Soviet Canada

Re: The Singularity Summit 2008

Unread postby Carlhole » Sun 31 Aug 2008, 00:33:46

$this->bbcode_second_pass_quote('gampy', 'I') am somewhat familiar with this idea of the "Singularity".

It's certainly an interesting premise. Shades of Terminator, notwithstanding.

I think some of the folk here are sceptical because they are thinking in terms of human intelligence.
Intelligence comes in various degrees, and flavours, I would say. Some of it is obvious, a lot of it is not.

This singularity may resemble a hive, or insect type of intelligence.

Or more likely, something so alien, and inconceivable, because the human mind is just that. Human. It perceives reality, and other intelligences through the prism of it's evolutionary, and biological constraints.

I can see it happening, but I wonder if our high energy society has enough time left to make it a reality. I wonder what
The Singularity would think of peak-oil, and environmental degradation, and climate change.

Proabably what Agent Smith thought. Cancer. Humans are a cancer. A disease. Most likely terminal.

Time for some chemo, and radiation.


Watch the YouTube Ray Kurzweil talk I posted a few posts back.

Also:

The Singularity

$this->bbcode_second_pass_quote('', 'T')he technological singularity is a theoretical future point of unprecedented technological progress, caused in part by the ability of machines to improve themselves using artificial intelligence.[1]

Statistician I. J. Good first wrote of an "intelligence explosion", suggesting that if machines could even slightly surpass human intellect, they could improve their own designs in ways unforeseen by their designers, and thus recursively augment themselves into far greater intelligences. The first such improvements might be small, but as the machine became more intelligent it would become better at becoming more intelligent, which could lead to an exponential and quite sudden growth in intelligence.

Vernor Vinge later called this event "the Singularity" as an analogy between the breakdown of modern physics near a gravitational singularity and the drastic change in society he argues would occur following an intelligence explosion. In the 1980s, Vinge popularized the singularity in lectures, essays, and science fiction. More recently, some prominent technologists such as Bill Joy, founder of Sun Microsystems, voiced concern over the potential dangers of Vinge's singularity (Joy 2000). Following its introduction in Vinge's stories, particularly Marooned in Realtime and A Fire Upon the Deep, the singularity has also become a common plot element throughout science fiction.

Others, most prominently Ray Kurzweil, define the singularity as a period of extremely rapid technological progress. Kurzweil argues such an event is implied by a long-term pattern of accelerating change that generalizes Moore's Law to technologies predating the integrated circuit and which he argues will continue to other technologies not yet invented. Critics of Kurzweil's interpretation consider it an example of static analysis, citing particular failures of the predictions of Moore's Law.

Robin Hanson proposes that multiple "singularities" have occurred throughout history, dramatically affecting the growth rate of the economy. Like the agricultural and industrial revolutions of the past, the technological singularity would increase economic growth between 60 and 250 times. An innovation which allowed a replacement for virtually all human labor could trigger this singularity.
Carlhole
 
Top

Re: The Singularity Summit 2008

Unread postby EnergyUnlimited » Sun 31 Aug 2008, 02:03:48

$this->bbcode_second_pass_quote('Carlhole', 'U')sually, in science, you posit a theory and then use it to make a prediction.

To say that technological progress MUST stop is mere gibberish until you make some concrete prediction about it.

It is also a gibberish to suggest that progress must continue forever.

I already pointed you out that constraints dictated by material science (limited number of available building blocks) are implying finite number of possible applications.

Another more fundamental barrier take a shape of actual laws of physics, which you cannot "cheat" somehow, regardless how intelligent you are.

To suggest otherwise is just silly.

$this->bbcode_second_pass_quote('', 'E')veryone can plainly see that the rate of technological progress is increasing as never before. Some really wild things are in a nascent stage.

Once consumerism is gone your wonderitems will be of little use.
If corporations cannot sell enough to get adequate of return on investment, they will lose interest in new developments.
$this->bbcode_second_pass_quote('', 'O')ne only has to extrapolate a little to see that the direction the rapid technological advances are taking us is towards a mind-blowing future.

Exponential extrapolations of current trends in surrounding limited physical world are leading to absurd predictions.
So why to bother with these?
$this->bbcode_second_pass_quote('', 'U')ntil you can show that technology is slowing or stopping according to your own prognosis, I'm afraid you're completely full of sh*t.

Read above. I have already provided strong enough reasons (limits dictated by considerations of material science and laws of physics themselves).

Until you show that progress of technology cannot stop (or prove that WTC attack was G. W. Bush's government conspiracy ), I am afraid, you're completely full of s*hit.

Just a silly little girl who thinks that because today she has a i-pod, tomorrow she must have time machine. :-D :-D :-D
User avatar
EnergyUnlimited
Light Sweet Crude
Light Sweet Crude
 
Posts: 7537
Joined: Mon 15 May 2006, 03:00:00
Top

Re: The Singularity Summit 2008

Unread postby EnergyUnlimited » Sun 31 Aug 2008, 02:12:17

Discussion of Carlhole's quote:

$this->bbcode_second_pass_quote('', 'S')tatistician I. J. Good first wrote of an "intelligence explosion", suggesting that if machines could even slightly surpass human intellect, they could improve their own designs in ways unforeseen by their designers, and thus recursively augment themselves into far greater intelligences.

This implies that humans (already intelligent) should be able to improve their intelligence ab infinitum.
Utter absurd.
$this->bbcode_second_pass_quote('', 'T')he first such improvements might be small, but as the machine became more intelligent it would become better at becoming more intelligent, which could lead to an exponential and quite sudden growth in intelligence.

Flies into face of constrains dictated by material science (yet another absurd).
User avatar
EnergyUnlimited
Light Sweet Crude
Light Sweet Crude
 
Posts: 7537
Joined: Mon 15 May 2006, 03:00:00
Top

Re: The Singularity Summit 2008

Unread postby Carlhole » Sun 31 Aug 2008, 02:15:40

$this->bbcode_second_pass_quote('EnergyUnlimited', 'O')nce consumerism is gone your wonder items will be of little use.


They're not my "wonder items", dude.

I'm posting about something that some very high-calibre people have gotten behind. These are people who are well aware of subjects such as peak oil theory, resource constraints, overpopulation, ecological destruction and so forth.

We are talking about people such as top management at Intel. An Oxford group called The Global Catastrophic Risks Conferencewill be presenting as well. It is their mission to identify, report on, and recommend solutions for various things which pose a threat to human civilization.

A lot of people who post here at PO.com can barely string sentences together, can barely think their way out of a wet paper bag, and have absolutely no credentials whatsoever other than some dumbass username -- and they think they are more credible than the people who will be presenting at The Singularity Summit?

Maybe you should go to the Singularity Summit 2008 and present your Doomer science to them. Straighten them out on a few things. But don't blame it on the messenger.
Carlhole
 
Top

Re: The Singularity Summit 2008

Unread postby EnergyUnlimited » Sun 31 Aug 2008, 02:58:41

$this->bbcode_second_pass_quote('Carlhole', '
')I'm posting about something that some very high-calibre people have gotten behind. These are people who are well aware of subjects such as peak oil theory, resource constraints, overpopulation, ecological destruction and so forth.

You are easy impressed by peoples, who are claiming to be "authority".

In fact they are a bundle of loons under a leadership of The Man Who Refused to Die (Ray Kurzweil).

Man's hubris have no limits.
$this->bbcode_second_pass_quote('', 'W')e are talking about people such as top management at Intel.

Do you really expect for executives from Intel to say that they see some looming barriers to further chip development?
Shareholders would not be proud of them...
$this->bbcode_second_pass_quote('', '.')..and they think they are more credible than the people who will be presenting at The Singularity Summit?

...or those, who are presenting on Flat Earth Summit.
$this->bbcode_second_pass_quote('', 'M')aybe you should go to the Singularity Summit 2008 and present your Doomer science to them. Straighten them out on a few things. But don't blame it on the messenger.

Why?
I am not concerned about singularity. I don't see any prospect to deliver one.
And Ray Kurzweil will still die... even if he hates such idea. :-D
User avatar
EnergyUnlimited
Light Sweet Crude
Light Sweet Crude
 
Posts: 7537
Joined: Mon 15 May 2006, 03:00:00
Top

Re: The Singularity Summit 2008

Unread postby Carlhole » Sun 31 Aug 2008, 03:40:32

$this->bbcode_second_pass_quote('EnergyUnlimited', '')$this->bbcode_second_pass_quote('Carlhole', '
')I'm posting about something that some very high-calibre people have gotten behind. These are people who are well aware of subjects such as peak oil theory, resource constraints, overpopulation, ecological destruction and so forth.

You are easy impressed by peoples, who are claiming to be "authority".
$this->bbcode_second_pass_quote('', 'W')e are talking about people such as top management at Intel.

Do you really expect for executives from Intel to say that they see some looming barriers to further chip development?
Shareholders would not be proud of them...
$this->bbcode_second_pass_quote('', '.')..and they think they are more credible than the people who will be presenting at The Singularity Summit?

...or those, who are presenting on Flat Earth Summit.
$this->bbcode_second_pass_quote('', 'M')aybe you should go to the Singularity Summit 2008 and present your Doomer science to them. Straighten them out on a few things. But don't blame it on the messenger.

Why?
I am not concerned about singularity.
And Ray Kurzweil will still die... even if he hates such idea. :-D

All of that sounds very childish to me. It's a very narrow-minded person who does not believe that accomplished, credentialed people have any credibility. My impression of you is that you ARE narrow-minded and that you would prefer that people like me would not post ideas here on PeakOil.com that run contrary to the Kool-Aid you have previously drunk.

I imagine that someone who has reached the position of Chief Technology Officer of Intel Corporation is someone who is not willfully ignorant and someone who presents visionary ideas with honesty.

Here are some of the abstracts of presenters from The Oxford Global Catastrophic Risks Conference who will be presenting at The Singularity Conference 2008:

(Incidentally, I find that these people will often answer emails. So if you were to write an email with your objections to the Singularity idea, for example, you may just get a reply which is worth posting here. However, I bet you're more likely to content yourself with bitching at me about this news I've posted rather than contributing anything interesting or new.)

Some thoughtful person here on PeakOil.com might wonder to him/herself, "Why isn't the Global Catastrophic Risks Conference mentioning anything about Peak Oil or the expected terrible energy crunch which lies immediately ahead of us?".

This is certainly a question that has occurred to me. Has ASPO or Matthew Simmons ever applied for membership in the GCRC? Why not?

These people at GCRC have contact emails available. Write to them and find out why your ideas about the end of the world due by the end of the next decade are not being addressed head-on. Tell us what you learn from them. THAT would be interesting.

Global Catastrophic Risks Abstracts

$this->bbcode_second_pass_quote('GCRC', 'S')ir Crispin Tickell
Humans: Past, Present and Future

In the history of life on Earth the human species is a very latecomer. But the human impact on the Earth has slowly and then rapidly increased, most of all in the last 250 years, to what has been widely predicted as an unsustainable level in just a few generations hence.

The main factors are human population increase, degradation of land, consumption of resources, water pollution and supply, climate change, destruction of biodiversity and other species, the widening division between rich and poor, the risk of conflict, and the technological fix. Technology could hold the key to human survival or its destruction. Despite life on Earth being robust, human survival is not guaranteed. Technology may throw up some interesting options, but it is how we govern these options that will count.

There are solutions to most of problems we have created, but we will have to radically change our thinking on global governance and the whole spectrum of international affairs.

Catastrophe, Social Collapse, and Human Extinction
Robin Hanson

Humans have slowly built more productive societies by slowly acquiring various kinds of capital, and by carefully matching them to each other. Because disruptions can disturb this careful matching, and discourage social coordination, large disruptions can cause a "social collapse," i.e., a reduction in productivity out of proportion to the disruption. For many types of disasters, severity seems to follow a power law distribution. For some of the types, such as wars and earthquakes, most of the expected harm is predicted to occur in extreme events, which kill most people on Earth. So if we are willing to worry about any war or earthquake, we should worry especially about extreme versions. If individuals varied little in their resistance to such disruptions, events a little stronger than extreme ones would eliminate humanity, and our only hope would be to prevent such events. If individuals vary a lot in their resistance, however, then it may pay to increase the variance in such resistance, such as by creating special sanctuaries from which the few remaining humans could rebuild society.

The Tragedy of the Uncommons
Jonathan Wiener

The classic "tragedy of the commons" phenomenon may plague many shared resources, but over time social institutions can learn from experience with resource depletion and pollution, and can develop effective remedies. By contrast, rare extreme catastrophic events -"tragedies of the uncommons" - pose a different and less tractable challenge. As societies solve the "commons" problems, new "uncommons" risks loom larger. Yet "uncommons" events do not present the kind of salient visible early warning signals that tend to galvanize popular politics to respond to "commons" problems. Can societies and their governance institutions devise ways to anticipate, prevent and survive such rare extreme catastrophic events?

Rationally Considering the End of the World
Eliezer Yudkowsky

The decision to think rationally is a preliminary challenge in any human endeavour. To land a spaceship on the Moon, you must solve problems of physics and engineering. But before you can even attempt such problems, you must think of the Moon as a goal to be attained, not a wondrous tale to be told; the physics must be a problem to be analyzed, not a mystery to be worshipped; the engineering must seem a challenge to creative ingenuity, rather than an invitation to trust to luck. The foundational requirement for humanity to successfully confront and resolve global catastrophic risks may be simply that we stay in a scientific/engineering frame of mind. I consider some common departures from this frame, and offer general principles for staying within it.

Disasters, Ecological Diversity, and the Future of Humanity
Christopher Wills

How quickly can the world’s biological systems recover their diversity in the aftermath of ecological catastrophes that cause widespread local and global extinctions of species? This question can be answered by examining the forces that maintain and increase ecological and genetic diversity at the present time. Here I look at two of these processes: those that increase human genetic diversity, and those that maintain and increase tropical forest diversity. I suggest that both have been aided by a particular type of natural selection known as negative frequency-dependent selection, and I conclude that this type of selection may aid in the recovery of diversity following catastrophes.

Culture and the Credibility of Catastrophe
Steve Rayner

Prophesies of doom and the anticipation of cataclysmic events have been a perennial feature of human discourse in both oral and written traditions. Characteristically they involve a super- or extra-human force—gods, ancestors, or nature—which will intervene in human affairs. They also embody strong moral or ethical imperatives, either to reform current social behaviour so that catastrophe can be averted, or to establish a more ideal social order once the present dispensation has been swept away.
Carlhole
 
Top

Re: The Singularity Summit 2008

Unread postby TWilliam » Sun 31 Aug 2008, 12:28:20

$this->bbcode_second_pass_quote('Carlhole', 'E')veryone can plainly see that the rate of technological progress is increasing as never before.


Can't lay my hands on it atm Carlhole, but I remember seeing an essay some time ago about this concept of accelerating 'technological progress', and one of the things I remember from it was a graph that showed that the rate of innovation, i.e. the frequency of the development of truly new technologies, is actually deccelerating. What we've been experiencing over the last century or so is not really an increasing rate of technological progress, but merely ongoing refinement of what we already have. Microchips, for example, are really in essence just a refinement of vaccuum tube arrays. This refinement process produces an appearance of 'progress', but in terms of genuine innovation, there's been less and less occurring...
"It means buckle your seatbelt, Dorothy, because Kansas? Is goin' bye-bye... "
User avatar
TWilliam
Expert
Expert
 
Posts: 2591
Joined: Sun 28 Nov 2004, 04:00:00
Top

Re: The Singularity Summit 2008

Unread postby EnergyUnlimited » Sun 31 Aug 2008, 13:47:44

$this->bbcode_second_pass_quote('Carlhole', '
') It's a very narrow-minded person who does not believe that accomplished, credentialed people have any credibility.

They do have some credibility but commonly only in extremely narrow area of knowledge.
Eg they are commonly victims of other type of singularity by knowing everything about nothing.

So for example I have attended one pharma conference where one of speakers (recognized academic from area of neuroscience, nomen omen...) was found to believe that neurons are communicating solely by electric impulses and during questioning time it was found out that he is not aware that such a thing like synapse exist at all.
He was not aware of functions of neurotransmitters (like dopamine or serotonine), albeit he knew that these are there and "do something around neurons".
Audience was astounded.
You see, this poor guy was working in extremely narrow area of neuroscience (he was investigating workings of sodium/potassium pump in process of impulses transmission) and he was not aware of anything else.

$this->bbcode_second_pass_quote('', 'M')y impression of you is that you ARE narrow-minded and that you would prefer that people like me would not post ideas here on PeakOil.com that run contrary to the Kool-Aid you have previously drunk.

I don't really mind about you posting or not.
However if you *do* post some silly-wily and I spot it then I will comment on it, time permitting.

We already had:

- WTC conspiracy
- designed dieoff
- singularity silliness
etc.

However you are still lagging behind Kylon. :-D

$this->bbcode_second_pass_quote('', 'I') imagine that someone who has reached the position of Chief Technology Officer of Intel Corporation is someone who is not willfully ignorant and someone who presents visionary ideas with honesty.

Again, he knows only extremely tiny part of physics related to surrounding world.
On the top of it he knows next to nothing from remaining physics, next to nothing from chemistry, next to nothing from biology, ecology etc.

He also believes that gizmos of his design are most important items in Universe and that peoples will carry on buying those gizmos and carry on investing in further even more irrelevant gizmos, regardless that they cannot afford doctor, decent diet, bill payment and fees for education of their children.
$this->bbcode_second_pass_quote('', 'H')owever, I bet you're more likely to content yourself with bitching at me about this news I've posted rather than contributing anything interesting or new.

As if you are contributing anything new...

We all know that there is some sort of singularity movement led by The Man Who Refused to Die (and who consumes every day about a glass of varios tablets and food supplement extracts to help himself achieving that noble task :-D ) and it is reasonable to expect that they are holding some meetings.
$this->bbcode_second_pass_quote('', 'S')ome thoughtful person here on PeakOil.com might wonder to him/herself, "Why isn't the Global Catastrophic Risks Conference mentioning anything about Peak Oil or the expected terrible energy crunch which lies immediately ahead of us?".
And I bet, many do...

Peak oil is only one of symptoms of far larger disease.

Taken apart many symptoms of this disease may be addressed somehow but unfortunately they are catching up with us in concerted fashion.
$this->bbcode_second_pass_quote('', 'T')hese people at GCRC have contact emails available. Write to them and find out why your ideas about the end of the world due by the end of the next decade are not being addressed head-on.
World will not end (eg. human extinction is unlikely to the extreme in that timescale) but current paradigm of eternal economic growth will collapse and technological civilization will begin to decline.
Dieoff will also set off.
User avatar
EnergyUnlimited
Light Sweet Crude
Light Sweet Crude
 
Posts: 7537
Joined: Mon 15 May 2006, 03:00:00
Top

Re: The Singularity Summit 2008

Unread postby EnergyUnlimited » Sun 31 Aug 2008, 14:20:52

$this->bbcode_second_pass_quote('TWilliam', '')$this->bbcode_second_pass_quote('Carlhole', 'E')veryone can plainly see that the rate of technological progress is increasing as never before.


Can't lay my hands on it atm Carlhole, but I remember seeing an essay some time ago about this concept of accelerating 'technological progress', and one of the things I remember from it was a graph that showed that the rate of innovation, i.e. the frequency of the development of truly new technologies, is actually deccelerating. What we've been experiencing over the last century or so is not really an increasing rate of technological progress, but merely ongoing refinement of what we already have. Microchips, for example, are really in essence just a refinement of vaccuum tube arrays. This refinement process produces an appearance of 'progress', but in terms of genuine innovation, there's been less and less occurring...

Stewards of Singularity will argue with you that it is quantity, not a quality of inventions, what counts. :-D

Alternatively they will say that new inventions are so complex and specialist, that you don't understand their importance, let alone details, and in fact it mean that singularity is already coming.
:-D :-D :-D

Well, perhaps it is and it takes a form of accelerating (exponential) loss of understanding of functioning of every day equipment, regardless of ones education level. :-D :-D :-D
User avatar
EnergyUnlimited
Light Sweet Crude
Light Sweet Crude
 
Posts: 7537
Joined: Mon 15 May 2006, 03:00:00
Top

Re: The Singularity Summit 2008

Unread postby comerbund » Sun 31 Aug 2008, 15:18:03

10 Print('Ask me any question.");
20 Input(A);
30 Print a;(' is too easy, next?');
40 Print 'I am smarter than Einstein anyway.');
50 Print(' The answer to what is Pi is;';
60 I = 1;
70 I = I+1;
80 print((I*7/22)+12;
90 If I < 32147 then go to 70;
100 goto 60;


there is machine intelligence. You can make it complex, you can make it act smart, but no one will believe it until it becomes extremely complex, but it still won't be alive.

How do I know?

Because if they made an exact duplicate of your brain in new york, and started it running, you would not know that they were doing it, therefore it is not you.

Engrams stored is just a lie, it won't be you.

What it will be is a lie to get you to believe it IS you so they can put you down and you THINK you will be alive in a computer somewhere, when you wont be.

this whole topic is all about how big a shyster can you be.
User avatar
comerbund
Wood
Wood
 
Posts: 23
Joined: Mon 18 Aug 2008, 03:00:00

PreviousNext

Return to Open Topic Discussion

Who is online

Users browsing this forum: No registered users and 0 guests

cron