by Carlhole » Sun 31 Aug 2008, 00:33:46
$this->bbcode_second_pass_quote('gampy', 'I') am somewhat familiar with this idea of the "Singularity".
It's certainly an interesting premise. Shades of Terminator, notwithstanding.
I think some of the folk here are sceptical because they are thinking in terms of human intelligence.
Intelligence comes in various degrees, and flavours, I would say. Some of it is obvious, a lot of it is not.
This singularity may resemble a hive, or insect type of intelligence.
Or more likely, something so alien, and inconceivable, because the human mind is just that. Human. It perceives reality, and other intelligences through the prism of it's evolutionary, and biological constraints.
I can see it happening, but I wonder if our high energy society has enough time left to make it a reality. I wonder what
The Singularity would think of peak-oil, and environmental degradation, and climate change.
Proabably what Agent Smith thought. Cancer. Humans are a cancer. A disease. Most likely terminal.
Time for some chemo, and radiation.
Watch the YouTube Ray Kurzweil talk I posted a few posts back.
Also:
The Singularity$this->bbcode_second_pass_quote('', 'T')he technological singularity is a theoretical future point of unprecedented technological progress, caused in part by the ability of machines to improve themselves using artificial intelligence.[1]
Statistician I. J. Good first wrote of an "intelligence explosion", suggesting that if machines could even slightly surpass human intellect, they could improve their own designs in ways unforeseen by their designers, and thus recursively augment themselves into far greater intelligences. The first such improvements might be small, but as the machine became more intelligent it would become better at becoming more intelligent, which could lead to an exponential and quite sudden growth in intelligence.
Vernor Vinge later called this event "the Singularity" as an analogy between the breakdown of modern physics near a gravitational singularity and the drastic change in society he argues would occur following an intelligence explosion. In the 1980s, Vinge popularized the singularity in lectures, essays, and science fiction. More recently, some prominent technologists such as Bill Joy, founder of Sun Microsystems, voiced concern over the potential dangers of Vinge's singularity (Joy 2000). Following its introduction in Vinge's stories, particularly Marooned in Realtime and A Fire Upon the Deep, the singularity has also become a common plot element throughout science fiction.
Others, most prominently Ray Kurzweil, define the singularity as a period of extremely rapid technological progress. Kurzweil argues such an event is implied by a long-term pattern of accelerating change that generalizes Moore's Law to technologies predating the integrated circuit and which he argues will continue to other technologies not yet invented. Critics of Kurzweil's interpretation consider it an example of static analysis, citing particular failures of the predictions of Moore's Law.
Robin Hanson proposes that multiple "singularities" have occurred throughout history, dramatically affecting the growth rate of the economy. Like the agricultural and industrial revolutions of the past, the technological singularity would increase economic growth between 60 and 250 times. An innovation which allowed a replacement for virtually all human labor could trigger this singularity.