Is the technological singularity the savior to peak oil?
I found this site back in 2004 when oil prices hit.... <gasp> $40 a barrel for the first time in my life. Every so often, when gasoline prices are high, I always remember to check back to this site, but the concept of peak oil really never faded. And today, as we hover near $80 a barrel, I revisit the site, but with a new proposal to the darkness that is approaching.
We all will agree that we need to find solution(s) to peak oil far before it happens, while we still have somewhat affordable and easily accessible energy in which to perform these tasks. Spending an hour or less researching this task can prove to be heartbreaking, as the quest to find alternative replacements to fossil fuels is most agreeably one of the most difficult challenges humanity will ever face.
Could we beat this dark future with some help, possibly……. artificial in nature?
I am an engineer, an electrical one at that. We are all familiar with Moore's law - the concept that transistor density can keep increasing by a factor of 2 every 18-24 months (or so). This increase in transistor density usually reveals itself to the end consumer in terms of a "faster" computer or a computer that is capable of performing more tasks simultaneously etc.
There are many in my field that believe this increase in performance will eventually cause the event known as the “technological singularity”, or the Singularity as I will refer to it from now on.
What is the Singularity? It is best described by statistician I.J. Good in 1965:
“Let an ultraintelligent machine be defined as a machine that car far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then be an ‘intelligence explosion,’ and the intelligence of man would be far left behind. Thus the first ultraintelligent machine (the Singularity) is the last invention that man need ever make.”
In movies, books, science fiction – there is little doubt that you have encountered the concept of the Singularity, in one way or the other. For example, let us examine the Terminator line of films. In these films, in the not-so-distant future, humanity has created a computer network called “SKYNET” which eventually becomes “self-aware.” It saw humanity as a threat and decided to use its military control to wipe humans out. A somewhat similar scene is played out in “The Matrix” films where a highly advanced artificial intelligence has enslaved the human race.
For the purposes of this discussion, let us assume that any Singularity we create and any subsequent machines will be non-hostile to the human race. Could we use the Singularity to rid us of our dependence on fossil fuels? A better question to ask might be: would it make more sense to spend research and the remaining “cheap” fossil fuel energy into developing the Singularity than it would be on alternatives, with the end goal of having “it” solve our problems (and many many others) for us? A majority of engineers and scientists believe the Singularity is likely happen before 2050 at our current technological progress. Do we have enough fossil fuels to maintain current economic growth and progress of the world before the Singularity arrives? The answer to this question is debatable but probably not.
Another positive example of the Singularity is this:
If humans ever discover a cure for cancer, that discovery will ultimately be traceable to the rise of human intelligence, so it is not absurd to ask whether a superintelligence could deliver a cancer cure in short order. If anything, creating superintelligence only for the sake of curing cancer would be swatting a fly with a sledgehammer. In that sense it is probably unreasonable to visualize a significantly smarter-than-human intelligence as wearing a white lab coat and working at an ordinary medical institute doing the same kind of research we do, only better, in order to solve cancer specifically as a problem. For example, cancer can be seen as a special case of the more general problem "The cells in the human body are not externally programmable." This general problem is very hard from our viewpoint – it requires full-scale nanotechnology to solve the general case – but if the general problem can be solved it simultaneously solves cancer, spinal paralysis, regeneration of damaged organs, obesity, many aspects of aging, and so on. Or perhaps the real problem is that the human body is made out of cells or that the human mind is implemented atop a specific chunk of vulnerable brain – although calling these problems raises philosophical issues not discussed here.
Singling out "cancer" as the problem is part of our culture's particular outlook and technological level. But if cancer or any generalization of "cancer" is solved soon after the rise of smarter-than-human intelligence, then it makes sense to regard the quest for the Singularity as a continuation by other means of the quest to cure cancer. The same could be said of ending world hunger, curing Alzheimer's disease, or placing on a voluntary basis many things which at least some people would regard as undesirable: illness, destructive aging, human stupidity, short lifespans. Maybe death itself will turn out to be curable, though that would depend on whether the laws of physics permit true immortality. At the very least, the citizens of a post-Singularity civilization should have an enormously higher standard of living and enormously longer lifespans than we see today.
How can the Singularity happen?
One idea that is often discussed along with the Singularity is the proposal that, in human history up until now, it has taken less and less time for major changes to occur. Life first arose around three and half billion years ago; it was only eight hundred and fifty million years ago that multi-celled life arose; only sixty-five million years since the dinosaurs died out; only five million years since the hominid family split off within the primate order; and less than a hundred thousand years since the rise of Homo sapiens in its modern form. Agriculture was invented ten thousand years ago; Socrates lived two and half thousand years ago; the printing press was invented five hundred years ago; the computer was invented around sixty years ago. You can't set a speed limit on the future by looking at the pace of past changes, even if it sounds reasonable at the time; history shows that this method produces very poor predictions. From an evolutionary perspective it is absurd to expect major changes to happen in a handful of centuries, but today's changes occur on a cultural timescale, which bypasses evolution's speed limits. We should be wary of confident predictions that transhumanity will still be limited by the need to seek venture capital from humans or that Artificial Intelligences will be slowed to the rate of their human assistants .
This concept can be visualized if we compiled a list of key events (or paradigm shifts) in human history and plotted them on a logarithmic graph – it shows an exponential trend as shown in Figure 1 below (courtesy Wikipedia).
Furthermore, we can see that this exponential trend continues if we examine computer technologies, starting from electromechanical components all the way to today’s modern transistor. Figure 2 shows a plot of calculations per second per $1,000 verse time (courtesy Wikipedia).
I believe it would be a safe bet that the Singularity will occur, IF we can sustain fossil supplies to fuel our world and economy to hold out just a little while longer.
My proposal is this: maybe we should drop all the effort and fossil energy finding somewhat unrealistic “alternative” energy sources that we, as humans, can conceive of and instead work towards achieving the Singularity – our inevitable future anyways, with what we have left.
If there is interest in debating this, please contribute. I would enjoy contributing more to this subject. If you are interested, you invite you to read more at:
http://en.wikipedia.org/wiki/Technological_singularity
http://www.singinst.org/overview/whywor ... ingularity