Donate Bitcoin

Donate Paypal


PeakOil is You

PeakOil is You

Peak Computing: 2004?

What's on your mind?
General interest discussions, not necessarily related to depletion.

In the bigger picture, Peak Computing is:

Irrelevant -- no oil, no electricity, no computers.
5
No votes
Irrelevant -- I'm already off-grid, off-net, and only make a trip into town to use the library computer, an original PC running MS-DOS.
0
0%
Irrelevant -- I already have all the computing power I'll ever need.
11
No votes
One more nail in the coffin of civilization.
3
No votes
Indicative of the greater problem of systems based on limitless growth.
10
No votes
It ain't gonna happen -- don't worry, be happy!
8
No votes
 
Total votes : 37

Peak Computing: 2004?

Unread postby Bytesmiths » Tue 07 Dec 2004, 15:15:35

$this->bbcode_second_pass_quote('in Scientific American (Nov 2004, pp 96-101), W. Wayt Gibbs', 'I')t was never a question of whether, but only of when and why. When would microprocessor manufacturers be forced to slow the primary propulsive force of their industry... And would it be fundamental physics or simple economics that raised the barrier to further scaling? The answers are: in 2004, and for both reasons.


Gibbs goes on to detail the halt to increasing speed in computing, which architects are hoping to supplant with increased parallelism.

But unlike Peak Oil, demand seems to also be flagging:$this->bbcode_second_pass_quote('', 'T')oday's computers are more than fast enough... Demand for speedier machines has alneady begun to flag.
:::: Jan Steinman, Communication Steward, EcoReality, a forming sustainable community. Be the change! ::::
User avatar
Bytesmiths
Tar Sands
Tar Sands
 
Posts: 730
Joined: Wed 27 Oct 2004, 03:00:00
Location: Salt Spring Island, Cascadia

Unread postby johnmarkos » Tue 07 Dec 2004, 15:55:00

Peak computing by what measurement? The industry tends to measure in terms of ghz/chip, which does seem to be flattening out. However, in terms of ghz/US$ or ghz/cubic foot, we're still advancing. What about ghz/kilowatt?

In terms of P.O., we would be most concerned with the last one.

Even when processor speed was advancing happily along with Moore's Law, things like disk I/O, RAM, and hard drive space limited the speed of a home or office computer. By far the greatest limiter of speed was software bloat. It took less time in seconds to boot up my Terak 8510/a from a floppy than it takes to boot my laptop running GNU/Linux. Yes, GNU/Linux is guilty of the bloat too -- not just Microsoft. But to their credit, free/open source operating systems allow programmers to write smaller, more efficient versions.
User avatar
johnmarkos
Tar Sands
Tar Sands
 
Posts: 866
Joined: Wed 19 May 2004, 03:00:00
Location: San Francisco, California

Unread postby Bytesmiths » Tue 07 Dec 2004, 16:17:36

$this->bbcode_second_pass_quote('johnmarkos', 'P')eak computing by what measurement? The industry tends to measure in terms of ghz/chip, which does seem to be flattening out. However, in terms of ghz/US$ or ghz/cubic foot, we're still advancing. What about ghz/kilowatt?
I'd suggest you read the article, which addresses these issues.

In particular, "GHz/watt" is a square-law metric that has been on decline ever since chips started getting faster. Double the speed, you quadruple the power, less some small difference due to increased efficiency.

As to having all the computing power you'll ever need, I guess that leaves out stuff like real-time holographic video feeds and such. I'm sure my great-grandparents had all the computing power they ever needed as well!
:::: Jan Steinman, Communication Steward, EcoReality, a forming sustainable community. Be the change! ::::
User avatar
Bytesmiths
Tar Sands
Tar Sands
 
Posts: 730
Joined: Wed 27 Oct 2004, 03:00:00
Location: Salt Spring Island, Cascadia

Unread postby johnmarkos » Tue 07 Dec 2004, 16:32:48

$this->bbcode_second_pass_quote('Bytesmiths', '
')As to having all the computing power you'll ever need, I guess that leaves out stuff like real-time holographic video feeds and such. I'm sure my great-grandparents had all the computing power they ever needed as well!


The computing measurement I'm most concerned with is interesting problems/time. By that measurement, computing topped out at a completely satisfactory level (no improvement needed) in 1981, when I started teaching myself Pascal on the aforementioned Terak 8510/a. The measure hasn't budged from its 1981 number but I don't mind.
User avatar
johnmarkos
Tar Sands
Tar Sands
 
Posts: 866
Joined: Wed 19 May 2004, 03:00:00
Location: San Francisco, California

Unread postby Bytesmiths » Tue 07 Dec 2004, 16:47:58

$this->bbcode_second_pass_quote('johnmarkos', 'T')he computing measurement I'm most concerned with is interesting problems/time. By that measurement, computing topped out at a completely satisfactory level (no improvement needed) in 1981, when I started teaching myself Pascal on the aforementioned Terak 8510/a.
Then you are blessed with simple needs, indeed!

In 1981, there was no Internet. (Although I used the predecessor, called the Arpanet, at that time.) Networking of any kind was limited to 56kb/s synchronous connections, available only on minicomputers (or larger) costing hundreds of thousands of dollars and more.

So by your metric, you shouldn't even be experiencing this forum!

I would submit that you have benefitted from the growth of computing power to a similar extent that you have benefitted from the growth of cheap energy from fossil fuel -- and that you are just as much in denial of that fact as the person who drives their SUV four blocks to the 7-11 to buy one loaf of bread is in denial of Peak Oil.
:::: Jan Steinman, Communication Steward, EcoReality, a forming sustainable community. Be the change! ::::
User avatar
Bytesmiths
Tar Sands
Tar Sands
 
Posts: 730
Joined: Wed 27 Oct 2004, 03:00:00
Location: Salt Spring Island, Cascadia
Top

Unread postby johnmarkos » Tue 07 Dec 2004, 16:57:57

$this->bbcode_second_pass_quote('Bytesmiths', '
')I would submit that you have benefitted from the growth of computing power to a similar extent that you have benefitted from the growth of cheap energy from fossil fuel -- and that you are just as much in denial of that fact as the person who drives their SUV four blocks to the 7-11 to buy one loaf of bread is in denial of Peak Oil.


Yes, I agree that I'm in denial. :)

No, I'm not as much in denial as that guy.
User avatar
johnmarkos
Tar Sands
Tar Sands
 
Posts: 866
Joined: Wed 19 May 2004, 03:00:00
Location: San Francisco, California
Top

Unread postby johnmarkos » Tue 07 Dec 2004, 17:09:20

$this->bbcode_second_pass_quote('Bytesmiths', '
')In 1981, there was no Internet. (Although I used the predecessor, called the Arpanet, at that time.) Networking of any kind was limited to 56kb/s synchronous connections, available only on minicomputers (or larger) costing hundreds of thousands of dollars and more.

So by your metric, you shouldn't even be experiencing this forum!


So if the development of computing power (in terms of processor speed) had ground to a halt in 1981, would the Internet have been impossible? Judging from the fact that the Internet *was* possible with the machines of a few years later, I tend to think it could have been built even with that limitation. Perhaps we'd be having this discussion on USENET instead of on the Web.
User avatar
johnmarkos
Tar Sands
Tar Sands
 
Posts: 866
Joined: Wed 19 May 2004, 03:00:00
Location: San Francisco, California
Top

Unread postby Jack » Tue 07 Dec 2004, 17:32:30

There's a semi-recent paper to the same effect at http://www.sea.uni-linz.ac.at/conferenc ... sion_e.pdf

I suspect that we need to make advances in parallel processing first; transforming applications to perform in parallel takes some careful coding and will require some substantial work to convert from the present uniprocessor model. Of course, the software exists - MPI, for example - but converting uniprocessor applications will take some serious work.

There's also the question of what applications really need that additional power. Other than the gaming sector, most of us do a little surfing (with bandwidth being the limiting factor), some word processing, and some things with spreadsheets. I realize that Bytesmith has more demanding applications; graphics have always been resource intensive. And yet, I'm none too sure that the current processors are the real bottleneck, even there. Most of us use chips with relatively small cache sizes - certainly nothing like the Itanium 2 with 9MB of L3 cache!

Our motherboards tend to be slow relative to the speed of the chip, we tend to have pathetic amounts of RAM (anything less than a gigabyte), and hard drives are a factor of a million slower than RAM.

So...before I'd worry too much about chip speed, I think I'd look at having RAIDs with striping, a faster board configured for a pair of Itanium 2's, and 10 gigs or so of RAM. 8O
Jack
Light Sweet Crude
Light Sweet Crude
 
Posts: 4929
Joined: Wed 11 Aug 2004, 03:00:00

Unread postby Bytesmiths » Tue 07 Dec 2004, 18:35:42

$this->bbcode_second_pass_quote('johnmarkos', 'i')f the development of computing power (in terms of processor speed) had ground to a halt in 1981, would the Internet have been impossible?
"Impossible" is a strong word. Do you ever download music or movies? Do you like websites with pictures? Certainly it would be a very different place, with state-of-the-art networking being 300 baud modems!

"Processor speed" is not merely what you have sitting on your desk. It's the speed of your modem. It's the ability to play movies. It's computerized sound that is more complex than a simple beep.

The corollary of processing speed is chip size. In 1981, VCRs were the size of a suitcase and cost many thousands of dollars. Today, you have more processing power in your microwave oven than you had in an entire desktop computer in 1981.

$this->bbcode_second_pass_quote('johnmarkos', 'P')erhaps we'd be having this discussion on USENET instead of on the Web.
And if electricity largely goes away, we could have this conversation using ground-up and flattened wood fibers, by rubbing it with charcoal. Reductio ad absurdium.

But things are not going back to 1981. But neither are they going to progress at the rate at which we've become accustomed.

$this->bbcode_second_pass_quote('Jack', 'I') suspect that we need to make advances in parallel processing first; transforming applications to perform in parallel takes some careful coding...
... and it may well be impossible in some circumstances. You can't always get nine women together to make a baby in a month!

$this->bbcode_second_pass_quote('Jack', 'I')'m none too sure that the current processors are the real bottleneck...
Processors in desktop computers are merely the tip of the iceberg, the canary in the coal mine. The continual drop in price and increase in function in consumer goods is driven by growth in processor speed (or its converse, reduction in processor size). If that market "matures" (in the sense of Geoffrey Moore) by limiting consumer choice to color and style rather than functionality, people might just figure out that they have enough "stuff," then what will happen to our poor economy? :-)


$this->bbcode_second_pass_quote('Jack', 'I') think I'd look... for a pair of Itanium 2's, and 10 gigs or so of RAM.Sorry, I don't do monopolies (Intel, Microsoft, fossil-fuel), and besides, those chips don't address more than 4GB directly. (I've got 8GB in my G5 dual right now, but as you alluded, I'm a bit of a technology snob. :-)
:::: Jan Steinman, Communication Steward, EcoReality, a forming sustainable community. Be the change! ::::
User avatar
Bytesmiths
Tar Sands
Tar Sands
 
Posts: 730
Joined: Wed 27 Oct 2004, 03:00:00
Location: Salt Spring Island, Cascadia
Top

Unread postby bart » Tue 07 Dec 2004, 19:03:33

The effect of too much computer power is similar to the effect of too much energy.

Designs become bloated and complex. Bells and whistles turn a compact program into a multi-Megabyte bug-filled monster. Instead of thinking, people turn to expensive high-tech solutions.

During the first part of my years in high tech, the additional computing power was a boon. It enabled more functionality, better interfaces (my specialty), and more modular programming (didn't have to count every cycle). At a certain point, though, the increases just led to waste. Management seemed to lose about 30 IQ points. It was a dramatic example of Tainter's theories about societies inevitably becoming complex and dysfunctional.

I think good design requires constraints, whether the design is for a web interface, a house, or a civilization.
User avatar
bart
Tar Sands
Tar Sands
 
Posts: 659
Joined: Wed 18 Aug 2004, 03:00:00
Location: SF Bay Area, Calif

Unread postby 0mar » Tue 07 Dec 2004, 19:26:04

A good example here is the design of the chip not the megahertz.

AMD processors have long been "slower" in GHZ than Intel chips. However, the chips themselves show no indication of this. In fact, AMD's new chips only run at 2.0 - 2.4 ghz, yet they routinely outperform 3.4-3.6ghz.

It's not how many GHZ you have, but how you use them :). The "barrier" of GHZ was really only felt by Intel, because other companies saw that such a venture was untenable. AMD has long been championing dual-core chips and new sorts of logic. Added to that is that the x86 system is just about out of steam and a new set of instructions should be made.
Joseph Stalin
"It is enough that the people know there was an election. The people who cast the votes decide nothing. The people who count the votes decide everything. "
User avatar
0mar
Heavy Crude
Heavy Crude
 
Posts: 1499
Joined: Tue 12 Oct 2004, 03:00:00
Location: Davis, California

Unread postby johnmarkos » Tue 07 Dec 2004, 19:32:54

$this->bbcode_second_pass_quote('Bytesmiths', '"')Impossible" is a strong word. Do you ever download music or movies? Do you like websites with pictures?

I do, although not as much as many others. I tend to be a fairly text-oriented person so I'm not representative of the average user. Because graphics and sound can often be as much of a distraction as they are a benefit, I boot in text-only mode when I really want to concentrate.
$this->bbcode_second_pass_quote('Bytesmiths', 'C')ertainly it would be a very different place, with state-of-the-art networking being 300 baud modems!

Agreed: 300 baud is too slow.
$this->bbcode_second_pass_quote('Bytesmiths', '"')Processor speed" is not merely what you have sitting on your desk. It's the speed of your modem. It's the ability to play movies. It's computerized sound that is more complex than a simple beep.

Those things are nice. Nonetheless, I wish chip and hardware makers were focused on questions like, "How do we make a 1 Ghz computer that uses only 10 watts?" and, "How do we use less energy to manufacture laptops?"

And software and OS makers would focus on making better use of the power they have rather than always demanding more for bloated programs . . . but now I'm dreaming.
User avatar
johnmarkos
Tar Sands
Tar Sands
 
Posts: 866
Joined: Wed 19 May 2004, 03:00:00
Location: San Francisco, California
Top

Energy efficiency & PC's

Unread postby Dvanharn » Tue 07 Dec 2004, 20:11:18

My 2-year old AMD Athlon 700mHz has three fans - one for the HUGE processor assembly, one for the PS & one for the case. I changed from a very noisy 10Gb IBM SCSI hard disk to a silent Seagate 40Gb Barracuda single platter HD. But the system is still noisy as hell - it takes 3 fans to carry away the waste heat of the newer machines. I hope to get a "silent PC" next year, but having been out of the LAN/WAN world for over three years, I have lost touch with the details of the technology.

What I really want is a super-quiet, energy efficient PC with a LCD monitor, moderately fast (1.3-1.7mHz) AMD processor and a reasonably large cache.

A currently active network guy stopped in the store to look at solar toys, and we talked computers a bit. He suggested that I get a used IBM laptop with a 15" screen. A lot of 4-5 year old IBM's like the one's we used at Chevron's HQ are still being used and actively bought and sold.

I wonder what percentage of energy in developed countries is used for computers, & how much will be saved as lead-laden CRT monitors are hauled to landfills and replaced iwth LCD's? (Certainly not enough to stave off peak oil!)

Dave
User avatar
Dvanharn
Lignite
Lignite
 
Posts: 228
Joined: Thu 20 May 2004, 03:00:00
Location: Sonoma County, Northern California

Unread postby pilferage » Tue 07 Dec 2004, 20:31:09

It's hitting the wall because Intel just realized that when you go to about
90nm you have trouble with electrons tunneling where they shouldn't, etc...
http://www.abxzone.com/forums/showthrea ... ge=2&pp=15
AMD is realizing the benefits of having more operations per cycle since their newer processors operating at ~1ghz less are beating intel's best offerings
http://www.tomshardware.com/cpu/2004031 ... 53-31.html
I've heard ppc architecture can complete more instructions per cycle than AMD's stuff... so it's not just about speed, it's speed and operations per cycle. :)
User avatar
pilferage
Tar Sands
Tar Sands
 
Posts: 553
Joined: Sun 21 Nov 2004, 04:00:00
Location: ~170ft/lbs@0rpm (on my bike)

Unread postby Carmiac » Tue 07 Dec 2004, 20:31:32

It sounds like what you really want is one of the new flatpanel iMacs. Check them out.
User avatar
Carmiac
Peat
Peat
 
Posts: 59
Joined: Tue 20 Jul 2004, 03:00:00

Unread postby holmes » Tue 07 Dec 2004, 20:54:27

house in two years will be 100% independent. solar and wind. It will be able to run a computer. Computers are one of the most important tools of the future without them we will hit the dark ages. poeple need to build knowledge bases.
holmes
Intermediate Crude
Intermediate Crude
 
Posts: 2382
Joined: Tue 12 Oct 2004, 03:00:00

Unread postby savethehumans » Wed 08 Dec 2004, 01:56:04

I sure will miss posting on these boards. Also the wonderful access to the world's information. ALSO, e-mail--it's soooooooo quick and efficient!

That's why we all need to make the best use of them while we still have them!
User avatar
savethehumans
Heavy Crude
Heavy Crude
 
Posts: 1468
Joined: Wed 20 Oct 2004, 03:00:00

Unread postby khebab » Wed 08 Dec 2004, 12:14:24

The current technology used to make chips is actually peaking because the size of the transistor patterns are becoming too small relatively to the wavelength used in the lithographic process. Much hopes were placed on the extreme ultraviolet litography but it was recently abandonned because too expensive and too complex. Other ways were explored such as using water in order to obtain smaller patterns (because of the change in the reflective index). The size of the transistors is not the only issue:
- heat is becoming tremendous ans is increasing with the ghz (an Intel P4 radiates as much heat as a iron on the cotton position!);
- circuitry is becoming the bottleneck (copper is nout enough);
- quantum effect (ex: tunneling effect) are becoming predominant while transistors are shrinking.

The current approach is to improve the architecture (Hyperthreading, multi-processor) but multi-tasking/parallelism is theoratically a nice feature but practically very difficult to implement and very application dependant. Other improvements are possible in a near future: spintronic, optotronic and so on.

Another aspect is that the CPU speed is not as a selling factor as it was a decade ago. Most of the companies are actually satisfied with their computers and do not want necessarily to upgrade all their infrastructure for a less than certain profit.
khebab
Tar Sands
Tar Sands
 
Posts: 899
Joined: Mon 27 Sep 2004, 03:00:00
Location: Canada

Unread postby tmazanec1 » Wed 08 Dec 2004, 12:46:39

Well, my Windows XP SP2 (my current system) seems a little more stable than the earlier ones I used.
I agree, though...MIPS/milliwatt are the way to go, especially with laptops (and battery life) becoming more prominent. And when your electric bills start going up...
tmazanec1
Tar Sands
Tar Sands
 
Posts: 506
Joined: Tue 12 Oct 2004, 03:00:00

Unread postby Trab » Wed 08 Dec 2004, 15:03:35

Most home users of PC's have way more horsepower than they need for most tasks. Other than gaming or high-end audio/video editing, there are few tasks the average person would need the latest and greatest equipment for.

One thing I have thought about off and on over the last several months is the concept of low power computing. Most newer PC's have a 400W or higher power supply in them. This is overkill for the most common tasks (email, web, IM, etc). Finding the right mix of components necessary to give the average user of PC's a good computing experience while cutting unnecessary power consumption is a worthy task.

I'm not an expert in the area, but by using low-power chips and reducing the number of moving parts in a PC, it should be possible to reduce the amount of power needed by at least 50% if not more.

Monitors are another story. :)
User avatar
Trab
Lignite
Lignite
 
Posts: 288
Joined: Thu 28 Oct 2004, 03:00:00
Location: SoWashCo, Minnesota

Next

Return to Open Topic Discussion

Who is online

Users browsing this forum: No registered users and 0 guests

cron