CPUs: The Next Generation

12 posts / 0 new
Last post
CaryMG's picture
Offline
Last seen: 11 years 4 months ago
Joined: Dec 20 2003 - 10:38
Posts: 161
CPUs: The Next Generation

[color=darkblue]I read somewhere that CPUs are about to hit a wall.

Because of "Moore's Law" -- CPUs doubling in power every 18 months -- if CPUs have get any smaller, quantum tunneling problems will kick in.
If they stay the same size, but with just more transistors put on them, they'll need liquid nitrogen to keep from overheating.

My questions....
1] Why can't they just be made bigger?
Would an extra square centimeter/millimeter or 2 be that big of a deal?

2] I always read how Gallium Arsenide was supposed to replace silicon. What happened?
3] What's next? BioCPUs? HoloCPUs? NanoCPUs?

Later![/color]
:coolmac: :coolmac: :coolmac:

moros's picture
Offline
Last seen: 12 years 3 months ago
Joined: Jan 21 2005 - 08:16
Posts: 65
Answer to 1...

They have got bigger, line a 6502 or 8088 up against a G3 and you'll see what i mean.

Then again i s'poise it could just be extra connectors for things like a 32/64 bit address, data, control bus. Does anyone know if this is the case?

Joel

moosemanmoo's picture
Offline
Last seen: 9 years 3 months ago
Joined: Aug 17 2004 - 15:24
Posts: 686
Current data suggests that CP

Current data suggests that CPUs made of silicon could go down to around 35nm and still work reasonably well. That's only a few years away, though. The problem with making a bigger CPU is that there are a lot of timing and latency problems involved with an enormous die-- copper wiring can only work well for so far. As for gallium arsenide, I've heard that it was extremely expensive to work with. This coupled with the fact that it's a poison in the first place doesn't give it a lot of momentum.
Two proposed solutions to the upcoming silicon wall are artificial doped diamond transistors and carbon nanotubes. Artificially grown diamonds doped with certain chemicals can be made into transistors with a much higher heat tolerance and smaller theoretical size than silicon can. This method is possible with the technology that is in the world today, and copper wiring would most likely be replaced with nanotube wiring. Nanotube transitors can be made as small as nanotubes can be made and coupled together with the required layers. The first transistor-like nanotube construction was just recently made, and the technology just isn't here yet for a mass-produced product. The way that CPUs are designed is changing, and in the future parallelism and efficiency will be extremely important.

Offline
Last seen: 2 months 3 weeks ago
Joined: Aug 22 2005 - 10:32
Posts: 404
Cell processors

ibm is developing and has a working proto type of the "cell processor" thats whats going into the Playstation 3. that has speed of up to around 4.2 ghz right now.

Jon
Jon's picture
Offline
Last seen: 12 years 10 months ago
Joined: Dec 20 2003 - 10:38
Posts: 2804
And don't fall into the Intel

And don't fall into the Intel marketing trap of MHz-MHz-MHz. The Pentium M can preform as well or better than a P4 at much higher clock rates, and use 1/5 the power. Efficient CPUs were mostly ignored in favor of high clock rates that could be used as marketing tools. Intel learned a hard lesson with the Netburst arch and is now backtracking and going down the road they should have been on all along. The P4 never looked that great to me, and now I'm being shown that my gut feelings were right. Intel missed the 4GHz mark, and has abandoned the P4 line. It had been projected to go to 10GHz, remember? The main problem is, why? Who needs 10GHz when you have to lower the efficiency of the CPU just to get it to go that fast? "What's Ghz got to do, got to do with it?"

Dr. Webster's picture
Offline
Last seen: 21 hours 23 min ago
Joined: Dec 19 2003 - 17:34
Posts: 1747
Re: And don't fall into the Intel

It had been projected to go to 10GHz, remember? The main problem is, why? Who needs 10GHz when you have to lower the efficiency of the CPU just to get it to go that fast? "What's Ghz got to do, got to do with it?"

There was an article on Slashdot a few months back about some dude who got a P4 overclocked to something like 7GHz, using liquid nitrogen. It was a big story because he didn't just overclock it, jump into the CMOS and take a pic of it saying "7GHz," but was able to actually boot into Windows and run benchmarks.

CaryMG's picture
Offline
Last seen: 11 years 4 months ago
Joined: Dec 20 2003 - 10:38
Posts: 161
Thanks, moosemanmoo !!

Current data suggests that CPUs made of silicon could go down to around 35nm and still work reasonably well. That's only a few years away, though. The problem with making a bigger CPU is that there are a lot of timing and latency problems involved with an enormous die-- copper wiring can only work well for so far. As for gallium arsenide, I've heard that it was extremely expensive to work with. This coupled with the fact that it's a poison in the first place doesn't give it a lot of momentum.
Two proposed solutions to the upcoming silicon wall are artificial doped diamond transistors and carbon nanotubes. Artificially grown diamonds doped with certain chemicals can be made into transistors with a much higher heat tolerance and smaller theoretical size than silicon can. This method is possible with the technology that is in the world today, and copper wiring would most likely be replaced with nanotube wiring. Nanotube transitors can be made as small as nanotubes can be made and coupled together with the required layers. The first transistor-like nanotube construction was just recently made, and the technology just isn't here yet for a mass-produced product. The way that CPUs are designed is changing, and in the future parallelism and efficiency will be extremely important.

[color=blue]That answers my question!

Thanks again!!

Later![/color]
:coolmac: :coolmac: :coolmac:

Offline
Last seen: 3 months 1 week ago
Joined: Jan 19 2005 - 23:30
Posts: 700
booting into windows at 7ghz

booting into windows at 7ghz is nuts... but i bet some dual core AMD stuff would do what that one did...... imagine if you over clock that?

CaryMG's picture
Offline
Last seen: 11 years 4 months ago
Joined: Dec 20 2003 - 10:38
Posts: 161
Optical CPU UpDate

Here's what I thought I knew about optical CPUs > "Optical CPUs"

:coolmac: :coolmac: :coolmac:

The Czar's picture
Offline
Last seen: 13 years 2 weeks ago
Joined: Dec 20 2003 - 10:38
Posts: 287
Re: And don't fall into the Intel

And don't fall into the Intel marketing trap of MHz-MHz-MHz. The Pentium M can preform as well or better than a P4 at much higher clock rates, and use 1/5 the power. Efficient CPUs were mostly ignored in favor of high clock rates that could be used as marketing tools. Intel learned a hard lesson with the Netburst arch and is now backtracking and going down the road they should have been on all along. The P4 never looked that great to me, and now I'm being shown that my gut feelings were right. Intel missed the 4GHz mark, and has abandoned the P4 line. It had been projected to go to 10GHz, remember? The main problem is, why? Who needs 10GHz when you have to lower the efficiency of the CPU just to get it to go that fast? "What's Ghz got to do, got to do with it?"

This is similar to what happened in the automotive industry during the last 50 years. The big three American auto manufacturers were hell-bent of making their cars go faster. Their answer? Increase engine displacement and put bigger carburators on these engines. This remained the status-quo until the 1970's when the Japanese and European auto makers really started to make inroads into the American market, and such, their innovations started to attract the attention of the Americans. Technology such as multi-valve systems, fuel-injection, weight-reduction techniques, etc. became alternatives to traditional thinking.

IMHO, this is what needs to happen in the computer market: Software needs to be refined to require fewer and fewer processor cycles. Processors need to move more and more bits per cycle. I think, most importantly, people need to realize that there is a limit to what computers can do.

People don't expect to drive 1000mph in their Honda Civic, why should they expect the equivalent from their PC?

Cheers,

The Czar

coius's picture
Offline
Last seen: 10 years 2 weeks ago
Joined: Aug 25 2004 - 13:56
Posts: 1975
Multiple Core CPU's

The real big leap is not going to be in speed of the CPU in single-core, but as in relative as to what the benchmarks are for each core in the individual CPU. Theorhetically, you could have 4-6 Cores on a CPU in a cube like Chip, with a wall on each side for a connector. Kinda like a Borg Cube. Who knows, these could actually be light based as someone suggested. It might even be using Fibre optic boards in the future. It might actually be that there is a whole new type of CPU being used. My prediction will be that there will be multiple CPU's on one chip pretty soon doing lower speeds, but able to achieve higher benchmarking. Also, within the next 7 years. I predict maybe 128-bit CPU's using cell based processing. Now, it is up to the OS to make it usable.

coius's picture
Offline
Last seen: 10 years 2 weeks ago
Joined: Aug 25 2004 - 13:56
Posts: 1975
actually...

I read some while ago, that there are processors that actually "Learn" as the data is processed, to make the most out of the CPU Clock cycle. These generally involve a doped pathway that is standard, and evolves the pathways in addition as the data is processed in order to gain efficiency. In effect, it is sort of like a brain. Starts off as a simple processesor, and "Learns" the best way to calculate the data, as it reaches out thru the die, and makes more connections. It repeats until it is at maximum potential, and then sets in for the data process.

I have seen the article somewhere on several sites, the first being /. (slashdot) and then in some other places. It is really cool and has a lot of potential, and I would love to see it implemented.
For instance. In starting out with video, it will be average performance, but maybe 1 minute into the data calculation, it would have made the best pathways, and becomes far above standard for video en/decoding. But when it switches data type (Say photo) it starts over and reverts to the standard die, and builds from there. this would actually eliminate the need for a standard of special instruction sets for media, as the CPU would develop the best way for moving the data thru the chip.

that, i believe, is the way CPU's will be by the time it is 20 years from now. I have also heard of "Organic" data processing too. But I doubt that will take off in even 100 years. Probably because some "Bacteria rights" organization will step in Blum 3

Log in or register to post comments