September 26, 2006 10:07 AM PDT

Intel pledges 80 cores in five years

Related Stories

Intel to announce low-end Xeons

September 25, 2006

A shifting scene for chipmakers

September 25, 2006

Intel puts four on the floor

September 25, 2006
SAN FRANCISCO--Intel has built a prototype of a processor with 80 cores that can perform a trillion floating-point operations per second.

CEO Paul Otellini held up a silicon wafer with the prototype chips before several thousand attendees at the Intel Developer Forum here Tuesday. The chips are capable of exchanging data at a terabyte a second, Otellini said during a keynote speech. The company hopes to have these chips ready for commercial production within a five-year window.

Intel uses its twice-yearly conference to educate developers on its long- and short-term plans. Over three days, hardware developers and partners get a chance to interact with Intel employees and take classes on new technologies.

Intel's 80-core chips

As expected, Intel announced plans to have quad-core processors ready for its customers in November. An extremely fast Core 2 Extreme processor with four cores will be released then, and the newly named Core 2 Quad processor for mainstream desktops will follow in the first quarter of next year, Otellini said.

The quad-core server processors are on a similar trajectory, with a faster Xeon 5300 processor scheduled for November and a low-power Xeon slated for the first quarter. Intel's first quad-core processors are actually two of its dual-core Core architecture chips combined into a multichip package.

"Performance matters again," Otellini said, disclosing that the quad-core desktop processor will deliver 70 percent faster integer performance than the Core 2 Duo, and the quad-core server processor will be 50 percent faster than the Xeon 5100 introduced in June.

Click here to Play

Video: VW, Intel and your wireless future
Car, driver and handheld device communicate


Click here to Play

Video: Intel's Otellini: Terabyte per second
The future of processors as Intel sees it

One reason performance didn't matter to Intel during the last couple of years was because it was getting trounced on benchmarks at the hands of Advanced Micro Devices' Opteron and Athlon 64 server and desktop processors. That all changed with the introduction of the Core 2 Duo chips this year.

"With this new set of dual and quad-core processors, we've regained our leadership," Otellini told developers. The growing Internet video phenomenon, as evidenced by the spectacular rise of Web sites like YouTube, will keep these processors busy during intensive tasks like video editing, he said.

Road to Santa Rosa
Notebooks will get a face-lift next year with the Santa Rosa platform, which will provide notebooks with new technologies like 802.11n wireless and flash memory. Intel believes that it will be the first to add flash memory to a notebook motherboard, which will improve boot times and reduce power consumption, Otellini said.

System power consumption is only one part of the equation. During the next few years, Intel wants to improve the performance per watt of power consumption of its transistors by 300 percent through new manufacturing technologies and designs, Otellini said. The next step on that road, Intel's 45-nanometer manufacturing technology, will enable the company to build chips that deliver a 20 percent improvement in performance with five times less current leakage, he said.

But the ultimate goal, as envisioned by Intel's terascale research prototype, is to enable a trillion floating-point operations per second--a teraflop--on a single chip. Ten years ago, the ASCI Red supercomputer at Sandia National Laboratories became the first supercomputer to deliver 1 teraflop using 4,510 computing nodes.

Intel's prototype uses 80 floating-point cores, each running at 3.16GHz, Justin Rattner, Intel's chief technology officer, said in a speech following Otellini's address. In order to move data in between individual cores and into memory, the company plans to use an on-chip interconnect fabric and stacked SRAM (static RAM) chips attached directly to the bottom of the chip, he said.

Intel's work on silicon photonics, including its recent announcement of a silicon laser, could help contribute toward the core-to-core connection challenge. Rattner and professor John Bowers of the University of California at Santa Barbara demonstrated Intel's newest breakthrough model of silicon laser, which was constructed using conventional techniques that are better suited to volume manufacturing than older iterations of the laser.

Many of the architectural nuances of the 80-core chip can be traced back to earlier research breakthroughs announced at previous IDFs. Connecting chips directly to each other through tiny wires is called Through Silicon Vias, which Intel discussed in 2005. TSV will give the chip an aggregate memory bandwidth of 1 terabyte per second.

Intel, meanwhile, began to discuss replacing wires with optical technology in computers and chips in 2001 and has come out with several experimental parts for enabling lasers and optical technology to replace wires.

The same year, Intel began to warn about the dangers of heat dissipation in processors. One of the solutions, the company said at the time, lay in producing chips with multiple cores.

CNET News.com's Michael Kanellos contributed to this report.

See more CNET content tagged:
Paul Otellini, teraflop, quad-core, Intel Xeon, power consumption

69 comments

Join the conversation!
Add your comment
Why would I need 80 cores to run IE or Word?
This sounds like total overkill!
Posted by bobby_brady (765 comments )
Reply Link Flag
feature creep
.
Posted by satayboy (73 comments )
Link Flag
feature creep
...
Posted by satayboy (73 comments )
Link Flag
feature creep
Well, you know, feature creep.
Posted by satayboy (73 comments )
Link Flag
You Don't, But Other People Do
Technology marches forward not because you need the
technology, but because someone does. If this was not true, I
doubt that billions upon billions of dollars would be spent
working on it.

Complex calculations for scientific research, video production,
or even computer games need the horsepower and the
technology trickles down to you and me whether we need it or
not. No one will force you to buy another computer, but you may
not even be able to run your OS as it continues to evolve and
expand over time.

Have a nice day!
Posted by lesfilip (496 comments )
Link Flag
Duh
You obviously have not noticed the tred of MS to make their programs more and more memory/cpu intensive every build...80 processors will barely be able to handle opening notepad in 5 years.
Posted by blueskydiver76 (7 comments )
Link Flag
BHO's?
When IE gets lots of BHO spyware installed, they really start slowing the machine down. Most are poorly written and really tax the CPU. With 80 cores, you could run 75 pieces of spyware and not have your computer slow down at all! Each piece of spyware gets its own core!
Posted by chris_d (195 comments )
Link Flag
Same as BG
You know the oft repeated quote: "640K ought to be enough for everybody." (Probably not an exact quote but close enough).

People wondered why I bought a 386/33 system with 16mb of ram once back in the day when 2-4mb was considered good.

And just for the wags jumping on MS, I noted a while back that current versions of Linux really, really stink when run on older systems too. It's not just MS making things that require more and faster hardware.

However, I do think that before we get to 80 core systems we'll find a point where people don't feel a need to upgrade any more but who knows, I don't.
Posted by aabcdefghij987654321 (1721 comments )
Link Flag
Its not all about you, believe it or not
We have these things called webservers, you might have heard of. And database servers, and application servers, and MMORPG servers, and ... see a trend?
Posted by (402 comments )
Link Flag
getting ready for VISTA
obviously more processing power is going to be needed for vista
and intel are just preparing themselves for the high demand!
Posted by yikes31 (71 comments )
Link Flag
RE: Why would I need 80 cores to run IE or Word?
You don't, but with the next revision of Elder Scrolls will it be enough?
Posted by Douglas Taylor (1 comment )
Link Flag
See more comment replies
Virtual worlds maybe;-)
With THAT kind of horsepower, the "holodeck" might be feasable!

Fascinating.
Posted by technewsjunkie (1265 comments )
Reply Link Flag
well
I'd be the first to buy a house
Posted by City_Of_LA (118 comments )
Link Flag
Part of the solution...
That might provide enough processing power for a reasonable image, but you still need the imaging system. Right now projectors to create what you see in Star Trek Holodecks are much larger than they will need to be in order to create a room-sized scene and keep all the image projectors in that same space.
Posted by zaznet (1138 comments )
Link Flag
Egads!
And here I am plodding along with one core!
Posted by Christopher Hall (1205 comments )
Reply Link Flag
How many FSB's?
How many front side buses, 40? I'm sure they won't want to give the flexibility to quickly adapt to new memory standards... yuck, yuck, yuck.
Posted by scdecade (329 comments )
Reply Link Flag
Only One...
They will only have one, the FSB is where the CPU connects to the system board. It is a single chip, not 40 dual core chips in one system board. Each core will have a direct connection to RAM, but this is before the chip hits the FSB. This is why the memory bendwidth is exceedingly faster than anything currently on market.

I can't wait to see the first super computer built using these chips. :)
Posted by zaznet (1138 comments )
Link Flag
Havent you seen the i7's? FSB doesnt exist anymore.
Posted by Inurdaes (2 comments )
Link Flag
A DCers Dream!
I do A LOT of Distributed Computing. I built a 12GHz system just to do DC projects.

I will LOVE 80 cores!
Posted by CaptainMooseInc (69 comments )
Reply Link Flag
Distributed Computing etc
I use TrueSpace 7, 3D Studio MAX and several other 3D animation and rendering applications. I can see where 80 cores would be very usefull. Hollywood would love this for FX in movies and such. In fact with that kind of horsepower (Funny, we are talking about highly advanced technology and yet we still use the metaphore of a horse to conceptulize how much computeing power it has) they realy could make entire movies with nothing on the screen being real and have it so realistic that no one would know the differance. They could probably do all there rendering in real time, with the human actors on a blank stage, no special costumes or props etc and have what looks like the finished product for all the dailys........
Posted by skon (4 comments )
Link Flag
Nice way to distract folks....
I have to give Intel credit. They know how distract people off of the core issues plaguing them. Lost customers, inventory issues, key talent escaping, bloated management and lack of empowerment.

Why on earth would Dell, Lenovo, IBM, and HP escape to AMD if their technology or roadmap were busted.

I figure if you cannot address the issues then just pull out a laser beam or an 80 core CPU. Bravo Intel.
Posted by phfm (4 comments )
Reply Link Flag
Intel has largely addressed problems
Intel has largely addressed their previous shortcomings this year. They recently laid off 10,000 people. They have advanced their roadmap and are beating their own timeframes.

Intels stock is up AMD's is down right now.
Posted by js33 (12 comments )
Link Flag
A year ago
I would have agreed with you. Now you are just way off the mark.

IBM, Dell and HP went with AMD finally for three reasons. In this order they had a, at the time, a better product, as good or better pricing and they wanted the PR of going with the underdog....to make fangirls like you happy.

Intel has let go of alot of fat, sold off non profitable groups and now has a better product, speed, power and heat wise.

Lets see what happens in a year. It is going to all come down to a price war...which can win if they want to.
Posted by Lindy01 (443 comments )
Link Flag
AMD seems to have distracted you....
And you are saying that AMD isn't distracting you with their loss of leadership? I guess you are saying that AMD can fiddle while the silicon burns as Intel regains their leadership with innovative ideas that will revolutionize the chip industry? Puhleeeeeeze......
Posted by dragonfly8610 (49 comments )
Link Flag
SIMD
I would bet that they are not talking about general instruction cores, but SIMD cores. Think of the cell processor on steroids.
Posted by ralfthedog (1589 comments )
Reply Link Flag
Not a Cell processor on steroids!
Do not compare this with the Cell processor. Each Cell Core is an SOC. This is quite the opposite intent of SIMD processors or the Intel Tera-Scale processors. The Cell processor is designed to be expressly non-specialized so that it can be used in many applications while providing a high level of efficeincy and scalability. You could find a Cell processor in cellular phones, cordless phones, DVD players, cable tuners, Video recorders, mp3 players, DVRs, PDAs, digital cameras, super computers, televisions and even personal computers. They are also very scalable so that a phone for example might require 2 cores and a HD-TV might use 6 or more cores. The Cell will allow some flexibility in the manufacturing process to make a processor to meet the needs of the application it's being used in.

You can't even compare each core of the Cell to an existing Intel X86 chip on the market today. The Cell cores each do more work, and include more functionality than what is included in the X86 class chips. So the Cell isn't perfect for any one job, but it meets the needs of many hundreds of jobs in one chip design. This significantly reduces development costs for those companies using the Cell processor in their hardware.

The 80 core Intel demonstration chip is an entirely different effort all-together and is still 5 years from production.
Posted by zaznet (1138 comments )
Link Flag
I love these 5 year prediction news stories.
These articles like this are sooooo stupid. 5 years and we will all have cray computer power!!! woooooooo
Posted by bigfeet123 (10 comments )
Reply Link Flag
Me too
I agree. But Cray is just OK (they are not as good as they used to). I prefer SGI. Besides that, guess what? We don't have the software to take advatage of all these cores.
Posted by domino360 (41 comments )
Link Flag
Not your desktop in 5 years...
These chips are paving the way for future generations of systems and software. You won't see one of these on your desk for a long while to come. You will probably see 8 or 10 core "core-duo" style chips in 5 years, but not this monster.
Posted by zaznet (1138 comments )
Link Flag
err, not really. 2 years ago they had mobile phones with specs that bash cray. Seriously. Go look at what Cray's realtime 3D graphics looked like in the mid 80s. Worse than Quake.

Regarding cores & efficiency, sure, getting more cores to add speed efficiently by co-operating is hard. To do efficiently, that is. You get diminishing returns by adding cores without being clever. That still doesn't discount the brute force approach! Throw 1000s of cores at it, it will be faster even if half of them are slacking or crashed/looping half the time.
Posted by bishopdante (2 comments )
Link Flag
Well - I believe it's possible!
Why will we need 80-cores? I don't know now - but Queen Victoria famously ridiculed a demonstration of a simple telephone at the 1851 Exhibition in London.
Posted by michaelbews (1 comment )
Reply Link Flag
Not too excited
because I know as soon as I purchase the 80 core they're going to release the 81 core and I'm going to wish I would have waited....
Posted by Charleston Charge (362 comments )
Reply Link Flag
You should be excited...
It's the software... :) These chips are designed for software that would be designed to take advantage of the multi-core chips. Today software has been recompiled to support some features but new software will be designed with multi-core in mind. That presents another step in performance improvement.
Posted by zaznet (1138 comments )
Link Flag
benefits of 80 cores
1) power savings if you can power down uneeded cores, while having huge "surge computing" capability when needed by enabling all 80 cores.
2) with power savings comes thermal management.
3) reliability. If 1 or more cores fail, just ignore them. You only lose 1.25% of your performance per core. If you can "map out" bad cores (similar to how you would shut them down for power & heat savings) you could loose 5-10 cores before you even noticed the performance hit.
4) for things like laptops, you could dynamically redistrubute core loading to avoid local hot spots and run with less overall cooling.
5) certain operations (rendering, raytracing, FEA, CFD) would benefit hugely from multiple small cores over 1-4 monolithic cores.
Posted by baldwinl (11 comments )
Reply Link Flag
and how much will intel pledge to lay off in 5 years ?
will they lay off another 10,000 employee's again?
Posted by kthor12 (3 comments )
Reply Link Flag
Jobs evolve, don't go away
Those same people will get other jobs. How many blacksmiths are
sad that they don't have to make nails one at a time any more?
Progress marches on.

Have a nice day!
Posted by lesfilip (496 comments )
Link Flag
How bout a spellling core?
Reading through these various prognostications it seems amazing that so many of you wizards are unable to spell, or spell-check.
Posted by kantan67 (1 comment )
Reply Link Flag
Ur hdline B fnny
Ur hdline B fnny, and you are right.
Posted by lesfilip (496 comments )
Link Flag
lmfao
I totally agree!
Posted by facety (11 comments )
Link Flag
Sarah Connor is getting worried
Yeah, I went there.
Posted by clancifer (1 comment )
Reply Link Flag
This is much bigger than C|Net realizes...
This is a super computer on a chip. This is not SIMD, this is not SOC or an upgrade to core-duo x40. This is a change to the design of a single processor to one that is intended to include multiple cores where the software is expecting multiple cores.

C|Net has focused on the memory per core as the big issue. They are right to be impressed by it, but they fail to see the big picture as to why this memory is there. Each core is expected to perform dozens, hundreds or thousands of operations on that set of data. Some of these operations may be to make that data available to instructions being performed by another core. Splitting the data is part of the design change and software will need to know where it sends data in a multi-core chip and where it's instructions are being run to optimize data access.

Some people complain that 20GB of RAM might not be enough. They say that the system will allow more RAM because that makes sense. No it won't! You will not want to pool all of your system RAM into one place accessed by any of these 80 cores. It is a performance hit to multi-core systems to share that RAM. Some of this will be seen with one CPU performing an operation on data held by RAM attached to another CPU. In this case only two CPUs are held up instead of all 80. Loading data from outside the CPU to have a core perform operations on will slow this system down in the same way that a page file hit slows a current Windows system. It's the difference in 2 vs. 80 operations.

Software that runs on Tera-Scale processors would know how to split tasks and memory between cores. The Operating System will need to understand to give a set of cores to an application so it has enough RAM and processing power to get it's jobs done. You could see 20 cores doing the same operation on sets of data (SIMD). Then at the same time 10 cores doing a different set of instructions on a shared set of data. Each core can reach the RAM on another core, so those 10 would be able to read, write, and perform an operation on that RAM.

This chip combines some SIMD concepts along with parallel computing concepts and the new multi-core concepts. Intel discovered you really need a lot of RAM per core so each core can handle a lot of data by itself and the entire RAM needs of the applications will be met within the CPU. 20GB is what they show in a sample using SRAM. They will be switching to DRAM and with that switch may also increase the RAM per core. The key to RAM per Core is going to depend on some efficiencies in the CPU based on tasks and data. Intel will begin to discover their performance curve with this prototype.

This system is nothing near what you will see on your desk in 5 years. It will be used to pave the way for application development and system design for years to come. It is the most important stepping stone for the future of computing that we've seen in the last several years because it changes how the hardware and software are designed to work together.
Posted by zaznet (1138 comments )
Reply Link Flag
Its about servers
In my last project, we had over 1200 dual-cpu machines handling
user requests via hardware load-balancer, a cluster of 12-cpu DB
machines, and one DB machine with 48 cpus (w/96 GB). Managing
that many machines is a hassle (and that's small compared with
Google, from what I hear). If we could divide by 80 it would be
great. The one downside being that you do have a lot of eggs in
each basket -- one machine down cuts out a lot of capacity.
Posted by Duke LaCrosse (1 comment )
Reply Link Flag
I think Intel is just bragging to get attention.
I'm wondering about the power compsumption of the new 80 core chip.
A 3.33 GHz Core 2 CPU comsumes about 75-80 W but can reach over 130W when put under heavy load. The 80 core chip is supposed to operate at 3.19 Gz.
So lets say that each core consumes 37.5 W at 3.2 GHz. Intel has also annuonced a new 35nm technology that will yield a 310% increase in performance per watt. So dividing 37.5 by 3.1 will yield what a core based an that enhanced technology will consume: 12.5W. If we multiply that by 80 which is the amount of cores on the chip we will get a power consumption of about 1000W. And that is not enough, they are also going to sandwich stacked SRAM with a performance of Terabytes/second attached directly to the bottom of the chip which also consumes a tremendous amount of power. I would like to see an answer on how they are going to deal with all this heat of over 1000W coming from that little chip before I'll take this statement seriously. We're talking about heat that is comparable to the face of the sun!
Posted by Aldenthe (2 comments )
Reply Link Flag
I think Intel is just bragging to get attention.
I think their 80 core "Prototype" merely is a marketing gimmick. They are nowhere near an 80 core chip. They are just bragging in an attempt to get a lot of attention.
The reason of my doubt is the power compsumption of the new 80 core chip.
A 3.33 GHz Core 2 CPU comsumes about 75-80 W but can reach over 130W when put under heavy load. The 80 core chip is supposed to operate at 3.19 GHz.
So lets say that each core consumes 37.5 W at 3.2 GHz. Intel has also annuonced a new 35nm technology that will yield a 310% increase in performance per watt. So dividing 37.5 by 3.1 will yield what a core based an that enhanced technology will consume: 12.5W. If we multiply that by 80 which is the amount of cores on the chip we will get a power consumption of about 1000W. And that is not enough, they are also going to sandwich stacked SRAM with a performance of Terabytes/second attached directly to the bottom of the chip which also consumes a tremendous amount of power. I would like to see an answer on how they are going to deal with all this heat of over 1000W coming from that little chip before I'll take their statements about this chip seriously. We're talking about heat that is comparable to the face of the sun!
Posted by Aldenthe (2 comments )
Reply Link Flag
Well, moore's law says you get smaller or faster at a given cost. So it will clearly get smaller for such basic messaging tasks as text etc. However, full multimedia will be available with an incredible size to cost to power ratio. That's the point.

Umm, have you seen the Ultra High Definition media that the japanese will be using for TV by 2015? That stuff makes 35mm film look fuzzy. And that runs to rather supercomputing type terascale storage and processing requirements.

Somebody will have to make those programs, and also design those much more detailed sets, costumes and doubtless CGI special effects.

Yes bring on my petaflops of processing, and Ultra High Def manga streaming to my OLED wall. Bring that right along over here.

You can Keep your Win 98, MS Word, and BBC basic for your grey little tank top world thankyou.

The iphone already has the capacity to be the library of alexandria in your pocket.

You see where this is going? MS Word up yo yo.
Posted by bishopdante (2 comments )
Reply Link Flag
I can has suparcomputar nao? :D
Posted by Inurdaes (2 comments )
Reply Link Flag
Where is my 80 core processor?
Posted by LincolnCannon (1 comment )
Reply Link Flag
well I gotta say I'm disappointed. I bought my q6600 in late 2007, hopping we'd have 80 cores by the time I was due for an upgrade. Do not want to purchase another quad core computer.
Posted by nimer55 (21 comments )
Reply Link Flag
 

Join the conversation

Add your comment

The posting of advertisements, profanity, or personal attacks is prohibited. Click here to review our Terms of Use.

What's Hot

Discussions

Shared

RSS Feeds

Add headlines from CNET News to your homepage or feedreader.