April 3, 2007 4:00 AM PDT

Despite its aging design, the x86 is still in charge

(continued from previous page)

This was part of the motivation behind Intel and Hewlett-Packard's EPIC project: a "clean-sheet" design that would remove many of x86's idiosyncrasies and support for legacy technologies, providing a modern foundation for the next 20 years.

Instead, EPIC became a lesson in how not to introduce a new instruction set. Software developers shied away from having to learn a new computing language, and early roll-out problems hindered Intel and HP's chances of building a broad market for the processor. The warm embrace of AMD's Opteron x86-64 processor (later duplicated by Intel) was the final blow, relegating EPIC and Itanium to the high end of the server market where it makes sense to port applications to take advantage of the performance offered by Itanium.

As with most things, it all came down to money. Billions of dollars have been invested in software written for x86. Even Intel--one of the most influential companies in the technology industry--couldn't convince software developers to move away from all those investments.

Is there an alternative?
Last year, Intel Chief Technology Officer Justin Rattner said the company had no plans to develop a new ISA in the foreseeable future. Microsoft's Rashid said his group doesn't have any projects that involve a rival instruction set, although Microsoft supported several different instruction sets as recently as 1999 with Windows NT 4.0.

So what might change the game? Performance is always one way to make software developers sit up and take notice, but there's nothing dramatic on the horizon. It's unlikely that any so-called "clean sheet" design would be able to produce more than a 10 percent improvement in performance or power consumption over the modern x86 ISA, Hester said.

A performance improvement that small isn't going to encourage a dramatic move away from x86, said Pat Gelsinger, a veteran chip designer and senior vice president and general manager of Intel's Digital Enterprise Group. "We're delivering 2x performance gains every year" with existing designs that can still run older applications.

The chip industry's ability to continue packing transistors onto its processors means that it dedicates fewer and fewer transistors--out of the whole--to keeping legacy code alive. "The burden of compatibility is there," Gelsinger said. "But the value of compatibility overwhelms the cost it brings with it."

One technology improvement that could be a wild card in the mix is the introduction of new chips with two or more processing cores. Chipmakers have settled on building chips with several lower-speed processor cores as a way of getting around power consumption problems caused by a single high-speed core. Right now, however, each core needs to use the same instruction set.

Some think a hybrid future is possible: smaller, more power-efficient cores could be created on an x86 using other ISAs that would be dedicated for specific tasks, like video processing, Arvind said.

IBM is doing something like this with its Cell processor design, found at the heart of Sony's PlayStation 3. Cell uses one PowerPC core in a sort of supervisory role over eight separate processing units. Further on down the road, chip companies could keep a basic x86 core to maintain backward compatibility and handle the next generation of complicated processing tasks with dedicated hardware--that may or may not run x86.

The earliest parts of this transition can be seen in efforts such as AMD's Fusion project, in which it plans to integrate a graphics processor onto a PC processor, McCarron said. By the next decade, processors with a mixture of cores using different ISAs could become a reality, he said.

But don't count on it.

"What has worked in (x86's) favor is that it's an evolutionary architecture, when problems come up it gets adapted," McCarron said. "This is ultimately the one that got picked. And for everything to work with each other, that's what we stick to."

CNET News.com's Stephen Shankland contributed to this report.

Previous page
Page 1 | 2

See more CNET content tagged:
Intel x86, x86 processor, AMD, Intel, IBM Corp.

24 comments

Join the conversation!
Add your comment
The only way to beat X86
is to make a new chip that can emulate X86 instructions to run legacy applications.

The whole reason why X86 is still in use, is because people and organizations need it to run older software. Even Apple eventually switched to X86 and when it did it made a better Mac in the process.
Posted by Orion Blastar (590 comments )
Reply Link Flag
Emulation
The problem with emulation is performance, there is no way that you're ever going to match the performance of x86 with another architecture emulating it. Intel themselves tried just that with IA-64 and failed miserably.

The only way you're going to be x86 is for the PC as a whole to become obsolete, and even that might not do the trick.
Posted by Hoser McMoose (182 comments )
Link Flag
Half-truths are still lies
"The problem is that deep inside Windows is code taken from the MS-DOS operating system of the early 1980s, and that code looks for certain instructions when it boots."
Posted by aabcdefghij987654321 (1721 comments )
Reply Link Flag
what do you mean?
Your title makes no sense. The sentence is accurate, and you just added a bizarre title. Even from Win98, the OS loaded MS-DOS before the graphical shell. Only recently has there been a shift away from the command.com system.
Posted by ben::zen (127 comments )
Link Flag
It's the software....
We are developing, (have developed), our updated General
Purpose Parallel Processing Platform, and yes we used the X86.
But our platform will be modified to run the Cell and others. It's
foundational, we will market it as Underware, and it will sit
anywhere, within the CPU most likely.

We will run all legacy software, and we have in fact tweaked the
X86 instruction set, to work more efficiently in parallel
operations. We do not thread, we CONCURRENTLY run all cores,
(no matter the volume of cores), and the magic is all in the
scheduler & memory use.

We compile all legacy code into our machine level assembler
code, so developers will need not squirm, they will flock to get
full parallel processing advantage on the new multiple cores,
regardless of the condition of their written code.

We have contacted Intel, but we have been taking it slow,
crossing all t's and dotting all i's. But we are almost ready to
release.

It's always been the software, and the fear to take on Microsoft,
but we didn't. We went underneath them, at the foundation,
where all the problems were in the first place. And they can't
complain, because we will get Vista off the ground for them, and
by playing below their expertise, they can claim it wasn't their
problem in the first place.

The foundation needed fixing, and Parallel Processing with the
new multiple cores is what will fix it. Our design was made for
the business enterprise environment, but obviously the
massively parallel environments will fully utilized the platform
too.

And did you ever thing that part of the heat and energy problem
was because the software performed poorly. We don't move the
data, not in the traditional way, which is a big generator of the
heat source. More to come.
Posted by thecatch (49 comments )
Reply Link Flag
Fascinating, so who are you?
Okay, that was interesting, but you talk about "we" without identifying yourself. Who the heck are you?
Posted by Razzl (1318 comments )
Link Flag
Fascinating, so who are you?
Okay, that was interesting, but you talk about "we" without identifying yourself. Who the heck are you?
Posted by Razzl (1318 comments )
Link Flag
X86 chipset - underlying problems
Years ago, I was part of a team to evaluate teh x86 chipset along
with others as a standard for some DOE laboratories. After
careful consideration, we rated it as NOT ACCEPTABLE because
at the time (this may not be true anymore), it only had two
hardware interrupt lines to the cpu chip, thus providing only 4
hardware interrupt states. To handle the large number of
interrupts that are required on a modern operating system, the
x86 chip set had to provide vectored interrupts, i.e. the OS had
to prepare a table with interrupt address in memory (where it
can be corrupted!), and then to produce an interrupt, put the
interrupt offset into the table on the bus, and raise one of the
hardware interrupts. The CPU would halt, grab the offset and
jump to the address in the table to handle the interrupt. Other
chip designs, such as RISC and the 68K series had 3 to 4
hardware lines for interrupts and thus would not suffer interrupt
overruns as can happen in a vectored system. IF this has not
been addressed in the chip set, it still represents a potentially
fatal flaw in the design when it is attempted to be used in a
priority interrupt scheduled OS that does a lot of context
switches. Our measurements showed that the x86 at the time
could not support the numbers of context switches that we
believed necessary to support UNIX and other priority interrupt
driven systems.
Posted by woo37830 (8 comments )
Reply Link Flag
Not enough interrupt lines
For any processor with x hardwired interrupts there will be an application that requires x+1 discrete hardwired interrupts.

The simple solution to prevent a vectored solution from being corrupted is to put the vector table in ROM. If it cannot be altered, it cannot be corrupted. See the Commodore PET, B series & C series computer ROMs for an example of a vectored incorruptible interrupt. Of course some of the machines did allow the ROM to be replaced by RAM when the programmer had a need to revector the interrupts, but that was a conscious design decision.
Posted by Fritzr_gc (19 comments )
Link Flag
Silly
I'm supposed to put up with legacy hardware at the same time I'm being pressured to buy Vista(for example). Sounds like a salesman talking. Damn Silly
Posted by mpotter28 (130 comments )
Reply Link Flag
Well...
Until we get away from 30 year old technology new chips designs that promise more speed, more performance, more anything will be in marketing hype only. I can't remember the last time I upgraded my computer to the latest greatest Intel processor only to find there was no difference in speed at all.

It is time to junk the 30 year old cobweb encrusted technology and move in the 21st century. But, that is likely to happen on the day Bush gets an I.Q. above that of a dead rat.

Robert
Posted by Heebee Jeebies (632 comments )
Reply Link Flag
Is this article about Big Oil?
We must ween ourselves from our foreign dependence...

The world has an addiction to _______.
Posted by redpop (2 comments )
Reply Link Flag
YEAH!
Right now, I want to deliver the highest respects to our x86 engineers for focusing on backwards compatibility, scalability, and consolidation. Thinking ahead may take a lot of time, but its obviously paying off.

Now, for what?s ahead? I?m speechless? It sounds like we?re in for a treat. Keep doing your part to overcome obstacles whatever the cost.

?reducing power consumption at the same time, this is going to be better than I ever imagined.

- Jerry Merfeld

...Hurry up and get those multi-core laptops out? I don?t even want to waste my time with this dual core cr*p. :)
Posted by jerrymerfeld (13 comments )
Reply Link Flag
aging x86 addiction causes end of world?
X86 in main-stream public uses since IBM PC is only 25 years old and x86-64 is only 3-years old. You ain't seen nuthin yet! The next 5 years will see more changes than the last 100.

The question we should be asking ourselves in the industry is where do we want the consumer computing industry to be in the next 25 years? Wouldn't it be great if we had an inexpensive handheld system that was 100% reliable (errors would be predicted and prevented) (Defects would be detected and automatically repaired); almost zero power consumption (instant on); human friendly interface (no more carpal-tunnel, eye/neck/back-strain); world-wide accessibility to free network; no illegal or offensive misuse of technology; all information on network is free of bias and misinformation?

Can we get there with x86? I for one doubt it. Way too much legacy baggage. Was Itanium the answer - no, but there must be a way to get there if the industry works together instead of fighting for their precious fraction of a market share.

In the enterprise market, we have other challenges. What if the world was faced with a crisis that could only be solved with one enormous computer - We had to create a computing model of the earth and we had to accurately predict the polar shift that will happen in 2012 that will destroy 75% of the earth in 3-days? We had to create a model that would help us predict where the safest places on earth would be so we could survive, preserve the human species and knowledge for future generations; to prepare a rebuilding plan, develop a new system for producing food, housing and energy. Could we do that with the existing computing technology?

OK, so what if the world does not end because we stick with x86, but in 50 years from now, what will historians write about the early 21st century industry leaders? It might be something like this:
"They had the capability to develop the technology to help solve the world hunger and energy crisis we have today, but chose to continue their childish fighting over who had the biggest toy instead of creating the technology we needed to solve the world's problems that they knew were going to happen."

I have been doing computer engineering since 1976. The first Intel processor I worked with was a 2-bit slice i4002 the Texas company I worked for used in a 8-bit micro controller for a disk drive that had a whopping 330KB of data storage on a 14-inch rigid platter with a single-platter cabinet that weight 100 pounds called the Diablo DS300 if my memory serves me. My company sold it as part of a 16-bit micro-computing business platform with our own proprietary networked OS that cost about $100,000. I used to fly around the world providing engineering support. The division I worked for finally succumbed to the competition, mainly the PC, in 1992, and I embraced x86 for my career ever since until I retired in Texas last year.

I will be in Beijing in 2 weeks doing some consulting. I will be looking for you C-net.
Posted by mc4golf (1 comment )
Reply Link Flag
I appreciate your knowledge and history....
The reason why I believe we have created the next foundation
for all platforms but the X86 first, is because the author of our
software has been creating brilliant code since the mid to late
1950's, where he wrote some of the first OS's used in the early
IBM machines, this was accomplished while working at RAND
Corp.

Then he went onto TI, Docutel, LEER, HUGHES, USAF, IBM, North
American Aviation and others.

I think it took a lifetime of experience of both systems, software
& hardware, to have the institutional knowledge to create a
platform that bridge the past with the present. This is the
difference in what he created. Your post speaks to that
difference.

And he too spent almost his entire beginnings in Texas, and
then to LA during the Cold War years, and then back to Texas.
Interesting your post is, very similar in attitude of my founder
and CTO.
Posted by thecatch (49 comments )
Link Flag
x86 and BIOS
From what I've gathered, the BIOS system found in most PCs is more of a legacy anchor than the x86 instruction set. Intel has been pushing EFI as a replacement, but Microsoft and/or OEMs (other than Apple) seem recalcitrant about adopting it, at least for 32-bit version of Windows. Someone should do an article on that.
Posted by xeroply (4 comments )
Reply Link Flag
X86 dominance
X86 is here to stay!
Posted by pentium4forever (192 comments )
Reply Link Flag
Chicken and Egg scenario
Software won't be updated to a new architecture unless there is sufficient hardware to justify it. Hardware won't be updated unless there is sufficient software to justify it.
Posted by g8crapachino (220 comments )
Reply Link Flag
Chicken vs. Egg scenario
Software won't be updated to a new architecture unless there is sufficient hardware to justify it. Hardware won't be updated unless there is sufficient software to justify it.
Posted by g8crapachino (220 comments )
Reply Link Flag
Chicken and the Egg scenario
Software won't be updated to a new architecture unless there is sufficient hardware to justify it. Hardware won't be updated unless there is sufficient software to justify it.
Posted by g8crapachino (220 comments )
Reply Link Flag
 

Join the conversation

Add your comment

The posting of advertisements, profanity, or personal attacks is prohibited. Click here to review our Terms of Use.

What's Hot

Discussions

Shared

RSS Feeds

Add headlines from CNET News to your homepage or feedreader.