April 3, 2007 4:00 AM PDT

Despite its aging design, the x86 is still in charge

Few computing technologies from the late 1970s endure today, with one notable exception: the fundamental marching orders for the vast majority of the world's computers.

The x86 instruction set architecture (ISA), used today in more than 90 percent of the world's PCs and servers, hit the marketplace in 1978 as part of Intel's 8086 chip.

So when the worldwide Intel developer's community gathers for its annual conference in Beijing later this month, they'll spend most of their time talking about technology that was developed when Jimmy Carter was in the White House and the soundtrack for the John Travolta movie Saturday Night Fever was the best-selling album in the United States.

Other instruction sets--which are basically, lists of operations that a software program can use--do exist, of course. There's IBM's Power, Sun Microsystems' Sparc and Intel's own EPIC (explicitly parallel instruction computing) Itanium project, to name a few. But x86 continues to thrive and has no serious competitors on the horizon because it provides "good enough" performance and because of the vast amount of software written over nearly three decades.


"If you look at the history of computing, big moves happen because there is a dramatic new requirement or change in the marketplace," said a professor of computer science and engineering at the Massachusetts Institute of Technology who uses the single name Arvind.

But x86 is apparently an exception to the rule. Whether it's the invention of the browser or low-cost network computers that were supposed to make PCs go away, the engineers behind x86 find a way to make it adapt to the situation.

Is that a problem?

Critics say x86 is saddled with the burden of supporting outdated features and software, and that improvements in energy efficiency and software development have been sacrificed to its legacy.

A comedian would say it all depends on what you think about disco.

Humble beginnings
The x86 ISA made its debut with Intel's 8086 processor in 1978. Even at the time, it wasn't considered the most elegant implementation on the market because of the way it searched for memory addresses, said Dean McCarron, an analyst with Mercury Research. IBM chose a slightly different version--the 8088--for its new PC, and the x86 architecture started to gain traction.

"It was originally thought about as an eight-bit chip (Intel's and Advanced Micro Devices' current chips are 64-bits) designed to run spreadsheets," said Phil Hester, chief technology officer at AMD. Accordingly, the original design lacked support for, among other things, an appropriate number of general-purpose registers that would be needed for the modern computing era. Registers are essentially small holding stations for data as it awaits processing, and general-purpose registers are useful because they can store either data or an address where that data is stored.

"There's no reason whatsoever why the Intel architecture remains so complex."
--Simon Crosby
CTO, XenSource

As the number of people using PCs made by IBM and so-called clone manufacturers grew, the x86 became the irreplaceable heart of the PC market. In the mid-1990s, Intel's entry into the server market with x86 chips cemented the ISA's dominance. Today, more than 90 percent of all servers shipped in the world use an x86 processor from either Intel or AMD.

Intel and AMD have managed to keep x86 fresh by continually adding extensions to the ISA, such as Intel's MMX and SSE instructions in the mid-'90s that improved graphics performance, and AMD's 64-bit extensions this decade that helped bypass the register issue. "We have seen a huge amount of change at the instruction level; we just keep calling it the same thing," said Rick Rashid, a senior vice president at Microsoft in charge of that company's research division.

But with each generation of extensions to the x86 ISA, more and more complexity is added to the chips, and support for the older feature remains to guarantee software compatibility.

"There's no reason whatsoever why the Intel architecture remains so complex," said Simon Crosby, chief technology officer at virtualization software start-up XenSource. "There's no reason why they couldn't ditch 60 percent of the transistors on the chip, most of which are for legacy modes."

If a chipmaker declared its chip could run only software written past some date such as 1990 or 1995, you would see a dramatic decrease in cost and power consumption, Crosby said. The problem is that deep inside Windows is code taken from the MS-DOS operating system of the early 1980s, and that code looks for certain instructions when it boots.

CONTINUED: How not to introduce a new set…
Page 1 | 2

See more CNET content tagged:
Intel x86, x86 processor, AMD, Intel, IBM Corp.


Join the conversation!
Add your comment
The only way to beat X86
is to make a new chip that can emulate X86 instructions to run legacy applications.

The whole reason why X86 is still in use, is because people and organizations need it to run older software. Even Apple eventually switched to X86 and when it did it made a better Mac in the process.
Posted by Orion Blastar (590 comments )
Reply Link Flag
The problem with emulation is performance, there is no way that you're ever going to match the performance of x86 with another architecture emulating it. Intel themselves tried just that with IA-64 and failed miserably.

The only way you're going to be x86 is for the PC as a whole to become obsolete, and even that might not do the trick.
Posted by Hoser McMoose (182 comments )
Link Flag
Half-truths are still lies
"The problem is that deep inside Windows is code taken from the MS-DOS operating system of the early 1980s, and that code looks for certain instructions when it boots."
Posted by aabcdefghij987654321 (1721 comments )
Reply Link Flag
what do you mean?
Your title makes no sense. The sentence is accurate, and you just added a bizarre title. Even from Win98, the OS loaded MS-DOS before the graphical shell. Only recently has there been a shift away from the command.com system.
Posted by ben::zen (127 comments )
Link Flag
It's the software....
We are developing, (have developed), our updated General
Purpose Parallel Processing Platform, and yes we used the X86.
But our platform will be modified to run the Cell and others. It's
foundational, we will market it as Underware, and it will sit
anywhere, within the CPU most likely.

We will run all legacy software, and we have in fact tweaked the
X86 instruction set, to work more efficiently in parallel
operations. We do not thread, we CONCURRENTLY run all cores,
(no matter the volume of cores), and the magic is all in the
scheduler & memory use.

We compile all legacy code into our machine level assembler
code, so developers will need not squirm, they will flock to get
full parallel processing advantage on the new multiple cores,
regardless of the condition of their written code.

We have contacted Intel, but we have been taking it slow,
crossing all t's and dotting all i's. But we are almost ready to

It's always been the software, and the fear to take on Microsoft,
but we didn't. We went underneath them, at the foundation,
where all the problems were in the first place. And they can't
complain, because we will get Vista off the ground for them, and
by playing below their expertise, they can claim it wasn't their
problem in the first place.

The foundation needed fixing, and Parallel Processing with the
new multiple cores is what will fix it. Our design was made for
the business enterprise environment, but obviously the
massively parallel environments will fully utilized the platform

And did you ever thing that part of the heat and energy problem
was because the software performed poorly. We don't move the
data, not in the traditional way, which is a big generator of the
heat source. More to come.
Posted by thecatch (49 comments )
Reply Link Flag
Fascinating, so who are you?
Okay, that was interesting, but you talk about "we" without identifying yourself. Who the heck are you?
Posted by Razzl (1318 comments )
Link Flag
Fascinating, so who are you?
Okay, that was interesting, but you talk about "we" without identifying yourself. Who the heck are you?
Posted by Razzl (1318 comments )
Link Flag
X86 chipset - underlying problems
Years ago, I was part of a team to evaluate teh x86 chipset along
with others as a standard for some DOE laboratories. After
careful consideration, we rated it as NOT ACCEPTABLE because
at the time (this may not be true anymore), it only had two
hardware interrupt lines to the cpu chip, thus providing only 4
hardware interrupt states. To handle the large number of
interrupts that are required on a modern operating system, the
x86 chip set had to provide vectored interrupts, i.e. the OS had
to prepare a table with interrupt address in memory (where it
can be corrupted!), and then to produce an interrupt, put the
interrupt offset into the table on the bus, and raise one of the
hardware interrupts. The CPU would halt, grab the offset and
jump to the address in the table to handle the interrupt. Other
chip designs, such as RISC and the 68K series had 3 to 4
hardware lines for interrupts and thus would not suffer interrupt
overruns as can happen in a vectored system. IF this has not
been addressed in the chip set, it still represents a potentially
fatal flaw in the design when it is attempted to be used in a
priority interrupt scheduled OS that does a lot of context
switches. Our measurements showed that the x86 at the time
could not support the numbers of context switches that we
believed necessary to support UNIX and other priority interrupt
driven systems.
Posted by woo37830 (8 comments )
Reply Link Flag
Not enough interrupt lines
For any processor with x hardwired interrupts there will be an application that requires x+1 discrete hardwired interrupts.

The simple solution to prevent a vectored solution from being corrupted is to put the vector table in ROM. If it cannot be altered, it cannot be corrupted. See the Commodore PET, B series & C series computer ROMs for an example of a vectored incorruptible interrupt. Of course some of the machines did allow the ROM to be replaced by RAM when the programmer had a need to revector the interrupts, but that was a conscious design decision.
Posted by Fritzr_gc (19 comments )
Link Flag
I'm supposed to put up with legacy hardware at the same time I'm being pressured to buy Vista(for example). Sounds like a salesman talking. Damn Silly
Posted by mpotter28 (130 comments )
Reply Link Flag
Until we get away from 30 year old technology new chips designs that promise more speed, more performance, more anything will be in marketing hype only. I can't remember the last time I upgraded my computer to the latest greatest Intel processor only to find there was no difference in speed at all.

It is time to junk the 30 year old cobweb encrusted technology and move in the 21st century. But, that is likely to happen on the day Bush gets an I.Q. above that of a dead rat.

Posted by Heebee Jeebies (632 comments )
Reply Link Flag
Is this article about Big Oil?
We must ween ourselves from our foreign dependence...

The world has an addiction to _______.
Posted by redpop (2 comments )
Reply Link Flag
Right now, I want to deliver the highest respects to our x86 engineers for focusing on backwards compatibility, scalability, and consolidation. Thinking ahead may take a lot of time, but its obviously paying off.

Now, for what?s ahead? I?m speechless? It sounds like we?re in for a treat. Keep doing your part to overcome obstacles whatever the cost.

?reducing power consumption at the same time, this is going to be better than I ever imagined.

- Jerry Merfeld

...Hurry up and get those multi-core laptops out? I don?t even want to waste my time with this dual core cr*p. :)
Posted by jerrymerfeld (13 comments )
Reply Link Flag
aging x86 addiction causes end of world?
X86 in main-stream public uses since IBM PC is only 25 years old and x86-64 is only 3-years old. You ain't seen nuthin yet! The next 5 years will see more changes than the last 100.

The question we should be asking ourselves in the industry is where do we want the consumer computing industry to be in the next 25 years? Wouldn't it be great if we had an inexpensive handheld system that was 100% reliable (errors would be predicted and prevented) (Defects would be detected and automatically repaired); almost zero power consumption (instant on); human friendly interface (no more carpal-tunnel, eye/neck/back-strain); world-wide accessibility to free network; no illegal or offensive misuse of technology; all information on network is free of bias and misinformation?

Can we get there with x86? I for one doubt it. Way too much legacy baggage. Was Itanium the answer - no, but there must be a way to get there if the industry works together instead of fighting for their precious fraction of a market share.

In the enterprise market, we have other challenges. What if the world was faced with a crisis that could only be solved with one enormous computer - We had to create a computing model of the earth and we had to accurately predict the polar shift that will happen in 2012 that will destroy 75% of the earth in 3-days? We had to create a model that would help us predict where the safest places on earth would be so we could survive, preserve the human species and knowledge for future generations; to prepare a rebuilding plan, develop a new system for producing food, housing and energy. Could we do that with the existing computing technology?

OK, so what if the world does not end because we stick with x86, but in 50 years from now, what will historians write about the early 21st century industry leaders? It might be something like this:
"They had the capability to develop the technology to help solve the world hunger and energy crisis we have today, but chose to continue their childish fighting over who had the biggest toy instead of creating the technology we needed to solve the world's problems that they knew were going to happen."

I have been doing computer engineering since 1976. The first Intel processor I worked with was a 2-bit slice i4002 the Texas company I worked for used in a 8-bit micro controller for a disk drive that had a whopping 330KB of data storage on a 14-inch rigid platter with a single-platter cabinet that weight 100 pounds called the Diablo DS300 if my memory serves me. My company sold it as part of a 16-bit micro-computing business platform with our own proprietary networked OS that cost about $100,000. I used to fly around the world providing engineering support. The division I worked for finally succumbed to the competition, mainly the PC, in 1992, and I embraced x86 for my career ever since until I retired in Texas last year.

I will be in Beijing in 2 weeks doing some consulting. I will be looking for you C-net.
Posted by mc4golf (1 comment )
Reply Link Flag
I appreciate your knowledge and history....
The reason why I believe we have created the next foundation
for all platforms but the X86 first, is because the author of our
software has been creating brilliant code since the mid to late
1950's, where he wrote some of the first OS's used in the early
IBM machines, this was accomplished while working at RAND

Then he went onto TI, Docutel, LEER, HUGHES, USAF, IBM, North
American Aviation and others.

I think it took a lifetime of experience of both systems, software
& hardware, to have the institutional knowledge to create a
platform that bridge the past with the present. This is the
difference in what he created. Your post speaks to that

And he too spent almost his entire beginnings in Texas, and
then to LA during the Cold War years, and then back to Texas.
Interesting your post is, very similar in attitude of my founder
and CTO.
Posted by thecatch (49 comments )
Link Flag
x86 and BIOS
From what I've gathered, the BIOS system found in most PCs is more of a legacy anchor than the x86 instruction set. Intel has been pushing EFI as a replacement, but Microsoft and/or OEMs (other than Apple) seem recalcitrant about adopting it, at least for 32-bit version of Windows. Someone should do an article on that.
Posted by xeroply (4 comments )
Reply Link Flag
X86 dominance
X86 is here to stay!
Posted by pentium4forever (192 comments )
Reply Link Flag
Chicken and Egg scenario
Software won't be updated to a new architecture unless there is sufficient hardware to justify it. Hardware won't be updated unless there is sufficient software to justify it.
Posted by g8crapachino (220 comments )
Reply Link Flag
Chicken vs. Egg scenario
Software won't be updated to a new architecture unless there is sufficient hardware to justify it. Hardware won't be updated unless there is sufficient software to justify it.
Posted by g8crapachino (220 comments )
Reply Link Flag
Chicken and the Egg scenario
Software won't be updated to a new architecture unless there is sufficient hardware to justify it. Hardware won't be updated unless there is sufficient software to justify it.
Posted by g8crapachino (220 comments )
Reply Link Flag

Join the conversation

Add your comment

The posting of advertisements, profanity, or personal attacks is prohibited. Click here to review our Terms of Use.

What's Hot



RSS Feeds

Add headlines from CNET News to your homepage or feedreader.