Intel's checkered past on some large projects means the chipmaker must prove that Larrabee isn't a development flub that will simply be kept on life support for the next few years.
As reported last week, Intel's Larrabee graphics chip was killed as an initial product offering after protracted delays, demonstrating that as successful as Intel is, it's not immune to major product missteps.
If certain product histories are any indication, the challenge could be daunting. Intel's XScale processor for small devices--which was used in Compaq handhelds back in 2000--was sold off to Marvell in 2006 after an unsuccessful run. And its Itanium processor has been the object of perennial ridicule as a product hanging on for dear life after getting off to a very rocky start back in 1998. Sun Microsystem's former CEO Scott McNealy eventually dubbed the chip the "Itanic" as a play on the word Titanic.
"If you go back in history when they started down the Itanium path in the mid-90s, they said they were going to have a really whiz-bang product, but by the time they finally got it out, it was decidedly ho-hum or even worse," said Nathan Brookwood, principal analyst at Insight 64.
"They've learned that you don't ship a product the first time around so that when it finally does appear people go 'What was all the fuss about?'" he said.
Larrabee, as we know now, was not ready for prime time. "It was for all intents [and] purposes an Intel project--a test bed, some might say a paper tiger," said Jon Peddie, president of Jon Peddie Research, writing in a blog.
For better or worse, Intel is expected to forge ahead with Larrabee, with a real product not appearing until 2011, according to Brookwood, who suggests that Intel should begin fixing things by replacing Larrabee's aging processor core with the more power-efficient Atom technology used today in Netbooks. And that's not all. "They need to revisit practically every major architectural decision in order to compete with Nvidia and ATI," he said. ATI is Advanced Micro Devices graphics unit.
If and when Larrabee reemerges, by that time, graphics chip software developers will have become schooled in technologies like OpenCL and DirectCompute--which speed computing tasks on Apple and Windows-based computers, respectively--using Nvidia or ATI chips. This renders Intel's plan for developers to flock to its tried-and-true "x86" platform next year doubtful. "Now this scenario will play out differently," according to Brookwood, who wrote at length about the Larrabee issue here.
There are other, more esoteric consequences, too. The Larrabee setback complicates planning for a next-generation Intel chip architecture called "Haswell," slated for 2012, according to Brookwood's post. Haswell was expected to incorporate a derivative of the Larrabee design, but now that's less likely considering that the chip will be an unproven design for the foreseeable future.
But no one is counting Intel out of course. The world's largest chipmaker is one of the most formidable competitors on the planet. Intel will, at the very least, incorporate Larrabee's "parallelism"--the ability to use its many processing cores to accelerate computing tasks--into future products. And Larrabee could emerge yet as a stand-alone whiz-bang product. "Intel invested a lot in Larrabee in dollars, time, reputation, dreams and ambitions, and exploration. None of that is lost. It doesn't vanish. Rather, that work provides the building blocks for the next phase," Peddie said.
Intel declined to comment for this story.