May 25, 2007 12:39 PM PDT

Intel: Software needs to heed Moore's Law

SAN FRANCISCO--After years of delivering faster and faster chips that can easily boost the performance of most desktop software, Intel says the free ride is over.

Already, chipmakers like Intel and Advanced Micro Devices are delivering processors that have multiple brains, or cores, rather than single brains that run ever faster. The challenge is that most of today's software isn't built to handle that kind of advance.

"The software has to also start following Moore's law," Intel fellow Shekhar Borkar said, referring to the notion that chips offer roughly double the performance every 18 months to two years. "Software has to double the amount of parallelism that it can support every two years."

But it's a big challenge for the industry. Things are better on the server side, where machines are handling multiple simultaneous workloads. Desktop applications can learn some from the way supercomputers and servers have handled things, but another principle, Amdahl's Law, holds that there is only so much parallelism that programs can incorporate before they hit some inherently serial task.

Speaking to a small group of reporters on Friday, Borkar said that there are other options. Applications can handle multiple distinct tasks, and systems can run multiple applications. Programs and systems can also both speculate on what tasks a user might want and use processor performance that way. But what won't work is for the industry to just keep going with business as usual.

Microsoft has recently been sounding a similar warning. At last week's Windows Hardware Engineering Conference in Los Angeles, Chief Research and Strategy Officer Craig Mundie tried to spur the industry to start addressing the issue.

"We do now face the challenge of figuring out how to move, I'll say, the whole programming ecosystem of personal computing up to a new level where they can reliably construct large-scale applications that are distributed, highly concurrent, and able to utilize all this computing power," Mundie said in an interview there. "That is probably the single most disruptive thing that we will have done in the last 20 or 30 years."

Earlier this week, Microsoft's Ty Carlson said that the next version of Windows will have to be "fundamentally different" to handle the amount of processing cores that will become standard on PCs. Vista, he said, is designed to handle multiple threads, but not the 16 or more that chips will soon be able to handle. And the applications world is even further behind.

"In 10 to 15 years' time we're going to have incredible computing power," Carlson said. "The challenge will be bringing that ecosystem up that knows how to write programs."

But Intel's Borkar said that Microsoft and other large software makers have known this shift is coming and have not moved fast enough.

"They talk; they talk a lot, but they are not doing much about it," he said in an interview following his discussion. "It's a big company (Microsoft) and so there is inertia."

He said that companies need to quickly adjust to the fact they are not going to get the same kind of performance improvements they are used to without retooling the way they do things.

"This is a physical limit," he said, referring to the fact that core chip speed is not increasing.

Despite the concern, Borkar said he is confident that the industry can rise to the challenge. Competition, for one, will spur innovation

"For every software (company) that doesn't buy this, there is another that will look at it as an opportunity," Borkar said.

He pointed to some areas where software has seen progress, such as in gaming. He also identified other areas that might be fruitful. In particular, specific tasks could have their own optimized languages. Networking tasks, for example, could be handled by specific optimized networking code.

Intel has also been releasing more of its own software tools aimed at harnessing multicore performance. Another of Intel's efforts is to work with universities to change the way programming is taught to focus more on parallelism; that way the next generation of developers will have such techniques in the forefront of their minds.

"You start with the universities," Borkar said. "Us old dogs, you cannot teach us new tricks."

See more CNET content tagged:
challenge, Intel, software company, programming, Microsoft Corp.

29 comments

Join the conversation!
Add your comment
With all the DRM and encryption that is embedded
inside of Vista and other apps, it's no wonder that software is so bloated that it runs slow.
Posted by bobby_brady (765 comments )
Reply Link Flag
That is part of the problem
That is part of the problem. There is so much DRM to be encrypted that some systems simply cannot handle it.

Now, I will be honest here: I run Vista on my parent's new notebook computer (which I have basically commandeered for my own use) because it came with the system.
It is no slower than XP and a lot faster in some areas. Some of it is because the chip in the notebook is about 100% faster than the one in my parent's old Media Center PC, but not all of it.

I also haven't seen any 'bloat' in the system thus far, everything that I don't want except for a few core parts of Windows Vista, I can uninstall and have uninstalled.
Posted by Leria (585 comments )
Link Flag
Moore's Law does NOT exist
As a (true) journalist , you have a duty not to report false
information. (Well, my opinion, actually...)

If you are, please stop to mention "Moore's Law" because:
- it not a law (as in physics)
- it not a law (as The Law)
- it has proved not be actual or effective, particularly these days

So, what is that so-called law ?

Just an assumption from Gordon Moore about Intel capabilty to
improve its chip line. While it is true this assumption was
effective SOMETIMES it is also true that reality had to be twisted
from time to time to make it appear correct...

It is a shame that a CNET columnist not be better informed
about this and in turn misinforms his readers.

Moore's Law ? My @ss...

:(
Posted by masked dummy (3 comments )
Reply Link Flag
Article is not misinformed
The article makes no attempt to pass off Moore's law as scientific law. It uses terms such as "notion" or "principle" when refering to these adages, which is perfectly correct.

Do you get mad every time someone brings up Murphy's law as well? It's commonplace for adages to be refered to as "laws." It's acceptable to do so. Get over it.
Posted by mkhecker (4 comments )
Link Flag
Moore's Law is the Name of a Trend
Moore's Law is the name given to a trend in computers to double their capibilities every few years.

You're right, it is not a "real" law.

But it was never ment to be, no more than Murphy's Law (anything that CAN go wrong WILL go wrong" is a "real" law.

It's more of an expression of an idea used in the english language. Perhaps if you study the language more you'll come to understand that.
Posted by Mergatroid Mania (8395 comments )
Link Flag
Good point
Should be Moore's Bench mark.....
Posted by CapoNumen (19 comments )
Link Flag
As others have already said...
You totally took off on a "besides-the-point" tangent there. Not to mention, you're just inheriently wrong about your argument. You're *inheriently* wrong, because you don't understand what is means when someone says Moore's Law.

It is an estimate, at best. A speculative notion by Moore, and nothing more. When taken into this context, many people are charitable enough to allow it to have always been true. As long as technology is advancing, and advancing fast/well, Moore's Law holds true. (That's the charitable interpretation.)

Relax, sir, and enjoy the prosperity while it lasts.
Posted by ceosion (3 comments )
Link Flag
backwards
Processing power is rarely the limiting factor for anything people do on computers these days. It's much more about internet bandwidth, security and trust, electricity consumption, user interface, etc.
Adding cores is very much a build-it-and-they-will-come mentality. We don't know if added processing power really will add much value in the future, i.e. if it enables people to do new things on their computer that they otherwise would not be able to do. This has happened in the past, but history does not repeat itself.
There is plenty of innovation left in PC:s, of course. A wii-style interface would be great. But these innovations might not necessary require more processing power.
Intel & co don't have much choice, of course.
They have to keep selling more powerful chips or go bust. Good luck to them I say, but I would never buy any of their stock.
Posted by karlengblom (22 comments )
Reply Link Flag
Moores Law Does Exist (In Software)
"Moores Law" did not state that chip performance would double every two years, it stated that the number of transistors on a chip would double every two years. People have just interpreted that as doubling performance.
I for one think that the software giants have kept up with "Moores law" by doubling the lines of code every two years. I think that MS DOS 1.0 had about 4000 lines of code and Windows XP now has something line 40,000,000.
Posted by thinkermonkey (7 comments )
Reply Link Flag
Moore's Law
Not only has the OS kept up with Moore's Law, so have the memory and external storage requirements.

30 years ago, 4k memory was considered huge - and external storage was often paper tape (which could be measured in bytes per inch).
Posted by Jim Harmon (329 comments )
Link Flag
Other requirement to better this technological movement
With the advent of highend graphics chip some of which are now being emmbeded programmers would be wise to work with projects like cuda to make use of those processor types bett and to use them to make use of the core processor managment better.
Another avenu that needs to develop is the sharing of certian code as to improve overall performance for speciallised application.
Also with the advent of smaller footprints there is becooming over time more ability to speciallise units as to better perform certin applications.
One would be wise to take note of these developments as it will allow over the next 25 years or so better adaption to non-binary technologies which look set to eventually enter the market fully.
Some tips for doing this is to work more on organic software model anongside other businesses.
Also to shift from propitory business models to better sharing of resources.
Alothough you can hit math walls in terms of how far you can spread butterfly arrays for a lot of operations there is beccoming more to play with as to improve your RandD.
Another peice of advice is to work with others as to keep a close eye on non digital/non binary technologies as to make sure you know how usfull these are to you as they come in following a sensably open research model.
Posted by wildchild_plasma_gyro (296 comments )
Reply Link Flag
Parallelism has limits
If parallelism is the only tool in the box then every problem is sped up by parallelism?
This is too simplistic in the real world.
Posted by CapoNumen (19 comments )
Reply Link Flag
Shekhar Borkar has got it all wrong...
Software developers are perfectly capable of writing parallelized software. Everything my coworkers and I write runs on multiple CPU machines and we take advantage of the multiple CPUs as well as multiple servers in a cluster. It's a myth that it's hard to write software in a parallel fashion. The software that can be parallelized, for the most part, already has. For example, Eclipse compiles in the background, launches builds and tests in threads or seperate processes, yata yata yata.

The problem is that most desktop applications are serial by their very nature, and the desktop chips built 8 years ago could handle their needs just fine (I am typing this from an 8 year old PC). Intel would love it if the developers would somehow rewrite all of our simple applications so that they required 8 cores just to get a good experience, but it's not going to happen.

Multiple CPUs and multiple cores will always have a place for power users who run servers or lots of background tasks. They will also come in handy in networked homes, since you could have one central server and multiple thin-terminals throughout the house.
Posted by sonofagunn (2 comments )
Reply Link Flag
Sloppy code
It looks to me like developers have been counting on ever increasing horsepower to offset for their sloppy coding.

Just look at the amount of processing power and memory is needed just to run MS Office on Vista at a decent speed. For pete's sake guys, how about some tight coding for a change?

I rebuilt an old Windows 98 PC for a friend this week and ran it for a while before applying patches, antivirus, antispyware, a firewall, and all the other crap that is needed to make any version of Windows marginally secure.

It's an old PIII with 96mb of ram and it booted faster than my 3ghz , 2gb ram XP machine. Office 97 launched faster, as did email, etc. For the average home user, who wants word processing, spreadsheets, mail, and a browser, it has equal or better performance as my new Dell.

IMO, what they say is true:

"Intel giveth and Microsoft taketh away"
Posted by rcrusoe (1305 comments )
Reply Link Flag
So true
"Intel giveth and Microsoft taketh away"

If you want to see fast boot times... load that PIII system with DOS 6.0.
Posted by Jim Harmon (329 comments )
Link Flag
Yes and No - depends on OS.
"[i]It looks to me like developers have been counting on ever
increasing horsepower to offset for their sloppy coding.[/i]"

Umm, that should say "Windows developers" for the most part,
followed somewhat by "Apple Developers", with "Linux
developers" way the hell back from that statement.

It's a matter of degrees:

* you have the .NET codemonkeys who like to call themselves
programmers, but are often not much more than plug-n-play
API jugglers, and the often fat and bloated results show it (Hello?
Adobe? Symantec? I'm talkin' to YOU when I say that).

* On the Apple side of things, you have to do some actual
innovation and optimization (*nix and C++ is nowhere near as
forgiving), but Apple helps cushion the blow somewhat, and
community pressure to open the source code isn't as large, so
you can still hide your kludges.

* Linux? You either know what you're doing, or two things will
happen - the code will run poorly, and you will get laughed at by
all the real programmers if/when you show the source code,
because it's all open to public inspection should you use an
open source license. OSS is a great way to keep programmers on
their toes and strive to be the best in their field. That, and there
is no real cushion for bad/sloppy programmers. OTOH, the
absolute best programming tools (Qt and GTK ferinstance) can
be found and best supported there.

/P
Posted by Penguinisto (5042 comments )
Link Flag
Improved software performance
This is unlikely to happen before programming languages are capable of taking advantage of new hardware capabilities.(Who writes in machine code anymore?)

Then after that happens, the operating system also needs to be optimized for the hardware (since so much of the basic program operation relies on OS code).

Doesn't sound very likely any time soon.
Posted by Jim Harmon (329 comments )
Reply Link Flag
The Real Software Problem vs. Intel's Whining
Shekhar Borkar said that "software has to double the amount of parallelism that it can support every two years."

This is so infuriating. That's not the problem with software. The nastiest problem in the computer industry is not speed but software unreliability. Unreliability imposes an upper limit on the complexity of our systems and keeps development costs high. As I've repeatedly mentioned on this blog, we could all be riding in self-driving vehicles (and prevent over 40,000 fatal accidents every year in the US alone) but concerns over safety, reliability and costs will not allow it. The old ways of doing things don't work so well anymore. We have been using the same approach to software/hardware construction for close to 150 years, ever since Lady Ada Lovelace wrote the first algorithm for Babbage's analytical engine.

The industry is ripe for a revolution. The market is screaming for it. And what the market wants, the market will get. It is time for a non-algorithmic, synchronous approach. That's what Project COSA is about. Intel would not be complaining about software not being up to par with their soon-to-be obsolete CPUs (ahahaha...) if they would only get off their ***** and revolutionize the way we write software and provide revolutionary new CPUS for the new paradigm. Maybe AMD will get the message.
Posted by eightwings (32 comments )
Reply Link Flag
blind men and the elephant?
While I believe we're all pointed in somewhat the same direct, this is the one nearest to what I support in that it recasts the problem.

If software doesn't drive hardware as stated here, a variation would be that our ability to rapidly morph software (transform it) to the new hardware must be created. The guy or company that does this will have a well worn path to their door.
Posted by deeppow (3 comments )
Link Flag
Excuses Excuses
I quote from Intel's website itself: "Moore's Law, states that the number of transistors on a chip doubles about every two years". Kind of hard for software to be involved with that.

Yes, changes to parallelism in software will also help speed things up. But this statement sounds like a company out of ideas, starting to make excuses (just keeping up with their Wintel brothers I assume)!
Posted by godsoe (1 comment )
Reply Link Flag
Way off base
No matter how well you can write code, you can never achieve the theoretical speed up of multi-core processors. If you can get 3/4 speedup, you are doing very good. But you still have to have a algorithm that is inherently parallel.

There is too much shared memory and dependencies between the processors. Not to mention the fact that no software can constantly run in parallel. There are as many points that branch in software as there is that merge and are dependent on prior results. When optimizing, assembly will not save you, just like parallelism will not save you.

Secondly, the problem with software today is that it is bloated. Software houses use the excuse that hardware is cheap to try and validate poor coding.

Hardware may be cheap, but if you can do the same thing or 30% less hardware, but don't you are part of the problem.

Operating Systems, especially but not exclusively from MS, should be able to do the same things but on less hardware. The same goes for Office Suites, video games, and almost everything.

It takes hard work and skill to optimize code, many people simply do not have this skill. Also, ignorant people with MBA's have entirely too much control over this industry. Penny pinchers are worrying about costs, even though this causes crappy software.

This problem is analogous to the security problems in software today. Something that isn't widely publicized outside of technical circles is that very few developers have any knowledge of security issues, much less how to solve them. Which is why we have the reactive approach to security which always fails.

The optimization issue is another dirty secret of the programming field.
Posted by MSSlayer (1074 comments )
Reply Link Flag
There is a law, and it's Amdahl's
circa 1967, check wikipedia.

fortune cookie: one cannot bake a cake faster by placing flour in oven before breaking egg.

'nuff said
Posted by spudboyzfromthehood (1 comment )
Reply Link Flag
Run its course
Programming approaches that involve humans have run their course - constructing large multi organisational applications that contain complex and interlinked end-to-end processes are simply beyond the current software development model. An application design in total can be interpreted through an algorithmic 'back box' and generated. Applications developed this way are far superior and cannot be matched by human programming teams. The IT industry will go through the same evolution as manufacturing - machines will replace humans and the value will be in the design capabilities.
Posted by Nexuszzz (6 comments )
Reply Link Flag
A red herring ?
Talk about software lagging behind hardware advances are easy to criticize. That software extracts maximum juice from a single core is well-known and well-accepted for eons. But does the same happen with multiple processors/cores ? Good question...or is it ?

The world of hardware has been blessed with Moore's law. At least so far. When they find they are reaching some limits there, pointing to the fact that parallel programming has not really taken off sounds suspiciously like a red herring.

Well, there are not well-accepted paradigms for parallel programming that the "ecosystem of software developers" would jump upon to with glee, should such a hardware platform present itself. Nor is it likely that such a situation (in the software world) will arise overnight.

Well, the actual problem seems to be one of attitude - "Here's the hardware, now get it to work, and work in parallel" simply sounds stupid. Why should the concept of multi-core (or multiprocessor) be *the* concept, going forward ? Yes, Moore's law is probably facing limits in the hardware world, but do we then accept horizontal growth as a compromise solution to limited vertical growth ? Are there not other solutions possible ?

Firstly, modularity (or, if you turn that around into the problem, it is bloatware). Why do we need a PC 6 times as powerful as the one of 6 years ago to do some of the same tasks you did 6 years ago ? ex., create/edit word documents. Not always do all of us run programs that *really* need the extra horsepower which was absent in the 6-year old PC.
In other words, "modular systems" or "pay as you go" can be a solution - I doubt it's acceptance by the hardware guys, simply because we then stop paying them annually. The same holds for the software guys. Yes, maintaining working systems breaks the business models for both the hardware and software industry. And so, the clarion call to get new software to work on new hardware so that we both get some new money into our salary accounts.

Secondly, new concepts. That the software guys haven't come up with great concepts for parallelization is as much a valid criticism as the fact that hardware guys (or material scientists or whatever they are called) have not come up with means to break the limits to Moore's law - or any other such limitation they have in the world of hardware. So, dear Intel, the pot is as black as the kettle.

Rounding up, throwing a multicore processor "over the wall" and asking the software guys to max it is no solution. A genuinely fresh approach to computing is probably what we lack. Or it is something that is already cooking somewhere...

- vinay
Posted by vinay_rd (1 comment )
Reply Link Flag
The Multi-core Revolution
Data volumes have been exploding for years and the time windows given for processing that data is shrinking simultaneously. Business users are demanding more intelligent analytics; all of this is current news. Fried captures the core (pun intended) issue, software developers have to have a way to harness the power of these new chip architectures.

Our engineering group came across this concept several years ago. They started programming in anticipation of the hardware emerging today. We have shipped a multi-core aware product for two years (<a href="http://ww2.pervasive.com/Integration/Products/Pages/PervasiveDataProfiler.aspx"> Pervasive DataProfiler TM </a>). I share this not as a marketing ploy, although any publicity is good publicity, but more to make the point that what the Borkar says here is true; some companies will get it and some will not.

Driving the media frenzy is the fact that Intel and AMD are shipping multiple cores as standard offerings. It is difficult to find a PC today that does not ship with at least two cores. But the revolution is larger than that; we recently tested with an OEM that allowed us to run on a box with over 300 cores! This was their small offering. Again not a marketing ploy but a point maker, we do not have to wait for hundreds of cores, they are already here. And we do not have to wait for multi-core frameworks, they too are already here.

Bottom line we at Pervasive are doing work in the field, we decided not to wait for the other software vendors. Our work is currently being made available for free in the form of a 100% Java framework called Pervasive DataRush TM. Our framework is aim at targeting data intensive problems, using a dataflow methodology to leverage what we call hyper-parallelism. Feel free to check out the framework and other work at the Pervasive DataRush web site: <a href="http://www.pervasivedatarush.com"> PervasiveDataRush.com </a>
Posted by bjacaruso (2 comments )
Reply Link Flag
It is a nice attempt...
but it's not a complete package.

And your talking something along the lines of Mitosis guessing
game, that Intel's Israel team is developing; but this is more of a
waiting and assigning game. You don't want to have to wait in
the process.

Having 300 cores running, that can't over come Amdahl's law is
a waste of raw power, because Amdahl's law kicks in when the
third core is added. Meaning, all additional cores add only
provide 1.5% gain. Meaning you would have to break the flow
every 3rd core, to overcome the law. And if you can't overcome
the law, the gain is nice, but not out of this world.

I didn't give our name, and I won't, but we have everything. The
language, the tools, the new needed file system and data base
manager, as well as scheduler, (critical to keep all cores active,
all the time), and security is dramatically improved too. Because
we are at the foundation, where all security problems start.

We have researched your offering from afar, and we are talking
with INTEL, but many can run with this type of product.
Posted by thecatch (49 comments )
Link Flag
How do AJAX-based apps fit in to this picture?
This article focuses on traditional desktop applications.

How do AJAX-based apps fit in to this picture?

As broadband networking advances, maybe our client computers will simply be cheap display devices. The compute-intensive operations will occur on liquid-cooled massively parallel super-computers.

Like the article said, servers are in a much better position to take advantage of parallelism. And servers can use exotic techniques to get the MIPS up, e.g., liquid cooling, exotic materials.
Posted by ahalsey (7 comments )
Reply Link Flag
A new platform is on the way...
and yes people do still write in Assembler, better yet, Assembler
with a twist.

I'll keep it short, because last time I was here I was yelled at.
Actually, we were talking about Virtualization v. Parallelization, I
was just trying to point out that True Parallel Processing will
lessen the need for VM's.

And we too agree that it's kind of funny hearing the hardware
gang, blame the software gang for not keeping up. Maybe that
WINTEL thing had something to do with keeping the dog down,
discouraging new software advancements. 'Can we be involved,
can we get any help with these instruction sets?'

Well those days are gone, now they have come begging.

We first had to design for Windows too, thank god for multiple
cores and Microsoft's own Vista, the parallel hungry OS.

Foundational Software will be the tag-line, and it will aid all
offerings, all applications.

And you can't teach old dogs new tricks, but in our case an old
dog wrote the code. His lifetime experience provided that
institutional knowledge of not only software, but systems and
hardware too.

That was the bridge that kept growing, (hardware &#38; systems
development separated from the software development), which
got us in this mess in the first place.

Discreet memory, and scheduling are key, but so was the
working knowledge of the creator.

Again I will not advertise a name here, never have and never will.
But give us by January and you will be hearing about us. It is
going to get fun. And one hint of a forecast to come, some laws
are proven wrong.

The Catch
Posted by thecatch (49 comments )
Reply Link Flag
microcrust is following more"s law
they are continuing to develop more demanding programs to use up the increased ability. not better faster, just more bungling and demanding! when windows was bugged down BE os beat them hands down. guess what microsoft bought them. they are still around, ever heart of them? why innovate? just dominate! that is there motto.
Posted by crasch48 (4 comments )
Reply Link Flag
 

Join the conversation

Add your comment

The posting of advertisements, profanity, or personal attacks is prohibited. Click here to review our Terms of Use.

What's Hot

Discussions

Shared

RSS Feeds

Add headlines from CNET News to your homepage or feedreader.