It's a question we all face: with chips getting more processing cores instead of more gigahertz, is your next computer going to actually run your software faster?
Microsoft is one of the companies that feels the pressure to most acutely when it comes to putting those cores to work. Though it doesn't pretend to have the problem licked, Microsoft does believe Windows 7 provides a better foundation for using multicore systems than earlier versions of the operating system.
One key part of solving the PC's multicore problems draws from the world of big iron, and Windows 7 can support much bigger iron--servers with as many as 256 processor cores compared with 64 for its predecessor. Now a few years into the multicore era, even today's laptops are able to juggle as many tasks as reasonably powerful servers from just a few years ago. Intel's new Core i7 "Clarksfield" processor for mobile computers has four cores that manage a total of eight separate "threads" of work.
"One dimension is support for a much larger number of processors and getting good linear scaling on that change from 64 to 256 processors," said Jon DeVaan, senior vice president of Microsoft's Windows Core Operating System Division. "There's all kinds of depth in that change."
Linear scaling means that doubling the number of processors means a doubling in performance--something rarely achieved in real-world computing. But what does 256 or even 64 processors have to do with a PC with four or eight cores? In short, updating the Windows plumbing to support bigger servers also helps work run more smoothly on smaller multicore machines, for example by ensuring data cached in memory is close on hand to the processor core that needs it, DeVaan said.
It's crucial that Microsoft help solve multicore issues. The company is responsible not just for the most widely used personal-computer operating system but also for the programming tools many use to create the software that runs on it. That's why another broad attempt to ease multicore pains takes place within Visual Studio 2010, the upcoming version of Microsoft's programming tools.
"People have been working on this for a long time. So far there haven't been any magic bullets," Devaan said. "The commercial reality is creating a lot more urgency now, so I think we'll see a lot more approaches taken."
Unlocking multicore power is a point of competition, too: Apple's newest version of Mac OS X, Snow Leopard, adds a facility called Grand Central Dispatch to centralize management of all the various threads of programs as they run on a system.
Intel and Advanced Micro Devices bear responsibility, too, since they embraced multicore designs once heat problems put an end to the clock-frequency race, but Microsoft has much more clout in developer relations.
Multicore designs can help easily when people are running many separate programs or when running programs that are "embarrassingly parallel"--in other words, when a task has many naturally independent subtasks, such as rendering each of a video's many frames. But many programs won't easily make the jump to a parallel design when they're set up as a single sequence of steps today.
"An operating system is never going to be able to take an application that isn't already parallel and make it so. Developers still need to multi-thread their apps," said Evans Data analyst Janel Garvin.
Visual Studio 2010
So it's good Microsoft is working on parallel programming aids within Visual Studio.
"Microsoft has done surprisingly little until recently to help developers write parallel applications, except for their alliance with Intel to promote Parallel Studio," an Intel collection of programming tools for parallel programming, Garvin said. "However, in the last year they've made some announcements and promises for Visual Studio 2010 about enhanced tools for parallel programming. It's likely that the success of Parallel Studio has impressed upon them the importance of providing Windows developers with the tools they need to remain competitive going into the future when manycore will be the standard."
Eventually, programmers will have to embrace parallel programming to be competitive, Garvin said. Parallel Studio helped bring the concepts to a much more mainstream audience, she said, and Evans Data's spring 2009 global developer survey found 40 percent of programmers are working on multithreaded applications today and another 15 percent plan to in the next year.
"Parallel programming is complex, difficult and labor-intensive, for even the most skilled developers, which has led developers to avoid writing parallel programs, leaving many CPU cycles unused," according to Steve Teixeira, Microsoft's principal product unit manager of parallel computing. The company's attempt to improve the situation comes not just in Visual Studio 2010 but also in another future product, version 4 of the company's .Net Development Framework.
Parallel programming tools
Among those features:
The Task Parallel Library, which lets .Net programmers write more parallel code in familiar terms. For example, programmers are used to "for loops" that repeat a particular task a specific number of times; library lets each step of the loop happen simultaneously instead of sequentially.
The Asynchronous Agents Library can permit separate threads of execution to pass messages among each other. That's useful in cases where separate threads need to head off no-no conditions such as when
Parallel Language Integrated Query (PLINQ) technology lets programmers perform some operations with data in parallel rather than sequentially.
The Parallel Pattern Library is designed to make parallel programming easier for those using the C++ language.
Microsoft knows none of this is truly easy, though. DeVaan wonders about cases when existing software is being parallelized--is each step in a parallel for loop really independent of the others? He sees "a lot of hand-waving" around the computing industry that glosses over the true difficulties.
"As an industry, we're going to be working hard to make it work better and working with broad set of developers to target (multicore programming) without undue work," DeVaan said. "Will these approaches really accomplish it? That's an open question."