Some ideas are so obvious that when announced, one's first reaction is to ask, "Wait, weren't we already doing it that way?" Such is the case with laptops featuring both integrated and discrete graphics processors, and Nvidia's new system for juggling them, called Optimus.
Common sense dictates that laptops with both a discrete GPU and standard integrated graphics should be able to switch between the two at will. After all, why waste battery life powering a GeForce card when you're just surfing the Web or sending an e-mail? Many laptops currently offer this option, commonly called switchable graphics, including select MacBook Pro models and systems from Asus, Sony, and others.
Unfortunately, until now, this required one to manually flip a switch to turn the discrete GPU off or on. Sometimes this was a software switch, sometimes an actual physical switch on the laptop. This kludgey system presents many problems. First, one has to remember to activate the GPU before launching a game or other graphics-intensive task, and then turn it off after (or risk killing your battery). Second, many mainstream users may not even know they have switchable graphics, and will simply leave the GPU permanently off or on, defeating the purpose altogether.
Some laptops label the two modes with unclear names such as "high performance" or "better battery life," which doesn't really spell out exactly what you're doing when you activate the mode. Even worse, some laptops require you to close all your apps and log out to switch modes (we're looking at you, MacBook Pro).
With Nvidia's new Optimus technology, that basic task of turning the discrete GPU on and off when appropriate has been automated. The company describes the process, saying, "Users can now experience the full performance benefits of a discrete GPU with the battery life of an integrated graphics solution. Nvidia Optimus automatically, instantaneously, and seamlessly optimizes the notebook to offer the best performance or best battery life depending on the application."
The concept is simple, the system uses its integrated graphics, and when an app launches that requires the discrete GPU, it seamlessly switches over to that, then turns it off when no longer required.
This seems like such an obvious thing, one might wonder why we ever had to manually switch graphics modes in the first place. Nvidia claims it was an issue with both the integrated and discrete graphics sharing a multiplexer (or mux) connection to the monitor, which made it impossible to switch on the fly. The Optimus solution is to route the dedicated graphics GPU output through the IGP integrated graphics chip, so there's only a single point of connection between both graphics systems and the display.
Our test system, an Asus UL50, included a test tool in the form of a pop-up window that indicated when the discrete GeForce G210M graphics were on or off. When launching a game or playing an HD video file, we could see the GPU indicator turn on, showing us that the system had switched into the more powerful, but more power-hungry, graphics mode. In real-world terms, that means we could get 70-plus frames-per-second in our Unreal Tournament III test at 1,366x768, while also getting nearly six hours of battery life in our video playback battery drain test.
The Optimus system will control its own switching by default, and the Nvidia drivers will maintain an updated list of preferred settings for apps and games--but you can also set your own preferences to override the defaults if needed, or set preferences for a program the Nvidia control panel doesn't recognize.
In using an Optimus-powered laptop, the technology worked as advertised, which is to say you'd never notice it in action if you weren't looking for it. That's exactly the point, so we look forward to seeing it in upcoming laptops from several PC makers, although the Asus UL50, and a handful of other Asus models, are the only confirmed systems right now.