Back in mid-August, Intel's Nick Knupffer made a promise to casual 3D gamers by suggesting that a new driver update would give systems that use the Intel G965 integrated graphics chipset a much-needed performance boost. I was highly skeptical of this claim, but wanted to put Intel's claims to the test.
To test Intel's claims, I chose an Acer TravelMate 4720-6727 laptop, which uses a 2GHz Core 2 Duo T7300 CPU, 1GB of RAM, an integrated 965GM Express graphic chipset with 384MB memory allocated, and running Windows XP Professional SP2. While this hardware combination is not the fastest available, I feel it comes close to representing a typical configuration for a midrange laptop with integrated graphics. I tested the laptop with CNET Labs' regular suite of 3D gaming benchmarks, using the originally installed driver (14.29) as well as the updated version (14.31.1).
First up on the list o' games was Quake 4. At a resolution of 640x480, using the low-quality video settings, and with only vertical-sync and antialiasing (AA) turned off, the result was an average of 20.2fps with the updated driver. With the video quality changed to high quality and AA set to 4x, the average framerate decreased by only 2 percent. I continued to up the ante to 1,024x768 at high quality and AA set to 8x; ending at 12.2fps. In comparison to the older (14.29) driver, I consistently saw only a 3 percent to 3.5 percent performance increase--which actually falls within our acceptable margin of error.
Next up was F.E.A.R. At 640x480, using fairly conservative performance settings, but with Enable Shadows turned on, it reached 22fps. By turning off the Enable Shadows feature, however, it achieved a more respectable 43fps. F.E.A.R. was actually the only game I saw that truly benefited from the driver update dramatically, nearly doubling the framerate.
To throw something a little more current (at least as of last year, anyway) into the mix, I tried Company of Heroes, using settings specifically suggested by Intel for "optimal experience." Following Intel's suggestions, it only reached 12.1fps. I then dropped the Physics, Terrain Detail, and Effect Fidelity settings to Medium and generated 13.3fps. The only time I was able to reach 20fps or more was by dropping all the quality settings as low as they go, except for Shader Quality and Model Quality, which I left set to High.
The minimal acceptable framerate for PC gamers falls into the 30 to 40fps range--but only when you are forced to make compromises, such as image quality vs. performance. Otherwise, the typical preferred minimum framerate is 60 to 70fps. I didn't hit framerates close to either of these two ranges, with the exception of F.E.A.R, which is the least taxing of the games I used for testing. In addition to benchmarking the above titles, I also played a few other games, such as Doom 3. Consistently, with the benchmarks and anecdotal gaming, I found that I had to make too many sacrifices to image quality in order to even try to get close to acceptable framerates. Even then, most of the time the image-quality sacrifices were still not enough to generate acceptable framerates.
It appears that my skepticism was well-founded, as Intel did not deliver on its promise. CNET Labs has yet to see an integrated graphics solution from Intel that is robust enough to meet the expectations of even the most-casual 3D gamer. To think that a mere driver update would provide the necessary boost would be naive. But don't give up on Intel quite yet; its recent acquisition of Havok indicates that Intel is willing to take gaming seriously, and leaves the door open for what the future may bring.