Tweak3D - Your Freakin' Tweakin' Source!
OFFICIAL Nvidia GeForce 2 GTS Preview (Page 5/8)


Posted: April 26, 2000
Written by: Dan "Tweak Monkey" Kennedy

More Benchmarks - Athlon @ 824 MHz

We decided to use an Athlon 800+ MHz CPU to test the GeForce 2 GTS's scalability (or lack thereof), high-end performance, and of course, compatibility with AMD CPUs. Here's the rig's settings:

AMD Athlon @ 824 MHz (103 MHz x 8)
Asus K7-M motherboard
128 MB SDRAM
Nvidia 5.16 reference drivers
Vsync disabled for all tests

Once again, Quake3 was the only benchmark available at the time of testing that was going to provide this card with a challenge and provide accurate results, so demo 1 was run at two resolutions: 1024x768 and 1600x1200. Anything lower than these resolutions is not even worth benchmarking with this card because the frame rates were too high to really even consider.
Note about modes used to test:
-"Normal" mode is Quake3's normal setting if enabled in the system settings. The resolution was adjusted and color depth was set to 16-bit.
-"Very High 16-bit" is the result of setting the game to "High Quality", then putting texture quality and geometric detail all the way up. Texture quality was left at 32-bit, but for this test, color depth was set to 16-bit.
-"Very High 32-bit" is the result of setting the game to "High Quality", then putting texture quality and geometric detail all the way up.



The difference between the GeForce and GeForce 2 GTS at this resolution is much more significant than it was with then Pentium II 450, and the frame rates are much higher. Even with the highest settings, the 16-bit color Very High mode still cranked out nearly 100 FPS. 32-bit color dropped substantially but still cruised along with no problems. The increase over GeForce 256 in these tests were 25% normal, 28% Very High 16-bit, and 20% Very High 32-bit. Not as high as the fill rates might indicate, but not too shabby...



The performance at 1600x1200 is not as great as one might expect, only surpassing the Pentium II 450 benchmarks by a small margin. This is possibly a result of Nvidia's advertised "CPU independence" -- stating that T&L does indeed pull a significant amount of work normally done by the CPU. Still, the increase over the GeForce 256 is nice; 62% normal, 44% Very High 16-bit, and 26% Very High 32-bit. The frame drops from even 1024x768 to 1280x1024 so sharply (yet, still playable) with 32-bit color that it is quite obvious there is simply not enough video memory to accomodate the 32-bit at higher resolutions.

With this system, Unreal Tournament flew by at nearly 100 frames per second at 1600x900 with 16-bit color, yet dropped to way below 30 FPS with 32-bit rendering (and no texture compression). Once again: these cards need either texture compression or more memory if you plan to run 1600x900+ with 32-bit rendering!

Next Page

  • News
  • Forums
  • Tweaks
  • Articles
  • Reviews