|
In the Forums... |
Posted: October 11, 1999 Written by: Dan "Tweak Monkey" Kennedy Introduction By now, nobody should be unfamiliar with NVIDIA and/or its products. In fact, long-time Tweak3D readers may remember that it was the original Riva 128 Tweak Guide that started this site. Throughout the last two years, we have brought you numerous TNT and TNT2 reviews, along with several articles pertaining to this company. Alas, here is the latest, an official NVIDIA GeForce 256 preview. Company Background I know, you're already begging for me to get to the specs and features of this new chip from NVIDIA, but first you should consider the company behind the product. Although NVIDIA was founded in 1993, they had a rather slow start and consequently did not make a huge impact in the 3D hardware world until the Riva 128 was released in 1997. This fine product was touted by many to be a powerful chip, and it did a great job when compared to others in its price range, including 3dfx's much loved Voodoo Graphics chipset. After a year went by, NVIDIA again shocked the world with the incredible TNT chipset. TNT, or TwiN Texel , offered advances that few had ever seen. It combined wonderful visual quality with an amazing feature set, as well as great performance. The only killer of the TNT chipset was the TNT2 chipset, which was released a year later. TNT2 basically stretched TNT to offer even better performance. Even with NVIDIA's imminent success, the company is rather small; featuring a staff of 300 hand selected individuals. What's in a Name? If you've read speculation (rumors, unofficial previews, etc.), you were probably surprised to see the name of this product. So, for future reference, keep in mind that NVIDIA's new chipset is called GeForce 256. Also keep in mind that this is pronounced JEE-Force, not G-E-Force or GEE-Force. The name was derived from two terms: one being "g-force", and the other being "geometry". The 256 added to the end of the name represents the the 256-bit architecture that I will explain later. As for the name NV10? Well, that's just the common style of naming an NVIDIA product. For example, the first product NVIDIA made was the NV1. As NVIDIA's Co-Founder, President and CEO, Jen-Hsun Huang said, "It sucked". Riva 128 was the project known as NV3, Riva 128ZX was NV3.5, etc. Now, on to more important topics... What the Industry Needs To be honest, I thought that the computer graphics industry did not need much improvement. Few people would complain about the status of computers and the ultimate impact that the graphics industry has made with modern games. Thanks to higher resolutions, frame rates, rendering techniques, and filtering, games are looking more and more like real life or better. Nearly perfect, realistic worlds are being created for our imaginations and game developers to indulge upon. However, there are components that are still lacking, as I learned upon analyzing the industry carefully. The most important lacking aspect is that only those with the best and most expensive computers can enjoy the luxuries of games. The 32-bit color, high resolutions, and high frame rates are only myths to most people who have to play Quake3 at 640x480x16bpp just to maintain a consistent frame rate. And although the graphics in most of these games are good, they lack realism. Detail aside, lighting is the area that needs the most work. Luckily, lighting techniques have improved vastly over the last few years, but once again, it's a luxury only few gamers can enjoy. Will the NVIDIA GeForce 256 fix these issues? Read on to find out, as I explore the features and unique aspects of this chip. Update - BENCHMARKS! Finally, NDA has been lifted and I can post benchmarks of the GeForce 256. And guess what? My board is DDR, not SDR! So, why didn't I just write a whole review? Well, first of all, most of the information I would include is already here.. why write another massive 10-page article? And also, I don't consider this a final product yet, and therefore it shouldn't be reviewed. Sure it may be stable, but I want to wait until more games can utilize the card before I write a review. Test System
Quake 2 - Default Settings
Q3 Test - "Normal" Setting
Q3 Test - "High Quality" Setting
Doesn't something seem wrong here? The 1280x960 and 1600x1200 High Quality benchmarks are extremely low, and the pattern doesn't fit. The culprit? I believe the problem is memory or bandwidth. I played with 3D ExerciZer and set the texture memory to 46.8 MB in a scene. The frame rate hovered around 60 FPS. When I upped the texture memory to an even 48 MB, the frame rate dropped to 3 FPS. Yes, 3 FPS! WTF!??? Want to comment on this? Please, do so in the message board. I'm interested to read what people think. Hopefully we'll have this all figured out soon. =) Q3 Test - Very High Quality In addition to "High Quality", the following settings were used to simulate "very high quality":
When you compare results, it looks like lower resolutions take a more significant performance hit with extra detail. Comments I have been drooling over the GeForce 256 since I first saw it in action at Nvidia's HQ. These benchmarks prove that the card has potential... for sure. First, consider that the drivers I used are not totally optimized. Remember the good ol' Riva 128 days? Even after the ICD was non-beta, performance almost always increased when new drivers were released. The same went for TNT/TNT2 cards. Nvidia is good at tweaking drivers. Next, consider that the GeForce 256 has barely even been tapped (no pun intended) by developers. In the future, more and more games will boost performance with T&L and the other features that make the GeForce 256 stick out as one of the coolest video cards to date. Overclocking The 3.48 Reference Drivers (and hopefully all future GeForce 256 drivers) include the option to enable/disable VSYNC. And better yet, they include an overclocking utility. YES!!! Finally! The core speed is a mere 120 MHz, which is fine considering the card renders 4 pixels per clock. But come on, we're all tweak monkeys at some time or another, and we're all going to try to overclock a GeForce 256. So... how does it do? How about the sweet DDR SGRAM that calculates to 300 MHz? I will have information and overclocking results online later this week. Stay tuned for more GeForce 256 information and benchmarks in the next few days. Continue reading my GeForce 256 Preview by clicking the Next Page link below. |
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
---|