GeForce 3 - Eye Candy A Plenty posted 4/1 at 10:16:35 PM PST by Mason McCuskey
For me, one of the highlights of the GDC this year was attending a press briefing for NVIDIA's GeForce 3 video card.
To everyone who believes the GeForce 3 is too good to actually exist: I have seen it. I have witnessed the glory of vertex and pixel shaders running in hardware. I have felt the power of true, z-correct bump mapping and phong lighting. I've gotta tell you, seeing a demo powered by the GeForce 3 triggers a lot of deep, powerful questions inside you. Questions like "If I grab that box now and make a break for it, can I make it out of the Expo before the GDC / NVIDIA police gun me down?" Before I had seen the GeForce3, I was a little concerned about the hype level. PC Gamer basically said that this card walked on water. Rumor has it one guy at the GDC noticed that Microsoft was giving out a small number of the cards; all you had to do was collect some stamps in your DirectX passport, then go to the prize counter and claim your just desserts. This guy got the stamps, then camped out in front of the Expo doors. He didn't get a card - someone beat him to it when the doors opened. After seeing the GeForce 3, however, I have no hesitation in saying that the card really is worth trampling your fellow game developers on your way to the prize counter. GeForce 3, with its nfiniteFX engine, stands poised and ready to take 3D gaming to the next level. Vertex and pixel shaders represent the dawn of a new era; anyone who tells you it's just hype obviously hasn't seen the card in action. Dave and I watched the NVIDIA guys put the card through its paces in several demos. First, we saw a chameleon walking along a stick - a great example of the GeForce 3's pixel shader capabilities. Then we saw a dinosaur transform, followed by several of the vertex and pixel shader samples you've probably already seen and downloaded the code for. But the card isn't just about vertex and pixel shaders: some less publicized features include the Lightspeed Memory Architecture, which helps the card achieve its stunning 7.36GB/second memory bandwidth number, HRAA, or High Resolution Anti-Aliasing, a dual cube environment mapping capability, true-color hardware support for cursors, 2D quad buffering, and HDTV/DVD playback. I'll end by humiliating myself by typing the following, incredibly stupid but irresistable pun: The GeForce 3 is a (G-)force to be reckoned with. Mason McCuskey |