NVIDIA's GeForce3: "NV20" Revealed
by Anand Lal Shimpi on February 27, 2001 9:16 AM EST- Posted in
- GPUs
Finally a true GPU
When the GeForce 256 was announced, NVIDIA was touting its achievement of bringing the worlds first Graphics Processing Unit (GPU) to the market. We hesitated in calling it a GPU simply because it drew a parallel between the GeForce 256 and a CPU that we were not willing to make. A CPU like the AMD Athlon and Intel Pentium III was much more useful than NVIDIA's first GPU simply because they were truly programmable entities. They had an instruction set (x86) and programmers could write code to manipulate the CPU's power in whatever means they saw necessary.
The GPU however was not as advanced of a chip. Developers were stuck with a feature-set that NVIDIA's engineering team implemented and could not control the chip in same manner with which x86 programmers can manipulate the Athlon or the Pentium III. NVIDIA introduced a whole new list of features that were supported in hardware with the GeForce2 GTS and its shading rasterizer; however if a developer did not implement a particular function as it was provided for in hardware, their function was useless in the eyes of the GPU. There was a severe lack of flexibility with this revolutionary GPU.
One of NVIDIA's most highly touted features was their hardware transforming and lighting engine that was designed to offload the transforming and lighting calculations from the host CPU to the GPU in an attempt to increase overall performance. Unfortunately very few games still can take advantage of this powerful T&L engine; Quake III Arena can make use of the GPU's hardware accelerated transformation engine however its own lighting engine makes the GPU's other function useless. That is the least we could ask for, games like UnrealTournament could not take advantage of even the GPU's transformation engine. With all due respect to the engineers at NVIDIA, they are not game developers and there is no way they could know the best way to implement a lot of these features in hardware so that everyone is happy.
The solution to this ongoing problem was to take a page from the books of desktop CPU manufacturers. If developers are constantly asking for their features to be implemented in hardware, why not place the burden on them and make them write the code necessary to manipulate the hardware NVIDIA manufactures?
The GeForce3 is thus the first true GPU from NVIDIA. It still features the same T&L engine from before, however now it is fully programmable. The GeForce3 features something NVIDIA calls the nfiniteFX engine that is made up of the hardware T&L unit we are used to, plus an nfiniteFX Vertex processor and an nfiniteFX Pixel processor. This is where the bulk of the transistor count in the GPU comes from.
The instruction set the GeForce3 understands is what is known as the Vertex Shader Instruction Set. This is the equivalent of the x86 instruction set in the PC processor world albeit specifically tailored for the needs of the GeForce3.
The Vertex processor handles the initial geometry and setup math necessary for the production of the 3D scene that is being rendered. When you're dealing with polygons (and obviously their vertices, hence the name), the Vertex processor is your one-stop-shop for all your 3D calculation needs. As you can see by the above graphic, the transformation and part of the lighting stages occur at this point during the rendering process as the scene is being setup. The polygons are transferred from mathematical space stored in memory to their 3D counterparts that will be shortly rendered to the frame buffer and dumped onto the screen.
Operations such as vertex lighting, morphing, key frame animation and vertex skinning are all functions that can be taken advantage of in programs developers will custom make that will be run by the Vertex processor. You've already heard of a lot of the aforementioned operations from our ATI Radeon review, except for now, instead of being limited to what the hardware engineers define should occur when a vertex skinning operation is initiated, the developer is free to control the outcome on their own.
Examples of what the Vertex processor is able to produce are things such as facial expressions. Gone are the days when a developer uses a single blocky texture to represent a hand, now things like individual fingers moving across a keyboard will be easily represented. Using a combination of polygons and programmable lighting effects, some very realistic models and actions can be produced.
An example of vertex skinning from the ATI Radeon Card |
The GeForce3 also supports hardware acceleration of curved surfaces as well as higher order surfaces which can be useful as it's easier to represent an arc with a quadratic function than it is with a bunch of small triangles. It saves bandwidth over the AGP bus by using fewer polygons.
The next step in the process is the finalization of the lighting process and the actual rendering of the pixels present in the scene for final display. This is obviously handled by the Pixel processor. One of the basic concepts of 3D acceleration is the idea of using textures. In the early days of 3D gaming a character could be represented by a few polygons covered with two-dimensional textures. The textures would store only a few bits of data, mainly pertaining to the color of the texture.
With the advent of the technology behind the GeForce3's GPU, the opportunity for the texture to become much more useful is limitless. Instead of just holding values for color, textures can now start to hold variables, direction vectors and lighting vectors; all of these values can be actually encoded into the texture.
What's the benefit of this? After the Vertex processor has setup the polygons in a scene and the Pixel processor comes in to apply textures, the textures not only tell the Pixel processor what color they will be but they also tell the processor how they will react if certain events occur. Using the encoded direction and lighting vectors of a pixel, the pixels can now realistically reflect light through a dot product calculation. And the manner in which the pixel reflects light will change according to the direction vector of the light source.
Imagine a wrinkle on a character's face. That wrinkle now contains much more information simply than it's color and general orientation. If a light source shines upon the wrinkle it will now cast a realistic shadow through a series of vector multiplication operations. Through a series of pixel shader programs the developer can now make their game completely unique. Doom3 will look completely different from Duke Nukem Forever not because of the fact that idSoftware uses different textures than 3DRealms, but because their vertex and pixel programs are completely different.
Remember Environment Mapped Bump Mapping Matrox preached with the release of the G400? Instead of having another texture create the illusion of bumps on a surface, the GeForce3 can actually do real time calculation of the effects on ripples in water and how they interact with surrounding objects such as rocks. If you ever noticed, with a game that supported EMBM, you never had a situation where something was protruding out of the bump-mapped water and had the water interact with the object. In the real world waves change according to the rocks they are interacting with, in the EMBM world they didn't. This is because with EMBM all you had was another texture on top of the regular textures representing the water and its ripples.
The GeForce3 will take advantage of Dot3 bump mapping which again uses dot products of direction vectors to produce the resulting ripple effects in water. This isn't limited to water alone, as walls and other surfaces will receive the same treatment. Dot3 has been around for a while but rarely used because of either poor implementations in hardware or a lack of flexibility for developers to use it. EMBM was rarely used because the penalty of rendering another texture was often too great for the cards that supported it. With the GeForce3's extremely flexible GPU, developer's can truly take the power into their own hands and you can finally expect more than just ugly looking 2D textures on walls and completely unrealistic water.
0 Comments
View All Comments