Upcoming Events
Unite 2010
11/10 - 11/12 @ Montréal, Canada

GDC China
12/5 - 12/7 @ Shanghai, China

Asia Game Show 2010
12/24 - 12/27  

GDC 2011
2/28 - 3/4 @ San Francisco, CA

More events...
Quick Stats
63 people currently visiting GDNet.
2406 articles in the reference section.

Help us fight cancer!
Join SETI Team GDNet!
Link to us Events 4 Gamers
Intel sponsors gamedev.net search:

Contents
 Introduction
 Vertex Shaders
 in the Pipeline

 Why use Vertex
 Shaders?

 Vertex Shader Tools
 Vertex Shader
 Architecture

 High Level View of
 Vertex Shader
 Programming

 Conclusion

 Printable version
 Discuss this article
 in the forums


The Series
 Fundamentals of
 Vertex Shaders

How This Course is Organized

We'll work through the fundamentals to a more advanced level in five lessons, first for vertex and later for pixel shaders. So our road map looks like this:

  • Fundamentals of Vertex Shaders
  • Programming Vertex Shaders
  • Fundamentals of Pixel Shaders
  • Programming Pixel Shaders
  • Shader FX

Let's start by examining the place of vertex shaders in the Direct3D pipeline ...

Vertex Shaders in the Pipeline

The following diagram shows the Source or Polygon, Vertex and Pixel Operations level of the Direct3D pipeline in a very simplified way:

On the source data level, the vertices are assembled and tessellated. This is the so called high-order primitive module, which works to tessellate high-order primitives such as N-Patches (ATI RADEON in device driver/ATI RADEON 8500 in hardware), quintic béziers, B-splines and rectangular and triangular (RT) patches.

It looks like the new 21.81 drivers from nVidia driver doesn't support RT-patches on the Geforce3 anymore.

A GPU that supports RT-Patches breaks higher-order lines and surfaces into triangles and vertices. A GPU that supports N-Patches generates, depending on the non-perpendicular angle of the normals of each triangle of an object, so called control points. The more the normals are bent away of the middle of the triangle, the more control points will be built by the GPU. Then a bézier surface, consisting of triangles, will be build out of these control points with the help of a special algorithm.

In this course, we're most interested in the vertex operations level: with Direct3D you have two different ways of processing vertices:

  1. The "fixed-function" pipeline. This is the standard Transform & Lighting pipeline, where the functionality is essentially fixed. You can play with T & L by setting render states, matrices, and lighting and material parameters.
  2. Vertex Shaders: this is the new mechanism introduced in DirectX 8. Instead of setting parameters to control the pipeline, you write a vertex shader program that operates nearly without an API abstraction layer directly with the graphics hardware.

Our focus lies on the second method. It is obvious from this simplifying diagramm that Face Culling, User Clip Planes, Frustrum Clipping, Homogenous Divide and Viewport Mapping operate on the vertex-level after the vertex shader. So these operations can't be controlled by a vertex shader.

So what kind of operations does a vertex shader do for you? Every vertex shader will accept one input vertex and generate one output vertex. So it is not able to create a new vertex. It is able to process the following steps on such a vertex:

  • 3D coordinates: procedural geometry, blending, morphing, deformations
  • Colors (true color, pseudo color)
  • Texture coordinates
  • Fog (elevation based, volume based)
  • Point size

A processed vertex will consist of at least a clip-space position (x, y, z and w) and optionally color, texture coordinate, fog intensity and point size information.

A Vertex Shader doesn't do the following:

  • Does not perform polygon based operations
  • Face culling, clip planes, clip to the frustrum, divide through w or map to the viewport
  • write to other vertices than the one it currently "shades"
  • create vertices

To summarize this abstract overview on vertex shader functionality: the vertex shader is only able to replace the hardwired- transformation and lighting capabilities. The following pipeline stages are fixed-function.

So the developer is able to write their own transformation and lighting engine this way. But why would someone do this?





Next : Why use Vertex Shaders?