Texture Mapping
by Thierry Tremblay
http://frogengine.net-connect.net

Texture coordinates representation

I developed a texture coordinates representation that is quite useful for a portal engine. The basic idea is not mine, I took it from the Crystal Space engine. The mathematics behind this representation are heavily inspired by a text file on their web page.

This method may seem counter intuitive to certain peoples, particularly those who are not familiar with rasterization of texture mapped polygons. I don't have the intention of explaining what texture mapping is and how it works, there is plenty of information about that on the net. What I intend to do is explain how the texture representation in Frog works.


Mathematics

The core of any 3D engine is mathematics. To be more specific, linear algebra. So if you are not used to play with vectors and matrices, I suggest you go read some book about it. There is also plenty of information on the web. It is very important to understand the basics of linear algebra if you are to write a 3D game anyway.

We start with the matrix representation for the texture coordinates on a polygon (in object space, of course). This matrix maps from object space to texture space:

Mot = | Ux Vx Wx |
      | Ux Vy Wy |
      | Uz Vz Wz |

In this matrix you will recognize three vectors: U,V and W. The first two vectors (U,V) are what you think they are: the U and V axis of the texture. The vector W is a vector facing away from the texture plane and is also parallel to the polygon's normal. Basically, you can see this matrix as a rotation matrix for the texture U and V axis on the texture plane (which has the same orientation as the polygon's plane).

One thing is missing: the translation component. You want to be able to put the texture plane origin anywhere on the polygon, so you need a translation vector. In object space, I call it Vot.

So if we have a vertex on the polygon in object space (Vo) and want to compute it's texture coordinate (Vt), we apply the following equation:

Vt = Mot * Vo - Vot

We can transform the Mot,Vot transformation to map world space to texture space:

Vt = Mwt * Vw - Vwt

And from view space to texture space:

Vt = Mvt * (Vv - Vvt)

That's the simple part, now the complex things.

The plane equation of the polygon in view space is given by:

A * x + B * y + C * z + D = 0  (1)

where x,y and z refers to Vv components.

Perspective equations mapping the view-space vertex (x,y,z) to screen space (sx,sy):

sx = x / z
sy = y / z

Rearranging:

x = sx * z  (2)
y = sy * z  (3)

Substituting (2) and (3) in equation (1) yield:

A * sx * z + B * sy * z + C * z = - D

Divide both side by ( -D * z ):

  A * sx     B * sy      C       1
--------- + --------- + ---- = ----   (4)
    -D        -D        -D       z

Sounds familiar? No? Look at the 1 / z term. What the equation says is that texture mapping is linear in screen space and proportional to 1 / z. This is what perspective correct texture mapping is all about.

Let's define three equation to simplify the representation of equation (4):

M = - A / D
N = - B / D
O = - C / D

And let's rename 1/z to 1/W

So we have:

M * sx + N * sy + O = 1 / W   (5)

We can now easily compute 1 / W at every point of the polygon in screen space. But we also need the texture coordinates! Now, I know lot of peoples are used to using U,V coordinates for this, and it is the intuitive way of doing things.

Let's define:

S = U / W  (6)
T = V / W  (7)

The reason is that a rasterizer needs S and T coordinates, not U and V. Direct3D use U and V? Sure... But not the hardware behind it. I have two examples to prove it:

  • I happen to have worked at Matrox and I'm the guy who wrote the Direct3D drivers for the Millenium and the Mystique series. I also developed the prototype driver for the MGA-G200 series. The hardware uses S and T coordinates, so we have to perform one division per vertex behind the scene. Why can't we just specify S,T coordinates in Direct3D is beyond me.
  • Look at Glide. You actually have to use S and T coordinates... This is no accident. I'm pretty sure the 3dfx Direct3D driver has to perform that division for each vertex. The new Window Coordinate system in Glide 3 does support this assumption: the documentation say that future 3dfx chips will perform that division.

Now the power of this texture representation will come to light: you can compute S and T for every vertex on the polygon plane given the screen coordinates.


More mathematics

Recall that we had this equation to map view space to texture space:

Vt = Mvt * (Vv - Vvt)  (8)

From (8) we get:

U = m11 * (x-v1) + m12 * (y-v2) + m13 * (z-v3)
V = m21 * (x-v1) + m22 * (y-v2) + m23 * (z-v3)

where v1,v2 and v3 refers to Vvt components.

Rewritten:

U = m11 * x + m12 * y + m13 * z - ( m11 * v1 + m12 * v2 + m13 * v3 )
V = m21 * x + m22 * y + m23 * z - ( m21 * v1 + m22 * v2 + m23 * v3 )

Let's define:

P = - ( m11 * v1 + m12 * v2 + m13 * v3 )
Q = - ( m21 * v1 + m22 * v2 + m23 * v3 )

So we now have:

U = m11 * x + m12 * y + m13 * z + P
V = m21 * x + m22 * y + m23 * z + Q

Using the perspective equations (1) and (2), we get:

U = m11 * sx * z + m12 * sy * z + m13 * z + P
V = m21 * sx * z + m22 * sy * z + m23 * z + Q

Divide by z:

U / z = m11 * sx + m12 * sy + m13 + P / z
V / z = m21 * sx + m22 * sy + m23 + Q / z

Using equations (5), (6) and (7) for substitutions:

S = m11 * sx + m12 * sy + m13 + P * ( M * sx + N * sy + O )
T = m21 * sx + m22 * sy + m23 + Q * ( M * sx + N * sy + O )

Rewritten:

S = ( m11 + P * M ) * sx + ( m12 + P * N ) * sy + ( m13 + P * O )
T = ( m21 + Q * M ) * sx + ( m22 + Q * N ) * sy + ( m23 + Q * O )

Let's define:

J1 = m11 + P * M
J2 = m12 + P * N
J3 = m13 + P * O
K1 = m21 + Q * M
K2 = m22 + Q * N
K3 = m23 + Q * O

We then have the following three equations:

1 / W    = M * sx + N * sy + O
S        = J1 * sx + J2 * sy + J3
T        = K1 * sx + K2 * sy + K3

So we can compute S,T and 1 / W for any point on the polygon.


Implementation

Now that the theory has been laid out, let see how I implemented this in Frog.

Matrix representation

The matrix representation can be optimized. First, there is that "W" vector. It is the same vector as the polygon's normal. It doesn't make sense to keep it. Second, we know that U,V and W form a right-handed coordinate system. We don't need to keep all three vectors, only two is needed through the computations. Just before rendering, we can compute the missing vector.

The texture plane

A texture plane contains the three needed components:

  • The U vector. This vector determines the U axis orientation of the texture space. It is NOT necessarily a unit vector, and as such it also provides a scaling factor for the U axis.
  • The O vector. This is a translation to apply and represent the origin of texture space. It is used to align a texture on a polygon.
  • The V / U scale factor. Since we are not keeping the V vector anymore, we need this to defines the scaling of the V axis.

Algorithm

Here is the algorithm to find textures coordinates:

  • transform U and O from object->world->view space
  • compute the missing V vector (V = U * N * scaleVU)
  • compute P and Q
  • compute the factors needed for 1 / W, S and T

All of this is in the FTTexturePlane class. Take a look for more details. The W vector is M,N,O, the S vector is J1,J2,J3 and the T vector is K1,K2,K3.

Discuss this article in the forums


Date this article was posted to GameDev.net: 8/25/1999
(Note that this date does not necessarily correspond to the date the article was written)

See Also:
Texture Mapping

© 1999-2011 Gamedev.net. All rights reserved. Terms of Use Privacy Policy
Comments? Questions? Feedback? Click here!