GameDev.netTexture Mapping

Texture Mapping
## Texture coordinates representationI developed a texture coordinates representation that is quite useful for a portal engine. The basic idea is not mine, I took it from the Crystal Space engine. The mathematics behind this representation are heavily inspired by a text file on their web page. This method may seem counter intuitive to certain peoples, particularly those who are not familiar with rasterization of texture mapped polygons. I don't have the intention of explaining what texture mapping is and how it works, there is plenty of information about that on the net. What I intend to do is explain how the texture representation in Frog works. ## MathematicsThe core of any 3D engine is mathematics. To be more specific, linear algebra. So if you are not used to play with vectors and matrices, I suggest you go read some book about it. There is also plenty of information on the web. It is very important to understand the basics of linear algebra if you are to write a 3D game anyway. We start with the matrix representation for the texture coordinates on a polygon (in object space, of course). This matrix maps from object space to texture space: Mot = | Ux Vx Wx | | Ux Vy Wy | | Uz Vz Wz | In this matrix you will recognize three vectors: One thing is missing: the translation component. You want to be able to put the texture plane origin anywhere on the polygon, so you need a translation vector. In object space, I call it So if we have a vertex on the polygon in object space Vt = Mot * Vo - Vot We can transform the Vt = Mwt * Vw - Vwt And from view space to texture space: Vt = Mvt * (Vv - Vvt) That's the simple part, now the complex things. The plane equation of the polygon in view space is given by: A * x + B * y + C * z + D = 0 (1) where Perspective equations mapping the view-space vertex sx = x / z sy = y / z Rearranging: x = sx * z (2) y = sy * z (3) Substituting (2) and (3) in equation (1) yield: A * sx * z + B * sy * z + C * z = - D Divide both side by A * sx B * sy C 1 --------- + --------- + ---- = ---- (4) -D -D -D z Sounds familiar? No? Look at the Let's define three equation to simplify the representation of equation (4): M = - A / D N = - B / D O = - C / D And let's rename So we have: M * sx + N * sy + O = 1 / W (5) We can now easily compute Let's define: S = U / W (6) T = V / W (7) The reason is that a rasterizer needs - I happen to have worked at Matrox and I'm the guy who wrote the Direct3D drivers for the Millenium and the Mystique series. I also developed the prototype driver for the MGA-G200 series. The hardware uses
**S**and**T**coordinates, so we have to perform one division per vertex behind the scene. Why can't we just specify**S,T**coordinates in Direct3D is beyond me. - Look at Glide. You actually have to use
**S**and**T**coordinates... This is no accident. I'm pretty sure the 3dfx Direct3D driver has to perform that division for each vertex. The new Window Coordinate system in Glide 3 does support this assumption: the documentation say that future 3dfx chips will perform that division.
Now the power of this texture representation will come to light: you can compute ## More mathematicsRecall that we had this equation to map view space to texture space: Vt = Mvt * (Vv - Vvt) (8) From (8) we get: U = m11 * (x-v1) + m12 * (y-v2) + m13 * (z-v3) V = m21 * (x-v1) + m22 * (y-v2) + m23 * (z-v3) where Rewritten: U = m11 * x + m12 * y + m13 * z - ( m11 * v1 + m12 * v2 + m13 * v3 ) V = m21 * x + m22 * y + m23 * z - ( m21 * v1 + m22 * v2 + m23 * v3 ) Let's define: P = - ( m11 * v1 + m12 * v2 + m13 * v3 ) Q = - ( m21 * v1 + m22 * v2 + m23 * v3 ) So we now have: U = m11 * x + m12 * y + m13 * z + P V = m21 * x + m22 * y + m23 * z + Q Using the perspective equations (1) and (2), we get: U = m11 * sx * z + m12 * sy * z + m13 * z + P V = m21 * sx * z + m22 * sy * z + m23 * z + Q Divide by z: U / z = m11 * sx + m12 * sy + m13 + P / z V / z = m21 * sx + m22 * sy + m23 + Q / z Using equations (5), (6) and (7) for substitutions: S = m11 * sx + m12 * sy + m13 + P * ( M * sx + N * sy + O ) T = m21 * sx + m22 * sy + m23 + Q * ( M * sx + N * sy + O ) Rewritten: S = ( m11 + P * M ) * sx + ( m12 + P * N ) * sy + ( m13 + P * O ) T = ( m21 + Q * M ) * sx + ( m22 + Q * N ) * sy + ( m23 + Q * O ) Let's define: J1 = m11 + P * M J2 = m12 + P * N J3 = m13 + P * O K1 = m21 + Q * M K2 = m22 + Q * N K3 = m23 + Q * O We then have the following three equations: 1 / W = M * sx + N * sy + O S = J1 * sx + J2 * sy + J3 T = K1 * sx + K2 * sy + K3 So we can compute ## ImplementationNow that the theory has been laid out, let see how I implemented this in Frog.
The matrix representation can be optimized. First, there is that
A texture plane contains the three needed components: - The
**U**vector. This vector determines the**U**axis orientation of the texture space. It is NOT necessarily a unit vector, and as such it also provides a scaling factor for the**U**axis. - The
**O**vector. This is a translation to apply and represent the origin of texture space. It is used to align a texture on a polygon. - The
**V / U**scale factor. Since we are not keeping the**V**vector anymore, we need this to defines the scaling of the**V**axis.
Here is the algorithm to find textures coordinates: - transform
**U**and**O**from object->world->view space - compute the missing
**V**vector**(V = U * N * scaleVU)** - compute
**P**and**Q** - compute the factors needed for
**1 / W**,**S**and**T**
All of this is in the
© 1999-2011 Gamedev.net. All rights reserved. |