Applying the Texture Mapping Technique
by Joshua Cantrell
equations last updated: 6-17-1998
last updated: 6-16-98

One day, when considering the reuse of my old code, I had a disturbing discovery! I didn't remember how I derived one of the equation in my code by using the general texture mapping equation! In my quest to minimize steps and reduce repeated calculations, I had obscured the general form of the equation, and had neglected to write comments on everything I had done! In this case, I had comments for everything except this particularly puzzling step. As a safe protect, I've decided to briefly describe how I organized and designed my part of the program.


Describing the textured 3D polygon

There are intrinsic features of a textured 3D polygon that never change. In order to be efficient, we'll want to calculate these numbers only once, when first defining the textured 3D polygon. These features are the image to be used on the 3D polygon (not something that can never change, but something that will in most cases be static), the direction vectors that describe the orientation of the texture on the 3D polygon, and an offset vector for moving the texture around on the 3D polygon.

The textured image is defined in whatever way you wish. Typically it consists of a 2D array of pixels. The texture coordinates calculated in the previous paper on texture mapping equations point inside this array and return the corresponding color. In my code, I define type as TexImage (for texture image).

The direction vectors are the u, v, and n vectors described in the texture mapping equation. u describes the vector that points in the direction of the texture image's x-axis. v describes the vector that points in the direction of the texture images y-axis. n describes a vector that is normal to both u and v. As an optimization based on the final form of the equation, we decided that by scaling the u and v vectors, we could scale the texture on the 3D polygon. This means u and v should have magnitudes equal to the number of units per pixel. The offset vector can be represented as precalculated texture coordinates for each point of the 3D polygon. These precalculated points come in handy when setting up the actual drawing of the polygon.

Let points_in_poly = The number of points that define the 3D polygon. Let Pn = Point n in 3D polygon. Let Tn = Texture point corresponding to point n in 3D polygon. Let tex_per_world_units = The size of a pixel per units. Let xoff = X offset of the texture map. Let yoff = Y offset of the texture map. struct TexInfo { TexImage image; Vector3D u; Vector3D v; Vector3D n; Ce2dPoints[points_in_poly]; // Equivalent of Tn } u' = P2 - P1 u'' = u' / ||u'|| // Normalized u vector u = tex_per_world_units * u'' // Scaled u vector n' = (P1 - P3) X u n = n' / ||n'|| // Normalized n vector v' = n X u'' v'' = v' / ||v'|| // Normalized v vector v = tex_per_world_units * v'' // Scaled v vector Tn = ((u X (Pn - P0)) - xoff, (v X (Pn - P0)) - yoff)

In my code, I also rotate u by some angle, but that's done using a simple rotation matrix. After its rotated, I again make sure that v is still orthogonal to both n and u.

Setup for drawing the polygon

Just before you start drawing the polygon, there are some constants that need to be found. Two of the constants are [U,C] and [V,C]. There's also a constant scalar, and constant scalings of u, v, and n. The vectors, u, v, and n, need to be rotated at the beginning of the calculations and used in their rotated form. The must be rotated using the same rotation matrix which rotated the points of the 3D polygon.

struct TexDrawInfo { TexImage image; // The image to use in texture mapping. Vector3D u; // Vector on the polygon's plane that points in the // direction of increasing x in the texture's space. Vector3D v; // Vector on the polygon's plane that points in the // direction of increasing y in the texture's space. Vector3D n; // A vector that is orthogonal to the polygon's plane. // Let distance = The distance of the focus to the viewport. real uz_distance; // A precalculated value for Uz * distance. real vz_distance; // A precalculated value for Vz * distance. real nz_distance; // A precalculated value for Nz * distance. real u_off; // The [U,C] constant offset as shown in the discussion // on texture mapping. real v_off; // The [V,C] constant offset as shown in the discussion // on texture mapping. real scalar; // The [N,P] constant scalar as shown in the discussion // on texture mapping. } Let (x', y') = The screen coordinates. Let (x, y, z) = The coordinates for the position of the texture point. Let P = Any point on the polygon's plane. Let T = Texture point in texture coordinates. Let C = Point on the polygon to be the texture map's origin. Let N = Normalized vector that is orthogonal to the polygon's plane Let U = Normalized vector on the polygon's plane that points in the direction of increasing x in the texture space. Let V = Normalized vector on the polygon's plane that points in the direction of increasing y in the texture space. Tx = [U,(x, y, z)] Ty = [V,(x, y, z)] Tx = (([N,P] / [N,(x', y', d)]) * [U,(x', y', d)]) - [U,C] Ty = (([N,P] / [N,(x', y', d)]) * [V,(x', y', d)]) - [V,C] Let n = The starting point number. Let (Txn, Tyn) = Texture point corresponding to point n in 3D polygon. Let (xn', yn') = Screen coordinates of 3D polygon point n. Let (xn, yn, zn) = 3D polygon coordinates of point n. Let uz_distance = A precalculated value for Uz * distance. Let vz_distance = A precalculated value for Vz * distance. Let nz_distance = A precalculated value for Nz * distance. Let u_off = The [U,C] constant offset. Let v_off = The [V,C] constant offset. Let scalar = The [N,P] constant scalar. uz_distance = Uz * d vz_distance = Vz * d nz_distance = Nz * d scalar = [N,(xn, yn, zn)] u_off = ((scalar / [N,(xn', yn', d)]) * [U,(xn', yn', d)]) - Txn v_off = ((scalar / [N,(xn', yn', d)]) * [V,(xn', yn', d)]) - Tyn

Finding the pixel in the texture given a point on the screen

Now using the constants defined in the previous section, it's easy to determine the position of the pixel in a texture given the screen coordinates. To make the value more useful, the remainder of the texture coordinate divided by the width of the texture is often used to produce a looping effect of the texture.

Let temp = Intermediate value used for both answers. temp = scalar / [N,(x', y', d)] Tx = (temp * [U,(x', y', d)]) - u_off Ty = (temp * [V,(x', y', d)]) - v_off

Where is this technique used?

This information was stripped from the code I used in my cs184 project. I put it into a simpler, more condensed form, so that people might get an idea of how to use my resultant equation without having to look at the code. If you wish to see the code and program used in my cs184 project, just follow this link to The Improved Maze Runner's Homepage.

Discuss this article in the forums


Date this article was posted to GameDev.net: 7/5/2000
(Note that this date does not necessarily correspond to the date the article was written)

See Also:
Texture Mapping

© 1999-2011 Gamedev.net. All rights reserved. Terms of Use Privacy Policy
Comments? Questions? Feedback? Click here!