The first example uses an illumination map, which holds the diffuse and specular values. I got the idea of using an illumination map this way from an article written by Kenneth Hurley [Hurley].
His article also helped me a lot in building up the illumination map with Paint Shop Pro. Below are the color and alpha channels of the illumination map:
This map stores the diffuse reflection value in the color channel and the specular reflection value in the alpha channel. To get a per-pixel specular reflection, all the following examples use a gloss map stored in the alpha component of the color map:
This map can use the whole 8-bit alpha channel to store a range of values, although this screenshot looks like it consists only of a black and white image. The data for the point light is stored in pointlight.dds. This texture stores the function
f(x, y) = x * x + y * y
in the color channel and the function
g(z) = z * z
in the alpha channel.
Both pictures show that the attenuation values are stored as bidirectional values, because a point light doesn't have a single vector of light like spot lights. The right picture in figure 4 shows the attenuation gradient.
This example needs two passes to execute the necessary vertex and pixel shader values. The first shader pair is responsible for the point light effect (PerPixelPointLight.vsh / PointLight.psh) and the second shader pair is responsible for the diffuse and specular reflection (SpecDot3Pix.vsh / SpecDot3.psh). Below is the relevant source from Render():
// first pass attenuation m_pd3dDevice->SetTexture(0, m_pPointLightTexture); m_pd3dDevice->SetTexture(1, m_pPointLightTexture); // Set the pixel shader m_pd3dDevice->SetPixelShader(m_dwPixShaderPointLight); // set vertex shader m_pd3dDevice->SetVertexShader(m_dwVertShaderPointLight); m_pd3dDevice->SetStreamSource( 0, m_pVertices, sizeof(ShaderVertex) ); m_pd3dDevice->SetIndices(m_pIndexBuffer,0); m_pd3dDevice->DrawIndexedPrimitive(D3DPT_TRIANGLELIST,0,m_iNumVertices, 0, m_iNumTriangles); m_pd3dDevice->SetRenderState(D3DRS_ALPHABLENDENABLE, TRUE); // SrcColor * 0 + DestColor * SrcColor m_pd3dDevice->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_ZERO); m_pd3dDevice->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_SRCCOLOR); // second pass m_pd3dDevice->SetTexture(0,m_pColorTexture); m_pd3dDevice->SetTexture(1,m_pNormalMap); m_pd3dDevice->SetTexture(3,m_pIllumMap); m_pd3dDevice->SetPixelShader(m_dwPixShaderDot3); m_pd3dDevice->SetVertexShader(m_dwVertexSpecular); m_pd3dDevice->DrawIndexedPrimitive(D3DPT_TRIANGLELIST,0,m_iNumVertices,0, m_iNumTriangles); m_pd3dDevice->SetRenderState(D3DRS_ALPHABLENDENABLE, FALSE);
The most important section in this source snippet is the alpha blending part that multiplies the two effects with each other:
// SrcColor * 0 + DestColor * SrcColor m_pd3dDevice->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_ZERO); m_pd3dDevice->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_SRCCOLOR);
With every pass the same DrawIndexedPrimitive() function is called and the vertex and index buffers are set only once. Using alpha blending this way is the best way to spread the lighting calculation across multiple passes without affecting the ability to render multiple overlapping lights.
First Pass: Point Light Effect
The first shader pair is responsible for the point light effect. You can find a similar implementation of a per-pixel point light in several NVIDIA and ATI examples. Using a point light with an attenuation map is extensively discussed in articles by Sim Dietrich [Dietrich], by Dan Ginsburg/Dave Gosselin [Ginsburg/Gosselin], by Kelly Dempski [Dempski] and others.
Sim Dietrich shows in his article that the following function encoded in an attenuation map delivers good results:
attenuation = 1 - d * d // d = distance
which stands for
attenuation = 1 - (x * x + y * y + z * z)
To store this formula in a 2D texture, it is split up into two functions (f(x, y) = x * x + y * y and g(z) = z * Z)). The first function is stored in the three components of the color values of the texture and the second function is stored in the alpha values of this texture. Dan Ginsburg/Dave Gosselin and Kelly Dempski divide the squared distance through a range constant, which stands for the range of distance, in which the point light attenuation effect happens:
attenuation = 1 - ((x/r)^2 + (y/r)^2 + (z/r)^2)
The division of the light vector through the range is calculated in the vertex shader named PerPixelPointLight.vsh:
vs.1.1 ; position in clip space m4x4 oPos, v0, c8 ; position in world space m4x4 r2, v0, c0 ; get light vector add r10, r2, c12 ; Divide each component by the range value mul r10, r10, c33.x ; multiply with 0.5 and add 0.5 to get all values in the range [0..1] mad r10, r10, c33.yyyy, c33.yyyy ; map the x and y components into the first texture mov oT0.xy, r10.xy ; z-component v0 mov oT1.x, r10.z
Because the light vector is in tangent space, the x, y and z values can be used to access a 2D/1D texture. Therefore the x and y values of the light vector is stored as texture coordinates in oT0 to access the rgb color values of the point light texture (see figure 4) in the pixel shader. The z value of the light vector is stored in oT1 to access the alpha value of the same texture in the pixel shader:
ps.1.1 tex t0 tex t1 add r0, 1-t0, -t1.a ; (1.0 - t0) - t1.a)
Although it is the same texture, we had to set and access it twice to be able to fetch its rgb and alpha values with different texture coordinate values.
Second Pass: Diffuse and Specular Reflection
The pixel shader in the file SpecDot3.psh, that calculates the diffuse and specular reflection with the help of the illumination map and the specular value in the alpha value of the color map looks like this:
ps.1.1 tex t0 ; color map in t0.rgb + gloss map in t0.a tex t1 ; normal map texm3x2pad t2, t1_bx2 ; u = t1 dot (t2) light vector texm3x2tex t3, t1_bx2 ; v = t1 dot (t3) half vector ; fetch texture 4 at u, v ; t3.rgb = (N dot L) ; t3.a = (N dot H)^16 ; r0 = (diffuse * color) + ((N dot H)^16 * gloss value) mul r1.rgb, t3, t0 ; (N dot L) * base texture +mul r1.a, t0.a, t3.a ; (N dot H)^16 * gloss value add r0, r1.a, r1
The light and the half vector in (t2) and (t3) are calculated and normalized in the vertex shader in the file SpecDot3Pix.vsh. Both form together a 3x2 matrix that is multiplied with the normal vector to fetch the color value from the illumination map in t3.
This color value from the illumination map has stored the diffuse reflection value in the color channels and the specular reflection value in the alpha channel as shown above in figure 2.
The arithmetic instructions multiply the diffuse reflection value with the color from the color map t0, multiply the specular value from the illumination map with the gloss value from the alpha channel of the color map and add the results from both operations together.
Compared to the pixel shader used in RacorX9, this shader eats up all texture coordinate registers available on ps.1.1 hardware. Unfortunately, the light vector can't be stored in one of the color value registers, because the texm* instruction pair doesn't accept input registers other than texture coordinate registers.
There is a much more clever method shown by [Beaudoin/Guardado]. They calculate a specular value by approximating a non-integer power function on a ps.1.1 pixel shader.
Compared to a point light that was calculated in the vertex shader with the dst instruction, the attenuation function used in this example is much simpler, but it works on a per-pixel basis and looks therefore much better.
This example and the following examples are based on a slightly different framework, than the examples in the Introduction. This is because a precision error in D3DXComputeTangent() showed up, when I normalized the light vector with a cube normalization map. Therefore I use the CreateBasisVectors() and FindAndFixDegenerateVertexBasis() functions from the NVIDIA EffectsBrowser/Cg Browser. You can find them in the file Dot3_Util.cpp. Please check the code comments in LoadXFile() for more info. I have to thank Tim Johnson [Johnson], who pointed me to that problem in the Microsoft DirectX forum.