This example shows, how to use a cube map as a normalization map for the light vector.
Using a cube normalization map helps to prevent the following problem: As light gets closer to the polygon surface, the interpolated light vector will become more and more unnormalized (it will be shortened). The result will be that, as the light approaches the surface, the surface will actually be less illuminated than when the light is further away. The cube normalization map is designed so that, given a texture coordinate representing a 3D vector, the output will always be the normalized vector.
Cube maps are made up of 6 square textures of the same size, representing a cube centered at the origin. Each cube face represents a set of directions along each major axis (+x, -x, +y, -y, +z, -z).
The normalization cube map is centered about the origin of the earth object. Each texel on the cube represents a unit light vector, oriented to this origin.
A function that produces a cube normalization map named CreateNormalizationCubeMap() can be found in the NVIDIA EffectsBrowser/Cg Browser source in the file nvtex.cpp.
Because it is not possible to use the diffuse reflection with a light vector normalized by a cube map and the specular reflection in one ps.1.1 pixel shader, all the effects are drawn in three passes onto the object. The first pass uses the cube normalization map to normalize the light vector and calculates the diffuse reflection. The second pass calculates the specular reflection and the third pass draws the point light effect into the frame buffer. This all happens in Render():
// first pass: diffuse(cubemap) * color m_pd3dDevice->SetRenderState(D3DRS_ALPHABLENDENABLE, TRUE); // SrcColor * 1 + DestColor * 1 m_pd3dDevice->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_ONE); m_pd3dDevice->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_ONE); m_pd3dDevice->SetTexture(0,m_pColorTexture); m_pd3dDevice->SetTexture(1,m_pNormalMap); m_pd3dDevice->SetTexture(2,m_pCubeTexture); m_pd3dDevice->SetPixelShader(m_dwPixShaderDot3); m_pd3dDevice->SetVertexShader(m_dwVertexSpecular); m_pd3dDevice->SetStreamSource( 0, m_pVertices, sizeof(ShaderVertex) ); m_pd3dDevice->SetIndices(m_pIndexBuffer,0); m_pd3dDevice->DrawIndexedPrimitive(D3DPT_TRIANGLELIST, 0, m_iNumVertices, 0, m_iNumTriangles); // second pass: specular m_pd3dDevice->SetTexture(3,m_pIllumMap); m_pd3dDevice->SetPixelShader(m_dwPixelSpecular); m_pd3dDevice->DrawIndexedPrimitive(D3DPT_TRIANGLELIST, 0, m_iNumVertices, 0, m_iNumTriangles); // third pass: attenuation // SrcColor * 0 + DestColor * SrcColor m_pd3dDevice->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_ZERO); m_pd3dDevice->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_SRCCOLOR); m_pd3dDevice->SetTexture(0, m_pPointLightTexture); m_pd3dDevice->SetTexture(1, m_pPointLightTexture); // Set the pixel shader m_pd3dDevice->SetPixelShader(m_dwPixShaderPointLight); // set vertex shader m_pd3dDevice->SetVertexShader(m_dwVertShaderPointLight); m_pd3dDevice->DrawIndexedPrimitive(D3DPT_TRIANGLELIST, 0, m_iNumVertices, 0, m_iNumTriangles); m_pd3dDevice->SetRenderState(D3DRS_ALPHABLENDENABLE, FALSE);
To be able to blend together the results of the different passes, alpha blending is used. The following code snippet shows the adding of the first and the second pass:
// SrcColor * 1 + DestColor * 1 m_pd3dDevice->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_ONE); m_pd3dDevice->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_ONE);
It is important to note, that the same vertex shader is used for the first and the second pass, although it is explicitely set only in the first pass. Using the same vertex shader in two passes should lead to a small performance gain, because the second time the vertex shader doesn't have to be uploaded to the graphics card once again. I guess that the performance gain on a software vertex shader implementation is bigger.
First Pass: Normalization of Light Vector/Calculation of Diffuse Reflection
The shader pair that uses the cube normalization map and calculates the diffuse reflection effect can be found in diffCubeMap.vsh and diffCubeMap.psh. The pixel shader source is pretty short:
ps.1.1 tex t0 ; color map tex t1 ; normal map tex t2 ; cube map dp3 r1, t2_bx2, t1_bx2 ; diffuse mul r0, t0, r1
The light vector is stored as a texture coordinate in (t2). The cube map is set to the second texture stage t2. By accessing the cube map with the values of the light vector, the normalized light vector values are fetched.
Another way to normalize a vector is shown in "Fundamentals of Pixel Shaders". This can be done in the pixel shader with the following code:
Compared to the former example, the diffuse reflection value is the calculated via a dp3 instruction in the pixel shader without using the illumination map. This illumination map is stilled used in the second pass to calculate the specular reflection.
Second Pass: Specular Reflection
The pixel shader specular.psh, that is used to calculate the specular reflection is nearly identical to the pixel shader used in RacorX8.
ps.1.1 tex t0 ; color map + gloss map tex t1 ; normal map texm3x2pad t2, t1_bx2 ; u = t1 dot (t2) half vector texm3x2tex t3, t1_bx2 ; v = t1 dot (t3) half vector ; fetch texture 4 at u, v ; t3.a = (N dot H)^16 mul r0, t0.a, t3.a ; (N dot H)^16 * gloss value
This shader now only handles the specular reflection. Therefore any code relating to the calculation of the diffuse reflection is omitted. It is also important to note, that using the texm3x2pad/texm3x2tex pair to load a value from a specular map is inefficient, because referencing the illumination map with the half angle vector and the normal should be sufficient. Using only the texm3x2tex instruction is not possible, because this instruction can only be used together with a texm3x2pad instruction.
A more elegant solution is possible by using the texdp3tex instruction together with a 1D specular map, which is shown in the next example.
Third Pass: Point Light Effect
The vertex and pixel shader for the point light effect are the same as in RacorX10.