Transparency in D3D Immediate Mode
by Nathan Davies

Please visit my website, Alamar's Domain...Thanks!

The purpose of this tutorial is to teach you, the reader what steps are necessary when drawing primitives in Direct 3D's Immediate Mode with textures containing transparency. What I mean in this is that each pixel/texel of the texture is either visible or not visible. In other words, the alpha value is either full or zero. This method has the same effect as using source color keys in Direct Draw, but isn't nearly as easy to do in Direct 3D.

This tutorial is NOT here to teach you about how to setup Direct 3D or how to render primitives. I make the assumption that you already know how to do these things. I have also removed most error checking code because I assume you have your own methods already. In a perfect world you should check the response of every function call and react accordingly.

There are five main steps involved in rendering textures in this way:

Enumerating Texture Formats

Just like all the other DirectX Enumeration functions, EnumTextureFormats' first parameter is a pointer to your callback function. The second paramter is a void pointer. In this case we will be sending the address of a DDPIXELFORMAT Variable. The EnumTextureFormats function is called using your LPDIRECT3DDEVICE3 pointer. A good place for this code would be during DirectX Initialization right after QueryInterface'ing for the D3DDevice pointer.

DDPIXELFORMAT TexturePixelFormat;  
pD3DDevice->EnumTextureFormats(( LPD3DENUMPIXELFORMATSCALLBACK )EnumTextures, ( void* )&TexturePixelFormat );
// If a Texture Format was NOT found, use the Current Back Surface Format instead
if( TexturePixelFormat.dwSize != sizeof( DDPIXELFORMAT ))  
    pBackSurface->GetPixelFormat( &TexturePixelFormat );

The LPD3DENUMPIXELFORMATSCALLBACK type is a pointer to a function that passes two parameters. The first parameter is of type LPDDPIXELFORMAT. This parameter will have the pixel format information for the current enumeration. The following code is an example of how you might write this function. This version will only accept 16bit textures with a single bit of alpha. There are only two possible texture formats for this that I am aware of are: RGBA 5551 and alternatively ARGB 1555. If the desired texture format is found the enumeration is stopped and the supplied DDPIXELFORMAT value is filled. If the desired format is not found the format will be set using the current primary/back format as above.

HRESULT CALLBACK EnumTextures( LPDDPIXELFORMAT DDPixelFormat, LPVOID pDDDesiredPixelFormat )  
    if( DDPixelFormat->dwFlags & DDPF_ALPHAPIXELS && DDPixelFormat->dwRGBBitCount == 16 )  
        if( DDPixelFormat->dwRGBAlphaBitMask == 1 || DDPixelFormat->dwRGBAlphaBitMask == 0x8000 )  
            memcpy( pDDDesiredPixelFormat, DDPixelFormat, sizeof(DDPIXELFORMAT) );  
            return D3DENUMRET_CANCEL;  
    return D3DENUMRET_OK;  

Creating the Texture

Creating the texture involves creating a surface and QueryInterface'ing for the texture. For this you'll need a DDSURFACEDESC2 Variable initialized with the usual information. You will also need to supply the PixelFormat from the section above. You then create the surface using this information and Query for the texture.

ZeroMemory( &Desc, sizeof( DDSURFACEDESC2 ));  
Desc.dwSize = sizeof( Desc );  
Desc.dwWidth = Width;  
Desc.dwHeight = Height;  
Desc.ddsCaps.dwCaps = DDSCAPS_TEXTURE;  
Desc.ddsCaps.dwCaps2 = DDSCAPS2_TEXTUREMANAGE;  
Desc.ddpfPixelFormat = TexturePixelFormat;  
pDD->CreateSurface( &Desc, &TheSurface, 0 );  
TheSurface->QueryInterface( IID_IDirect3DTexture2, ( void **)&TheTexture );

Loading the Image

Loading the Image, or setting up the surface, requires opening a bmp file and copying the information from it to the surface. The following code uses a DC for this purpose since BitBlt will convert from the 24-bit BMP's I use to the 16-bit format of the surface supplied. You could of course write all this yourself, or better yet load from a image file that already has an alpha value in it, but that's not the purpose of this tutorial. That and I haven't done it myself :)

HDC hDCImage, hDC;  
GetObject( hBM, sizeof( BM ), &BM );  
hDCImage = CreateCompatibleDC( NULL );  
SelectObject( hDCImage, hBM );  
if( SUCCEEDED( TheSurface->GetDC( &hDC )))  
    BitBlt( hDC, 0, 0, TextureWidth, TextureHeight, hDCImage, 0, 0, SRCCOPY );  
    TheSurface->ReleaseDC( hDC );  
DeleteDC( hDCImage );  
DeleteObject( hBM );

Setting up the Transparency

Setting up the Transparency is a simple matter of locking the Texture's Surface, scanning through it one colour(as in 2 bytes for a 16-bit surface) at a time and setting the alpha value to 0 or 1. There is an example very similiar to this in the Frame work files. The code below locks the surface then retrieves the Alpha Mask and RGB Masks. Since the two possible formats are 1555 or 5551, AlphaMask | RGBMask will always equal 0xFFFF. This code then steps through each pixel in the surface. If the Pixel is black, it is cleared(just in case it already contained alpha information). Otherwise the pixel is OR'd with the AlphaMask(to make it visible). This code ensures that any value in the texture that was black is now non-visible. If you want other colors to be non-visible you need only compare the pixel to your color. The problem with this however is that colors have different values in different modes. One way to get around this is to make the first or last pixel in any bitmap your transparent color then check each pixel against that color.

ZeroMemory( &Desc, sizeof( Desc ));  
Desc.dwSize = sizeof( Desc );  
TheSurface->Lock( NULL, &Desc, DDLOCK_WAIT, NULL );  
WORD AlphaMask = ( WORD )Desc.ddpfPixelFormat.dwRGBAlphaBitMask;  
WORD RGBMask = ( WORD )( Desc.ddpfPixelFormat.dwRBitMask | Desc.ddpfPixelFormat.dwGBitMask | Desc.ddpfPixelFormat.dwBBitMask );  
WORD* SurfPtr;  
for( DWORD y = 0; y < Desc.dwHeight; y++ )  
    SurfPtr = ( WORD* )(( BYTE* )Desc.lpSurface + y * Desc.lPitch );  
    for( DWORD x = 0; x < Desc.dwWidth; x++ )  
        if(( *SurfPtr & RGBMask ) == 0 )  
            *SurfPtr = 0;  
            *SurfPtr |= AlphaMask;  
	TheSurface->Unlock( NULL );

Rendering the Texture

Rendering the texture is done in the same way you would any other texture, using one of the DrawPrimitive functions. Alpha blending does not need to be on, since this is not blending. However, unless your transparent textures are only going to be used as overlays(like cursors or menu interfaces) you do need to turn on Alpha Testing. As far as I know, what this does is set the Z-Buffer based on the Alpha value. When it is disabled and you draw a rectangle on the screen, the Z-Buffer is updated so that even though you might see through a section of the texture you might chop off part of an object. For a simple example, render a rectangle with a texture where the center is transparent and render a cube in the center of the transparent section of this rectangle. You will only see the front half of the cube. To fix this, just set the renderstates accordingly:

pD3DDevice->SetRenderState( D3DRENDERSTATE_ALPHAREF, 0x02 );  

It's usually a good idea to Turn the Alpha Testing off when it's not needed:


Well I hope this information was in some way useful. If you find any problems or typos, please let me know at

Copyright Feb 15, 1999 Nathan Davies

Discuss this article in the forums

Date this article was posted to 10/13/1999
(Note that this date does not necessarily correspond to the date the article was written)

See Also:

© 1999-2011 All rights reserved. Terms of Use Privacy Policy
Comments? Questions? Feedback? Click here!