Title: Lighting for Games
1Lighting for Games
Kenneth L. Hurley
2Agenda
- Introduction to Lighting
- What is Radiosity?
- Lightmaps
- Per Pixel Lighting
- High Dynamic Range Images
- Low Dynamic Range Image
- BRDFS
3Introduction to Lighting
- Ambient Lighting
- I Ia x Ka
4Introduction to Lighting
- Diffuse Lighting
- Ip x Kd x (N . L)
5Introduction to Lighting
- Phong Shading
- Ks x (R . V)n)
- Reflection Calculation
- R (2 x N x (N . L)) - L
6Radiosity
- What is Radiosity
- Objects reflect light at different wave length
- Can create a scattered lighting effect
- Lightmaps are determined from radiosity solutions
- Ray tracing with diffuse reflection calculations
usually used to determine radiosity
7Lightmaps
- Encodes a Diffuse Lighting Solution inSeparate
Texture - Think of an interior building wall
- Brick surface pattern on walls may be common to
many walls and highly repeated - Diffuse lighting solution is different for each
wall, but typically low resolution - Light maps decouple surface texture from diffuse
lighting contribution - http//hcsoftware.sourceforge.net/RadiosGL/RadiosG
L.html
8Lightmaps in Quake2
? (modulate)
decal only
lightmaps only
combined scene
9Gloss Map Example
? (modulate)
Diffuselighting contribution (per-vertex lighting
)
Gloss maptexture
Specularlightingcontribution (per-vertexlightin
g)
Final combined result
10Per Pixel Lighting Overview
- Introduction to per-pixel lighting
- Normal maps
- How to create them
- Tangent or surface-local space
- Why we need it
- How to use it
- Things to watch out for
- Animation Other Topics
11Per-Pixel Lighting
- Per-Pixel lighting is the next leap in visual
quality after simple multi-texturing - It allows more apparent surface detail than would
be possible with triangles alone - DX7 HW with DOT3 was a huge leap in per-pixel
capability - DX8 HW increases performance again, and adds
completely new capabilities
12Examples
Simple geometry, high detail
Reflective bumps
A single quad lit per-pixel
13Per-Pixel Lighting / Bump Mapping
- Bump Mapping is a subset of Per-Pixel Lighting
- These slides will discuss them interchangeably
- Most older Bump Mapping examples were only
performing diffuse directional lighting - Bump Mapping / Per-Pixel Lighting can be used to
achieve diffuse and/or specular point lights,
spotlights and volumetric lights also
14Normal Maps are Bump Maps
- Height maps are popular (3DS Max, Maya, ..)
- Normal maps are better
Normal Map
Height Map
15Creating Normal Maps
- Normal maps are easy to create from height maps
- Find slope along each axis dHeight/dU,
dHeight/dV - Cross product of slopes gives normal vector
- Convert normal vector (X,Y,Z) -1,1 to R,G,B
color 0,1 - X ? R, Y ? G, Z ? B
- Z is up out of the image plane
- RGB ( 0.5, 0.5, 1.0 ) corresponds to XYZ (
0, 0, 1 ) - XYZ ( 0, -1, 0 ) ? RGB ( 0.5, 0.0, 0.5 )
- Surface normals mostly point up out of the bump
map plane, so normal maps are mostly blue
simulated surface
16Creating Normal Maps From Height Maps
- Simplest Use 4 nearest neighbors
- dz/du ( B.z - A.z ) / 2.0f // U gradient
- dz/dv ( D.z - C.z ) / 2.0f // V gradient
- Normal Normalize( (dz/du) ? (dz/dv) )
- ? denotes cross-product
C
A
B
D
17Creating Normal Maps From Height Maps
- Make sure your height map uses the full range of
gray values - Get smoother results by sampling a larger area
around each point - 3x3, 5x5,
- NVIDIA provides three tools
- Normal Map Generation Tool (best sampling)
- BumpMaker (simple 2-neighbor sampling)
- Photoshop plug-in
18Creating Normal Maps From Geometry
- More esoteric approach
- Can be done in a DCC app
- Model surface detail in 3D
- Create detail up from a flat surface
- Render surface with red, green, and blue
directional lights, one color for each 3D axis - Need negative lights as well as positive
- Orthographic projection
19Creating Normal Maps From Geometry
- 5 lights, positive negative
- Ambient ( ½ , ½ , ½ )
B
-R
G
R
-G
20Normal Map Applied to Geometry
- We now have a normal vector for each pixel of the
object - Use the normal in standard
- N L N H lighting eqn.
- Normal map vector is relative to
- the flat triangle it is on. It is NOT a normal
in world or object space! - N L must have Normal and Light Vector in the
same coordinate system!
21The Light Vector
- With vertex lighting, we had
- Normal vector per vertex
- Light vector per vertex
- So far, weve got
- Normal vector per pixel
- We need a light vector for every pixel!
- Start with vector to light at each vertex
- HW iterates that vector across each triangle
- Iterated color or texture coordinate
22Interpolated Vector -- Watch Out!
- Were interpolating between vectors linearly
- Interpolated vector is not normalized
- It can be shorter than unit length
- Only noticeable when light is close to object
not normalized
normalized
normalized
23Solution Re-Normalize the Vector
- Do this only if you have to
- Only if distance from tri to light is less than
longest edge of tri, or some other metric - What if you dont?
- Highlights are dimmer
- Rare cases you will notice a darkening near the
light - Use normalization cube map
- Pixel Shaders Use one step of Newton-Raphson
technique to re-normalize - Developed by Scott Cutler at NVIDIA
24Normalization Cube Map
- Access cube map with un-normalized vector
(U,V,W) - Result is RGB normalized vector in same direction
- Input ( 0, 0, 0.8 ) ? RGB ( 127, 127, 255 )
- which is a normalized vector for per-pixel
- lighting
25Normalization Cube Map
- Cube map doesnt need to be huge
- 32x32x8
- 64x64x16
- www.nvidia.com/Developer
- Simple Dotproduct3 Bump Mapping demo
26Newton-Raphson Re-Normalization
- One step of numerical technique for normalizing a
vector - DX8 Pixel Shaders (or OGL Register Combiners)
- Faster than cube normalization map
- Numerical method
- Normalize( V ) ? V / 2 ( 3 - V V )
- when V is close to unit length
- Great when angle between interpolated vectors of
a tri is no more than about 40º - Thats a big difference, so this is valid for
most models circumstances
27Newton-Raphson in DX8
- Approximate V / 2 ( 3 - V V )
- V/2 (3 VV) 1.5V 0.5V
(VV) V 0.5V 0.5V (VV) V
0.5V ( 1 ( V V ) ) - Pixel Shader code V t0 vector
- def c0, 0.5, 0.5, 0.5, 0.5
- mul r0, t0, c0 // 0.5 V
- dp3 r1, t0, t0 // V DOT V
- mad r0, 1-r1, r0, t0
28N L Per-Pixel
- Can visualize light vector x,y,z as an RGB color
- Same -1,1 ? 0,1 conversion as for the normal
vector
Normal map
Light Vector, L
Per-Pixel Lighting
29What Coordinate System?
- Normal vector is expressed relative to each
triangle - This is surface-local space, aka. texture
space - Its a 3D basis, consisting of three axis vectors
- S, T, S ? T ( ? cross product )
- Texture space depends on
- Geometric position of vertices
- U,V coordinates of vertices which determine how
the normal map is applied
T
SxT
T
SxT
S
S
30How to Calculate Texture Space
- NVIDIA sample code!
- D3DX utility library for DX8.1 will do it for
you! - If you must know
- For each tri, find derivatives of U and V texture
coordinates with respect to X,Y, and Z - S vector dU/dX, dU/dY, dU/dZ
- T vector dV/dX, dV/dY, dV/dZ
- Then take S ? T
- Now we have S, T, S?T texture space basis for
each triangle - S, T, S?T is a transform from Object Space into
Texture Space
31Resultant Texture Space
- Express texture space per-vertex
- For each vertexs S vector, average the S vectors
of the tris it belongs to - Same for T and S?T vectors
- Analogous to computing vertex normals from face
normals! -
S
32Add It to Your Geometry
- Add S, T, S?T vectors to your vertex format (FVF)
- We can now transform the object space Light
Vector into texture space - This puts L in the same space as our normal map
vectors, so N L lighting will work - DX7 Must transform light vector in SW
- Stuff it into Diffuse or Specular color for
iteration - or a 3D texture coord for Normalization Cube Map
- DX8 Use a Vertex Shader to transform light
vector at each vertex - Put it into a color or texture coord for iteration
33DX7 vs. DX8 Hardware Implementation
- DX7 hardware
- Write light vector to a color for iteration
- TextureStageState setup example
- COLORARG0 D3DTA_DIFFUSE // light vec
- COLORARG1 D3DTA_TEXTURE // normal map
- COLOROP D3DTOP_DOTPRODUCT3
- DX8 hardware
- Write light vector to a texture coord for
iteration - Various Pixel Shader program approaches
- tex t0 // normal map
- texcoord t1 // light vector
- DP3 r0, t0_bx2, t1 // expand unsigned vals
34GeForce I, II Details
- Remember Under DX8, GeForce I II have a new
temporary result register - Also new triadic ops 3rd argument
- D3DTOP_MULTIPLYADD, D3DTOP_LERP
- VertexBuffer-gtLock() Write light vector to
color or texture coord VertexBuffer-gtUnlock() - N L BaseTexture
- 0, COLORARG0 D3DTA_DIFFUSE // light vec
- 0, COLORARG1 D3DTA_TEXTURE // normal map
- 0, COLOROP D3DTOP_DOTPRODUCT3
- 1, COLORARG0 D3DTA_CURRENT // dot3 result
- 1, COLORARG1 D3DTA_TEXTURE // base tex
- 1, COLOROP D3DTOP_MODULATE
35GeForce 3 Approach
- FVF pos, nrm, diffuse, t0, S, T, SxT
- Declare vertex shader S ? v4 T?v5
SxT?v6 - SetVertexShaderConst( C_L, vLightPosObjSpace..)
- vs.1.1
- dp3 oD1.x, v4, cC_L
- dp3 oD1.y, v5, cC_L
- dp3 oD1.z, v6, cC_L
- mov oD1.w, cCV_ONE
ps.1.1 tex t0 // base tex t1 // normal
map dp3 r0, t1_bx2, v1_bx2 mul r0, r0, t0 //
plenty of slots left if you // want to do
normalization
36Animation
- Keyframe
- Dont blend between radically different keys
- Interpolate S, T, S?T re-normalize (VShader)
- Matrix Palette Skinning
- Animate S, T, S?T vectors with the same transform
as for the normal vector - Vertex Shader program makes this trivial
- Try using the vertex Normal in place of S?T if
you need room
37Final Bump Map Thoughts
- Once you have texture space, youre all set for
many other effects - Normal maps can be created and modified on the
fly very quickly with DX8 hardware! - Normal Map Detail Normal Map for added detail
- Similar to texture detail texture
- Per-pixel lighting adds tremendous detail
38High Dynamic Range Images
- Developed by Paul E. Debevec and Jitendra Malik
- http//www.debevec.org
- Radiance can vary beyond precision of 8 bits
- Encodes radiance in floating point values
- Demo at site uses Geforce2
- Commercial Licensing Required
39Low Dynamic Range Images
- Simply lighting encoded in cubemap
- Low precision but can be effective for Diffuse
lighting
- Take high resolution photographs of mirrored ball
from as many as 6 angles
40Low Dynamic Range Images
- Align images into cubemap faces.
41Low Dynamic Range Images
- Run though diffuse convolution filter
42Low Dynamic Range Images
43BRDFS
- Principals of BRDF Lighting
44What is a BRDF?
- BRDF Stands for Bi-directional Reflectance
Distribution Function - BRDF is a function of incoming light direction
and outgoing view direction
surface
- In 3D, a direction D can be represented in
spherical coordinates (?D, ?D) - A BRDF is a 4D function BRDF( ?L, ?L, ?V, ?V )
45Multi-Texture BRDF Approximations
- Basic Idea
- Approximate the 4D function with lower
dimensional functions - Separate the BRDF into products of simpler
functions - BRDF(L,V) ? G1(L)H1(V) G2(L)H2(V)
- Minnaert Reflections are a little easier
- Only encodes (L N) and (V N)
46BRDF Examples
47References
- Computer Graphics at University of Leeds,
http//www.comp.leeds.ac.uk/cuddles/hyperbks/Rende
ring/index.html - Paul E. Debevec and Jitendra Malik. Recovering
High Dynamic Range Radiance Maps from
Photographs. In SIGGRAPH 97, August 1997.
48Questions
?
www.nvidia.com/Developer