Computer Graphics The Rendering Pipeline Review - PowerPoint PPT Presentation

1 / 16
About This Presentation
Title:

Computer Graphics The Rendering Pipeline Review

Description:

We looked at the rendering pipeline of DirectX: Have seen each stage now. Except tessellation not commonly used. Will now review each stage in turn. Vertex Data ... – PowerPoint PPT presentation

Number of Views:336
Avg rating:3.0/5.0
Slides: 17
Provided by: lauren80
Category:

less

Transcript and Presenter's Notes

Title: Computer Graphics The Rendering Pipeline Review


1
Computer GraphicsThe Rendering Pipeline - Review
  • CO2409 Computer Graphics
  • Week 13

2
Lecture Contents
  • The Rendering Pipeline
  • Input Geometry
  • Vertex Data Primitive Data
  • Vertex Processing
  • Matrix Transformations
  • Lighting
  • Geometry Processing
  • Pixel Processing
  • Textures
  • Pixel Rendering

3
The Rendering Pipeline
  • We looked at the rendering pipeline of DirectX
  • Have seen each stage now
  • Except tessellation not commonly used
  • Will now review each stage in turn

4
Vertex Data
  • Initially identified the input vertex data as the
    vertex coordinates of our geometry
  • (X,Y,Z) position for each vertex
  • Other parts of the pipeline also need per-vertex
    data
  • Vertex normals
  • if using lighting to find colour at vertex
  • Vertex colours
  • if not using lighting
  • Texture coordinates / UVs
  • if we use textures
  • Multiple sets if we blend multiple textures

5
Vertex Data Formats
  • Input vertex data is more than just a simple list
    of vertex positions
  • Data required depends on how we render
  • With textures, lighting etc.
  • Use custom vertex data formats for flexibility,
    e.g.

// Vertex coord colour // Fixed colour polys
with // No lighting or texture struct
CUSTOMVERTEX1 FLOAT vx, vy, vz DWORD
colour
// Vertex coord, normal // texture coord Lit
and // textured polys struct CUSTOMVERTEX2
FLOAT vx, vy, vz FLOAT nx, ny, nz FLOAT
u, v
6
Vertex Buffers
  • A vertex buffer is just an array of vertices
  • Defining the set of vertices in our geometry
  • Each vertex a custom format as above
  • The buffer is managed by DirectX
  • We must ask DirectX to create and destroy them
  • and to access the data inside (called locking)
  • Stored in main memory or video card memory
  • We have some control when we create them
  • First saw vertex buffers in the week 5 lab

7
Primitive Data / Index Buffers
  • Primitive data is data defining the triangles in
    the geometry
  • Or other primitives, such as lines or points
  • The vertex data itself can define the primitives
  • Each triplet of vertices defining a triangle
  • More often use an index buffer
  • An array of integers that index the vertex buffer
  • Triplet of indexes for each triangle
  • No duplication of vertices
  • Saw index buffers in week 9

8
Vertex Processing
  • Vertex primitive data defines the 3D geometry
  • Next step is to convert this data into 2D
    polygons
  • Key operation is a sequence of matrix transforms
  • Transforming vertex coordinates from 3D model
    space to 2D viewport space
  • This step can also include calculation of
    lighting (using the per-vertex normals)
  • And animation, which we havent covered yet
  • Some vertex data (e.g. UVs) is not used at this
    stage and is simply passed through to later
    stages
  • We looked at vertex shaders for vertex processing

9
From Model to World Space
  • Mesh vertices are stored in model space
  • A local space with a convenient origin and
    orientation
  • Each model has a world matrix
  • That transforms the model geometry from model
    space to world space
  • This matrix defines position and orientation for
    the model
  • Has a special form containing the local axes of
    the model
  • These axes are often extracted from the matrix

10
From World to Camera Space
  • We view the models through a camera
  • Which is part of the scene
  • A camera has a view matrix
  • Transforms the world space geometry into camera
    space
  • Camera space defines the world as seen by the
    camera
  • Camera looks down its Z axis
  • The view matrix is the inverse of a normal world
    matrix
  • But has a similar form and can be used in a
    similar way

11
Camera Space to Viewport
  • Each camera has a second matrix, the projection
    matrix
  • Defining how the camera space geometry is
    projected into 2D
  • Defines the camera viewport / near clip distance
    field of view
  • Final step is to scale the projected 2D geometry
    into viewport pixel coordinates
  • Performed internally by DirectX
  • Covered this material weeks 6-9

12
Lighting
  • We looked at the mathematical lighting models
    used to illuminate 3D geometry (week 11)
  • We showed how lighting can be calculated while
    vertices are being transformed
  • In the same vertex shader
  • The calculations need a normal for each vertex
  • Discarded after vertices are converted to 2D
  • Several effects combined for final vertex colour
  • Ambient, diffuse, specular

13
Geometry Processing
  • This stage processes 2D polys before rendering
  • Off-screen polygons are discarded (culled)
  • Partially off-screen polygons are clipped
  • Back-facing polygons are culled if required
    (determined by clockwise/anti-clockwise order of
    viewport vertices)
  • These steps occur with minimal programming
  • This stage also scans through the pixels of the
    polygons (called rasterising / rasterizing)
  • The input data is interpolated and passed to the
    later pixel-based stages
  • 2D coordinates, colours, UVs etc.

14
Textures and Pixel Processing
  • Next each pixel is worked on to get a final
    colour
  • The texture and UV data has been passed through
    from the previous steps
  • The UVs map the textures onto the 2D polygons
  • Textures can be filtered (texture sampling) to
    improve their look
  • Textures covered in week 12
  • The texture colours are combined with the polygon
    colour to produce final pixel colours
  • The polygon colour came from the original vertex
    data or from the lighting calculations

15
Pixel Rendering
  • The final step in the rendering pipeline is the
    rendering of the pixels to the viewport
  • This involves blending the final polygon colour
    with the existing viewport pixel colours
  • Havent covered this in 3D yet, similar to sprite
    blending
  • Also the depth buffer values are tested / written
  • We saw the depth buffer in the week 8 lab
  • Will see them in more detail shortly

16
Where Next?
  • Modern apps use the programmable pipeline
  • Vertex processing is performed by a Vertex Shader
  • Matrix transformations, lighting
  • Pixel processing by a Pixel Shader
  • Combining texture and polygon colours
  • There is a fixed pipeline that avoids shaders
  • Little flexibility, rarely used nowadays
  • Shaders allows a much wider range of pipeline
    behaviour for advanced techniques and effects
  • In the next few weeks we will look at some of
    these advanced shader techniques
Write a Comment
User Comments (0)
About PowerShow.com