LOD Metrics - PowerPoint PPT Presentation

About This Presentation
Title:

LOD Metrics

Description:

Look deeper at what our algorithms are doing ... top-down view-dependant algorithms give up tess efficiency (often without realizing it! ... – PowerPoint PPT presentation

Number of Views:57
Avg rating:3.0/5.0
Slides: 29
Provided by: Bea73
Category:

less

Transcript and Presenter's Notes

Title: LOD Metrics


1
LOD Metrics
  • Jonathan Blow
  • Bolt Action Software
  • jon_at_bolt-action.com

2
Motivation
  • Spent a lot of time on terrain research
  • Goal spend energy wisely
  • Goal Maximize simplicity and benefit
  • Communicate what I see in the academic papers
    from being real far into it.
  • Disclose the things you wont read in anyones
    LOD paper.

3
Lecture Structure
  • Goal of LOD
  • Choosing a metric to meet that goal
  • See how implementation details get in the way
  • Look deeper at what our algorithms are doing
  • Examine the role of analysis in formulating these
    algorithms.

4
I. The Goal of LOD
5
LOD is about reducing resource usage
  • Usually triangle, but also textures, etc
  • We substitute cheaper models where people
    wouldnt notice the difference
  • This is not a very formal definition

6
No Objective Quality Goal
  • Scientific method test hypothesis against world
    to determine truth or falsehood
  • LOD researchers are pulling this stuff out of
    their ass (e.g. metric).
  • The more complex the algorithm, the more
    ass-pulling is involved -gt more likely it is
    wrong.

7
Objective Goal Analogy - PSNR
  • Used widely in image processing
  • Definition L2-norm of the N-vector between two
    images.

8
Drawbacks of PSNR
  • Doesnt match human visual system
  • e.g. Each pixel independent
  • People are working on replacements, nobody agrees
    on one

9
The Order in whichBugs Get Fixed
  • Things that dont compile
  • Things that crash
  • Obvious functional errors
  • Subtle functional errors that require careful
    analysis to diagnose and that still leave the
    software pretty much working.

10
II. Choosing a metric and using it
11
Metrics in the small vs. in the large
  • A common metric is projected pixel error
  • We dont have anything in the large (analogy of
    PSNR)

12
To be fast, algs degrade tess efficiency
  • EQS or progressive mesh - mega conservative
  • regular samples for BTT vs TIN
  • structure of BTT forces extra splits (crack
    fixing)
  • top-down view-dependant algorithms give up tess
    efficiency (often without realizing it!)

13
Measurement of projected pixel error computation
is arbitrary
  • GH along normal
  • LK along z direction
  • Why not measure terrain along normal? Or a
    general mesh along local verticality? What about
    texture popping? Etc.

14
Metric as simplifier
  • breaks down lots of complex spatial relations
    into e.g. a scalar
  • algorithms try to use vertex correlation to speed
    things up
  • they usually use the scalar and forget that all
    this info is available
  • Instead use isosurfaces, big speed improvement.

15
III. Implementation details get in the way
16
Icky non-cooperation of different data types
(tri, tex, light)
Changes in vert positions
geometry
Changes in normals
texture
irradiance
how the scene looks
17
How might this be simplified?
  • Example of LODd voxel space
  • galactic armada of signal processing

18
Bump mapping (and other anisotropy eg BRDF)
screws us
  • Our hardware and API implementations give us less
    flexibility with this kind of lighting than with
    old-school lambertian stuff
  • tangent frames can only be pinned to vertices
  • This sucks ass when combined w/ LOD.

19
IV. Looking deeper at our algorithms processes.
20
Edge collapse is a linear interpolation between
samples.
  • When we look at this as a filter what does it
    tell us?
  • Translate bilinear filter into sequence of
    fundamental signal processing operations.
  • What this does to frequency content fold, then
    mirror, then transfer by 1.5cos(x).

21
V. Better planning of future algorithms
22
Metric defines system behavior
  • So we can tell a lot about what a system will do
    by thinking about the metric and the data it
    operates on.
  • This can help us understand where to best focus
    our effort.

23
Example What it means to limit projected pixel
error
  • Metric projected pixel error
  • Algorithm keep projected error near some
    constant
  • Effect Screen-space triangles are roughly the
    same size

24
In an LODd scene, polygons tend to be roughly
the same size in screen pixels.
25
A large percentage of polygons are small and
close (50? 60?)
26
Why polygons tend to be the same size in screen
pixels.
  • Projected size of any delta value is roughly
    constant (stabilized point of algorithms
    action). ?/z k
  • Big delta values tend to be attached to big
    edges, small deltas to small edges. ?
    md d edge length, m some constant

27
Why polygons tend to be the same size in screen
pixels.
  • ? /z k ? ? 1/z1 k ?2/z2
  • ? md ? md1/z1 k md2/z2
  • Screen area ? 1/2(d/z)2 1/2(k/m)2
  • This is a constant.

28
Conclusions
  • Push metric in the small into the realm of the
    frame buffer sample? (Video cards already screw
    the pooch at this scale so maybe we would be just
    hiding all our error in there)
  • Thank-you and good night.
Write a Comment
User Comments (0)
About PowerShow.com