Instance-Based Learning - PowerPoint PPT Presentation

About This Presentation
Title:

Instance-Based Learning

Description:

How to decide the feature vectors so that avoid the 'curse of ... Clever hack: luminance remapping. Why steerable pyramid? Still can't work for line arts. ... – PowerPoint PPT presentation

Number of Views:45
Avg rating:3.0/5.0
Slides: 28
Provided by: don105
Category:
Tags: based | hack | how | instance | learn | learning | to

less

Transcript and Presenter's Notes

Title: Instance-Based Learning


1
Instance-Based Learning
  • By Dong Xu
  • State Key Lab of CADCG, ZJU

2
Overview
  • Learning Phase
  • Simply store the presented training examples.
  • Query Phase
  • Retrieve similar related instances.
  • Construct local approximation.
  • Report the function value at query point.

Key Idea - Inference from neighbors(????)
3
Perspective
  • Nearby instances is Related. (????)
  • Nearby (distance metric, distance is short)
  • Related (function value can be estimated from)
  • Distance Metric
  • Euclidean points
  • Feature Vector
  • Function Approximation
  • Lazy k-Nearest Neighbor (kNN), Locally Weighted
    Regression, Case-Based Reasoning
  • Eager Radial Basis Functions (RBFs)
  • In Essence, all methods are all local.

4
Three Methods
  • k-Nearest Neighbor
  • Discrete-valued functions (Voronoi diagram)
  • Continuous-valued functions (Distance-Weighted)

5
Three Methods
  • Locally Weighted Regression
  • Locally Weighted Linear Regression
  • Linear Approximation Function
  • Choose Weights by Energy Minimization

6
Three Methods
  • Radial Basis Functions
  • Target Function
  • Kernel Function
  • Two-Stage Learning Process
  • Learn the Kernel Function
  • Learn the Weights
  • They are trained separately. More efficient.

7
Remarks
  • How to decide the feature vectors so that avoid
    the curse of dimensionality problem?
  • Stretch the axes (weight each attribute
    differently, suppress the impact of irrelevant
    attributes).
  • Q How to stretch? A Cross-validation approach
  • Efficient Neighbor Searching
  • kd-tree (Bentley 1975, Friedman et al. 1977)
  • Q How to decide k ( neighbors)? A Can use
    range searching instead.
  • Kernel Function Selection
  • Constant, Linear, Quadratic, etc.
  • Weighting Function
  • Nearest, Constant, Linear, Inverse Square of
    Distance, Gaussian, etc.
  • Represent global target function as a linear
    combination of many local kernel functions (local
    approximations).
  • Query-Sensitive. Query phase may be
    time-consuming.

8
  • Have a rest,
  • now come to our example.

9
Image Analogies
  • Aaron Hertzmann et. al. SIGGRAPH 2001.

Problem (IMAGE ANALOGIES) Given a pair of
images A and A (the unfiltered and filtered
source images, respectively), along with some
additional unfiltered target image B, synthesize
a new filtered target image B such that A A
B B
10
Questions
  • How to achieve Image Analogies?
  • How to choose the feature vector?
  • How many neighbors need to be consider?
  • How to avoid curse of dimensionality?

11
Outline
  • Relationships need to be described
  • Unfiltered image and its respective filtered
    image
  • The source pair and the target pair.
  • Feature Vector (Similarity Metric)
  • Based on an approximation of a Markov random
    field model.
  • Sample joint statistics of small neighborhoods
    within the image.
  • Using raw pixel value and, optionally, oriented
    derivative filters.
  • Algorithm
  • Multi-scale autoregression algorithm, based on
    previous texture synthesis methods Wei and
    Lovey, 2000 and Ashikhmin 2001.
  • Applications

12
Feature Vector
  • Why RGB?
  • Intuitive, easy to implement.
  • Work for many examples.
  • Why luminance?
  • Cant work for images with dramatic color
    differences.
  • Clever hack luminance remapping.
  • Why steerable pyramid?
  • Still cant work for line arts.
  • Need strengthen orientation information.
  • Acceleration
  • Feature Vector PCA (Dimension Reduction)
  • Search Strategies ANN (Approximate Nearest
    Neighbor), TSVQ

13
Algorithm (1)
  • Initialization
  • Multi-scale (Gaussian pyramid) construction
  • Feature vector selection
  • Searching structure (kd-tree for ANN) build up
  • Data Structure
  • A(p) array p ? SourcePoint of Feature
  • A(p) array p ? SourcePoint of Feature
  • B(q) array q ? TargetPoint of Feature
  • B(q) array q ? TargetPoint of Feature
  • s(q) array q ? TargetPoint of SourcePoint

14
Algorithm (2)
  • Synthesis
  • function CREATEIMAGEANALOGY(A, A, B)
  • Compute Gaussian pyramids for A, A, and B
  • Compute features for A, A, and B
  • Initialize the search structures (e.g., for ANN)
  • for each level l , from coarsest to finest, do
  • for each pixel q ? Bl , in scan-line order,
    do
  • p ? BESTMATCH(A, A, B, B, s, l , q)
  • Bl (q) ? Al(p)
  • sl (q) ? p
  • return Bl
  • function BESTMATCH(A, A, B, B, s, l , q)
  • papp ? BESTAPPROXIMATEMATCH(A, A, B, B, l , q)
  • pcoh ? BESTCOHERENCEMATCH(A, A, B, B, s, l ,
    q)
  • dapp ? Fl (papp ) - Fl(q)2
  • dcoh ? Fl (pcoh) - Fl (q) 2
  • if dcoh dapp(1 2l-L?) then
  • return pcoh
  • else

?- coherence parameter
15
Algorithm (3)
  • Best Approximate Match
  • The nearest pixel within the whole source image.
  • Search strategies ANN, TSVQ. PCA (dimension
    reduction).
  • Best Coherence Match
  • Return best pixel that is coherent with some
    already-synthesized portion of Bl adjacent to q,
    which is the key insight of
  • Ashikhmin 2001..
  • The BESTCOHERENCEMATCH procedure simply returns
  • s(r) (q - r), where
  • r arg min r?N(q) Fl(s(r) (q - r)) - Fl
    (q)2
  • and N(q) is the neighborhood of already
    synthesized pixels adjacent to q in B l.

16
Algorithm (4)
Figure Neighborhood Matching.
17
Algorithm (5)
Figure Coherent Matching
18
Applications (1)
  • Traditional image filters

19
Applications (2)
  • Improved texture synthesis

20
Applications (3)
  • Super-resolution

21
Applications (4)
  • Texture transfer

22
Applications (5)
  • Line arts

23
Applications (6)
  • Artistic filters

24
Applications (7)
  • Texture-by-numbers

25
Conclusion
  • Provide a very natural means of specifying image
    transformations.
  • A typical application of Instance-Based Learning
  • kNN approach.
  • DO NOT consider local reconstruction. Is this
    possible?
  • More analogies?

26
Resource
  • AutoRegression Analysis (AR)
  • http//astronomy.swin.edu.au/pbourke/analysis/ar/
  • Image Analogies Project Page
  • http//www.mrl.nyu.edu/projects/image-analogies/
  • Reconstruction and Representation of 3D Objects
    with Radial Basis Functions
  • Carr et. al. SIGGRAPH 2001

27
Thank you
Write a Comment
User Comments (0)
About PowerShow.com