Modelling StereoVision Systems 3D Mapping - PowerPoint PPT Presentation

About This Presentation
Title:

Modelling StereoVision Systems 3D Mapping

Description:

Modelling StereoVision Systems 3D Mapping – PowerPoint PPT presentation

Number of Views:69
Avg rating:3.0/5.0
Slides: 89
Provided by: xavierarma
Category:

less

Transcript and Presenter's Notes

Title: Modelling StereoVision Systems 3D Mapping


1
Modelling StereoVision Systems3D Mapping
2
Contents
  • 1. Introduction
  • 2. Camera Modelling and Calibration
  • 3. Stereo Vision and the Epipolar Geometry
  • 4. 3D Mapping Results
  • 5. Related Publications

3
1.1 Objectives
  • Building a 3D map from an unknown environment
    using a stereo camera system
  • Localization of the robot in the map
  • Providing a new useful sensor for the robot
    control architecture

GRILL Mobile robot with a stereo camera system
4
1.2 Hardware requirements
5
1.3 The whole application
Camera
Image Acquisition
A/D
Remove Distortion
Calibration
Remove Distortion
Calibration
Image Processing Low Level
Gradients
Filtering
Feature Extraction
Correspondence Problem
Correspondence Problem
Image Processing High Level
3D Information
3D Information
Motion Estimation
Motion Estimation
Tracking
Description Level
Localization
Localization
Map Building
Map Building
6
Contents
  • 1. Introduction
  • 2. Camera Modelling and Calibration
  • 2.1 Calibration Introduction
  • 2.2 Camera Model
  • 2.3 Calibration Methods
  • 2.4 Accuracy Evaluation
  • 2.5 Experimental Results
  • 2.6 Conclusions
  • 3. Stereo Vision and the Epipolar Geometry
  • 4. 3D Mapping Results
  • 5. Related Publications

7
2.1 Calibration Introduction
  • Some applications of this capability include

Dense reconstruction Visual
inspection Object localization Camera
localization
8
2.1 Calibration Introduction
Image Plane
Image Plane
Focal Point
Focal Point
?
In metrics
In pixels
9
2.1 Calibration Introduction
Modelling G(X) X ?
G(X)
Calibration X !!!
  • Modelling
  • Determine the equation that approximates the
    camera behaviour.
  • Define the set of unknowns in the equation
    (camera parameters).
  • The camera model is an approximation of the
    physics optics of the camera.
  • Calibration
  • Get the numeric value of every camera parameter.

10
2.2 Camera Model
Image coordinate system
Image plane
Camera coordinate system
World coordinate system
11
2.2 Camera Model (Step 1 World to Camera)
Image plane
Camera coordinate system
World coordinate system
12
2.2 Camera Model (Step 2 Projection)
Image plane
Camera coordinate system
World coordinate system
13
2.2 Camera Model (Step 3 Lens Distortion)
Image plane
Camera coordinate system
World coordinate system
14
2.2 Camera Model (Step 3 Lens Distortion)
Radial Distortion
Radial distortion effect (a negative, b
positive)
15
2.2 Camera Model (Step 3 Lens Distortion)
Radial and Tangential Distortion
16
2.2 Camera Model (Step 4 Camera to Image)
Image coordinate system
Image plane
Camera coordinate system
World coordinate system
17
2.3 Calibration Methods (I)
  • Method of Hall
  • Lineal method
  • Transformation matrix
  • Method of Faugeras-Toscani
  • Lineal method
  • Obtaining camera parameters
  • Method of Faugeras-Toscani with distortion
  • Iterative method
  • Radial distortion
  • Method of Tsai
  • Iterative method
  • Radial distortion
  • Focal distance estimation
  • Method of Weng
  • Iterative method
  • Radial and tangential distortion

18
2.3.1. The Method of Hall
Assume light is captured on the image plane by a
linear projection
The matrix is defined up to a scale factor ?
Multiple Solutions A component is fixed to the
unity ? Unique Solution
19
2.3.1. The Method of Hall
20
2.3.1. The Method of Hall
Obtaining 11 unknowns and every 2D points gives
two equations So, at least 6 points are needed.
More points leads to a more accurate solution.
Pseudoinverse leads to a unique solution
21
2.3.2. The Method of Faugeras-Toscani
  • Extrinsic parameters Model the situation and
    orientation of the camera with respect to a world
    co-ordinate system.
  • Intrinsic parameters Model the behaviour of the
    internal geometry and the optical characteristics
    of the camera.

22
2.3.2. The Extrinsic Parameters
Retinal Plane
Pw
Yc
Zc
Xc
Yw
Xw
Oc
Camera co-ordinate system
Zw
K
World co-ordinate system
Ow
23
2.3.2. The Intrinsic Parameters Ideal Projection
Yc
CPw
CPu
PZc
PYc
PXc
Xu
f
Yu
Zc
Image Plane
Oc
Xc
C
24
2.3.2. The Intrinsic Parameters Pixel Conversion
25
2.3.2. The Intrinsic Parameters Principal Point
26
2.3.2. The Pinhole Model
Image coordinate system
Image plane
Step 4
Step 3
Step 2
Camera coordinate system
World coordinate system
Step 1
27
2.3.2. The Pinhole Model
(
X
,
Y
,
Z
) 3D object point with respect to world
co-ordinate system
w
w
w
Affine transformation.
Modelled parameters R, T
(
X
,
Y
,
Z
) 3D object point with respect to camera
co-ordinate system
c
c
c
Perspective transformation.
Modelled parameter f
(
X
,
Y
) Ideal projection on the retinal plane
u
u
Pixel adjustment
Modelled parameters ku, kv
(
X
,
Y
) Real projection on the image plane
p
p
Adaptation to the computer image buffer
Modelled parameters u0, v0
(Xi, Yi)
Real projection on the image plane
28
2.3.2. The Pinhole Model
29
2.3.2. The Pinhole Model
Intrínsecs
Extrínsecs
30
2.3.2. Calibrating the Pinhole Model
31
2.3.2. Calibrating the Pinhole Model
32
2.3.2. Calibrating the Pinhole Model tz
33
2.3.2. Calibrating the Pinhole Model The
Intrinsics
34
2.3.2. Calibrating the Pinhole Model The
Extrinsics
35
2.3.2. Calibrating the Pinhole Model The
Extrinsics
36
2.3.2. The Method of Faugeras-Toscani with
distorsion
Radial distorsion effect
Tangential distorsion effect
Radial distorsion is the most important and
usually the only considered in calibration.
37
2.3.2. The Method of Faugeras-Toscani with
distorsion
k1 is the most important component and usuallly
sufficient in most applications.
38
2.3.2. The Method of Faugeras-Toscani with
distorsion
Image coordinate system
Image plane
Step 5
Step 4
Step 3
Step 2
Camera coordinate system
World coordinate system
Step 1
39
2.3.2. The Method of Faugeras-Toscani with
distorsion
(
X
,
Y
,
Z
) 3D object point with respect to world
co-ordinate system
w
w
w
Affine transformation.
Modelled parameters R, T
(
X
,
Y
,
Z
) 3D object point with respect to camera
co-ordinate system
c
c
c
Perspective transformation.
Modelled parameter f
(
X
,
Y
) Ideal projection on the retinal plane
u
u
Radial lens distortion.
Modelled parameter k1
(
X
,
Y
) Real projection on the retinal plane
d
d
Pixel adjustment
Modelled parameters ku, kv
(
X
,
Y
) Real projection on the image plane
p
p
Adaptation to the computer image buffer
Modelled parameters u0, v0
(
X
,
Y
) Real projection on the image plane
i
i
40
2.3.2. The Method of Faugeras-Toscani with
distorsion
  • Iterative minimisation
  • Newton-Raphson
  • Levenberg-Marquardt

The model is NO LINEAR
41
2.3 Calibration Methods (II)
Transformation matrix
Step 1 World2camera
Transformation with ?, ?, ?, tx, ty and tz
Step 2 Projection
Projection with f
Step 3 Lens Distortion
Radial distortion with k1
Undistorted
Multiple distortion k1, g1, g2, g3, g4
Transformation with u0, v0 , ku and kv
Transformation with u0, v0 and sx
Transformation with u0, v0, ku and kv
Step 4 Camera2image

42
2.4 Accuracy Evaluation
  • 3D Measurement
  • Distance with respect to the optical ray
  • Normalized Stereo Calibration Error
  • 2D Measurement
  • Accuracy of distorted image coordinates
  • Accuracy of undistorted image coordinates

43
2.5 Experimental Results Synthetic Images (I)
44
2.5 Experimental Results Synthetic Images (II)
Normalized Stereo Calibration Error
45
Computing Time
2.5 Experimental Results Synthetic Images (III)
  • 160 punts 1800 punts
  • Hall 1 ms 70 ms
  • Faugeras 1 ms 70 ms
  • Faugeras with distortion 10 ms 380 ms
  • Tsai 10 ms 530 ms
  • Weng 51 ms 4216 ms
  • Pentium III at 1 GHz.

46
2.5 Experimental Results Real Images (I)
Image of the calibrating pattern
47
2.5 Experimental Results Real Images (II)
Stereo camera over a mobile robot
Image of the calibration pattern
48
2.6 Conclusions
  • Implementation of 5 of the most used camera
    calibration methods
  • Notation was unified
  • The methods were compared
  • Model
  • Accuracy
  • The accuracy of non-linear methods is better than
    linear methods
  • Modelling of radial distortion is quite
    sufficient when high accuracy is required
  • Accuracy measuring methods obtain similar results
    if they are relatively compared

49
Contents
  • 1. Introduction
  • 2. Camera Modelling and Calibration
  • 3. Stereo Vision and the Epipolar Geometry
  • 3.1 Shape from X
  • 3.2 Stereo Vision Introduction
  • 3.3 Triangulation Principle and Constraints
  • 3.4 Epipolar Geometry
  • 3.5 Computing Fundamental Matrix
  • 3.6 Accuracy Evaluation
  • 3.7 Experimental Results
  • 3.8 Conclusions
  • 4. 3D Mapping Results
  • 5. Related Publications

50
3.1 Shape from X
  • Techniques based on
  • Modifying the intrinsic camera parameters
  • i.e. Depth from Focus/Defocus and Depth from
    Zooming
  • Considering an additional source of light onto
    the scene
  • i.e. Shape from Structured Light and Shape from
    Photometric Stereo
  • Considering additional surface information
  • i.e. Shape from Shading, Shape from Texture and
    Shape from Geometric Constraints
  • Multiple views
  • i.e. Shape from Stereo and Shape from Motion

51
3.1 Shape from X
  • Techniques based on
  • Modifying the intrinsic camera parameters
  • i.e. Depth from Focus/Defocus and Depth from
    Zooming
  • Considering an additional source of light onto
    the scene
  • i.e. Shape from Structured Light and Shape from
    Photometric Stereo
  • Considering additional surface information
  • i.e. Shape from Shading, Shape from Texture and
    Shape from Geometric Constraints
  • Multiple views
  • i.e. Shape from Stereo and Shape from Motion

52
3.1 Shape from X
  • Techniques based on
  • Modifying the intrinsic camera parameters
  • i.e. Depth from Focus/Defocus and Depth from
    Zooming
  • Considering an additional source of light onto
    the scene
  • i.e. Shape from Structured Light and Shape from
    Photometric Stereo
  • Considering additional surface information
  • i.e. Shape from Shading, Shape from Texture and
    Shape from Geometric Constraints
  • Multiple views
  • i.e. Shape from Stereo and Shape from Motion

53
3.1 Shape from X
  • Techniques based on
  • Modifying the intrinsic camera parameters
  • i.e. Depth from Focus/Defocus and Depth from
    Zooming
  • Considering an additional source of light onto
    the scene
  • i.e. Shape from Structured Light and Shape from
    Photometric Stereo
  • Considering additional surface information
  • i.e. Shape from Shading, Shape from Texture and
    Shape from Geometric Constraints
  • Multiple views
  • i.e. Shape from Stereo and Shape from Motion

54
3.2 Stereo Vision Introduction
55
3.3 Triangulation Principle
56
3.3 Triangulation Principle
T21 R1(,13) T1 invT21inv(T21) P2Dw1invT2
1Xu1 Yu1 f1 1 Ocw1invT21(,4) T22
R2(,13) T2 invT22inv(T22) P2Dw2invT22Xu2
Yu2 f2 1 Ocw2invT22(,4)
WP2D1
WP3D
WOc1
WP2D2
WOc2
W
pqOcw2(13)-Ocw1(13) uP2Dw1(13,i)-Ocw1(13)
vP2Dw2(13,i)-Ocw2(13) alpha(pq'v-(pq'u
)norm(v)2/(u'v))/((u'v)-norm(u)2norm(v)2/(u
'v)) beta(-pq'ualphanorm(u)2)/(u'v)
rOcw1(13)alpha.u sOcw2(13)beta.v
P3Dstereo (rs)./2 disterror norm(r-s)
57
3.3 Constraints in Stereo Vision
M
3D Reconstruction
OI
OW
OI
m
m
Optics and Internal Geometry
Camera Pose
OC
OC
I
  • Constraints
  • The Correspondence Problem.
  • Active Systems Non static Camera Position nor
    Orientation

I
Epipolar Geometry
58
3.4 Epipolar Geometry (I)
OW coincides with OC
Intrinsic
Extrinsic
59
3.4 Epipolar Geometry (II)
Epipolar geometry of Camera 1
Epipolar geometry of Camera 2
Area 2
Epipole
Area 1
Epipole
Epipolar lines
Epipolar lines
Correspondence points
Zoom Area 2
Zoom Area 1
60
3.4 Epipolar Geometry (III)
epipole
epipole
61
3.5 Computing F The Eight Point Method
The epipolar geometry is defined as
Operating, we obtain
62
3.5 Computing F The Eight Point Method
First solution is
NOT WANTED
F is defined up to a scale factor, so we can fix
one of the component to 1. Lets fix F33 1.
Then
Least-Squares
63
3.5 Computing the Fundamental Matrix A Survey
LS Least-Squares Eig Eigen Analysis AML
Approximate Maximum Likelihood
64
3.6 Accuracy Evaluation
Image plane camera 1
Image plane camera 2
65
3.7 Experimental Results Synthetic Images (I)
Linear methods Good results if the points are
well located and no outilers

Methods 1.- 7-Point 2.- 8-Point with
Least-Squares 3.- 8-Point with Eigen Analysis
4.- Rank-2 Constraint
Mean and Std. in pixels
66
3.7 Experimental Results Synthetic Images (I)
Iterative methods Can cope with noise but
inefficient in the presence of outliers

Methods 5.- Iterative Linear 6.- Iterative
Newton-Raphson 7.- Minimization in parameter
space 8.- Gradient using LS 9.- Gradient
using Eigen 10.- FNS 11.- CFNS
Mean and Std. in pixels
67
3.7 Experimental Results Synthetic Images (I)
Robust methods Cope with both noise and outliers
Methods 12.- M-Estimator using LS 13.-
M-Estimator using Eigen 14.- M-Estimator
proposed by Torr 15.- LMedS using LS 16.-
LMedS using Eigen 17.- RANSAC 18.- MLESAC
19.- MAPSAC.
Mean and Std. in pixels
68
3.7 Experimental Results Synthetic Images (II)
Linear
Iterative
Robust
1.- 7-Point 2.- 8-Point with Least-Squares 3.-
8-Point with Eigen Analysis 4.- Rank-2
Constraint 5.- Iterative Linear 6.- Iterative
Newton-Raphson 7.- Minimization in parameter
space 8.- Gradient using LS 9.- Gradient using
Eigen 10.- FNS 11.- CFNS 12.- M-Estimator
using LS 13.- M-Estimator using Eigen 14.-
M-Estimator proposed by Torr 15.- LMedS using
LS 16.- LMedS using Eigen 17.- RANSAC 18.-
MLESAC 19.- MAPSAC.
69
3.8 Experimental Results Real Images (I)
70
3.8 Experimental Results Real Images (II)

Methods 1.- 7-Point 2.- 8-Point with
Least-Squares 3.- 8-Point with Eigen Analysis
4.- Rank-2 Constraint
Methods 5.- Iterative Linear 6.- Iterative
Newton-Raphson 7.- Minimization in parameter
space 8.- Gradient using LS 9.- Gradient
using Eigen 10.- FNS 11.- CFNS
Methods 12.- M-Estimator using LS 13.-
M-Estimator using Eigen 14.- M-Estimator
proposed by Torr 15.- LMedS using LS 16.-
LMedS using Eigen 17.- RANSAC 18.- MLESAC
19.- MAPSAC.
Mean and Std. in pixels
71
3.9 Conclusions
  • Survey of 15 methods of computing F and up to 19
    different implementations
  • Description of the estimators from an algorithmic
    point of view
  • Conditions Gaussian noise, outliers and real
    images
  • Linear methods Good results if the points are
    well located and the correspondence problem
    previously solved (without outliers)
  • Iterative methods Can cope with noise but
    inefficient in the presence of outliers
  • Robust methods Cope with both noise and outliers
  • Least-squares is worse than eigen analysis and
    approximate maximum likelihood
  • Rank-2 matrices are preferred if a good geometry
    is required
  • Better results when data are previously normalized

72
Contents
  • 1. Introduction
  • 2. Camera Modelling and Calibration
  • 3. Stereo Vision and the Epipolar Geometry
  • 4. 3D Mapping Results
  • 4.1 Data Flow Diagram
  • 4.2 Example
  • 4.3 Experimental Results
  • 6. Related Publications

73
4.1 Data Flow Diagram (I)
Sequence A
Sequence B
Camera A
Camera B
2D Image Processing
2D Points
3D Image Processing
3D Points
Map Building and Localization
3D Points
Position
Image Flow 2D Points Flow 3D Points Flow Position
Flow
3D Map
Trajectory
74
4.1 Data Flow Diagram (II)
2D Image Processing
3D Image Processing
Map Building and Localization
75
4.2 Example Input Sequence
  • Cameras are calibrated
  • Both stereo images are obtained simultaneously

76
4.2 Example RGB to I
  • Description
  • Converting a color image to an intensity image
  • Input
  • Color image (RGB)
  • Output
  • Intensity image

77
4.2 Example Remove Distortion
  • Description
  • Removing distortion of an image using camera
    calibration parameters
  • Input
  • Distorted image
  • Output
  • Undistorted image

78
4.2 Example Corners
  • Description
  • Detection of corners using a variant of Harris
    corners detector
  • Input
  • Undistorted image
  • Output
  • Corners list

Corners Detected
79
4.2 Example Spatial Cross Correlation
  • Description
  • Spatial cross correlation using fundamental
    matrix obtained from camera calibration
    parameters
  • Input
  • Undistorted image A
  • Corners list A
  • Undistorted image B
  • Corners list B
  • Output
  • Spatial points list
  • Spatial matches list

Points and matches list
80
4.2 Example Temporal Cross Correlation
Points and matches list
  • Description
  • Temporal cross correlation using small windows
    search
  • Input
  • Previous undistorted image
  • Previous corners list
  • Current undistorted image
  • Current corners list
  • Output
  • Temporal points list
  • Temporal matches list

81
4.2 Example Stereo Reconstruction
  • Description
  • Stereo reconstruction by triangulation using
    camera calibration parameters
  • Input
  • Spatial points list
  • Spatial matches list
  • Output
  • 3D points list

82
4.2 Example 3D Tracker
  • Description
  • Tracking 3D points using temporal cross
    correlation
  • Input
  • 3D points list
  • Temporal points list A
  • Temporal matches list A
  • Temporal point list B
  • Temporal matches list B
  • Output
  • 3D points history
  • Points history A
  • Matches history A
  • Points history B
  • Matches history B

83
4.2 Example Outliers Detection
  • Description
  • Detection of outliers comparing distance between
    current and previous 3D points list
  • Input
  • Odometry position
  • Current 3D points list
  • Previous 3D points list
  • Output
  • Outliers list

Current and Previous 3D points without outliers
84
4.2 Example Local Localization
  • Description
  • Computing the absolute position from the map by
    minimizing distance between the projection of 3D
    map points in cameras and current 2D points and
    matches.
  • Input
  • Odometry position as initial guest
  • Previous 3D points list from the map
  • Current 2D points list
  • Current 2D matches list
  • Output
  • Absolute position

85
4.2 Example Global Localization
  • Description
  • Computing the trajectory effect by the robot
  • Input
  • Local position
  • Output
  • Global position

86
4.2 Example Build 3D Map
  • Description
  • Building the 3D map from the 3D points with a
    history longer than n times
  • Input
  • Global position
  • Current 3D points list
  • Previous 3D points list
  • Output
  • Global position

87
4.3 Experimental Results
88
5. Related Publications
  • Journals
  • J. Salvi, X. Armangué and J. Batlle. A
    Comparative Review of Camera Calibrating Methods
    with Accuracy Evaluation. Pattern Recognition,
    PR, pp. 1617-1635, Vol. 35, Issue 7, July 2002.
  • X. Armangué and J. Salvi. Overall View Regarding
    Fundamental Matrix Estimation. Image and Vision
    Computing, IVC, pp. 205-220, Vol. 21, Issue 2,
    February 2003.
  • X. Armangué, H. Araújo and J. Salvi. A Review on
    Egomotion by Means of Differential Epipolar
    Geometry Applied to the Movement of a Mobile
    Robot. Accepted to be published in Pattern
    Recognition.
  • Conferences
  • X. Armangué, H. Araújo and J. Salvi. Differential
    epipolar constraint in mobile robot egomotion
    estimation. International Conference on Pattern
    Recognition, ICPR 2002, Québec, Canada, August
    2002.
  • J. Salvi, X. Armangué, J. Pagès. A survey
    addressing the fundamental matrix estimation
    problem. IEEE International Conference on Image
    Processing, ICIP 2001, Thessaloniki, Greece,
    October 2001.
  • More Information http//eia.udg.es/qsalvi/
Write a Comment
User Comments (0)
About PowerShow.com