Machine Vision lecture 4 - PowerPoint PPT Presentation

1 / 33
About This Presentation
Title:

Machine Vision lecture 4

Description:

Finding the pixels for each object. Extracting features. Calculate features ... Compactness. Minimum for a disc. Minimum for a rectangle. Circularity ... – PowerPoint PPT presentation

Number of Views:30
Avg rating:3.0/5.0
Slides: 34
Provided by: vok
Category:

less

Transcript and Presenter's Notes

Title: Machine Vision lecture 4


1
Machine Visionlecture 4
  • Topics
  • Feature extraction
  • Classification
  • Pattern recognition

2
Fundamental Steps in Computer Vision
  • Today
  • Representation
  • Recognition

3
Machine Vision
  • Improve image quality
  • Binarize image
  • Remove noise
  • Extract BLOBs
  • Calculate features

Today
Feature vector 2,1,,3
. . .
. . .
Feature vector 4,7,,0
4
Machine vision applications
  • Pose estimation
  • Pick and place applications
  • Bin-picking
  • Classification
  • Quality control

HOF
THOR
Conveyor belt
5
Agenda
  • BLOBs
  • Finding the pixels for each object
  • Extracting features
  • Calculate features
  • For example size and shape

Feature vector 2,1,,3
. . .
. . .
Feature vector 4,7,,0
6
BLOB what is it?
  • Blob versus BLOB
  • BLOB Binary Large Object
  • Also know as a Particle

. . .
7
Isolating a BLOB
  • What we want
  • For each object in the image, a list with its
    pixels
  • How do we get that?
  • Connected component analysis
  • (Region growing)
  • Define connectivity
  • Who are my neighbors?
  • 4-connected
  • 8-connected

8
Connected component analysis
  • Binary image (0,1)
  • Seed point where do we start?
  • Grassfire concept
  • Delete (burn) the pixels we visit
  • Visit all CONNECTED (4 or 8) neighbors

9
Connected component analysis
x
x
x
x
x
x
10
Connected component analysis
x
x
x
x
x
x
x
x
x
x
x
x
11
Connected component analysis
x
x
x
x
x
x
x
x
x
x
x
x
x
x
x
  • Recursive algorithm
  • (show blob, TH, analyze (set outlines))
  • Region growing

12
Extracting features
13
Why feature extracting?
  • Generate some entities (numbers) that can be used
    for classification
  • For example size and shape
  • From image operations to mathematical operations
  • Input a list of pixel positions
  • Output Feature vector
  • First step remove too small, too big, and border
    BLOBs
  • (show AuPbSn40, TH, analyze (set outlines,
    size, border))

Feature vector 2,1,,3
. . . . .
. . . . .
Feature vector 4,7,,0
14
Features
15
Features
One BLOB
  • Area (number of pixels)
  • Used to remove noise (small/big objects)
  • Number of holes in the object
  • Holes area
  • Total area area holes area
  • Perimeter length of contour
  • Bounding box
  • Upper left corner
  • Height and width of bounding box

16
Features
  • Center of mass (xm,ym)
  • Compactness
  • Minimum for a disc
  • Minimum for a rectangle
  • Circularity
  • Longest distance (Ferets diameter)
  • Orientation of the object
  • Orientation of Ferets diameter

17
Features
  • Shapes
  • Bounding box ratio Height / Width
  • Says something about the elongation
  • How well does the object fit a rectangle
  • Can be derived from Area and Perimeter
  • How well does the object fit an ellipse
  • Can be derived from Area and Perimeter
  • Moments
  • Many other features exist
  • see the notes for today and be creative!

Degree of elongation
18
Classification
19
Classification
THOR
HOF
  • Extract some features from each incoming object
  • Compare it with learned models
  • How?
  • Measure the DISTANCE from each BLOB to the model
    and pick the model with the shortest distance,
    i.e. the model most similar to the BLOB
  • How do we define a distance?

20
Classification
  • Distance in feature-space
  • Feature 1 Area
  • Feature 2 Circularity
  • 2 dimensional feature space
  • BLOB
  • Models

Feature 2
MATCH!
Feature 1
21
Classification Distance
Feature 2
  • Euclidean distance

Feature 1
22
Classification
  • Problem
  • Area is measured in 1000s and circularity 0,1
  • That means that the area will dominate the
    distance measure completely!
  • Solutions
  • Use ratios So W/H is a better feature than W or
    H
  • Normalize all feature to the same interval, e.g.,
    0,1

23
Classification
  • What if we trust some features more than others?
  • For example, with small objects the perimeter
    might be uncertain
  • Solution Weigthing

24
What to remember
  • BLOB Binary Large OBject
  • Find a list of pixels for each object
  • Connectivity 4 versus 8 connected
  • Connected component analysis
  • Features (characteristics) of an object
  • Many exist
  • Many more can be derived
  • Feature matching
  • Make a model of each object to be classified
  • Euclidean distance
  • Normalize
  • Weight

25
Advanced classification
26
Advanced classification
  • Distance in feature-space
  • Feature 1 Area
  • Feature 2 Circularity
  • 2 dimensional feature space
  • BLOB
  • Model types

Feature 2
MATCH!
Feature 1
(tavle note 1-3)
27
What to remember
  • Advanced classification
  • Based on statistics Bayes classifier
  • Include the variance (covariance), since data
    will never be static
  • Use enough samples when learning the models!

28
Machine vision applications
29
Machine vision applications
  • Pose estimation
  • Pick and place applications
  • Bin-picking
  • Classification
  • Quality control

HOF
THOR
Conveyor belt
30
Kvalitets kontrol
  • Mål en eller flere features
  • Hvordan sætter vi grænserne for accept ?
  • Absolutte grænser for samle dele
  • Ellers træn grænserne !!
  • Find fordelingerne for både OK samt
  • FEJL emner (ikke sikkert FEJL er pæn)
  • Brug nok samples !

Pointer 1) to typer fejl. 2)
(tavle note 4-5)
31
What to remember
  • Quality control
  • Learn the thresholds between OK and ERROR!
  • Set threshold according to False negatives and
    False positives
  • Learn variance of features gt
  • Learn covariances gt
  • The red region is the objects what will be
    wrongly accepted if ignoring the covariance

32
Xtra
33
Classification
  • What if we trust some features more than others?
  • For example, with small objects the perimeter
    might be uncertain
  • Solution Weigthing
  • Advanced classification
  • Use more samples to represent a model gt mean and
    covariance
  • Pattern recognition
  • Gaussian distribution gt
  • Mahalanobis distance
Write a Comment
User Comments (0)
About PowerShow.com