Objective POD Estimation - PowerPoint PPT Presentation

1 / 58
About This Presentation
Title:

Objective POD Estimation

Description:

Host: Roger Bryant, Director, Logan Emergency Ambulance Service Authority (LEASA) Participants: Attendees at Logan SAR Weekend on 15-16 June 2002 ... – PowerPoint PPT presentation

Number of Views:28
Avg rating:3.0/5.0
Slides: 59
Provided by: qrobej
Category:

less

Transcript and Presenter's Notes

Title: Objective POD Estimation


1
Objective POD Estimation
  • The Development of a Standard Method
  • For Gathering and Using Detection Data
  • R. Quincy Robe Jack Frost

2
Presentation Outline
  • Define and Describe a detectability index
  • Show how it is used with other data to estimate
    POD
  • Describe a Procedure for doing detection
    experiments to determine a detectability index

3
The Detection Process
  • A series of glimpses as the searcher moves
    through the environment containing the object.
  • Detection with any one glimpse depends on the
  • Search Object (size, color, contrast, etc.)
  • Environment (weather, terrain, vegetation, etc.)
  • Search Resource (sensor and platform)
  • Distance from the Resource to the Object

4
What is Probability of Detection (POD)?
  • Applies to some amount of area (e.g., a segment)
  • Probability of detecting an object if present
  • POD is a function of
  • Effort (Resources, Search Speed, Time)
  • Size of the Area covered
  • Search object detectability

5
What is Effort?
  • Total Distance traveled by searchers while
    searching in the segment
  • Effort searcher speed x time x number of
    searchers
  • What is Area covered?
  • Size of the area over which the searching effort
    is approximately uniformly spread

6
What is Detectability?
  • How can one measure or quantify how easy or hard
    it will be to detect a particular object with a
    particular type of resource (sensor) in a
    particular environment?

7
What about Maximum Detection Range?
  • Easy to measure directly.
  • Measures how far from the sensor an object can be
    detected by an alerted searcher who knows where
    to look.
  • Does not address whether the object will be
    detected within that range.
  • Does not measure how much detecting can be
    expected from a searcher (sensor).
  • No simple, predictable correlation with detection
    performance.

8
What about direct estimation?
  • Humans are very poor at estimating probabilities
    of any kind.
  • Compare
  • How many of 10 objects would you have found?
  • How many of 10 objects could you have missed?
  • No such thing as one size fits all POD for
    everything from small clues to large objects.
  • Direct estimation Wild Guess

9
Effective Sweep Width (Koopman)
  • Cannot be measured directly
  • Is an objective measure of detectibility
  • Large value gt easy to detect
  • Small value gt hard to detect
  • Depends on the characteristics of
  • Searcher/Sensor (What we are searching with.)
  • Search Object (What we are searching for.)
  • Environment (What we are searching in.)
  • Terrain, Vegetation, Weather, etc.
  • Has units of length (feet, meters, miles, etc.)

10
A Uniform Random Distribution

11
Effective Sweep Width
(Unrealistic Ideal Sensor Making a Clean Sweep)

Number detected 40. Number missed within sweep
width 0. Number detected outside sweep width
0.
12
Effective Sweep Width
(More Typical Sensor)

Number detected 40. Number missed within sweep
width 16. Number detected outside sweep width
16.
13
Effective Sweep Width Notes
  • In both of the previous examples, there were
  • The same object density ( of objects/unit of
    area),
  • The same length of searcher track, and
  • The same number of objects detected (40).
  • Therefore,
  • The effective sweep widths are also the same.
  • Effective sweep width represents the expected
  • amount of detection.

14
Lateral Range (Koopman)
  • Distance to right or left of sensor at the
    closest point of approach (CPA)
  • Lateral range curve

15
Effective Sweep Width
  • Key to Improved Search Planning and Evaluation
  • Improves POD Estimation
  • Allows us to Objectively Relate POD to Effort
    Expenditure
  • Has both Predictive and Retrospective Value
  • More Accurate and Reliable than Subjective
    Estimates
  • Based on Observable Factors
  • Improves Effort Allocation
  • Makes known, proven (mathematical) techniques
    available
  • Improves conceptualization of the search problem

16
Southern California
17
Southern California
18
Western Washington State
19
Western Washington State
20
Objective POD EstimationFor a searched segment
  • Effort z Total Distance Searchers Cover
    search speed ? time ? number of searchers
  • Effective Sweep Width W from detection
    experiments
  • Area Effectively Swept z ? W
  • Coverage C
  • POD 1 e-C (Koopman)

Area Effectively Swept
Area of Searched Segment
21
POD vs. Coverage Graph (Koopman)
22
Uncorrected Effective Sweep WidthsIn Nautical
Miles For Aerial Search Over Land (IAMSAR Manual)
Meteorological Visibility (Nautical Miles) Meteorological Visibility (Nautical Miles) Meteorological Visibility (Nautical Miles) Meteorological Visibility (Nautical Miles) Meteorological Visibility (Nautical Miles)
Search Object Altitude (Feet AGL) 3 5 10 15 20
Person 500 0.4 0.4 0.5 0.5 0.5
1000 0.4 0.4 0.5 0.5 0.5
Vehicle 500 0.9 1.3 1.3 1.3 1.3
1000 1.0 1.4 1.4 1.5 1.5
Small Aircraft 500 1.0 1.4 1.4 1.4 1.4
1000 1.0 1.5 1.5 1.6 1.6
Large Aircraft 500 1.2 2.0 2.2 2.2 2.2
1000 1.8 2.7 3.0 3.0 3.0
23
Effective Sweep Width Correction FactorsFor
Aerial Search Over Land (IAMSAR
Manual)(Multipliers)
Search Object 15-60 vegetation or hilly 60-85 vegetation or mountainous Over 85 vegetation
Person 0.5 0.3 0.1
Vehicle 0.7 0.4 0.1
Small Aircraft 0.7 0.4 0.1
Large Aircraft 0.8 0.4 0.1
24
Sweep Width Issues for Ground Search
  • Too many different types and combinations of
    terrain, vegetation, search objects for a
    universal set of sweep width tables.
  • Each locale needs sweep widths only for its area
    of responsibility, typical search objects, etc.
  • Solution Develop a standard, practical, and
    scientifically based procedure for local
    resources to use when developing sweep width
    estimates.

25
The Logan, West VirginiaDemonstration Project

26
Project Support
  • Sponsored by the U. S. National Search and Rescue
    Committee (NSARC)
  • Funded by Department of Defense (NSARC member)
  • Contract administered by U. S. Coast Guard (NSARC
    Chair) via the USCG Research and Development
    Center performed by Potomac Management Group
  • Endorsed by NASAR and U. S. Air Force RCC
  • Hosted by Logan Emergency Ambulance Service
    Authority

27
Demonstration Project
  • Principal Investigator R. Quincy Robe
  • Location Chief Logan State Park, Logan, WV
  • Host Roger Bryant, Director, Logan Emergency
    Ambulance Service Authority (LEASA)
  • Participants Attendees at Logan SAR Weekend on
    15-16 June 2002
  • Outstanding support and hospitality!

28
Demonstration Project Objectives
  • Design Practical Detection Experiment Procedures
    to determine Effective Sweep Width values for
    ground wilderness/rural searches.
  • Supervise a Demonstration of the Procedures Using
    Ground SAR Personnel.
  • Describe Method for Objectively Estimating POD
    from Effective Sweep Width, Effort, and Area.
  • Report Results and Describe Future Work required
    to generalize their application.

29
Concept of Operations (Preparation)
  • Select a typical area and typical search object
    types (no more than 3 types)
  • Select track(s) for searchers to follow (for at
    least 1 hourlonger is better)
  • Choose date, select participants, make logistic
    arrangements, set up schedule
  • Obtain/construct search objects ( 10 of each)

30
Concept of Operations (Execution)
  • Place objects at random locations along the track
    and random distances on either side
  • Send searcher/data recorder pairs along the track
    at timed intervals (to ensure separation)
  • Searchers move at normal search speed and report
    all sightings of search objects
  • Data recorders record searcher sighting reports
    and other pertinent data
  • Collect and analyze the recorded data

31
Chief Logan State Park
32
Select Search Track
33
Search Objects
Orange Glove
Garbage Bag
34
Determining Object Locations
  • Useful range of distances off track
  • Too close gt Insufficient data for longer ranges
  • Too far gt Wasted detection opportunities
  • Useful range of distances along track
  • Too close gt Frequent reinforcement gt alertness
  • Too far gt Track too long for reasonable time
  • Use Average Maximum Detection Range

35
Average Maximum Detection Range
36
Select Object Placement
  • Randomize
  • Distances along the track
  • Distances off track
  • Right or Left of track
  • Object types
  • Determine locations based on largest AMDR
  • Average separation along track of 3 ? AMDR
  • Off track up to 1.5 ? AMDR

37
Example of Object Locations(AMDR 100 m)
Search Object Locations Track Interval Along Track Location Cross Track Location Search Object Type
Location 1 100 to 300 241 m 122 m right B
Location 2 400 to 600 442 m 47 m left A
Location 3 700 to 900 886 m 69 m right A
Location 4 1000 to 1200 1033 m 22 m left B
Location 5 1300 to 1500 1420 m 45 m left A
More locations To end of track Next location Next location Next search object type
38
Search Object Location Zones

39
What is a Detection Opportunity?
  • For the purposes of a detection experiment, a
    detection opportunity is defined as one complete
    pass by the search object.
  • If there are 15 identical search objects of a
    given type and 30 searchers in an experiment,
    then there are a total of 15 x 30 450 detection
    opportunities for that type.
  • Each detection opportunity has one of two
    results Detection or Non-detection.

40
Important Notes
  • When performing a detection experiment, it is
    important to understand that
  • The relationship between the searcher (sensor)
    and the search object during the window of
    detection opportunity must be captured, and
  • Knowing when non-detection occurs is just as
    important as knowing when detection occurs.

41
Important Notes
  • The experiment is NOT a competitive event
  • The experiment does NOT measure individual
    searcher proficiency
  • Do NOT tell searchers how many objects are
    present, how far off track, or give any other
    hints
  • DO Collect additional data (e.g., weather, time
    of day, terrain and vegetation descriptions,
    searcher training/experience data, etc.) for
    later analysis

42
Perform Experiment
  • Secretly Place Objects at Selected Locations
  • Send Searcher/Data Recorder Pairs along the
    Selected Track at Timed Intervals
  • Collect Completed Detection Data Forms
  • Remove Objects at Experiments Conclusion
  • (Discard data for objects not found.)
  • Compile, Sort and Analyze the Detection Data

43
Detection Log
44
Calculate Sweep Width
  • Use the following property of sweep width
  • The number of detections outside a swath one
    sweep width wide centered on the searchers track
    equals the number of missed detections inside
    that swath.
  • Equivalently, the number of detections at lateral
    ranges greater than one-half the sweep width
    value are equal to the number of missed
    detections at lateral ranges less than one-half
    the sweep width value.

45
Logan Demonstration Statistics
  • 32 Searchers Participated
  • 12 Orange Gloves were placed
  • Glove AMDR 19 meters
  • 32 x 12 384 Detection Opportunities
  • 9 Black Garbage Bags were placed
  • Bag AMDR 25 meters (1.5 x 25 37.5 meters)
  • 32 x 9 288 Detection Opportunities

46
Consolidated Detection Data
47
Orange Glove Sweep Width
(AMDR 25 m) (12 Gloves, 32 Searchers)
48
Orange Glove Half Lateral Range Curve
49
Orange Glove Modified Sweep Width
50
Orange Glove Modified Half LRC
51
Black Bag Sweep Width
(AMDR 25 m) (9 Bags, 32 Searchers)
52
Black Bag Lateral Range Curve
53
Lessons Learned
  • AMDR did not work well
  • Poor choice of location?
  • Poor technique by investigators?
  • Should have been repeated several times in
    different locations
  • May need to use maximum, rather than average
    maximum detection range
  • Need steady flow of searcher/data recorder pairs

54
Future Work
  • Validate and refine detection experiment
    procedures in 3 different venues with different
    SAR groups and personnel during the next year.
  • Publish the refined procedures and make them
    available upon request.
  • Extend techniques to include aerial search over
    land (CAP, CASARA, etc).
  • Develop more advanced search planning methods
    appropriate for the land SAR community.

55
Future Work (continued)
  • Develop functional requirements for software
    tools to support land SAR search planning.
  • Survey existing software packages for synergistic
    opportunities.
  • Develop software (modules) to support land SAR
    search planning functions.

56
Conclusions
  • A practical detection experiment procedure is
    feasible.
  • Effective sweep width results make scientifically
    proven search planning methods available for use
    in land SAR.
  • Objective, accurate, reliable POD estimation is
    possible
  • More nearly optimal resource allocation can be
    done
  • Increase probability of success (POS) at maximum
    rate.
  • Minimize mean time to find survivors.
  • Save more lives.
  • Minimize risks to searchers through reduced
    exposure times.
  • Minimize costs through shorter searches on
    average.

57
Conclusions (continued)
  • Effort needed is comparable to a SAREX.
  • No special skills, tools or equipment required
    (although some items would be helpful).
  • Data should be archived at a central site.
  • Additional data gathered will support later
    analyses for important secondary effects
  • For example, correction factors to extend
    usability of effective sweep width data to
    situations other than those of the experiments.

58
THANK YOU!
Potomac Management Group, Inc. 510 King Street,
Suite 200 Alexandria, VA 22314 Attn J. R.
Frost 703-836-1037 or 202-267-6702 (USCG) E-mail
jfrost_at_potomacmgmt.com or
jfrost_at_comdt.uscg.mil
Write a Comment
User Comments (0)
About PowerShow.com