Final Report Alpha Squad Seven - PowerPoint PPT Presentation

1 / 18
About This Presentation
Title:

Final Report Alpha Squad Seven

Description:

Final Report Alpha Squad Seven Members: Erik Bass Benjamin Carter Rahul Kapoor Steven Koegler Jared Schlicher Matthew Werner Coach: Alex Hsieh Date: 3/15/07 – PowerPoint PPT presentation

Number of Views:45
Avg rating:3.0/5.0
Slides: 19
Provided by: cl28
Category:
Tags: alpha | final | object | report | seven | squad

less

Transcript and Presenter's Notes

Title: Final Report Alpha Squad Seven


1
Final ReportAlpha Squad Seven
  • Members
  • Erik Bass
  • Benjamin Carter
  • Rahul Kapoor
  • Steven Koegler
  • Jared Schlicher
  • Matthew Werner
  • Coach
  • Alex Hsieh
  • Date
  • 3/15/07

2
Software Architecture
3
Camera
  • Controlling used ptzProxy()
  • Creating Image

captureframe undefined jpeg Headerfile not accessible
cameraproxy undefined jpeg
framgrabber Undefined inbuilt function Downloaded header not compiling
4
Point Translation and Boundaries
  • Use Direction and Robot relative points to find
    global points
  • Robot relative points come from the lidar
  • Robot direction based on ?x and ?y
  • Global points checked to with boundary conditions
  • Boundaries are the G.P.S. coordinates of the walls

5
LIDAR Function
  • ID 1 Object Avoidance
  • Filters out objects out of the box
  • Returns y-distance (R) of closest object
  • Y-dimension is a variable
  • X-dimension hardcoded at .7 m

6
LIDAR Function
  • ID 2 Intersection
  • Scans 180 degrees
  • Translates points of objects detected
  • Compare with intersection coordinates
  • Wait/Go

7
Way-point Following
  • The Initial Problem
  • The Solution (Look-Up Table)
  • The Implementation
  • The Calibration

8
The Initial Problem
  • The way-point program initially had problems when
    the robot was not aligned properly with the first
    way-points.
  • Swerve back and forth on the ideal line from one
    way-point to the next.
  • However if the robot was aligned properly to the
    first way-point, the robot could navigate the
    given seven way-points with some what of a smooth
    line.

9
The Solution
  • Create a look up table based on two values the
    angle and distance to the way-point.
  • The look up tables angle was incremented in units
    of fifteen degrees until it reached seventy five
    degrees.
  • The distance started at 0.5 m and was incremented
    by a full meter until the distance reached 4.5 m.
  • With five distance states and six angle states
    the table gives a total combination of thirty
    steering inputs.

10
The Implementation
  • The look-up table was implemented in C.
  • Six stacked if-else conditional statements for
    the angle check.
  • Nested if-else statement for the distance check.
  • With in the appropriate if-else statement the
    steering angle is assigned.

11
The Calibration
  • First every steering input was set at the maximum
    steering.
  • Adjusted the steering for the table by
    multiplying the maximum steering by a value from
    zero to one, basically to take a percentage of
    the maximum steering available depending on the
    state.
  • Then entered a guess for each state, through
    trial and error for way-point following, tuned
    the steering.
  • Applied the table to the intersection problem,
    the lane change problem, and then to object
    avoidance problem and made adjustments where they
    were needed.

12
Object Avoidance
  • Use Lidar to scan for object
  • When object is found check distance
  • Ignore large distances
  • If distance too small stop
  • If distance in between then
  • If link has 1 lane then stop
  • If link has 2 lanes then switch lanes

13
Lane Switch Logic
  • Find current lane position
  • Create line based on opposite lane points
  • Found using two closest opposite lane points
  • Once line is found create a new waypoint in the
    other lane.
  • Once past the object switch back into original
    lane.

14
Intersection Problem
  • How to determine if we are at an intersection
  • How to determine if there is a stop sign for the
    intersection

15
Reading the Network File
  • The Intersection Problem is solved by reading the
    Network File
  • The Network file contains node and stop sign
    information
  • The code reads this information to determine if
    the robot is at the end of a link
  • Then the code determines if there is a node at
    the end of the link the robot is located at
  • Finally, the code determines if the node contains
    a stop sign

16
Navigating an Intersection
  • If stop sign, then stop for set time
  • During stop use lidar data to determine if an
    object is within defined intersection
  • If object is, pause for more time than initial
    stop
  • If object is not, proceed after initial pause
  • If node has no stop sign, continue driving

17
Conclusion
  • Control Issues
  • Point Translation and Boundary Conditions
  • Driving Control
  • LiDAR Function
  • Obstacle Avoidance
  • Intersection Navigation

18
Questions
Write a Comment
User Comments (0)
About PowerShow.com