JPL Technical Report Server

Fully self-contained vision-aided navigation and landing of a micro air vehicle independent from external sensor inputs

Show simple item record

dc.contributor.author Brockers, Roland
dc.contributor.author Susca, Sara
dc.contributor.author Zhu, David
dc.contributor.author Matthies, Larry
dc.date.accessioned 2013-02-25T21:49:19Z
dc.date.available 2013-02-25T21:49:19Z
dc.date.issued 2012-04-26
dc.identifier.citation SPIE Symposium on Defense, Security, and Sensing, Baltimore, Maryland, April 23-27, 2012 en_US
dc.identifier.clearanceno 12-2072
dc.identifier.uri http://hdl.handle.net/2014/42781
dc.description.abstract Direct-lift micro air vehicles have important applications in reconnaissance. In order to conduct persistent surveillance in urban environments, it is essential that these systems can perform autonomous landing maneuvers on elevated surfaces that provide high vantage points without the help of any external sensor and with a fully contained on-board software solution. In this paper, we present a micro air vehicle that uses vision feedback from a single down looking camera to navigate autonomously and detect an elevated landing platform as a surrogate for a roof top. Our method requires no special preparation (labels or markers) of the landing location. Rather, leveraging the planar character of urban structure, the landing platform detection system uses a planar homography decomposition to detect landing targets and produce approach waypoints for autonomous landing. The vehicle control algorithm uses a Kalman filter based approach for pose estimation to fuse visual SLAM (PTAM) position estimates with IMU data to correct for high latency SLAM inputs and to increase the position estimate update rate in order to improve control stability. Scale recovery is achieved using inputs from a sonar altimeter. In experimental runs, we demonstrate a real-time implementation running on-board a micro aerial vehicle that is fully self-contained and independent from any external sensor information. With this method, the vehicle is able to search autonomously for a landing location and perform precision landing maneuvers on the detected targets. en_US
dc.description.sponsorship NASA/JPL en_US
dc.language.iso en_US en_US
dc.publisher Pasadena, CA : Jet Propulsion Laboratory, National Aeronautics and Space Administration, 2012. en_US
dc.subject micro air vehicles (MAVs) en_US
dc.subject autonomous landing en_US
dc.title Fully self-contained vision-aided navigation and landing of a micro air vehicle independent from external sensor inputs en_US
dc.type Preprint en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search


Browse

My Account