JPL Technical Report Server

Using arm and hand gestures to command robots during stealth operations

Show simple item record

dc.contributor.author Stoica, Adrian
dc.contributor.author Assad, Chris
dc.contributor.author Wolf, Michael
dc.contributor.author You, Ki Sung
dc.contributor.author Pavone, Marco
dc.contributor.author Huntsberger, Terry
dc.contributor.author Iwashita, Yumi
dc.date.accessioned 2013-02-25T21:48:40Z
dc.date.available 2013-02-25T21:48:40Z
dc.date.issued 2012-04-23
dc.identifier.citation SPIE Symposium on Defense, Security, and Sensing, Baltimore, Maryland, April 23-27, 2012 en_US
dc.identifier.clearanceno 12-1465
dc.identifier.uri http://hdl.handle.net/2014/42780
dc.description.abstract Command of support robots by the warfighter requires intuitive interfaces to quickly communicate high degree-of-freedom (DOF) information while leaving the hands unencumbered. Stealth operations rule out voice commands and vision-based gesture interpretation techniques, as they often entail silent operations at night or in other low visibility conditions. Targeted at using bio-signal inputs to set navigation and manipulation goals for the robot (say, simply by pointing), we developed a system based on an electromyography (EMG) "BioSleeve”, a high density sensor array for robust, practical signal collection from forearm muscles. The EMG sensor array data is fused with inertial measurement unit (IMU) data. This paper describes the BioSleeve system and presents initial results of decoding robot commands from the EMG and IMU data using a BioSleeve prototype with up to sixteen bipolar surface EMG sensors. The BioSleeve is demonstrated on the recognition of static hand positions (e.g. palm facing front, fingers upwards) and on dynamic gestures (e.g. hand wave). In preliminary experiments, over 90% correct recognition was achieved on five static and nine dynamic gestures. We use the BioSleeve to control a team of five LANdroid robots in individual and group/squad behaviors. We define a gesture composition mechanism that allows the specification of complex robot behaviors with only a small vocabulary of gestures/commands, and we illustrate it with a set of complex orders. en_US
dc.description.sponsorship NASA/JPL en_US
dc.language.iso en_US en_US
dc.publisher Pasadena, CA : Jet Propulsion Laboratory, National Aeronautics and Space Administration, 2012. en_US
dc.subject human robot interfaces en_US
dc.subject gesture recogniztion en_US
dc.subject electropyography en_US
dc.subject EMG sensor arrays en_US
dc.subject stealth operations en_US
dc.title Using arm and hand gestures to command robots during stealth operations en_US
dc.type Preprint en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search


Browse

My Account