SEARCH
News & Information   |   Photos   |   Videos   |   NREC Brochure   |   NREC Flyers  
© Carnegie Mellon University
News & Updates


A natural, hands-free interface for LS3 will make the robot easier to use and speed its adoption. (Photo courtesy of DARPA.)



NREC Developing an Intuitive
Natural Interface for LS3


NREC researchers are working on a natural, hands-free user interface for the LS3 – the successor to Boston Dynamics’ famous four-legged walking robot, “Big Dog.”  They’re partnering with user interface and human factors gurus Dezudio and Kicker Studio to document a set of voice, gesture, and haptic (touch/tactile) commands for controlling the LS3, replacing a hand-held device.  

Essentially, the LS3 is a robotic pack mule.  It is designed to carry heavy loads and autonomously follow a person around.  Using arm motions, spoken commands, or even touch to interact with it is a natural fit.  The project’s overall goal is to make using the LS3 easier and more intuitive. The Robotics Technology Consortium is underwriting this six-month effort.

Virtually all mobile robots are operated with a control device – everything from bulky, suitcase-sized units to small, hand-held game controllers.  While these devices support an enormous range of functions, they are often difficult to use.  A control device adds extra weight to the heavy loads that dismounted warfighters already must carry.  It’s one more item to keep track of.  Worse, a warfighter often must concentrate on operating the robot to the exclusion of all else.  This is a big problem in theater, where warfighters need to focus on their missions and their surroundings.  They cannot afford to devote a significant amount of time and attention to a robot, no matter how useful it might be. 

A well-designed operator interface that uses natural human inputs instead of a control device avoids these problems.  Warfighters can make full use of the robot without having to lug a controller around.  Interacting with a robot through gesture, voice, or other natural inputs is simpler and much less distracting than using a hand-held device.  It can reduce training time and improve the robot’s adoption rate.

NREC and its partners are performing extensive field studies of warfighters in familiar mission scenarios.  This user-centered research will form the basis of an intuitive, naturalistic interface for controlling the LS3.  It will specify voice, gesture, and haptic commands that are easy to remember and carry out.  NREC is also recommending sensors and developing control software to detect, interpret, and execute these commands.    

Distribution Statement “A” (Approved for Public Release, Distribution Unlimited)