Table of Contents Author Guidelines Submit a Manuscript
Applied Bionics and Biomechanics
Volume 2, Issue 3-4, Pages 171-178

Sensory Integration with Articulated Motion on a Humanoid Robot

J. Rojas and R. A. Peters II

Center for Intelligent Systems, Vanderbilt University, Nashville TN, USA

Copyright © 2005 Hindawi Publishing Corporation. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


This paper describes the integration of articulated motion with auditory and visual sensory information that enables a humanoid robot to achieve certain reflex actions that mimic those of people. Reflexes such as reach-and-grasp behavior enables the robot to learn, through experience, its own state and that of the world. A humanoid robot with binaural audio input, stereo vision, and pneumatic arms and hands exhibited tightly coupled sensory-motor behaviors in four different demonstrations. The complexity of successive demonstrations was increased to show that the reflexive sensory-motor behaviors combine to perform increasingly complex tasks. The humanoid robot executed these tasks effectively and established the groundwork for the further development of hardware and software systems, sensory-motor vector-space representations, and coupling with higher-level cognition.