Abstract

The way that natural systems navigate their environments with agility, intelligence and efficiency is an inspiration to engineers. Biological attributes such as modes of locomotion, sensory modalities, behaviours and physical appearance have been used as design goals. While methods of locomotion allow robots to move through their environment, the addition of sensing, perception and decision making are necessary to perform this task with autonomy. This paper contrasts how the addition of two separate sensing modalities – tactile antennae and non-contact sensing – and a low-computation, capable microcontroller allow a biologically abstracted mobile robot to make insect-inspired decisions when encountering a shelflike obstacle, navigating a cluttered environment without collision and seeking vision-based goals while avoiding obstacles.