Abstract
The way that natural systems navigate their environments with agility, intelligence and efficiency is an inspiration to engineers. Biological attributes such as modes of locomotion, sensory modalities, behaviours and physical appearance have been used as design goals. While methods of locomotion allow robots to move through their environment, the addition of sensing, perception and decision making are necessary to perform this task with autonomy. This paper contrasts how the addition of two separate sensing modalities – tactile antennae and non-contact sensing – and a low-computation, capable microcontroller allow a biologically abstracted mobile robot to make insect-inspired decisions when encountering a shelflike obstacle, navigating a cluttered environment without collision and seeking vision-based goals while avoiding obstacles.
Acknowledgement
This material is based upon work supported by the National Science Foundation (NSF) under grant no. 0516587 and the IGERT training grant DGE 9972747, the AFOSR under grant FA9550-07-1-0149 and grant F08630-03-1-0003 and the NASA Langley Research Center under grant NNL06AA19G.