Linked local navigation for visual route guidance

Smith, Lincoln, Philippides, Andrew, Graham, Paul, Baddeley, Bart and Husbands, Philip (2007) Linked local navigation for visual route guidance. Adaptive Behavior, 15 (3). pp. 257-271. ISSN 1059-7123

Full text not available from this repository.


Insects are able to navigate reliably between food and nest using only visual information. This behavior has inspired many models of visual landmark guidance, some of which have been tested on autonomous robots. The majority of these models work by comparing the agent's current view with a view of the world stored when the agent was at the goal. The region from which agents can successfully reach home is therefore limited to the goal's visual locale, that is, the area around the goal where the visual scene is not radically different to the goal position. Ants are known to navigate over large distances using visually guided routes consisting of a series of visual memories. Taking inspiration from such route navigation, we propose a framework for linking together local navigation methods. We implement this framework on a robotic platform and test it in a series of environments in which local navigation methods fail. Finally, we show that the framework is robust to environments of varying complexity.

Item Type: Article
Schools and Departments: School of Engineering and Informatics > Informatics
Depositing User: Douglas Lincoln Smith
Date Deposited: 06 Feb 2012 19:07
Last Modified: 12 Apr 2012 13:20
📧 Request an update