Table of Contents Author Guidelines Submit a Manuscript
Journal of Robotics
Volume 2012, Article ID 797063, 13 pages
Research Article

Visual Odometry through Appearance- and Feature-Based Method with Omnidirectional Images

Departamento de Ingenierıa de Sistemas y Automática, Universidad Miguel Hernández, Avda. de la Universidad s/n, 03202 Elche (Alicante), Spain

Received 30 March 2012; Revised 16 July 2012; Accepted 31 July 2012

Academic Editor: Bojan Nemec

Copyright © 2012 David Valiente García et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


In the field of mobile autonomous robots, visual odometry entails the retrieval of a motion transformation between two consecutive poses of the robot by means of a camera sensor solely. A visual odometry provides an essential information for trajectory estimation in problems such as Localization and SLAM (Simultaneous Localization and Mapping). In this work we present a motion estimation based on a single omnidirectional camera. We exploited the maximized horizontal field of view provided by this camera, which allows us to encode large scene information into the same image. The estimation of the motion transformation between two poses is incrementally computed, since only the processing of two consecutive omnidirectional images is required. Particularly, we exploited the versatility of the information gathered by omnidirectional images to perform both an appearance-based and a feature-based method to obtain visual odometry results. We carried out a set of experiments in real indoor environments to test the validity and suitability of both methods. The data used in the experiments consists of a large sets of omnidirectional images captured along the robot's trajectory in three different real scenarios. Experimental results demonstrate the accuracy of the estimations and the capability of both methods to work in real-time.