Table of Contents Author Guidelines Submit a Manuscript
Journal of Sensors
Volume 2017, Article ID 7342931, 14 pages
Research Article

Method for SLAM Based on Omnidirectional Vision: A Delayed-EKF Approach

Department of Computer Science, CUCEI, University of Guadalajara, Guadalajara, Mexico

Correspondence should be addressed to Rodrigo Munguía; ude.cpu@aiugnum.ogirdor

Received 4 October 2016; Accepted 4 January 2017; Published 19 February 2017

Academic Editor: Luis Payá

Copyright © 2017 Rodrigo Munguía et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


This work presents a method for implementing a visual-based simultaneous localization and mapping (SLAM) system using omnidirectional vision data, with application to autonomous mobile robots. In SLAM, a mobile robot operates in an unknown environment using only on-board sensors to simultaneously build a map of its surroundings, which it uses to track its position. The SLAM is perhaps one of the most fundamental problems to solve in robotics to build mobile robots truly autonomous. The visual sensor used in this work is an omnidirectional vision sensor; this sensor provides a wide field of view which is advantageous in a mobile robot in an autonomous navigation task. Since the visual sensor used in this work is monocular, a method to recover the depth of the features is required. To estimate the unknown depth we propose a novel stochastic triangulation technique. The system proposed in this work can be applied to indoor or cluttered environments for performing visual-based navigation when GPS signal is not available. Experiments with synthetic and real data are presented in order to validate the proposal.