Table of Contents Author Guidelines Submit a Manuscript
ISRN Signal Processing
Volume 2011 (2011), Article ID 425621, 12 pages
http://dx.doi.org/10.5402/2011/425621
Research Article

Camera-Based Motion Recognition for Mobile Interaction

1Machine Vision Group, Department of Electrical and Information Engineering, University of Oulu, 90570 Oulu, Finland
2Centre for Vision, Speech and Signal Processing, Faculty of Engineering and Physical Sciences, University of Surrey, Surrey GU27XH, UK

Received 22 February 2011; Accepted 18 March 2011

Academic Editors: L.-L. Wang, N. Younan, and L. Zhang

Copyright © 2011 Jari Hannuksela et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Multiple built-in cameras and the small size of mobile phones are underexploited assets for creating novel applications that are ideal for pocket size devices, but may not make much sense with laptops. In this paper we present two vision-based methods for the control of mobile user interfaces based on motion tracking and recognition. In the first case the motion is extracted by estimating the movement of the device held in the user's hand. In the second it is produced from tracking the motion of the user's finger in front of the device. In both alternatives sequences of motion are classified using Hidden Markov Models. The results of the classification are filtered using a likelihood ratio and the velocity entropy to reject possibly incorrect sequences. Our hypothesis here is that incorrect measurements are characterised by a higher entropy value for their velocity histogram denoting more random movements by the user. We also show that using the same filtering criteria we can control unsupervised Maximum A Posteriori adaptation. Experiments conducted on a recognition task involving simple control gestures for mobile phones clearly demonstrate the potential usage of our approaches and may provide for ingredients for new user interface designs.