About this Journal Submit a Manuscript Table of Contents
Advances in Human-Computer Interaction
Volume 2012 (2012), Article ID 251384, 10 pages
http://dx.doi.org/10.1155/2012/251384
Research Article

Testing Two Tools for Multimodal Navigation

1The Interactive Institute, Acusticum 4, 941 28 Piteå, Sweden
2University of Oulu, PL 8000, Oulun Yliopisto, 90014 Oulu, Finland
3University of Lapland, P.O. Box 122, 96101 Rovaniemi, Finland

Received 27 December 2011; Accepted 18 May 2012

Academic Editor: Kiyoshi Kiyokawa

Copyright © 2012 Mats Liljedahl et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

The latest smartphones with GPS, electronic compasses, directional audio, touch screens, and so forth, hold a potential for location-based services that are easier to use and that let users focus on their activities and the environment around them. Rather than interpreting maps, users can search for information by pointing in a direction and database queries can be created from GPS location and compass data. Users can also get guidance to locations through point and sweep gestures, spatial sound, and simple graphics. This paper describes two studies testing two applications with multimodal user interfaces for navigation and information retrieval. The applications allow users to search for information and get navigation support using combinations of point and sweep gestures, nonspeech audio, graphics, and text. Tests show that users appreciated both applications for their ease of use and for allowing users to interact directly with the surrounding environment.