Research Article | Open Access
Ramiro Velázquez, Edwige Pissaloux, Aimé Lay-Ekuakille, "Tactile-Foot Stimulation Can Assist the Navigation of People with Visual Impairment", Applied Bionics and Biomechanics, vol. 2015, Article ID 798748, 9 pages, 2015. https://doi.org/10.1155/2015/798748
Tactile-Foot Stimulation Can Assist the Navigation of People with Visual Impairment
Background. Tactile interfaces that stimulate the plantar surface with vibrations could represent a step forward toward the development of wearable, inconspicuous, unobtrusive, and inexpensive assistive devices for people with visual impairments. Objective. To study how people understand information through their feet and to maximize the capabilities of tactile-foot perception for assisting human navigation. Methods. Based on the physiology of the plantar surface, three prototypes of electronic tactile interfaces for the foot have been developed. With important technological improvements between them, all three prototypes essentially consist of a set of vibrating actuators embedded in a foam shoe-insole. Perceptual experiments involving direction recognition and real-time navigation in space were conducted with a total of 60 voluntary subjects. Results. The developed prototypes demonstrated that they are capable of transmitting tactile information that is easy and fast to understand. Average direction recognition rates were 76%, 88.3%, and 94.2% for subjects wearing the first, second, and third prototype, respectively. Exhibiting significant advances in tactile-foot stimulation, the third prototype was evaluated in navigation tasks. Results show that subjects were capable of following directional instructions useful for navigating spaces. Conclusion. Footwear providing tactile stimulation can be considered for assisting the navigation of people with visual impairments.
Designing navigational assistive interfaces for people with visual impairment involves two major challenges. (1) Information must be presented in a nonvisual form and in a simple, intuitive, and fast to understand manner. (2) The interface must be lightweight and inconspicuous and ensure an autonomy of at least a regular walking journey.
Many navigational assistive interfaces for the visually impaired can be found in the literature. Many of them use an acoustic information presentation method [1–3]. However, a major drawback of such systems is that acoustic feedback causes distraction and interferes with the user’s normal hearing activities. Some other interfaces use tactile feedback with the aim of avoiding user distraction with musical tones [4–6]. A major drawback of touch stimulation systems is that information displayed can be complex and slow to understand. Readers are referred to [7, 8] for two comprehensive and reasonably updated surveys on electronic interfaces for the visually impaired.
Most of the interfaces that exploit tactile stimulation are destined to the fingertips and palms. They are usually used in conjunction with the primary aids (white cane and guide dog). Unfortunately, as they require constant hand interaction, they quickly cause bodily fatigue and user rejection.
During the last years, we have been exploring the feasibility of providing tactile feedback via the feet. We have developed three prototypes of tactile interfaces for the foot that allow hands-free interaction and that are worn rather than carried by the user. With important improvements between them, all three prototypes essentially consist of a set of actuators embedded in a foam shoe-insole that provide vibrotactile stimulation to the plantar surface. These inexpensive devices address the two major challenges of assistive interfaces aforementioned: (1) they present vibrating patterns that are simple and fast to understand and (2) they can be further inserted into a shoe, thus becoming inconspicuous, visually unnoticeable assistive devices.
Experimental perceptual studies have been already conducted in the past with the first prototype in sighted and visually impaired users . Our results indicate that people actually understand information displayed to the plantar surface of the foot. This paper briefly recalls those results and reports our progresses in tactile-foot stimulation with two new improved devices. In particular, prototypes are evaluated and compared in direction rendering which is the basis for real-world navigation.
The rest of the paper is organized as follows: Section 2 presents background information on tactile-foot stimulation. Section 3 introduces the first prototype developed and its evaluation in direction rendering. Section 4 presents a technologically improved second prototype and evaluates the pertinence of these improvements in the same task. Section 5 presents the third prototype and its evaluation and compares the results with those obtained with the two previous prototypes. Real-time navigation of test participants is shown to demonstrate the effectiveness of the third prototype. Finally, Section 6 concludes summarizing the paper’s main contributions as well as future work perspectives.
Tactile paving is the most representative example of a tactile-foot stimulation system. It consists of regularly textured ground surface indicators in the form of patterns of raised domes or bars. Its purpose is to assist visually impaired pedestrians with their navigation (bar pattern) and alert them of hazards or obstacles in the immediate location (dome pattern).
Literature addressing electronic interfaces that display tactile information to the feet can be found in several domains.
In man-machine interaction, tactile-foot stimulation can be found in certain user control interfaces: car pedals , fitness gear , dental equipment , music instruments , and so forth. They provide vibrating cues to alert users of well-defined machine situations.
In human-computer interaction, tactile-foot stimulation is mostly oriented towards recreating virtual environments. Papetti et al. proposed a combined audio-tactile system that provides the sensation of walking over different grounds . Okamoto et al. developed a footstep display that recreates the sensation of stepping on fragile structures, such as paper, aluminum, and polypropylene . Jayakumar et al. introduced a haptic foot-based interface providing the sensation of mechanical impacts during virtual walking . Nordahl et al. explored the use of tactile-foot stimulation, together with visual and auditory stimuli, to generate vertical illusory self-motion during exposure to a virtual environment depicting an elevator .
In biomedical engineering, tactile-foot stimulation has been mostly explored in the control of human posture and standing balance [18, 19] as well as locomotion . To our knowledge, it has not been yet explored in assistive devices.
3. First Prototype
Based on the physiology of the plantar surface of the foot, which indicates that the great majority of the bioreceptors sensitive to vibrotactile stimulation are located on the medial and lateral areas , a first tactile interface consisting of a 16-point array of actuators was conceived (Figure 1(a)).
(a) Design concept. Inset: distribution of bioreceptors sensitive to vibrotactile stimulation in the foot sole, after . Receptive field size range from 5.8 to 333.6 mm2
(b) Prototype. This device was featured in IEEE Spectrum Online 
(c) Drive components all connected by cables
The prototype integrated all 16 actuators in a commercial inexpensive foam shoe-insole with 10 mm pin spacing (Figure 1(b)). The pins’ contact surface with the skin was 7 mm2 and targeted the bioreceptors of smallest receptive field size (Figure 1(a) inset). Actuators provided axial forces up to 13 mN and vibrating frequencies between 10 and 55 Hz. Each vibrator was independently controlled with a specific vibrating frequency command. This first prototype was meant to be used on the right foot and it was controlled by a computer through an electronic unit. All subsystems were connected by cables  (Figure 1(c)).
3.2.1. Study Participants and Experimental Procedure
Twenty undergraduate students (10 men and 10 women) at Panamericana University participated voluntarily in the experiment. All gave their consent in agreement with the university ethics guidelines. No special criteria were used to select them but availability. All participants were healthy sighted with no known impairments in tactile sensory or cognitive functions. Their ages ranged from 18 to 24 years with an average age of 20.5.
During the experiment, the subjects were seated wearing the tactile interface on the right foot (Figure 1(c)). For hygiene, all subjects were requested to use socks. Before the experiment, they were totally naive about all aspects of the test and were given general instructions concerning the task. A short familiarization time was granted prior to the tests. During this time, the subjects tested different vibration frequencies and had the opportunity to choose a preferred one. All 20 subjects chose 55 Hz, the maximum vibration frequency of the actuators.
3.2.2. Direction Recognition Experiment
The purpose of this test was to determine whether the subjects could recognize directions with the first prototype.
Method. A dynamic straight line was presented to the 20 subjects. Four patterns were chosen: North N (a line moving from the last row to the first one), South S (the inverse), West W (a line moving from the last column to the first one), and East E (the inverse) (Figure 2). Note that these tactile inputs refer to relative directions where North is always associated with the direction the subject is heading. They do not necessarily imply actual cardinal directions.
A set of 14 directions was presented to the subjects in one trial: S-N-E-W-S-E-N-W-E-S-W-N-S-E. This set takes into account all possible transitions between directions. Subjects were asked to report the direction perceived with no time restriction. Upon request, they could have the vibrating pattern refreshed on the interface.
Results. Subjects required from 8 to 10 min of training to get used to the patterns. Results obtained are presented in confusion matrix in Table 1. The average recognition rates were 71.92%, 71.05%, 80.24%, and 80.7% for N, S, W, and E, respectively. The average accuracy of confusion matrix (Table 1) is 76%. Note an overall good performance.
During the test, 12 of the 20 subjects requested to have at least one direction of the set refreshed. We noticed that users felt correctly the vibrations but sometimes they were unable to locate them precisely on their foot-sole.
4. Second Prototype
With the aim of improving user perception, a second prototype was envisaged (Figure 3(a)). Note that this design reduced the number of vibrators from 16 to 4 and that was meant to stimulate the same plantar areas: medial and lateral.
(c) Fully wearable device with wireless connection
This design modification addressed the observations made while testing with prototype 1: people do understand information displayed on the plantar surface of the foot. However, the foot is not capable of precise discrimination; that is, people cannot accurately distinguish which actuator is actually vibrating. From a technological point of view, it is not worthy then to integrate a large number of actuators in the interface. Work must be done at tactile rendering level.
Figure 3(b) shows the second prototype of tactile interface for the foot. Vibrators were arranged in a diamond-like shape with 35 mm side-length. Their characteristics are identical to those used in the previous prototype (13 mN, 10–55 Hz). However, the pins’ contact surface with the foot-sole was increased to 133 mm2 to enclose more small receptive field size bioreceptors and stimulate them together with medium receptive field size bioreceptors. This design was meant to be used on the left foot.
This device is completely wearable. Figure 3(c) illustrates how it is worn by a user. Note that battery, RF (radio-frequency) transmission module, and control circuitry are all embedded in an electronic module that the user carries comfortably attached to the ankle. Experimental tests revealed a 6 h continuous operation of the device and a 100 m communication distance range with a computer.
4.2.1. Study Participants and Experimental Procedure
A group of 20 undergraduate students (14 men and 6 women) were invited to participate in the experiment. None of them had participated previously evaluating prototype 1. All subjects were healthy sighted with no known impairments in tactile sensory or cognitive functions. Their ages ranged from 19 to 23 years with an average age of 20.2. All gave their consent in agreement with the university ethics guidelines.
The same procedure was followed: subjects were seated wearing socks with the interface now on the left foot. There was no previous knowledge of any of the aspects of the experiment. A short familiarization time with the interface was granted. Vibrating actuators were working at 55 Hz.
4.2.2. Direction Recognition Experiment
The purpose of this test was to determine whether the subjects could obtain better recognition rates upon the use of the second prototype and new vibrating patterns.
Method. Each one of the four contact pins of the interface was set to represent a cardinal point (again, not necessarily representing the actual one). A direction is encoded in five sequences () as follows: three consecutive short vibrations in the corresponding contact pin, then a short vibration in the opposite contact pin, and again a short vibration in the correct contact pin.
Figure 4 shows, for example, the codification for North. Note that the contact pin N vibrates three times, then S once, and again N. The same set of 14 directions was presented to the subjects in one trial. All 20 subjects were asked to report the direction perceived with no time restriction. Upon request, they could have the direction pattern refreshed on the interface.
Results. Subjects required from 3 to 6 min of training to get used to the patterns. Results obtained are presented in confusion matrix in Table 2. The average recognition rates were 91.65%, 91.25%, 78.75%, and 91.65% for N, S, W, and E, respectively. The average accuracy of confusion matrix (Table 2) is 88.3%. Note that recognition rates improved for N, S, and E while for W, it was practically the same as in prototype 1.
Note that tactile patterns used in the second prototype are easier to recognize. Now, the opposite direction is also displayed to indicate a direction. This provides a reliable reference to identify points of vibration. As it can be seen from the recognition rates, the fact of displaying both correct and opposite directions in the same tactile pattern eases recognition and does not confuse the user.
5. Third Prototype
Figure 5(a) shows the conceptual representation of the third prototype. Note that the only modification is that contact pin S stimulates now the tibial plantar area (Figure 1(a) inset). The reason for this modification can be deduced from the results in Table 2: for W, most incorrect answers point to N and S. This suggests that these pins are too close to pin W for precise discrimination. Figure 5(b) shows the prototype developed. Its characteristics are identical to those of prototype 2. This version is also wearable and is meant again for the right foot.
5.2.1. Study Participants and Experimental Procedure
A new group of 20 undergraduate students (15 men and 5 women) participated in the experiment. None of them had tried any of the two previous prototypes. All gave their consent in agreement with the university ethics guidelines. Subjects were healthy sighted with no known impairments in tactile sensory or cognitive functions. Their ages ranged from 20 to 22 years with an average age of 20.8.
The same procedure was followed: subjects were seated wearing socks with the interface on the right foot. There was no previous knowledge of any of the aspects of the experiment. A short familiarization time with the interface was granted prior to the test. Vibrations were at 55 Hz.
5.2.2. Direction Recognition Experiment
The purpose of this test was to determine whether the subjects could obtain even better recognition rates using the third prototype.
Method. The same encoding scheme was tested with prototype 3: three consecutive short vibrations in the corresponding contact pin, then a short vibration in the opposite contact pin, and again a short vibration in the correct contact pin (Figure 4).
As with the two previous prototypes, the set of 14 directions was presented in one trial. Subjects were asked to report the direction perceived with no time restriction. Upon request, they could have the direction pattern refreshed on the interface.
Results. Subjects required from 3 to 5 min of training to get used to the patterns. Results obtained are presented in confusion matrix in Table 3. The average recognition rates were 100%, 97.78%, 88.89%, and 90% for N, S, W, and E, respectively. The average accuracy of confusion matrix (Table 3) is 94.2%. Note that recognition rates improved for N, S, and W while for E, it was practically the same as in prototype 2. Moving pin S to the tibial area greatly improved user perception.
Figure 6 compares the results obtained between prototypes and tactile patterns. Note that both technological and rendering improvements led progressively to better recognition rates.
5.2.3. Navigation in Space
This experiment aims to determine whether the subjects could actually navigate in a structured environment using the directions provided by the third prototype.
Method. A camera-based tracking platform was set for this experiment. It consisted of a camera placed 4 m above the ground surface that recorded RGB video. The acquired video was later processed in a PC for subject tracking.
Fifteen (13 men and 2 women) of the 20 subjects of the last group remained available three weeks after the direction recognition test and were invited to participate in this experiment. They had already tried prototype 3 and were familiar with the directional patterns.
The following intuitive protocol was used: North for moving forward, South for moving backward, West for turning left, and East for turning right. A fifth pattern consisting of two consecutive short vibrations, then a pause, and then two consecutive short vibrations (the typical pattern for SMS alerts in mobile phones) was used for indicating to stop. Directions were provided by a computer located outside the navigation environment.
During the test, subjects were blindfolded so that no cue from sight could be obtained. Four different navigational environments (E-1 to E-4) were proposed to the subjects who were totally naive about their structure prior to and during the test.
Subjects were asked to move according to the pattern felt. They had no time restriction to complete the test and, upon request, they could have the directional instruction refreshed on the interface. The navigation time was recorded for each participant.
Results. All subjects were capable of following the navigational instructions and successfully completed the task. Figure 7 shows a representative example of subject performance per proposed environment. In E-1, E-2, and E-4, subjects had no error following the instructions. In E-3, subject misunderstood E for S and moved backward instead of turning right before being corrected by the computer. Note that, without the visual reference, most sighted subjects fail to walk in straight line: in E-2, despite following the instructions correctly, subject bumped into the obstacles once.
Figure 8 compares the navigational times of the four environments. All four environments were tested by four subjects each (only one subject participated twice navigating E-1 and E-4). Median navigational times were 76, 75, 115, and 149 s for E-1, E-2, E-3, and E-4, respectively. Note that E-1 and E-2 involve the same number of directions and were completed in similar times. The structures of E-3 and E-4 were more complex and required higher navigational times.
Results are undoubtedly encouraging: they suggest that it is feasible to exploit tactile-foot stimulation for directional navigation in space. However, navigational times reported in Figure 8 may raise questions about the practicality of the approach. Note that the proposed environments involve a maze-like structure and require a large number of navigational instructions for completion: 13 instructions were provided to navigate E-1 and E-2, 16 instructions for E-3, and 19 for E-4 (example for E-1: forward - stop - turn left - forward - stop - turn right - forward - stop - turn right - forward - stop - turn left - forward). Understanding this amount of information and acting upon it in the reported times seem reasonable. The fact of testing with sighted subjects with no experience in nonvisual travel is also to be considered: motion is performed slower and with certain fear. Outdoor navigation by visually impaired users is expected to be much faster: fewer instructions (only when needed) and natural fearless walking.
This paper has reported our work and progresses on wearable electronic interfaces for the foot. Using vibrating motors, three simple, low cost, and easy/fast to assemble/maintain designs have been proposed to stimulate the plantar surface and transmit tactile information to the user.
The paper also presented the evolution and technological improvements between designs and prototypes. Lessons learned from the two previous prototypes led to the design of a third optimized version that exhibits significant advances in tactile-foot stimulation.
The pertinence of these advances was evaluated through a perceptual experiment involving direction recognition. Rates show that people understand easier directional information when the medial, lateral, and tibial plantar areas are stimulated and with patterns providing reference points rather than motion.
Real-time navigation in space based on directions was also evaluated. Results show that people are capable of navigating environments using the directions provided by our tactile interfaces for the foot.
The potentials of tactile-foot stimulation and vibrotactile interfaces for the foot can be particularly attractive for assisting people with visual impairment: an unobtrusive, inconspicuous, inexpensive, and fully wearable device that provides intuitive navigational information. Such system would not intend to replace the primary aids but to enhance independent travel by providing directions useful to reach a destination easier and faster. Our future work looks to integrate our third interface with commercially available GPS devices. This will open the possibility of testing in outdoor scenarios.
An important concern on tactile-foot stimulation is cognitive load. Many subjects expressed that vibrating patterns were easy and intuitive but that they needed a certain level of concentration in order to fully and quickly understand them. This could be a limitation for real-life outdoor use: (1) a continuous state of concentration will eventually fatigue users and (2) there are a number of environmental variables that will surely distract the user from the device. Perception and navigational decisions may be affected or slowed down.
Current work focuses on the design and use of new tactile patterns that represent more complex ideas than just directions. These patterns intend to provide situational awareness assistance during navigation. Figure 9 illustrates this concept. A blindfolded subject navigates a structured environment as described in Section 5.2.3. At point A, the interface displays obstacle to your left and chair to your right (Figure 9(a)). Subject acts accordingly (Figure 9(b)).
(a) Navigation in a structured environment. The broken yellow line represents the trajectory followed by the test subject
(b) Action upon assistance provided by the interface
Perceptual studies with voluntary subjects who are visually impaired have been already conducted in the past with our first prototype . Results showed that tactile-foot feedback seems easier to understand for this population. Even better results can be expected from this population with the second and third optimized prototypes presented in this paper.
Conflict of Interests
The authors declare that there is no conflict of interests regarding the publication of this paper.
- J. R. Marston, J. M. Loomis, R. L. Klatzky, and R. G. Golledge, “Nonvisual route following with guidance from a simple haptic or auditory display,” Journal of Visual Impairment & Blindness, vol. 101, no. 4, pp. 203–211, 2007.
- L. A. Guerrero, F. Vasquez, and S. F. Ochoa, “An indoor navigation system for the visually impaired,” Sensors, vol. 12, no. 6, pp. 8236–8258, 2012.
- S. Holland, D. R. Morse, and H. Gedenryd, “Audiogps: spatial audio navigation with a minimal attention interface,” Personal and Ubiquitous Computing, vol. 6, no. 4, pp. 253–259, 2002.
- W. Heuten, N. Henze, S. Boll, and M. Pielot, “Tactile wayfinder: a non-visual support system for wayfinding,” in Proceedings of the 5th Nordic conference on Human-computer interaction: building bridges (NordiCHI '08), pp. 172–181, Lund, Sweden, October 2008.
- S. Schätzle, T. Ende, T. Wüsthoff, and C. Preusche, “VibroTac: an ergonomic and versatile usable vibrotactile feedback device,” in Proceedings of the IEEE International Symposium on Robot and Human Interactive Communication, pp. 670–675, September 2010.
- J. S. Zelek, S. Bromley, D. Asmar, and D. Thompson, “A haptic glove as a tactile-vision sensory substitution for wayfinding,” The Journal of Visual Impairment & Blindness, vol. 97, no. 10, pp. 621–632, 2003.
- B. Andò, “Electronic sensory systems for the visually impaired,” IEEE Instrumentation and Measurement Magazine, vol. 6, no. 2, pp. 62–67, 2003.
- R. Velázquez, “Wearable assistive devices for the blind,” in Wearable and Autonomous Biomedical Devices and Systems for Smart Environment: Issues and Characterization, A. L. Ekuakille and S. C. Mukhopadhyay, Eds., vol. 75 of Lecture Notes in Electrical Engineering, pp. 331–349, Springer, Berlin, Germany, 2010.
- R. Velázquez, O. Bazán, J. Varona, C. Delgado-Mata, and C. A. Gutiérrez, “Insights into the capabilities of tactile-foot perception,” International Journal of Advanced Robotic Systems, vol. 9, pp. 1–11, 2012.
- M. Mulder, M. van Paassen, S. Kitazaki, S. Hijikata, and E. Boer, “Car-following support with haptic gas pedal feedback,” in Proceedings of the IFAC Symposium on Analysis, Design, and Evaluation of Human-Machine Systems, 2004.
- D. Bial, T. Appelmann, E. Rukzio, and A. Schmidt, “Improving cyclists training with tactile feedback on feet,” in Haptic and Audio Interaction Design: Proceedings of the 7th International Conference, HAID 2012, Lund, Sweden, August 23-24, 2012, vol. 7468 of Lecture Notes in Computer Science, pp. 41–50, Springer, Berlin, Germany, 2012.
- K. Lint, J. Witmer, and J. Reagan, “System including a wireless dental instrument and universal wireless foot controller,” International Patent WO 2011130236 A1, 2011.
- T. Michailidis and S. Berweck, “Tactile feedback tool: approaching the foot pedal problem in live electronic music,” in Proceedings of the International Computer Music Conference (ICMC '11), pp. 661–664, 2011.
- S. Papetti, F. Fontana, M. Civolani, A. Berrezag, and V. Hayward, “Audio-tactile display of ground properties using interactive shoes,” in Haptic and Audio Interaction Design: Proceedings of the 5th International Workshop, HAID 2010, Copenhagen, Denmark, September 16-17, 2010, vol. 6306 of Lecture Notes in Computer Science, pp. 117–128, 2010.
- S. Okamoto, S. Ishikawa, H. Nagano, and Y. Yamada, “Spectrum-based vibrotactile footstep-display for crinkle of fragile structures,” in Proceedings of the IEEE International Conference on Robotics and Biomimetics, pp. 2459–2464, 2011.
- R. P. Jayakumar, S. K. Mishra, J. F. Dannenhoffer, and A. M. Okamura, “Haptic footstep display,” in Proceedings of the IEEE Haptics Symposium (HAPTICS '12), pp. 425–430, IEEE, Vancouver, Canada, March 2012.
- R. Nordahl, N. C. Nilsson, L. Turchet, and S. Serafin, “Vertical illusory self-motion through haptic stimulation of the feet,” in Proceedings of the IEEE Virtual Reality Workshop on Perceptual Illusions in Virtual Environments (PIVE '12), pp. 21–26, March 2012.
- H. Asai, K. Fujiwara, H. Toyama, T. Yamashina, K. Tachino, and I. Nara, “The influence of foot soles cooling on standing postural control analyzed by tracking the center of foot pressure,” in Posture and Gait: Control Mechanisms, vol. 2, pp. 151–154, 1992.
- C. Maurer, T. Mergner, B. Bolha, and F. Hlavacka, “Human balance control during cutaneous stimulation of the plantar soles,” Neuroscience Letters, vol. 302, no. 1, pp. 45–48, 2001.
- S. Gravano, Y. P. Ivanenko, G. Maccioni, V. Macellari, R. E. Poppele, and F. Lacquaniti, “A novel approach to mechanical foot stimulation during human locomotion under body weight support,” Human Movement Science, vol. 30, no. 2, pp. 352–367, 2011.
- P. M. Kennedy and J. T. Inglis, “Distribution and behaviour of glabrous cutaneous receptors in the human foot sole,” Journal of Physiology, vol. 538, no. 3, pp. 995–1002, 2002.
- A. Corley, “Navigation by the soles of your feet,” IEEE Spectrum Online, 2014, http://spectrum.ieee.org/robotics/home-robots/navigation-by-the-soles-of-your-feet-and-the-seat-of-your-pants.
Copyright © 2015 Ramiro Velázquez et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.