Table of Contents Author Guidelines Submit a Manuscript
Advances in Human-Computer Interaction
Volume 2009 (2009), Article ID 901707, 6 pages
Research Article

CyARM: Haptic Sensing Device for Spatial Localization on Basis of Exploration by Arms

1Department of Info.Sys.Eng., Kanazawa University, Kakuma, Kanazawa, Ishikawa 920-1192, Japan
2International Young Researcher Empowerment Center, Shinshu University, 3-15-1 Tokida, Ueda, Nagano 386-8567, Japan
3Department of Media Architecture, 116-2 Kamedanakano, Hakodate, Hokkaido 041-8655, Japan
4Graduate School of Infomation Science and Technology, Hokkaido University, Kita 14, Nishi 9, Kita-ku, Sapporo, Hokkaido 060-0814, Japan

Received 10 June 2009; Accepted 25 October 2009

Academic Editor: Sunil K. Agrawal

Copyright © 2009 Junichi Akita et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


We introduce a new type of perception aid device based on user's exploration action, which is named as CyARM (acronym of “Cyber Arm”). The user holds this device in her/his arm, the extension of the arm is controlled by tension in wires, which are attached to her/his body according to the distance to the object. This user interface has unique characteristics that give users the illusion of an imaginary arm that extends to existing objects. The implementations of CyARM and our two experiments to investigate the efficiency and effectiveness of CyARM are described. The results show that we could confirm that CyARM can be used to recognize the presence of an object in front of the user and to measure the relative distance to the object.

1. Introduction

We as humans can avoid obstacles while walking or can stop along our way when blooming flowers catch our attention. Our brain and sensory organs bear a large part of responsibility for controlling such spontaneous behaviors [1]. Here, the question arises; “Are we able to perceive the surrounding environment as it really is?” For example, insects perceive their environment with compound eyes, while bats do so with an ultrasonic wave transmitter and receptor. Therefore, one can say that all animals have developed sensory organs and methods of environmental perception that are uniquely suitable for their style of living. Therefore, one can say that they perceive the same environment differently.

The sensory organs of humans can be roughly divided into the following two groups [2]:

(i)peripheral receptors that detect faraway objects, for example, eyes, ears, or nose. (ii)contiguous receptors that detect nearby surroundings through sensations received through the skin, membranes, or muscles, that is, the tactile senses.

However, these categories are not so strict; for example, although radiant heat is not generally perceived via a tactile receptor, this receptor can interpret that stimulus in some cases. Therefore, possibilities exist that a person can perceive a stimulus by using a variety of receptors; in other words, some sensory organs can compensate the others. For example, visually impaired people develop their auditory functions to perceive space in their environment using the relationship between direct and reflective sound waves. We believe that creating nonsensory organs and/or using our physical actions as a new kind of sensory organ is also possible.

In our study, we focused on physical action as a way of compensating the sensory organs, and here we describe a new interactive device, called CyARM (abbreviation of “Cyber Arm”) that is designed to provide users with a unique and intuitive interface for perceiving their living space. In this paper, Section 2 describes our concept of CyARM, and Section 3 describes the detail about the actual implementation of CyARM. And Section 4 and 5 describes the experiments to evaluate the ability of the CyARM, and finally Section 6 shows the discussion and conclusions.

2. Concept of CyARM

2.1. Conventional Visual Aid Devices

Currently, numerous visual aid devices have been developed, especially those for the visually impaired that replace the white cane. These devices can be divided into two groups with respect to the modality of their user interfaces. The first group uses auditory signals. Specifically, information about the surrounding environment is gathered by ultrasonic sensors and transformed into audible sounds that users can hear (e.g., high pitched sounds means the object is close to them). For example, Sonicguide [3] and TriSensor (KASPA) are equipments that generate the audible sound depending on the distance to the object by the ultra sonic distance sensor. The second group exploits tactile modality. Information obtained by distance sensors is conveyed to the user through tactile stimuli such as vibrations. Some of these methods have already been applied to commercial products.

Up to now, these devices have had some critical problems. For example, the devices in the first group that transform distance information into audible sounds force the users to create and use a kind of mental mapping between distance and sound pitch, and this can cause cognitive or mental distress for those users. Moreover, these audible sounds sometime masks the external sounds that were quite important information for blind persons to comprehend their current situation, so that the blind person reported that these devices cannot be used in the crowded environment. The devices in the second group also force the user to create a kind of mental mapping between distance and frequency of vibration, and again users can have difficulty determining the exact distance via this vibration. In addition, the continuous sensing of these vibrations makes users decrease their sensitivity of the vibrations.

2.2. Concept

A great deal of research in sensor technology has been undertaken in recent years. If this sophisticated technology was successfully coupled with new interface methods, it could be used to create sensing devices that would aid users in perceiving their surroundings. Since humans cannot intuitively recognize artificial signals (e.g., the meanings of vibrations mentioned in the previous section), proposing an intuitive method of transforming such signals into sensory information is essential to make these signals easy for users to understand.

We have designed a new sensory aid device named CyARM (abbreviation of “Cyber Arm”) that allows users to perceive distance and other spatial information without interference with natural sound sources and without cognitive or mental loads [47]. Suppose that you tried walking with your eyes closed, you would attempt to investigate your environment by extending your arms in front of you. This kind of behavior is similar to the functions of insects' antennae or the white cane of the visually impaired. Similar to the use of those antennae or canes, when the extended arm touched some objects, one would bend one's arm at the elbow and stop exploring. On the other hand, if no objects are in found in front of one, the arm naturally extends, as is illustrated in Figure 1. Here, the physical motions of bending and extending the arm can be considered as a kind of sensory receptor.

Figure 1: Metaphor of physical arm motion. The position of the arm in the user's exploration (extending) action corresponds to the distance to the object.

CyARM was developed by focusing on this intuitive metaphor of using physical arm motion as a sensory receptor. Specifically, we assumed that the users would grasp CyARM in their hand and investigate a desired direction by holding this device as shown in Figure 2. The CyARM is connected to a user's body by a wire and measures the distance to an object with ultrasonic waves. The tension of the wire is controlled according to the measured distance to the object. If the object is a short distance away, the CyARM pulls the wire tightly so that users feel stronger tension and their arms are forced to bend. This indicates that the object can be reached by just extending the arm. On the other hand, if the object is far away, CyARM stops after giving just enough slack to the wire so that users can extend their arm and feel almost no tension; this means that the object cannot be reached. In this way, users can search for objects in any direction by holding this device. There are a lot of studies on force representation device using wire tension in the area of virtual reality (VR), such as the SPIDER system [810]. These devices aim to represent the real force according to the information from the controller, while the operation of CyARM is directly based on the user's exploration action.

Figure 2: Concept of CyARM

3. Implementation

3.1. First CyARM Prototype

The mechanical architecture of the first CyARM prototype is illustrated in Figure 3. Ultrasonic sensors measure the distance to an object (measurement range is from 0.3 to 3.0 m), and the geared motor releases or tightens the wire. The wire position is regulated by P gain. When an object is found by the ultrasonic sensors, the wire is rewound to the appropriate position as determined from the measured distance. When users attempt to extend their arms, the device detects a slight displacement of the reel caused by the wire tension, and regulates rewinding the wire.

Figure 3: Mechanical architecture of first prototype of CyARM.

Figure 4 is a photograph of the first CyARM prototype we developed. Ultrasonic sensors were placed on the front of its body to facilitate easy aiming for users. A hook was attached near the wire release site allowing the device to be attached to the users' body (e.g., on a belt loop). This CyARM prototype weighs 500 grams and its dimensions are 15 cm height, 10 cm width, and 10 cm depth. The distance to the object is measured every 50 ms, in other words, the distance measuring frequency of 20 Hz, and the wire position are both controlled in the same period. Since the actions of users' arms are quite slow in relation to this measurement cycle time, the wire can be stably controlled. The maximum speed of the wire retraction is approximately 1:0 m/s. The coefficient of the wire length against the measured distance to the object is currently configured as 0.17 (wire of 50 mm against the distance of 300 mm).

Figure 4: First developed prototype of CyARM.
3.2. Second CyARM Prototype

The first CyARM prototype was too heavy to be used continuously for very long periods. Therefore, we developed a second CyARM prototype to reduce the weight of those that parts users carry in their hands. Figure 5 shows the architecture of the second CyARM prototype. The distance sensor and the motor were separated in order to reduce the weight of the component users hold in their hands, that component contains only the distance sensor. The other parts, for example, reel, motor, controllers, and the battery, are placed at the second component that users carry on their bodies. These two components are connected by the wire for retraction and the signal cables that transmit the measured distance signal and supply power to the sensor circuitry.

Figure 5: Mechanical architecture of the second prototype of CyARM.

Figure 6 is a photograph of the second CyARM prototype, and Figure 7 shows the device in practical use. The basic functions are identical to those of the first prototype. We used a brushless motor to reduce free torque; this effectively reduced the force required when the users extend the wire by extending their arms.

Figure 6: Second developed prototype of CyARM.
Figure 7: Practical use of CyARM.

The coefficient of the wire length against the measured distance to the object is currently configured as 0.27 (wire of 27 mm against the distance of 100 mm). This coefficient is expected to depend greatly on a user's body size, and should be configured when it is first used according to that user's body size and the purpose for using the device.

In the next sections, we describe two experiments to investigate the efficiency and effectiveness of the second prototype of the CyARM. The first experiment was to confirm its accuracy in detecting the presence of an object, and the second was to confirm the accuracy of perceiving the distance to an object by using CyARM.

4. Experiment  1: Target Presence Detection

4.1. Overview

Four sighted persons with blindfolds and one visually impaired person who was completely blind participated in this experiment. At first, an experimenter gave the participants brief instructions about the CyARM, for example, the concept of this device and how to use this device. After given the instructions, they were asked to practice with CyARM for a few minutes. All participants were then asked to stand and hold the CyARM while wearing headphones playing white noise to prevent those hearing external sounds. And the four sighted participants were asked to be blindfolded during the experiment.

The task of these participants was to determine whether a static object was present in front of them by using CyARM. A white board, one meter wide and two meters high, was randomly placed as the static object at a distance of about two meters in front of the participants. Participants were asked to report whether they felt the presence or the absence of the object. Each participant experienced 20 trials; in 10 trials out of 20 the object was present, while in the other 10 trials it was not. Each trial took about less than 10 seconds, so that each participant took about 40 minutes for this experiment including the receiving the instructions from the experimenter and practicing the CyARM. The sequences of the presence or absence of the objects are set as random.

4.2. Results

The experimental results are summarized in Table 1. The mean percentage among the five participants of successfully detecting the object's presence or absence was 90.0% for the object's presence, while the percentage for object absence was 96.0%. The results of chi-square test showed that the participants succeeded in detecting the presence or absence of the object ( , ). The percentage of the successful detection is almost constant during the trial of each participant, after the learning operation prior to the experiments.

Table 1: Results of experiment for recognizing the existence of objects (with standard deviations in parentheses).

Although we could not conduct the statistical analysis about the differences on the successfully detecting percentage between the sighted and blind participants due to the limited number of the participants, it seemed that there was no differences between these participants. The result then suggests that CyARM is an efficient device for detecting the presence of objects near users.

5. Experiment  2: Perceiving the Distance

5.1. Overview

Next, we conducted an experiment to evaluate the efficacy of the CyARM for perceiving the distance to the object. Twenty-three sighted persons (seventeen men and six women) participated in this experiment. They did not participate in Experiment . As the same with Experiment , after being given brief instructions and practicing using CyARM, all participants were asked to be blindfolded, and to stand and hold the CyARM while wearing headphones playing white noise. The task of these participants was to report the perceived distance to the object by using CyARM. The object was a white board which was the same one utilized in the Experiment .

At first, we conducted the training phase in which the participants experienced the CyARM's actual behaviors when the object was placed at either 50 cm or 150 cm in front of the participants. The distance from the participants to the object was measured as the distance from the participants' toes to the object. After this training phase, the actual experimental procedure was started; the object was placed in front of the participants at random distances from 50 cm to 150 cm with steps of 10 cm, and the participants were asked to report the perceived distance. Also the participants were informed that the object was placed in 10 cm steps from 50 cm to 150 cm. Each participant experienced the two sets of 11 trials of this procedure. The experiencing orders of each distance were counterbalanced among participants.

5.2. Results

Figure 8 shows the relation of the actual distance and the averaged reported distance in this experiment, where the crosses in Figure 8 correspond to the pairs of the presented distances and the averaged reported distances by the subjects.

Figure 8: Experimental result of distance to the object and the mean of reported distance

The regression line of the participants reported distance was , with the product-moment correlation coefficient between the actual and reported distance of . Due to this significant higher correlation, it can be said that the participants could reasonably recognize the distance to the object. However, thses results simultaneously showed that the smaller errors occurred in the short distance and the bigger errors occurred in the long distance.

As a reason of this slight tendency of perceiving a smaller distance than the actual distance, we assumed that the posture of the arm holding the CyARM caused this phenomenon. For example, when the participants perceived the object located far from the participants, they must extend their arms. On the other hand, when they perceived the object near to them, they must bend their arms at the elbow. This means that the CyARM was far from their body in the former case, and it was near in the latter one. As already mentioned in the above, the distance from the participants to the object was measured from the participant's toe. Therefore, when the participants perceived the object at a far distance, the reported distance should be shorter than the actual one because the CyARM was away from their body, and when they perceived the object at a near distance, the reported one should be accurate because the CyARM was very close to their body. Considering these participants' behaviors, it can be said that the CyARM had a high ability to make user recognize the distance to the object quite precisely.

6. Discussion and Conclusions

In this paper, we focused on physical actions for compensating the sensory organs, and we described a new interactive device, called CyARM (acronym of “Cyber Arm”). CyARM provides users a unique and intuitive interface for comprehending the space surrounding them by using body actions, specifically that of extending or stretching their arms. We described two implementations of CyARM—the first prototype and the improved second prototype. We also described the results of two experiments evaluating CyARM. One evaluated using it to detect the existence of an object, and the other evaluated using it to perceive the distance to the object. As the results of the above two experiment, we could confirm the followings.

(i)The participants holding the CyARM could detect whether the object was present or absent in front of them or not. (ii)The participants holding the CyARM could detect the distance from the participants to the object accurately.

These results successfully showed that CyARM has the efficacy to perceive the existence of an object and the distance to the object. It is then said that CyARM can provide users a kind of imaginary arm that extends to objects at a long distance. Moreover, the participants with just a few minutes of training with the CyARM could perceive the object's presence and the distance to the object by using this, so that this is the one of the strong advantage of this device. As an one of the consecutive studies, we are now planning to conduct an experiment to investigate whether the user could detect the shape of the object by means of the CyARM. Because the users freely can move and shake this CyARM, they can recognize the different distances to the object according to the positions or directions of the holding arm. Therefore, it is assumed that the pattern in different distance would lead to perceive the shape of object. We expected that the CyARM might enable the user to identify the shape by moving the device as if tracing the surface of object.

Further detailed evaluation, especially in practical usage situations, will be conducted and reported in future work. We are also planning to make practical improvements in CyARM, such as improving its accuracy and range of distance measurement, developing a more compact body, providing longer battery life, and enhancing its ease of use. We are also planing the experiments on recognizing the shapes of the objects using CyARM, and the results will be reported in our future work.


  1. T. Yoshimoto, Here Is the Finger?Hokkaido University Press, Sapporo, Japan, 1979.
  2. E. T. Hall, The Hidden Dimension, Anchor Books, New York, NY, USA, 1976.
  3. C. Carter and K. A. Ferrell, The Implementation of Sonicguide with Visually Impaired Infants and School Children, Sensory Aids Corporation, Bensenville, Ill, USA, 1980, D. W. Campbell, Ed.
  4. J. Akita, T. Takagi, M. Okamoto et al., “CyARM: environment sensing device using non-visual modality,” in Proceedings of the International Conference on Technology (CSUN '04), Los Angeles, Calif, USA, March 2004.
  5. M. Okamoto, J. Akita, K. Ito, T. Ono, and T. Takagi, “CyARM: interactive device for environment recognition using a non-visual modality,” in Proceedings of the International Conference on Computers Helping People with Special Needs (ICCHP '04), pp. 462–467, Paris, France, July 2004.
  6. K. Ito, M. Okamoto, J. Akita, T. Ono, I. Gyobu, and T. Takagi, “CyARM: an alternative aid device for blind persons,” in Proceedings of the Conference on Human Factors in Computer Systems (CHI '05), p. 36, Portland, Ore, USA, April 2005.
  7. J. Akita, K. Ito, T. Komatsu et al., “CyARM: direct perception device by dynamic touch,” in Proceedings of the 13th International Conference on Perception and Action, pp. 87–90, Montery, Calif, USA, July 2005.
  8. T. Iwamoto, M. Tatezono, T. Hoshi, and H. Shinoda, “Airborne ultrasound tactile display,” in Proceedings of the 35th International Conference on Computer Graphics and Interactive Techniques (SIGGRAPH '08), pp. 11–15, Los Angeles, Calif, USA, August 2008. View at Publisher · View at Google Scholar
  9. Y. Hirata and M. Sato, “3-dimensional interface device for virtual work space,” in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS '92), vol. 2, pp. 889–896, Raleigh, NC, USA, July 1992.
  10. A. Yamamoto, T. Ishii, and T. Higuchi, “Electrostatic tactile display for presenting surface roughness sensation,” in Proceedings of the IEEE International Conference on Industrial Technology (ICIT '03), vol. 2, pp. 680–684, Maribor, Slovenija, 2003. View at Publisher · View at Google Scholar