Table of Contents Author Guidelines Submit a Manuscript
BioMed Research International
Volume 2016, Article ID 8163098, 9 pages
http://dx.doi.org/10.1155/2016/8163098
Research Article

Multisensory Integration in the Virtual Hand Illusion with Active Movement

1Advanced Production Systems Engineering Course, National Institute of Technology, Gunma College, Maebashi, Japan
2College of Information Science and Engineering, Ritsumeikan University, Kusatsu, Japan
3Gungin System Service Co., Ltd., Maebashi, Japan

Received 2 June 2016; Revised 21 September 2016; Accepted 29 September 2016

Academic Editor: Jaehyo Kim

Copyright © 2016 Woong Choi et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Improving the sense of immersion is one of the core issues in virtual reality. Perceptual illusions of ownership can be perceived over a virtual body in a multisensory virtual reality environment. Rubber Hand and Virtual Hand Illusions showed that body ownership can be manipulated by applying suitable visual and tactile stimulation. In this study, we investigate the effects of multisensory integration in the Virtual Hand Illusion with active movement. A virtual xylophone playing system which can interactively provide synchronous visual, tactile, and auditory stimulation was constructed. We conducted two experiments regarding different movement conditions and different sensory stimulations. Our results demonstrate that multisensory integration with free active movement can improve the sense of immersion in virtual reality.

1. Introduction

The mutual interaction of the sensory signals is a critical aspect in human perception and cognition. Recently, with the development of virtual reality (VR) technology, increasing researches were carried out on projecting multisensory information to virtual representations of the actual body. Ideally, the virtual representations in VR space should be identical to the actual body. However, in practice, physical differences arise due to the spatial limitation when constructing the VR space. The real-time representation of multisensory information plays a pivotal role in an immersive VR environment.

The shifts between physical stimuli and human perceptions are known as illusions. Using these sensory distortions, more realistic perceptions can be represented despite the physical limitations of sensory display interfaces.

Body ownership illusion, which is typically induced by VR, has been widely studied in the past few decades. Self-recognition is necessary for human cognition adapting to changes in the environment [1]. Mental representation of one’s own body, which is called body image, is not limited to the sense of body ownership but also comprises the multisensory perceptions such as visual and tactile information [2]. In addition, body image can be extended to an object or an artificial limb attached to the human body. Therefore, body image can be intentionally manipulated by displaying coherent multisensory information.

A famous illusion of body ownership is the Rubber Hand Illusion (RHI) [37]. In the RHI, the subjects viewing stimulation of a rubber hand being stroked synchronously with their own unseen hand feel that the rubber hand is part of their own body [8].

The displacement of body ownership has also been observed in VR environment, in which a virtual hand was displayed as the visual stimulation [9, 10]. This illusion is called the Virtual Hand Illusion (VHI). As reported by IJsselsteijn et al., the VR environment produced a weaker illusion than the RHI but a more convincing subjective illusion than the mixed reality environment [11]. Furthermore, connectivity of the virtual hand with the rest of the virtual body has a significant effect on the subjective illusion of ownership [12].

It is relatively easy to evoke the illusory ownership by displaying synchronous visual and tactile stimulation with multisensory display interface. Therefore, with a proper integration and display of multisensory stimulation, a more realistic experience can be elicited in an immersive virtual reality environment.

Many researchers investigated the effect of synchrony for visual, tactile, and auditory stimulation. The results showed that the synchronous conditions led to stronger illusions of body ownership than the asynchronous conditions [4, 1214]. In other words, synchronous stimulation is critical for inducing the VHI. Most studies on synchrony of multisensory stimulation are focused on examining the minimum conditions necessary to induce the illusion of body ownership. However, considering that the synchronous stimulation is the basis of multisensory integration, we aimed to investigate the most effective combination for inducing the illusion of ownership over the virtual hand in a 3D virtual reality environment.

In practice, a human being moves his/her body parts initiatively and receives subsequent multisensory feedback. Therefore, investigation of active movement in the VHI is necessary for answering the question of how to properly extend body image in VR space. Recent research showed that the change of body image is affected not only by passive sensory stimulation but also by the feedback from active movement. When visual and felt movements were synchronized, active movement arises stronger illusion than passive movement [15]. It has confirmed that illusory body ownership existed during active virtual games in an immersive VR space [16]. There is also evidence that the VHI can be induced by only synchronous visual and motor stimulation, in the absence of tactile stimulation [14].

In VR environment, active movement and multisensory feedback produce the sense of ownership and the sense of agency [17]. Most researches on the VHI in active movement using VR representation were focused on the visual and tactile stimulation. It was reported that the inclusion of a sound cue heightened the effects of the illusion and caused participants to more readily accept the rubber hand into the body schema [18]. Under the invisible hand illusion, the influence of visual-tactile integration on proprioceptive updating is modifiable by irrelevant auditory cues merely through the temporal correspondence between the visual-tactile and auditory events [19]. However, the effects of visual, tactile, and auditory integration in the VHI have not been studied yet.

The purpose of our study was to investigate the effects of multisensory integration in the VHI with active movement. In this paper, we constructed a VR system that interactively generates synchronous visual, tactile, and auditory stimulation. Our system enables participants to perform active movement in a VR environment. We conducted two experiments: (1) the VHI in different active movement conditions and (2) multisensory integration in the VHI. The effect of the visual presentation of the virtual hand was also evaluated.

2. Materials and Methods

2.1. Participants

Twenty participants (19 males) with mean age (SD) were recruited for the experiments. All participants had normal or corrected-to-normal vision and were right-handed. None had previously participated in similar studies.

All of the participants gave written informed consents prior to their participation. The protocol was approved by the ethics committees of Gunma National College of Technology.

2.2. Experimental Setup

The virtual reality setup consisted of a stereoscopic three-dimensional (3D) display (LG FLATRON W2363D), a six degree-of-freedom haptic device (SensAble Technologies Phantom Omni), and a pair of stereo speakers (Figure 1). The display had a resolution of 1,920 × 1,080 and synchronized with active shutter 3D glasses (NVIDIA 3D Vision) at 120 Hz. The haptic device provided force feedback and positional sensing in a  mm workspace with a 0.88 N continuous exertable force and a 3.3 N maximum exertable force at nominal position. A purpose-built frame with the display mounted on its top was constructed. The haptic device and the speakers were hidden inside the frame with a curtain during the experiments.

Figure 1: Configuration of the virtual reality system.

A virtual xylophone system was designed for this study. The system allowed active movement and interactively provided synchronous visual, tactile, and auditory stimulation in real time. In a xylophone playing task, participants saw a horizontally placed 3D virtual xylophone with a mallet (Figure 2). A 3D virtual right hand matched to their own hand sizes was also displayed in certain experimental conditions. Participants could operate the virtual hand and play the xylophone with their own unseen hand by holding and moving a pen-shaped tool of the haptic device. A tactile feedback with the resilience of a rubber-headed mallet was received when participants virtually struck the xylophone. A synchronous sound was played by the stereo speakers. The pitch and the volume of the sound were determined according to the struck bar and the striking speed. Libraries of CHAI3D, OpenGL, and OpenAL were used to build the system.

Figure 2: Virtual xylophone, mallet, and the virtual hand rendered and displayed in the experiments.
2.3. Procedure

Two experiments were performed in a quiet and dimly lit laboratory room.

2.3.1. Experiment  1: The VHI in Different Active Movement Conditions

This experiment was aimed to investigate the VHI in different active movement conditions. Participants were seated in front of the system with their right sleeve rolled up. They were directed to correctly hold the pen-shaped tool of the haptic device with their right hand while the device remained unseen. They were then asked to look at the display through 3D glasses. The virtual hand (without the xylophone and the mallet) was displayed as the visual stimulation (Figure 3). Participants were asked to perform left/right, forward/backward, up/down, rotatory, and free movements. The duration for each condition was 30 seconds. The order of the five conditions was counter-balanced across participants.

Figure 3: Experimental setup in experiment 1.

After the experiment, participants filled in a 11-item questionnaire in Japanese as shown below.

Questionnaire for Experiment  1(Q-a1)Sometimes it seemed as if my hand were located where I saw the virtual hand.(Q-a2)Sometimes I felt as if the virtual hand were my hand.(Q-a3)Sometimes I felt as if my hand were made by computer graphics.(Q-a4)At some moments, it seemed as if the virtual hand began to resemble my own hand.(Q-a5)Sometimes it seemed as if I might have more than one right hand.(Q-a6)Sometimes I felt as if my hand were existed in the virtual environment.(Q-a7)I had the sensation in questions (Q-a1) to (Q-a6) during left/right movements.(Q-a8)I had the sensation in questions (Q-a1) to (Q-a6) during forward/backward movements.(Q-a9)I had the sensation in questions (Q-a1) to (Q-a6) during up/down movements.(Q-a10)I had the sensation in questions (Q-a1) to (Q-a6) during rotatory movements.(Q-a11)I had the sensation in questions (Q-a1) to (Q-a6) during free movements.

Each question was scored on a 7-point Likert scale, with 1 indicating strongly disagree and 7 strongly agree. Questions (Q-a1) to (Q-a6) were partially adapted and modified from [3]. Questions (Q-a7) to (Q-a11) were introduced to apply to different conditions. For questions (Q-a7) to (Q-a11), reasons for their choice were asked.

Measurement of the displacement of the perceived hand position (proprioceptive drift) is a major way to evaluate the strength of the feeling of ownership in the RHI [20, 21]. However, most of these studies were carried out in passive movement conditions, in which the participant was asked to rest his/her hand on a table. The perceived hand positions were recorded before and after the participant’s hand was tapped or stroked passively by the experimenter.

The task in our study was an active movement one, which was designed to resemble a typical dynamic operation in VR environment. Measurement of the proprioceptive drift in the traditional RHI studies is difficult for us because the active hand is constantly moving. Therefore, we used the questionnaire to evaluate the strength of the illusory ownership.

2.3.2. Experiment  2: Multisensory Integration in the VHI

This experiment was aimed at investigating the effect of multisensory integration in the VHI with active movement. The experimental setup was similar to that in experiment 1. In this experiment, eight different experimental conditions with different combinations of visual, tactile, and auditory stimulation were designed (Table 1). It should be noted that, for visual stimulation, the xylophone and the mallet were displayed in all conditions, whereas the virtual hand was only displayed in conditions C1, C3, C5, and C7. Visual, tactile, and auditory stimulation were synchronized in all conditions. The order of the eight conditions was counter-balanced across participants. Figure 4 illustrated the experimental setup of conditions C1, C3, C5, and C7 as an example.

Table 1: The eight experimental conditions.
Figure 4: Experimental setup in experiment 2 (conditions C1, C3, C5, and C7).

Participants were directed to move their hand to the reference point to begin the experiment of each condition. They were asked to strike each bar of the virtual xylophone once in an ascending scale. They were then asked to play the xylophone freely for 15 seconds. After the experiment of each condition, participants filled in a 5-item, 7-point Likert scaled questionnaire in Japanese as shown below.

Questionnaire for Experiments  2: Common Questions(Q-b1)It seemed as if I were playing the xylophone with the mallet.(Q-b2)It seemed as if I were holding the mallet.(Q-b3)I could move the mallet to any position at my will.(Q-b4)I felt as if I were striking an object in the virtual environment.(Q-b5)I felt an increasing virtual reality experience during the experiment.

For conditions C1, C3, C5, and C7, five more questions that referred to the virtual hand were included as shown below.

Questionnaire for Experiments  2: Questions for Conditions C1, C3, C5, and C7(Q-c1)Sometimes I felt as if the virtual hand were my hand.(Q-c2)I felt as if my real hand were located at the virtual hand.(Q-c3)Sometimes I felt as if my hand were made by computer graphics.(Q-c4)At some moments, it seemed as if the virtual hand began to resemble my own hand.(Q-c5)Sometimes it seemed as if I might have more than one right hand.

For conditions C3, C4, C7, and C8, one more question that referred to auditory stimulation was included. For conditions C5–C8, one more question that referred to tactile stimulation was included as shown below.

Questionnaire for Experiments  2: Questions for Conditions C3–C8(Q-d1)It seemed as if the sound were coming from the xylophone bars where I struck.(Q-d2)It seemed as if the force were coming from the xylophone bars where I struck.

3. Results

Figure 5 shows the mean scores and standard deviations regarding questions (Q-a1) to (Q-a6) in experiment 1. Figure 6, regarding questions (Q-a7) to (Q-a11), shows that free movement condition (Q-a11) has a significantly higher mean score than other conditions (paired -test, , , , , and ).

Figure 5: Questionnaire results for (Q-a1) to (Q-a6) in experiment 1.
Figure 6: Questionnaire results for (Q-a7) to (Q-a11) in experiment 1.

Figure 7 shows the results of questions (Q-b1) to (Q-b5) for conditions C1 and C2 (visual stimulation only) in experiment 2. For conditions C1 and C2, the existence of the virtual hand showed a significant effect in (Q-b2), (Q-b5) (, , and ), and (Q-b4) (, and ).

Figure 7: Questionnaire results of (Q-b1) to (Q-b5) for conditions C1 and C2 in experiment 2.

Figure 8 shows the results of questions (Q-b1) to (Q-b5) for conditions C3 and C4 (visual and auditory stimulation) in experiment 2. For conditions C3 and C4, the existence of the virtual hand showed a significant effect in (Q-b1), (Q-b2), and (Q-b3) (, , , and ) and (Q-b4) and (Q-b5) (, , and ).

Figure 8: Questionnaire results of (Q-b1) to (Q-b5) for conditions C3 and C4 in experiment 2.

Figure 9 shows the results of questions (Q-b1) to (Q-b5) for conditions C5 and C6 (visual and tactile stimulation) in experiment 2. For conditions C5 and C6, the existence of the virtual hand showed a significant effect in (Q-b1), (Q-b2), and (Q-b3) (, , , and ) and (Q-b4) and (Q-b5) (, , and ).

Figure 9: Questionnaire results of (Q-b1) to (Q-b5) for conditions C5 and C6 in experiment 2.

Figure 10 shows the results of questions (Q-b1) to (Q-b5) for conditions C7 and C8 (visual, auditory, and tactile stimulation) in experiment 2. For conditions C7 and C8, the existence of the virtual hand showed a significant effect in (Q-b1), (Q-b2), (Q-b3), and (Q-b4) (, , , , and ).

Figure 10: Questionnaire results of (Q-b1) to (Q-b5) for conditions C7 and C8 in experiment 2.

Figure 11 shows the results of questions (Q-c1) to (Q-c5) for conditions C1, C3, C5, and C7 (with the virtual hand) in experiment 2.

Figure 11: Questionnaire results of questions (Q-c1) to (Q-c5) for conditions C1, C3, C5, and C7 in experiment 2.

For question (Q-c1), there were significant differences between conditions C1 and C5 (, ), conditions C1 and C7 (, ), conditions C1 and C3 (, ), and conditions C3 and C7 (, ).

For question (Q-c2), there were significant differences between conditions C1 and C7 (, ), conditions C1 and C5 (, ), and conditions C3 and C7 (, ).

For question (Q-c3), there were significant differences between conditions C1 and C5 (, ), conditions C1 and C7 (, ), conditions C3 and C5 (, ), and conditions C3 and C7 (, ).

For question (Q-c4), there were significant differences between conditions C1 and C7 (, ), conditions C3 and C7 (, ), conditions C1 and C5 (, ), and conditions C5 and C7 (, ).

For question (Q-c5), there were significant differences between conditions C1 and C5 (, ) and conditions C1 and C7 (, ).

Figures 12 and 13 show the result of questions (Q-d1) and (Q-d2). For (Q-d1), the conditions with the virtual hand (conditions C3 and C4) had significantly higher mean scores than those without the virtual hand (conditions C7 and C8) (, , and ). For conditions C7 and C8, the existence of the virtual hand also showed a significant effect in (Q-d2) (, ). No significant difference was observed between conditions C5 and C6 in (Q-d2).

Figure 12: Questionnaire results of (Q-d1) in experiment 2.
Figure 13: Questionnaire results of (Q-d2) in experiment 2.

4. Discussions

We investigate the effects of multisensory integration in the VHI with active movement.

Experiment  1 examined the VHI in different active movement conditions.

The experimental results showed that, in translation (the movements without rotation), left/right and forward/backward movements yielded less illusion than up/down movements. Because it is relatively difficult to perceive the depth for human, the perception gap between the virtual space and the real space in vertical movements (up/down) was smaller than that in horizontal movements (left/right and forward/backward) [22, 23].

Participants reported that they had a strong illusion at the near side and felt sense of incongruity during forward/backward movements. This result that might have been caused by the difference between camera view in the virtual space and participant’s view in the real space has been increased in forward/backward movements. As shown in Figure 6, questionnaire results were in agreement with that the forward/backward movement yielded least illusion.

Rotatory movements had a higher mean score than translation movements. Participants experienced stronger illusion because they felt less incongruity of spatial coordinates between virtual and real spaces especially during rotatory movements performed at the near side. Not only objective spatial parameters but also subjective impressions informed by previous experiences have an effect on the VHI [24]. Rotatory hand movements, which were less experienced than translation hand movements for participants, evoked greater illusion in our experiment.

Participants felt significantly stronger illusion in the condition of free movement. This result indicates that free-willed active movement can enhance the illusion of body ownership.

Previous studies suggested that the strongest illusion was reported when the rubber hand and the real hand were in the closest positions [25]. Furthermore, efference copy and the sensory feedback should be coincided in time for having the sense of agency [26]. Neurons in the parietal lobe related to the sense of agency function as mirror neuron [27]. It fires when an action is performed or the same action is performed by another. In the parietal lobe, the visual feedback and the predicted sensory feedback which is generated by the efference copy are compared [28]. During the free movements performed in our experiments, when the two feedbacks matched, the participants felt as if their real hand was moving in the virtual space and experienced a stronger illusion.

Experiment  2 examined the effects of multisensory integration in VHI with active movement.

We assumed that the VHI in VR space can be enhanced by applying multisensory integration to the virtual hand.

In experiment 2, conditions C1 and C2 (visual stimulation only conditions) had lower mean scores compared to the other six conditions. In addition, only three out of five items showed a significant effect of the existence of the virtual hand (Figure 7). In contrast, the existence of the virtual hand showed a significant effect in all items of the visual-auditory (Figure 8) and visual-tactile (Figure 9) conditions. It indicates that, with only the visual stimulation, it is relatively difficult to induce the illusion of body ownership despite the existence of the virtual hand. Note that, although conditions C7 and C8 (visual, tactile, and auditory conditions) had highest mean scores, the virtual hand showed less importance in one item (Figure 10). We consider that strong illusion was induced by multisensory integration even without visual existence of the virtual hand. Figure 11 showed that multisensory integration enhanced the strength of the illusory ownership. Furthermore, except for question (Q-c1), visual-auditory stimulation did not show significant advantage over visual only stimulation. However, visual-tactile stimulation showed a significantly greater effect on the illusion than visual only stimulation. This result shows that tactile signal is more critical than auditory signal in inducing the illusion of body ownership.

Figure 12 showed a significant effect of the existence of the virtual hand. It indicates that the auditory stimulation has been enhanced by the visually displayed virtual hand. The integration of visual and action auditory signals is one of the most important cues for human’s spatial position perception [29].

In contrast, no significant effect of the virtual hand was observed in visual-tactile conditions (Figure 13, C5 and C6). Iriki et al. studied behavioral effects of tool-use in humans and monkeys [30, 31]. The results indicated that body representation in the brain could be changed following tool-use. Body image has been extended to the tool. Studies in VHI also reported that the illusion of body ownership can be extended to noncorporeal objects by synchronous movements [32]. Note that wooden stick shaped object without movement was reported to be not capable of inducing the illusion [21]. In our experiments, participants extended their body image to the virtual mallet instead of the virtual hand by performing active movements with synchronous visual-tactile stimulation. The strength of the illusory ownership has not reduced because the effect of the virtual hand has been substituted by the virtual mallet.

We conclude that not only visual stimulation but also multisensory integration with active movement is important to induce a strong illusion of body ownership in VR space. Furthermore, a stronger sense of immersion can be expected by performing a free movement task before the operation in VR space.

5. Conclusion

In this study, we constructed a VR system that provided interactive feedback of visual, tactile, and auditory stimulation. We investigated the effects of different hand moving conditions and multisensory integration in the illusion of body ownership with active movement. We designed a virtual xylophone playing task for the VHI experiments. The VR system provided synchronous visual, tactile, and auditory stimulation when the participants played the xylophone in VR environment. Furthermore, we evaluated the effect of the visual existence of the virtual hand under different sensory stimulation conditions.

The experiments showed that (1) free movement yielded strongest illusion in different active movement conditions, (2) tactile stimulation had more significant influences than auditory stimulation on the VHI, and (3) multisensory integration of visual, tactile, and auditory signals induced strongest illusion. We conclude that free active movement with multisensory feedback is the most effective way to induce the illusory ownership in VR space. This study suggested a possibility to improve the sense of immersion in VR space, provide multisensory feedback, and perform a set of free active movements before the formal operation.

We also expect that our study can improve the sense of immersion in VR based clinical applications, such as treatment for phantom limb pain [33] and pain relief during acupuncture [34]. A network of multisensory and homeostatic brain areas was reported to be responsible for maintaining a “body-matrix” [35]. We consider that, by using multisensory integration in VR space, training with virtual limb can be an effective therapeutic method for phantom limb pain experienced by amputees. The VR system used in this study can be extended to virtual rehabilitation training for patients’ recovery after stroke. For further study, experiments for investigating the effectiveness of different multisensory synchrony conditions will be carried out. The changes of electromyography (EMG) [36] and position of the arm/hand [37] will be measured and analyzed quantitatively. The spatial information of the arm/hand during active movement can be obtained by using a motion capture system, which we used to develop a gesture based VR system [38].

Competing Interests

The authors declare no competing interests.

Authors’ Contributions

Woong Choi, Liang Li, Satoru Satoh, and Kozaburo Hachimura conceived and designed the experiments. Woong Choi and Satoru Satoh performed the experiments. Woong Choi, Liang Li, and Satoru Satoh analyzed the data. Woong Choi, Liang Li, and Satoru Satoh wrote the paper.

References

  1. M. Jeannerod, “The mechanism of self-recognition in humans,” Behavioural Brain Research, vol. 142, no. 1-2, pp. 1–15, 2003. View at Publisher · View at Google Scholar · View at Scopus
  2. S. Gallagher, “Philosophical conceptions of the self: implications for cognitive science,” Trends in Cognitive Sciences, vol. 4, no. 1, pp. 14–21, 2000. View at Publisher · View at Google Scholar · View at Scopus
  3. M. Botvinick and J. Cohen, “Rubber hands ‘feel’ touch that eyes see,” Nature, vol. 391, no. 6669, p. 756, 1998. View at Publisher · View at Google Scholar · View at Scopus
  4. H. H. Ehrsson, N. P. Holmes, and R. E. Passingham, “Touching a rubber hand: feeling of body ownership is associated with activity in multisensory brain areas,” The Journal of Neuroscience, vol. 25, no. 45, pp. 10564–10573, 2005. View at Publisher · View at Google Scholar · View at Scopus
  5. M. Samad, A. J. Chung, and L. Shams, “Perception of body ownership is driven by Bayesian sensory inference,” PLoS ONE, vol. 10, no. 2, Article ID e0117178, 2015. View at Publisher · View at Google Scholar · View at Scopus
  6. X. Fuchs, M. Riemer, M. Diers, H. Flor, and J. Trojan, “Perceptual drifts of real and artificial limbs in the rubber hand illusion,” Scientific Reports, vol. 6, article 24362, 2016. View at Publisher · View at Google Scholar
  7. F. Pavani, C. Spence, and J. Driver, “Visual capture of touch: out-of-the-body experiences with rubber gloves,” Psychological Science, vol. 11, no. 5, pp. 353–359, 2000. View at Publisher · View at Google Scholar · View at Scopus
  8. M. Costantini and P. Haggard, “The rubber hand illusion: sensitivity and reference frame for body ownership,” Consciousness and Cognition, vol. 16, no. 2, pp. 229–240, 2007. View at Publisher · View at Google Scholar · View at Scopus
  9. M. Slater, D. Perez-Marcos, H. H. Ehrsson, and M. V. Sanchez-Vives, “Towards a digital body: the virtual arm illusion,” Frontiers in Human Neuroscience, vol. 2, article 6, 2008. View at Publisher · View at Google Scholar · View at Scopus
  10. F. Short and R. Ward, “Virtual limbs and body space: critical features for the distinction between body space and near-body space,” Journal of Experimental Psychology: Human Perception and Performance, vol. 35, no. 4, pp. 1092–1103, 2009. View at Publisher · View at Google Scholar · View at Scopus
  11. W. A. IJsselsteijn, Y. A. W. De Kort, and A. Haans, “Is this my hand I see before me? The rubber hand illusion in reality, virtual reality, and mixed reality,” Presence: Teleoperators and Virtual Environments, vol. 15, no. 4, pp. 455–464, 2006. View at Publisher · View at Google Scholar · View at Scopus
  12. D. Perez-Marcos, M. V. Sanchez-Vives, and M. Slater, “Is my hand connected to my body? The impact of body continuity and arm alignment on the virtual hand illusion,” Cognitive Neurodynamics, vol. 6, no. 4, pp. 295–305, 2012. View at Publisher · View at Google Scholar · View at Scopus
  13. H. H. Ehrsson, C. Spence, and R. E. Passingham, “That's my hand! Activity in premotor cortex reflects feeling of ownership of a limb,” Science, vol. 305, no. 5685, pp. 875–877, 2004. View at Publisher · View at Google Scholar · View at Scopus
  14. M. V. Sanchez-Vives, B. Spanlang, A. Frisoli, M. Bergamasco, and M. Slater, “Virtual hand illusion induced by visuomotor correlations,” PLoS ONE, vol. 5, no. 4, article e10381, 2010. View at Publisher · View at Google Scholar · View at Scopus
  15. T. Dummer, A. Picot-Annand, T. Neal, and C. Moore, “Movement and the rubber hand illusion,” Perception, vol. 38, no. 2, pp. 271–280, 2009. View at Publisher · View at Google Scholar · View at Scopus
  16. Y. Yuan and A. Steed, “Is the rubber hand illusion induced by immersive virtual reality?” in Proceedings of the IEEE Virtual Reality Conference (VR '10), pp. 95–102, Boston, Mass, USA, March 2010. View at Publisher · View at Google Scholar · View at Scopus
  17. N. David, A. Newen, and K. Vogeley, “The ‘sense of agency’ and its underlying cognitive and neural mechanisms,” Consciousness and Cognition, vol. 17, no. 2, pp. 523–534, 2008. View at Publisher · View at Google Scholar · View at Scopus
  18. B. O'Mera, Auditory information in the form of a scratching sound enhances the effects of the rubber hand illusion [Honors thesis], University of Dayton, Dayton, Ohio, USA, 2014.
  19. G. Darnai, T. Szolcsányi, G. Hegedüs et al., “Hearing visuo-tactile synchrony—sound-induced proprioceptive drift in the invisible hand illusion,” British Journal of Psychology, 2016. View at Publisher · View at Google Scholar
  20. M. Rohde, M. Luca, and M. O. Ernst, “The rubber hand illusion: feeling of ownership and proprioceptive drift do not go hand in hand,” PLoS ONE, vol. 6, no. 6, Article ID e21659, 2011. View at Publisher · View at Google Scholar · View at Scopus
  21. M. Tsakiris and P. Haggard, “The rubber hand illusion revisited: visuotactile integration and self-attribution,” Journal of Experimental Psychology: Human Perception and Performance, vol. 31, no. 1, pp. 80–91, 2005. View at Publisher · View at Google Scholar · View at Scopus
  22. V. Interrante, B. Ries, and L. Anderson, “Distance perception in immersive virtual environments, revisited,” in Proceedings of the IEEE Virtual Reality Conference, pp. 3–10, IEEE, Albuquerque, NM, USA, March 2006. View at Publisher · View at Google Scholar · View at Scopus
  23. L. Kaufman and J. H. Kaufman, “Explaining the moon illusion,” Proceedings of the National Academy of Sciences of the United States of America, vol. 97, no. 1, pp. 500–505, 2000. View at Publisher · View at Google Scholar · View at Scopus
  24. J. Zhang, K. Ma, and B. Hommel, “The virtual hand illusion is moderated by context-induced spatial reference frames,” Frontiers in Psychology, vol. 6, article 1659, 2015. View at Publisher · View at Google Scholar · View at Scopus
  25. D. M. Lloyd, “Spatial limits on referred touch to an alien limb may reflect boundaries of visuo-tactile peripersonal space surrounding the hand,” Brain and Cognition, vol. 64, no. 1, pp. 104–109, 2007. View at Publisher · View at Google Scholar · View at Scopus
  26. E. von Holst and H. Mittelstaedt, “Das Reafferenzprinzip—wechselwirkungen zwischen zentralnervensystem und peripherie,” Die Naturwissenschaften, vol. 37, no. 20, pp. 464–476, 1950. View at Publisher · View at Google Scholar · View at Scopus
  27. G. Rizzolatti and L. Craighero, “The mirror-neuron system,” Annual Review of Neuroscience, vol. 27, no. 1, pp. 169–192, 2004. View at Publisher · View at Google Scholar · View at Scopus
  28. A. Murata and H. Ishida, “Representation of bodily self in the multimodal parieto-premotor network,” in Representation and Brain, S. Funahashi, Ed., pp. 151–176, Springer, Berlin, Germany, 2007. View at Publisher · View at Google Scholar
  29. A. Tajadura-Jiménez, A. Väljamäe, I. Toshima, T. Kimura, M. Tsakiris, and N. Kitagawa, “Action sounds recalibrate perceived tactile distance,” Current Biology, vol. 22, no. 13, pp. R516–R517, 2012. View at Publisher · View at Google Scholar · View at Scopus
  30. A. Maravita and A. Iriki, “Tools for the body (schema),” Trends in Cognitive Sciences, vol. 8, no. 2, pp. 79–86, 2004. View at Publisher · View at Google Scholar · View at Scopus
  31. A. Iriki and M. Taoka, “Triadic (ecological, neural, cognitive) niche construction: a scenario of human brain evolution extrapolating tool use and language from the control of reaching actions,” Philosophical Transactions of the Royal Society B: Biological Sciences, vol. 367, no. 1585, pp. 10–23, 2012. View at Publisher · View at Google Scholar · View at Scopus
  32. K. Ma and B. Hommel, “Body-ownership for actively operated non-corporeal objects,” Consciousness and Cognition, vol. 36, pp. 75–86, 2015. View at Publisher · View at Google Scholar · View at Scopus
  33. C. D. Murray, E. Patchick, S. Pettifer, F. Caillette, and T. Howard, “Immersive virtual reality as a rehabilitative technology for phantom limb experience: a protocol,” CyberPsychology & Behavior, vol. 9, no. 2, pp. 167–170, 2006. View at Publisher · View at Google Scholar
  34. D.-S. Chang, Y.-J. Kim, S.-H. Lee et al., “Modifying bodily self-awareness during acupuncture needle stimulation using the rubber hand illusion,” Evidence-based Complementary and Alternative Medicine, vol. 2013, Article ID 849602, 7 pages, 2013. View at Publisher · View at Google Scholar · View at Scopus
  35. G. L. Moseley, A. Gallace, and C. Spence, “Bodily illusions in health and disease: physiological and clinical perspectives and the concept of a cortical ‘body matrix’,” Neuroscience and Biobehavioral Reviews, vol. 36, no. 1, pp. 34–46, 2012. View at Publisher · View at Google Scholar · View at Scopus
  36. T. Tsuji, H. Yamakawa, A. Yamashita et al., “Analysis of electromyography and skin conductance response during rubber hand illusion,” in Proceedings of the IEEE Workshop on Advanced Robotics and Its Social Impacts (ARSO '13), pp. 88–93, Tokyo, Japan, November 2013. View at Publisher · View at Google Scholar · View at Scopus
  37. I. Olivé and A. Berthoz, “Combined induction of rubber-hand illusion and out-of-body experiences,” Frontiers in Psychology, vol. 3, article 128, 2012. View at Publisher · View at Google Scholar · View at Scopus
  38. J. Minagawa, W. Choi, S. Tsurumi et al., “Supporting system with hand gesture for location-based augmented reality,” in Proceedings of the IEEE 4th Global Conference on Consumer Electronics (GCCE '15), pp. 479–480, Osaka, Japan, October 2015. View at Publisher · View at Google Scholar