Neural Plasticity

Neural Plasticity / 2016 / Article
Special Issue

Neuroplastic Mechanisms Underlying Perceptual and Cognitive Enhancement

View this Special Issue

Review Article | Open Access

Volume 2016 |Article ID 5260671 |

M. S. Houde, S. P. Landry, S. Pagé, M. Maheu, F. Champoux, "Body Perception and Action Following Deafness", Neural Plasticity, vol. 2016, Article ID 5260671, 7 pages, 2016.

Body Perception and Action Following Deafness

Academic Editor: Xiaoming Zhou
Received09 Sep 2015
Revised13 Nov 2015
Accepted16 Nov 2015
Published12 Jan 2016


The effect of deafness on sensory abilities has been the topic of extensive investigation over the past decades. These investigations have mostly focused on visual capacities. We are only now starting to investigate how the deaf experience their own bodies and body-related abilities. Indeed, a growing corpus of research suggests that auditory input could play an important role in body-related processing. Deafness could therefore disturb such processes. It has also been suggested that many unexplained daily difficulties experienced by the deaf could be related to deficits in this underexplored field. In the present review, we propose an overview of the current state of knowledge on the effects of deafness on body-related processing.

1. Introduction

In recent years, increasing attention has been paid to sensory changes in individuals having undergone sensory deprivation. Amongst these investigated populations are the deaf. Deaf individuals can provide a unique insight on the effects of sensory deprivation as some have regained partial hearing through the use of a cochlear implant (CI), a neuroprosthetic device that can restore some level of hearing. The deaf provide opportunities to better understand not only the neuroplasticity underlying sensory deprivation but also the adaptive and maladaptive plasticity that can occur upon recovery of a sensory modality. Current research on the effects of deafness on the perception of the external world suggests that a prolonged period of deafness can lead to significant alterations in sensory processing (for a review, see [1, 2]). Due to the link between perception of the environment and the ability to act in it, several unexplained day-to-day life difficulties observed in the deaf have been proposed as related to deficits in body-related processing (e.g., [3, 4]). Furthering our understanding of the effects of deafness on these processes could thus not only provide insight on the fundamental processes of sensory deprivation but also be of great benefit to individuals living with deafness. The objective of this review is to examine the current state of knowledge on the effects of deafness on body-related processes. In order to provide a well-defined interpretation of the literature, we specifically surveyed nonvisual processing in the deaf, namely, somatosensory, motor, and posture processing. As such, processes that are in direct relation with the visual system (e.g., facial recognition and eye-movement) were not included in the review, even with the existence of a relation with the body.

The perception-action model proposes that perception of the environment is directly related to the ability to act in it [5]. Influenced by this model, Paillard [6] suggested a distinction between “knowing where” and “knowing how to get there,” implying a difference between the body image for perception and the body schema for action. A body image consists of the perceptions of one’s body (i.e., judgment of bodily properties), while body schema consists of the sensory-motor capacities in which information necessary for movements is integrated, such as for body posture. Thus, in the same way perception differs from movement, body image differs from body schema.

Our body’s experience of perception or action is not exclusively limited to the somatosensory system but is accompanied by a variety of body-related inputs. Indeed, there exist no single set of peripheral receptors that inform the brain on the location or self-identity of body parts. The experience of our own body has therefore been shown to be constructed within the central nervous system by the integration of several information sources including somatosensory signals (e.g., [710]), visual inputs (e.g. [1114]), auditory signals (e.g., [15]), and vestibular input (e.g., [16]).

Over the years, several well-known tasks have been developed to directly assess the multiple features related to body image and schema. The investigation of these different body-related processes also include task-dependent effects of bodily illusions. It is understood that the brain’s resolution of sensory conflict as induced by bodily illusions is a measurement of the plasticity and flexibility of the underlying body-related processing [17].

Numerous data suggest a significant role for the auditory system in body-related processing. In normally developing individuals, auditory inputs have been shown to interact with the tactile and motor system during speech processing [18, 19], motor behaviour (e.g., [2024]), posture and balance (e.g., [2527]), and the general initiation of motor action [2831]. Finally, the influence of auditory inputs on tactile body perception has also often been demonstrated using multisensory tasks (e.g., [32, 33]). In these audiotactile interaction tasks, the manipulation of auditory input alters tactile perception of either palmar dryness or the number of perceived tactile stimulation.

Evidence suggesting a role of auditory inputs on body-related processing raises important questions for the impact of sensory deprivation [3, 4, 34]. Indeed, considering these evidences, it is perfectly reasonable to expect that deaf individuals would perceive their own body differently than hearing individuals. If so, according to the perception-action model, the deaf would also have altered fundamental perception of their environment.

3. Body Image for Perception in the Deaf

3.1. Body Sensations

Sight, hearing, taste, smell, and touch are the five traditionally recognized senses. Unlike sight, hearing, smell, and taste, which are all located in specific parts of the body, the sense of touch is much less centralized. Indeed, touch (or the peripheral somatosensory system) is very hard to localize because tactile sensory information enters the nervous system from every area of the body. The sense of touch can provide sufficient information for an individual to determine the numerous features related to a specific object. In this sense, touch allows an individual to learn about the proximal environment and adapt behaviour accordingly. Numerous standardized tasks have been developed to examine human tactile perception. These tasks allow for the examination of detection, resolution, and discrimination capabilities (e.g., static two-point discrimination [35], tactile sensitivity thresholds using Semmes-Weinstein monofilaments [36], and tactile resolution using a grating orientation task [37]).

Similar to studies revealing highly specific changes to visual processing (for a review, see [1, 2, 62, 63]), researches on the tactile domain are often inconsistent depending on the specificity of the tasks used and/or the characteristics of participants (e.g., congenitally deaf; hearing impaired; CI users), suggesting that deafness does not seem to lead to uniform alterations to tactile perception.

Tactile detection and discrimination tasks have been examined in the deaf without statistically significant differences with normally hearing individuals (e.g., [43, 44]). However, a positive correlation between hearing and tactile acuity has been suggested [64]. Additionally, no significant differences were found for tactile detection and discrimination abilities in deaf CI users [45, 46]. More targeted tactile abilities were also investigated and no significant differences were found between early deaf and control groups for spatial sensitivity [48], temporal onset-offset-order discrimination [44], frequency discrimination [49], object identification [38], or tactile discrimination of a rhythmic pattern [42]. However, there is compelling evidence that deafness can result in changes for tactile perception in some specific conditions. For instance, data suggests superior vibrotactile frequency change sensitivity [49] and haptic orientation [47] in congenitally deaf humans. Congenitally deaf CI users were found to have faster reaction time in response to tactile stimuli [40]. However, this increased tactile reaction time was not found in congenitally deaf individuals [39] or in late deaf CI users [40, 41]. The altered tactile abilities from deafness are not exclusively improvements as reduced tactile temporal sensitivity has been revealed in congenitally deaf individuals [48]. These results suggest that, for tactile abilities, plasticity following deafness does not lead to uniform behavioural improvements and can lead certainly to maladaptive behavioural compensation in specific behavioural conditions.

3.2. Multisensory Interactions Involving Touch

The sense of touch can be altered through the simultaneous stimulation of another sense. Interaction between senses can enhance overall perceptual accuracy and saliency through cooperative advantages in certain congruent situations (e.g., [65, 66]). Body-related multisensory interactions can be examined through multiple tasks, when the information coming from two modalities are congruent or incongruent. The presentation of conflicting multisensory information can result in an illusory percept. We can gain insight into the ability to integrate multisensory information following deafness by studying alterations to this illusory percept.

3.2.1. Integration of Congruent Auditory and Tactile Information

The interaction between auditory and tactile congruent information has recently been examined in the deaf with CI. Nava et al. [40] showed that both congenitally and late deaf CI users were able to integrate congruent audiotactile stimuli in a reaction time task as effectively as control group members. These results suggest that congenital and acquired deafness does not prevent the development and recovery of this form of basic multisensory processing. However, the authors also found that congenitally deaf CI users (not late deaf CI users) benefited from redundancy gains in the presence of the multisensory stimulation significantly less than their matched controls. This may be explained by a change in tactile perception in those individuals as reviewed in the previous section.

3.2.2. Segregation and Integration of Incongruent Auditory and Tactile Information

Two of the most robust examples of auditory-somatosensory illusions are the “audiotactile illusory flash effect” [33] for the temporal domain and the “parchment skin illusion” [32] for the spectral domain. Both of these tasks are examples of cross-modal interactions. The “audiotactile illusory flash effect” is a nonspeech illusory percept in which the simultaneous presentation of a single somatosensory stimulus with two consecutive sounds can lead to the perception of two distinct tactile sensations in normally hearing individuals. The “parchment skin illusion” is also a nonspeech illusory percept in which an amplification or reduction of high-frequency content from the sound generated by rubbing hands together results in an alteration of the perceived palmar dryness/moistness. Our research team recently use these two tasks to investigated whether a period of deafness disturbed the segregation or the integration of incongruent temporal and spectral audiotactile processing in deaf adults using CI [4, 46]. In both tasks, normally hearing individuals integrated auditory and tactile information effectively in the context of an illusory audiotactile percept, whereas CI users did not. Considering the fundamental nature of the stimuli involved in these tasks, failure to segregate or integrate multisensory information could not be explained by the use of the CI.

4. Body Schema for Action in the Deaf

4.1. Body Movements

Savelsbergh et al. [51] suggested that the absence of early auditory input could contribute to motor delays in deaf children. This hypothesis was later tested and results suggested that indeed hearing children performed significantly better than deaf children in various evaluations of motor development [50]. More specifically, several studies of motor capacities in deaf children have reported deficits in general dynamic coordination, balance, ball catching abilities, and slower reaction times and speed of movement execution [34, 52, 67, 68]. Interestingly, studies of motor coordination combining deaf and CI users do not report significant differences between deaf and hearing abilities [50].

Several findings suggest that profound deafness may result in disturbances to nonauditory abilities related to serial-order information [54, 56]. In particular, Conway et al. [54] reported deficits of implicit learning abilities in deaf children with CI on color-sequences task. These authors proposed that exposure to sound, a temporally arrayed signal, provides important experience with learning sequential patterns in the environment. This lack of experience with sound at a young age may therefore delay the development of domain-general processing skills of sequential patterns, including nonauditory abilities [55]. In terms of motor sequencing specifically, Schlumberger et al. [53] found that deaf children showed delays in the development in the production of sequential limb movement. Another recent investigation with deaf children with CI by Conway et al. [54] revealed disturbances in the ability to perform a simple fingertip-tapping task. Our research team recently investigate the procedural learning skills of deaf adults with and without CI [56]. The serial reaction time task (SRTT [69]), a task sensitive to both explicit and implicit learning, was administered to investigate possible motor alteration subsequent to auditory deprivation. Results revealed statistically significant differences between the deaf and control groups in sequence-specific learning, with deaf subjects being less efficient than controls in acquiring sequence-specific knowledge. These results further supported impaired sequential learning abilities in the deaf [54, 55].

4.2. Body Posture

Researchers have known for more than a century that changes in limb posture (such as crossing the hands) can impair people’s performance in temporal order judgments tasks involving tactile stimulus presented to either hand (e.g., [70]). This crossed hands deficit has been attributed to a conflict between externally (i.e., visual and auditory) and anatomically anchored reference systems (i.e., somatosensory) when people localize tactile stimuli [7173]. Considering this, it has been suggested that such modulation in the perception of touch caused by body posture could be impaired in individuals deprived of one external sensory system, such as in deaf or blind individuals [71]. Indeed, the performance of congenitally blind adults does not seem to be affected by crossing the hands unlike in seeing individuals [71]. This provides insight on the critical role of visual inputs in modulating the perception of touch that may arise from the emergence of specific crossmodal links during development. However, the role of auditory inputs in the development and maintenance of this crucial processing is still unexplored.

Body posture has, however, been evaluated during balance task with a force platform in participants with sensorineural hearing loss. Results suggest that participants with sensorineural hearing loss have poorer balance than normal hearing participants [5760] and tend to depend mostly on vision and somatosensory inputs [57, 59] to maintain their balance. No significant change in body posture has been revealed for deaf participants with unilateral or bilateral CI [61].

5. Discussion

The objective of this review was to survey the existing corpus of research on the effect of deafness on body-related abilities. We also considered studies of body-related abilities in CI users since sensory deprivation, even temporary, can have an effect on the remaining senses. Multiple investigations have examined the effects of sensory deprivation on the remaining senses. Indeed, the effects of deafness on visual abilities have received considerable attention [62, 63], but body-related abilities have garnered considerably less. However, the effect of deafness on body-related processing has important repercussions as it is suggested to be a contributing factor in the daily difficulties observed in the deaf (e.g., [3, 4]).

Auditory inputs are believed to play an important role in the development of body-related processing in the hearing (e.g., [15, 1833]) and it has been suggested that deafness could have a dramatic impact on these processes [3, 4, 34].

There does not appear to be a global trend on the effects of deafness body-related processes (for an overview of the reviewed articles, see Table 1). The variability between studies surveyed in this review highlights the existing debate over the identity of the altered systems and the mechanisms that mediate adaptive or maladaptive neuroplastic changes following deafness. As shown by Table 1, comparing results between studies is made particularly difficult in the deaf due to the multiple confounding factors involved in deafness. Beyond the categorization of early and late deaf and cochlear implantation, factors such as duration of deafness [74], communication strategy [75], onset of deafness [74, 76, 77], hearing aid use [78], and duration of CI use [46, 7982] can all influence performance in the deaf. Comparing investigations across studies is complicated by this large set of variables that are often left unreported. Yet, the factors that may constrain or promote performance in body-related processing following deafness are still unknown.


Body sensations
Object identificationED = H[38]
Reaction timeED = H[39]
EDCI > H[40]
LDCI = H[40, 41]
Discrimination of rhythmic patternED = H[42]
SensitivityD = H[43, 44]
CI = H[4, 45, 46]
Orientation detectionED > H[47]
CI = H[4, 46]
Temporal sensitivityED < H[48]
Spatial sensitivityED = H[48]
Temporal onset- offset-order discriminationED = H[44]
Frequency discriminationED = H[49]
CI = H[4, 46]
Frequency change detectionED > H[49]

Multisensory interactions involving touch
Audiotactile reaction timeCI = H[40]
Audiotactile segregationCI ≠ H[4]
Audiotactile integrationCI ≠ H[46]

Body movement
Motor coordinationD = H[50]
D < H[34, 51, 52]
Sequential limb movementD < H[53]
Serial-order learningCI < H[54, 55]
D < H[56]

Body posture
PostureD < H[5761]

D: deaf, ED: early deaf, LD: late deaf, CI: cochlear implant users, EDCI: early deaf cochlear implant users, LDCI: late deaf cochlear implant users, and D: deaf and cochlear implant users confounded.
D = H, no population difference; D > H, deaf group demonstrating enhanced body related abilities compared to hearing group; D < H, deaf group displaying worse body related abilities compared to hearing group; D ≠ H, deaf group displaying significantly altered abilities compared to hearing group.

Future research looking at deafness and body-related processes could help further identify the role of auditory experience, whether in early- or late-life, in modulating such processes. These investigations will help deepen our knowledge of not only the neuroplastic changes of deafness to body-processes but also the effects of auditory restoration. More specifically, such understanding will help to identify the systems that are altered and the mechanisms and factors that mediate adaptive or maladaptive changes following deafness. The results from these investigations will provide complementary information to the existing research examining the role of auditory input on external processing following deafness (for a review, see [1, 2]). Moreover, further investigations in this burgeoning field of research will provide additional understanding to the daily difficulties observed in the deaf. Much of the understanding of our surrounding occurs in a multisensory environment in which sensory-motor and auditory cues are present. Identifying behavioural changes in deaf and CI users has direct and significant implications for recognizing the difficulties experienced in day-to-day life. Knowledge stemming from such research will allow more effective patient counselling and expectation management and enable more individualized postimplantation rehabilitation strategies.

Conflict of Interests

The authors declare that there is no conflict of interest regarding the publication of this paper.


  1. D. Bavelier and H. J. Neville, “Cross-modal plasticity: where and how?” Nature Reviews Neuroscience, vol. 3, no. 6, pp. 443–452, 2002. View at: Google Scholar
  2. O. Collignon, F. Champoux, P. Voss, and F. Lepore, “Sensory rehabilitation in the plastic brain,” Progress in Brain Research, vol. 191, pp. 211–231, 2011. View at: Publisher Site | Google Scholar
  3. S. M. Nasir and D. J. Ostry, “Speech motor learning in profoundly deaf adults,” Nature Neuroscience, vol. 11, no. 10, pp. 1217–1222, 2008. View at: Publisher Site | Google Scholar
  4. S. P. Landry, J.-P. Guillemot, and F. Champoux, “Temporary deafness can impair multisensory integration: a study of cochlear-implant users,” Psychological Science, vol. 24, no. 7, pp. 1260–1268, 2013. View at: Publisher Site | Google Scholar
  5. A. D. Milner and M. A. Goodale, The Visual Brain in Action, Oxford University Press, Oxford, UK, 1995.
  6. J. Paillard, “Body schema and body image: a double dissociation in deafferented patients,” in Motor Control: Today and Tomorrow, G. N. Gantchev, S. Mori, and J. Massion, Eds., Academic Publishing House, Sofia, Bulgaria, 1999. View at: Google Scholar
  7. J. R. Lackner, “Some proprioceptive influences on the perceptual representation of body shape and orientation,” Brain, vol. 111, no. 2, pp. 281–297, 1988. View at: Publisher Site | Google Scholar
  8. V. S. Ramachandran and W. Hirstein, “The perception of phantom limbs. The D. O. Hebb lecture,” Brain, vol. 121, no. 9, pp. 1603–1630, 1998. View at: Publisher Site | Google Scholar
  9. E. Naito, P. E. Roland, and H. H. Ehrsson, “I feel my hand moving: a new role of the primary motor cortex in somatic perception of limb movement,” Neuron, vol. 36, no. 5, pp. 979–988, 2002. View at: Publisher Site | Google Scholar
  10. H. H. Ehrsson, T. Kito, N. Sadato, R. E. Passingham, and E. Naito, “Neural substrate of body size: illusory feeling of shrinking of the waist,” PLoS Biology, vol. 3, no. 12, article e412, 2005. View at: Publisher Site | Google Scholar
  11. V. I. Petkova and H. H. Ehrsson, “If I were you: perceptual illusion of body swapping,” PLoS ONE, vol. 3, no. 12, Article ID e3832, 2008. View at: Publisher Site | Google Scholar
  12. H. H. Ehrsson, “The experimental induction of out-of-body experiences,” Science, vol. 317, no. 5841, article 1048, 2007. View at: Publisher Site | Google Scholar
  13. H. H. Ehrsson, C. Spence, and R. E. Passingham, “That's my hand! Activity in premotor cortex reflects feeling of ownership of a limb,” Science, vol. 305, no. 5685, pp. 875–877, 2004. View at: Publisher Site | Google Scholar
  14. M. S. A. Graziano, D. F. Cooke, and C. S. R. Taylor, “Coding the location of the arm by sight,” Science, vol. 290, no. 5497, pp. 1782–1786, 2000. View at: Publisher Site | Google Scholar
  15. M. S. A. Graziano, L. A. J. Reiss, and C. G. Gross, “A neuronal representation of the location of nearby sounds,” Nature, vol. 397, no. 6718, pp. 428–430, 1999. View at: Publisher Site | Google Scholar
  16. C. Pfeiffer, A. Serino, and O. Blanke, “The vestibular system: a spatial reference for bodily self-consciousness,” Frontiers in Integrative Neuroscience, vol. 8, article 31, 2014. View at: Publisher Site | Google Scholar
  17. M. P. M. Kammers, I. J. M. van der Ham, and H. C. Dijkerman, “Dissociating body representations in healthy individuals: differential effects of a kinaesthetic illusion on perception and action,” Neuropsychologia, vol. 44, no. 12, pp. 2430–2436, 2006. View at: Publisher Site | Google Scholar
  18. S. M. Nasir and D. J. Ostry, “Auditory plasticity and speech motor learning,” Proceedings of the National Academy of Sciences of the United States of America, vol. 106, no. 48, pp. 20470–20475, 2009. View at: Publisher Site | Google Scholar
  19. T. Ito and D. J. Ostry, “Speech sounds alter facial skin sensation,” Journal of Neurophysiology, vol. 107, no. 1, pp. 442–447, 2012. View at: Publisher Site | Google Scholar
  20. E. Gherri, J. Driver, and M. Eimer, “Eye movement preparation causes spatially-specific modulation of auditory processing: new evidence from event-related brain potentials,” Brain Research, vol. 1224, pp. 88–101, 2008. View at: Publisher Site | Google Scholar
  21. A. Garg, D. Schwartz, and A. A. Stevens, “Orienting auditory spatial attention engages frontal eye fields and medial occipital cortex in congenitally blind humans,” Neuropsychologia, vol. 45, no. 10, pp. 2307–2321, 2007. View at: Publisher Site | Google Scholar
  22. C. Rorden and J. Driver, “Does auditory attention shift in the direction of an upcoming saccade?” Neuropsychologia, vol. 37, no. 3, pp. 357–377, 1999. View at: Publisher Site | Google Scholar
  23. K.-P. Schaefer, K.-J. Süss, and E. Fiebig, “Acoustic-induced eye movements,” Annals of the New York Academy of Sciences, vol. 374, pp. 674–688, 1981. View at: Publisher Site | Google Scholar
  24. T. J. Van Grootel and A. J. Van Opstal, “Human sound-localization behaviour after multiple changes in eye position,” European Journal of Neuroscience, vol. 29, no. 11, pp. 2233–2246, 2009. View at: Publisher Site | Google Scholar
  25. Z. Kapoula, Q. Yang, T.-T. Lê et al., “Medio-lateral postural instability in subjects with tinnitus,” Frontiers in Neurology, vol. 2, article 35, 2011. View at: Publisher Site | Google Scholar
  26. S. H. Park, K. Lee, T. Lockhart, and S. J. Kim, “Effects of sound on postural stability during quiet standing,” Journal of NeuroEngineering and Rehabilitation, vol. 8, article 67, 2011. View at: Publisher Site | Google Scholar
  27. R. G. Kanegaonkar, K. Amin, and M. Clarke, “The contribution of hearing to normal balance,” Journal of Laryngology and Otology, vol. 126, no. 10, pp. 984–988, 2012. View at: Publisher Site | Google Scholar
  28. J. Valls-Solé, A. Solé, F. Valldeoriola, E. Muñoz, L. E. Gonzalez, and E. S. Tolosa, “Reaction time and acoustic startle in normal human subjects,” Neuroscience Letters, vol. 195, no. 2, pp. 97–100, 1995. View at: Publisher Site | Google Scholar
  29. L. B. Oude Nijhuis, L. Janssen, B. R. Bloem et al., “Choice reaction times for human head rotations are shortened by startling acoustic stimuli, irrespective of stimulus direction,” Journal of Physiology, vol. 584, no. 1, pp. 97–109, 2007. View at: Publisher Site | Google Scholar
  30. A. Queralt, V. Weerdesteyn, H. J. R. van Duijnhoven, J. M. Castellote, J. Valls-solé, and J. Duysens, “The effects of an auditory startle on obstacle avoidance during walking,” Journal of Physiology, vol. 586, no. 18, pp. 4453–4463, 2008. View at: Publisher Site | Google Scholar
  31. R. F. Reynolds and B. L. Day, “Fast visuomotor processing made faster by sound,” The Journal of Physiology, vol. 583, no. 3, pp. 1107–1115, 2007. View at: Publisher Site | Google Scholar
  32. V. Jousmäki and R. Hari, “Parchment-skin illusion: sound-biased touch,” Current Biology, vol. 8, article 190, 1998. View at: Google Scholar
  33. K. Hötting and B. Röder, “Hearing cheats touch, but less in congenitally blind than in sighted individuals,” Psychological Science, vol. 15, no. 1, pp. 60–64, 2004. View at: Publisher Site | Google Scholar
  34. P. H. Wiegersma and A. Van der Velde, “Motor development of deaf children,” Journal of Child Psychology and Psychiatry, vol. 24, no. 1, pp. 103–111, 1983. View at: Publisher Site | Google Scholar
  35. D. Warwick, R. Dunn, E. Melikyan, and J. Vadher, Hand Surgery, Oxford University Press, Oxford, UK, 2009.
  36. J. A. Bell-Krotoski and E. Tomancik, “Repeatability of testing with the Semmes-Weinstein monofilaments,” Journal of Hand Surgery, vol. 12, no. 1, pp. 155–161, 1987. View at: Google Scholar
  37. R. W. Van Boven and K. O. Johnson, “The limit of tactile spatial resolution in humans: grating orientation discrimination at the lip, tongue, and finger,” Neurology, vol. 44, no. 12, pp. 2361–2366, 1994. View at: Publisher Site | Google Scholar
  38. W. Schiff and R. S. Dytell, “Tactile identification of letters: a comparison of deaf and hearing childrens' performances,” Journal of Experimental Child Psychology, vol. 11, no. 1, pp. 150–164, 1971. View at: Publisher Site | Google Scholar
  39. B. Heimler and F. Pavani, “Response speed advantage for vision does not extend to touch in early deaf adults,” Experimental Brain Research, vol. 232, no. 4, pp. 1335–1341, 2014. View at: Publisher Site | Google Scholar
  40. E. Nava, D. Bottari, A. Villwock et al., “Audio-tactile integration in congenitally and late deaf cochlear implant users,” PLoS ONE, vol. 9, no. 6, Article ID e99606, 2014. View at: Publisher Site | Google Scholar
  41. N. Hauthal, S. Debener, S. Rach, P. Sandmann, and J. D. Thorne, “Visuo-tactile interactions in the congenitally deaf: a behavioral and event-related potential study,” Frontiers in Integrative Neuroscience, vol. 8, article 98, 2015. View at: Publisher Site | Google Scholar
  42. J. Rosenstein, “Tactile perception of rhythmic patterns in normal, blind, deaf and aphasic children. Independent studies and capstones,” Paper 393, Program in Audiology and Communication Sciences, Washington University School of Medicine, St. Louis, Mo, USA, 1956. View at: Google Scholar
  43. A. M. Donahue and T. Letowski, “Vibrotactile performance by normal and hearing-impaired subjects using two commercially available vibrators,” International Journal of Audiology, vol. 24, no. 5, pp. 362–373, 1985. View at: Publisher Site | Google Scholar
  44. T. M. Moallem, C. M. Reed, and L. D. Braida, “Measures of tactual detection and temporal order resolution in congenitally deaf and normal-hearing adults,” Journal of the Acoustical Society of America, vol. 127, no. 6, pp. 3696–3709, 2010. View at: Publisher Site | Google Scholar
  45. C. M. Conway, J. Karpicke, E. M. Anaya, S. C. Henning, W. G. Kronenberger, and D. B. Pisoni, “Nonverbal cognition in deaf children following cochlear implantation: motor sequencing disturbances mediate language delays,” Developmental Neuropsychology, vol. 36, no. 2, pp. 237–254, 2011. View at: Publisher Site | Google Scholar
  46. S. P. Landry, J.-P. Guillemot, and F. Champoux, “Audiotactile interaction can change over time in cochlear implant users,” Frontiers in Human Neuroscience, vol. 8, article 316, 2014. View at: Publisher Site | Google Scholar
  47. R. Van Dijk, A. M. L. Kappers, and A. Postma, “Superior spatial touch: improved haptic orientation processing in deaf individuals,” Experimental Brain Research, vol. 230, no. 3, pp. 283–289, 2013. View at: Publisher Site | Google Scholar
  48. N. Bolognini, C. Cecchetto, C. Geraci, A. Maravita, A. Pascual-Leone, and C. Papagno, “Hearing shapes our perception of time: temporal discrimination of tactile stimuli in deaf people,” Journal of Cognitive Neuroscience, vol. 24, no. 2, pp. 276–286, 2012. View at: Publisher Site | Google Scholar
  49. S. Levänen and D. Hamdorf, “Feeling vibrations: enhanced tactile sensitivity in congenitally deaf humans,” Neuroscience Letters, vol. 301, no. 1, pp. 75–77, 2001. View at: Publisher Site | Google Scholar
  50. F. Gheysen, G. Loots, and H. Van waelvelde, “Motor development of deaf children with and without cochlear implants,” Journal of Deaf Studies and Deaf Education, vol. 13, no. 2, pp. 215–224, 2008. View at: Publisher Site | Google Scholar
  51. G. J. P. Savelsbergh, J. B. Netelenbos, and H. T. A. Whiting, “Auditory perception and the control of spatially coordinated action of deaf and hearing children,” Journal of Child Psychology and Psychiatry and Allied Disciplines, vol. 32, no. 3, pp. 489–500, 1991. View at: Publisher Site | Google Scholar
  52. E. Hartman, U. Houwen, and C. Visscher, “Motor skill performance and sports participation in deaf elementary school children,” Adapted Physical Activity Quarterly, vol. 28, no. 2, pp. 132–145, 2011. View at: Google Scholar
  53. E. Schlumberger, J. Narbona, and M. Manrique, “Non-verbal development of children with deafness with and without cochlear implants,” Developmental Medicine and Child Neurology, vol. 46, no. 9, pp. 599–606, 2004. View at: Publisher Site | Google Scholar
  54. C. M. Conway, D. B. Pisoni, E. M. Anaya, J. Karpicke, and S. C. Henning, “Implicit sequence learning in deaf children with cochlear implants,” Developmental Science, vol. 14, no. 1, pp. 69–82, 2011. View at: Publisher Site | Google Scholar
  55. C. M. Conway, D. B. Pisoni, and W. G. Kronenberger, “The importance of sound for cognitive sequencing abilities: the auditory scaffolding hypothesis,” Current Directions in Psychological Science, vol. 18, no. 5, pp. 275–279, 2009. View at: Publisher Site | Google Scholar
  56. J. Lévesque, H. Théoret, and F. Champoux, “Reduced procedural motor learning in deaf individuals,” Frontiers in Human Neuroscience, vol. 8, article 343, 2014. View at: Publisher Site | Google Scholar
  57. H. Suarez, S. Angeli, A. Suarez, B. Rosales, X. Carrera, and R. Alonso, “Balance sensory organisation in children with profound hearing loss and cochlear implants,” International Journal of Pediatric Otorhinolaryngology, vol. 71, no. 4, pp. 629–637, 2007. View at: Publisher Site | Google Scholar
  58. S. L. Cushing, R. Chia, A. L. James, B. C. Papsin, and K. A. Gordon, “A test of static and dynamic balance function in children with cochlear implants: the vestibular olympics,” Archives of Otolaryngology—Head & Neck Surgery, vol. 134, no. 1, pp. 34–38, 2008. View at: Publisher Site | Google Scholar
  59. M.-W. Huang, C.-J. Hsu, C.-C. Kuan, and W.-H. Chang, “Static balance function in children with cochlear implants,” International Journal of Pediatric Otorhinolaryngology, vol. 75, no. 5, pp. 700–703, 2011. View at: Publisher Site | Google Scholar
  60. A. M. M. de Sousa, J. de França Barros, and B. M. de Sousa Neto, “Postural control in children with typical development and children with profound hearing loss,” International Journal of General Medicine, vol. 5, pp. 433–439, 2012. View at: Publisher Site | Google Scholar
  61. M. E. Eustaquio, W. Berryhill, J. A. Wolfe, and J. E. Saunders, “Balance in children with bilateral cochlear implants,” Otology and Neurotology, vol. 32, no. 3, pp. 424–427, 2011. View at: Publisher Site | Google Scholar
  62. D. Bavelier, M. W. G. Dye, and P. C. Hauser, “Do deaf individuals see better?” Trends in Cognitive Sciences, vol. 10, no. 11, pp. 512–518, 2006. View at: Publisher Site | Google Scholar
  63. F. Pavani and D. Bottari, “Visual abilities in individuals with profound deafness a critical review,” in The Neural Bases of Multisensory Processes, M. M. Murray and M. T. Wallace, Eds., CRC Press, Boca Raton, Fla, USA, 2012. View at: Google Scholar
  64. H. Frenzel, J. Bohlender, K. Pinsker et al., “A genetic basis for mechanosensory traits in humans,” PLoS Biology, vol. 10, no. 5, Article ID e1001318, 2012. View at: Publisher Site | Google Scholar
  65. G. A. Calvert and T. Thesen, “Multisensory integration: methodological approaches and emerging principles in the human brain,” Journal of Physiology Paris, vol. 98, no. 1–3, pp. 191–205, 2004. View at: Publisher Site | Google Scholar
  66. B. E. Stein and T. R. Stanford, “Multisensory integration: current issues from the perspective of the single neuron,” Nature Reviews Neuroscience, vol. 9, no. 4, pp. 255–266, 2008. View at: Publisher Site | Google Scholar
  67. G. W. Gayle and R. L. Pohlman, “Comparative study of the dynamic, static, and rotary balance of deaf and hearing children,” Perceptual and Motor Skills, vol. 70, no. 3, pp. 883–888, 1990. View at: Publisher Site | Google Scholar
  68. J. C. Siegel, M. Marchetti, and J. S. Tecklin, “Age-related balance changes in hearing-impaired children,” Physical Therapy, vol. 71, no. 3, pp. 183–189, 1991. View at: Google Scholar
  69. M. A. Perez, S. P. Wise, D. T. Willingham, and L. G. Cohen, “Neurophysiological mechanisms involved in transfer of procedural knowledge,” Journal of Neuroscience, vol. 27, no. 5, pp. 1045–1053, 2007. View at: Publisher Site | Google Scholar
  70. C. T. Burnett, “Studies on the influence of abnormal position upon the motor impulse,” Psychological Review, vol. 11, no. 6, pp. 370–394, 1904. View at: Publisher Site | Google Scholar
  71. B. Röder, F. Rösler, and C. Spence, “Early vision impairs tactile perception in the blind,” Current Biology, vol. 14, no. 2, pp. 121–124, 2004. View at: Publisher Site | Google Scholar
  72. B. Röder and F. Rösler, “Compensatory plasticity as a consequence of sensory loss,” in Handbook of Multisensory Processing, G. Calvert, C. Spence, and B. E. Stein, Eds., MIT Press, Cambridge, Mass, USA, 2004. View at: Google Scholar
  73. B. Röder, J. Föcker, K. Hötting, and C. Spence, “Spatial coordinate systems for tactile spatial attention depend on developmental vision: evidence from event-related potentials in sighted and congenitally blind adult humans,” European Journal of Neuroscience, vol. 28, no. 3, pp. 475–483, 2008. View at: Publisher Site | Google Scholar
  74. D. S. Lee, J. S. Lee, S. H. Oh et al., “Cross-modal plasticity and cochlear implants,” Nature, vol. 409, no. 6817, pp. 149–150, 2001. View at: Google Scholar
  75. S. Hirano, Y. Naito, H. Kojima et al., “Functional differentiation of the auditory association area in prelingually deaf subjects,” Auris Nasus Larynx, vol. 27, no. 4, pp. 303–310, 2000. View at: Publisher Site | Google Scholar
  76. Y. Naito, S. Hirano, I. Honjo et al., “Sound-induced activation of auditory cortices in cochlear implant users with post- and prelingual deafness demonstrated by positron emission tomography,” Acta Oto-Laryngologica, vol. 117, no. 4, pp. 490–496, 1997. View at: Publisher Site | Google Scholar
  77. A. L. Giraud, C. J. Price, J. M. Graham, and R. S. J. Frackowiak, “Functional plasticity of language-related brain areas after cochlear implantation,” Brain, vol. 124, no. 7, pp. 1307–1316, 2001. View at: Publisher Site | Google Scholar
  78. M. M. Shiell, F. Champoux, and R. J. Zatorre, “Reorganization of auditory cortex in early-deaf people: functional connectivity and relationship to hearing aid use,” Journal of Cognitive Neuroscience, vol. 27, no. 1, pp. 150–163, 2014. View at: Publisher Site | Google Scholar
  79. J. G. Nicholas and A. E. Geers, “Effects of early auditory experience on the spoken language of deaf children at 3 years of age,” Ear and Hearing, vol. 27, no. 3, pp. 286–298, 2006. View at: Publisher Site | Google Scholar
  80. C. Pantev, A. Dinnesen, B. Ross, A. Wollbrink, and A. Knief, “Dynamics of auditory plasticity after cochlear implantation: a longitudinal study,” Cerebral Cortex, vol. 16, no. 1, pp. 31–36, 2006. View at: Publisher Site | Google Scholar
  81. G. W. J. A. Damen, A. J. Beynon, P. F. M. Krabbe, J. J. S. Mulder, and E. A. M. Mylanus, “Cochlear implantation and quality of life in postlingually deaf adults: long-term follow-up,” Otolaryngology—Head and Neck Surgery, vol. 136, no. 4, pp. 597–604, 2007. View at: Publisher Site | Google Scholar
  82. M. C. Allen, T. P. Nikolopoulos, and G. M. O'Donoghue, “Speech intelligibility in children after cochlear implantation,” American Journal of Otology, vol. 19, no. 6, pp. 742–746, 1998. View at: Google Scholar

Copyright © 2016 M. S. Houde et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Related articles

No related content is available yet for this article.
 PDF Download Citation Citation
 Download other formatsMore
 Order printed copiesOrder

Related articles

No related content is available yet for this article.

Article of the Year Award: Outstanding research contributions of 2021, as selected by our Chief Editors. Read the winning articles.