Table of Contents Author Guidelines Submit a Manuscript
Advances in Human-Computer Interaction
Volume 2012, Article ID 210507, 8 pages
Research Article

Exploring Sensor Gloves for Teaching Children Sign Language

Faculty of Information Technology, Monash University, Clayton Campus, VIC 3800, Australia

Received 17 March 2012; Accepted 20 June 2012

Academic Editor: Armando Bennet Barreto

Copyright © 2012 Kirsten Ellis and Jan Carlo Barca. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. G. R. Karlan, “Manual communication with those who can hear,” in Manual Communication: Implications for Education, H. Bornstein, Ed., pp. 151–185, Gallaudet University Press, Washington, DC, USA, 1990. View at Google Scholar
  2. K. Ellis and K. Blashki, “Children, Australian sign language and the web; the possibilities,” pp. 281-287.
  3. K. Ellis and K. Blashki, “The digital playground: kindergarten children learning sign language via multimedia,” AACE Journal, vol. 15, no. 3, pp. 225–253, 2007. View at Google Scholar
  4. gizmag, “The acceleglove—capturing hand gestures in virtual reality,” September 2003,
  5. J. L. Hernandez-Rebollar, N. Kyriakopoulos, and R. W. Lindeman, “A new instrumented approach for translating American sign language into sound and text,” in Proceedings of the 6th IEEE International Conference on Automatic Face and Gesture Recognition, pp. 547–552, May 2004. View at Publisher · View at Google Scholar
  6. G. Grimes, Digital Data Entry Glove Interface Device, AT & T Bell Labs, 1983.
  7. S. Sidney and E. Geoffrey, “Glove talk—a neural-network interface between a data-glove and a speech synthesizer,” IEEE Transactions on Neural Networks, vol. 4, no. 1, pp. 2–8, 1993. View at Publisher · View at Google Scholar
  8. J. Kramer and L. Leifer, “The talking glove: an expressive and receptive “verbal” communication aid for the deaf, deaf-blind, and Nonvocal,” in Proceedings of the Computer Technology, Special Education, and Rehabilitation Conference, pp. 335–340, 1987.
  9. J. Hernandez, N. Kyriakopoulos, and W. Lindeman, The AcceleGlove: A Whole-Hand Input Device for Virtual Reality, ACM Press, 2002.
  10. D. Sturman and D. Zelter, “A survey of glove-based input,” IEEE Computer Graphics and Applications, vol. 14, no. 1, pp. 30–39, 1994. View at Publisher · View at Google Scholar
  11. M. Mohandes and S. Buraiky, “Automation of the Arabic sign language recognition using the powerglove,” AIML Journal, vol. 7, no. 1, pp. 41–46, 2007. View at Google Scholar
  12. vrlogic, “CyberGlove,” March, 2009,
  13. R. M. McGuire, J. Hernandez-Rebollar, T. Starner, V. Henderson, H. Brashear, and D. S. Ross, “Towards a one-way American sign language translator,” in Proceedingsof the 6th IEEE International Conference on Automatic Face and Gesture Recognition (FGR '04), pp. 620–625, May 2004. View at Publisher · View at Google Scholar · View at Scopus
  14. J. L. Hernandez-Rebollar and E. Mendez, “Interactive American sign language dictionary,” in Proceedings of the ACM SIGGRAPH International Conference on Computer Graphics and Interactive Techniques, p. 26, Los Angeles, Calif, USA, August 2004. View at Publisher · View at Google Scholar · View at Scopus
  15. S. Simon and K. Johnson, “Improving the efficacy of motion analysis as a clinical tool through artificial intelligence techniques,” in Pediatric Gait: A New Millenium in Clinical Care and Motion Analysis Technology, pp. 23–29, 2000. View at Google Scholar
  16. Infusion Systems, “Infusion systems,” June, 2012,
  17. K. Ellis, M. Quigley, and M. Power, “Experiences in ethical usability testing with children,” Journal of Information Technology Research, vol. 1, no. 3, pp. 1–13, 2007. View at Google Scholar