Table of Contents Author Guidelines Submit a Manuscript
International Journal of Computer Games Technology
Volume 2018, Article ID 8734540, 14 pages
https://doi.org/10.1155/2018/8734540
Research Article

Automated Analysis of Facial Cues from Videos as a Potential Method for Differentiating Stress and Boredom of Players in Games

1University of Skövde, Skövde, Sweden
2Federal University of Fronteira Sul, Chapecó, SC, Brazil

Correspondence should be addressed to Fernando Bevilacqua; es.sih@auqcaliveb.odnanref

Received 5 December 2017; Revised 22 January 2018; Accepted 30 January 2018; Published 8 March 2018

Academic Editor: Michael J. Katchabaw

Copyright © 2018 Fernando Bevilacqua et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. E. D. Mekler, J. A. Bopp, A. N. Tuch, and K. Opwis, “A systematic review of quantitative studies on the enjoyment of digital entertainment games,” in Proceedings of the 32nd Annual ACM Conference on Human Factors in Computing Systems, CHI 2014, pp. 927–936, May 2014. View at Publisher · View at Google Scholar · View at Scopus
  2. P. Rani, C. Liu, N. Sarkar, and E. Vanman, “An empirical study of machine learning techniques for affect recognition in human-robot interaction,” Pattern Analysis and Applications, vol. 9, no. 1, pp. 58–69, 2006. View at Publisher · View at Google Scholar · View at Scopus
  3. T. J. Tijs, D. Brokken, and W. A. IJsselsteijn, “Dynamic game balancing by recognizing affect,” in Fun and Games, vol. 5294 of Lecture Notes in Computer Science, pp. 88–93, Springer, Berlin, Germany, 2008. View at Publisher · View at Google Scholar
  4. J. F. Cohn and F. De la Torre, Automated Face Analysis for Affective, The Oxford handbook of affective computing, 2014.
  5. C. T. Tan, D. Rosser, S. Bakkes, and Y. Pisan, “A feasibility study in using facial expressions analysis to evaluate player experiences,” in Proceedings of the 8th Australasian Conference on Interactive Entertainment: Playing the System, IE 2012, July 2012. View at Publisher · View at Google Scholar · View at Scopus
  6. C. T. Tan, S. Bakkes, and Y. Pisan, “Inferring player experiences using facial expressions analysis,” in Proceedings of the 10th Australian Conference on Interactive Entertainment, IE 2014, pp. 1–8, ACM Press, December 2014. View at Publisher · View at Google Scholar · View at Scopus
  7. C. T. Tan, S. Bakkes, and Y. Pisan, “Correlation between facial expressions and the game experience questionnaire,” in Proceedings of the Entertainment Computing-ICEC 2014: 13th International Conference, vol. 8770, p. 229, Springer, Sydney, Australia, October 2014.
  8. X. Zhou, X. Huang, and Y. Wang, “Real-time facial expression recognition in the interactive game based on embedded hidden markov model,” in Proceedings of the Computer Graphics, Imaging and Visualization, pp. 144–148, Penang, Malaysia, 2004. View at Publisher · View at Google Scholar
  9. C. Zhan, W. Li, P. Ogunbona, and F. Safaei, “A real-time facial expression recognition system for online games,” International Journal of Computer Games Technology, vol. 2008, pp. 1–7, 2008. View at Publisher · View at Google Scholar
  10. P. Ekman and W. V. Friesen, 1977, Facial action coding system.
  11. J. F. Cohn, Z. Ambadar, and P. Ekman, “Observer-based measurement of facial expression with the facial action coding system,” The handbook of emotion elicitation and assessment, pp. 203–221, 2007. View at Google Scholar
  12. T. Saari and M. Turpeinen, “Towards psychological customization of information for individuals and social groups,” in Designing Personalized User Experiences in eCommerce, vol. 5 of Human-Computer Interaction Series, pp. 19–37, Springer, 2004. View at Publisher · View at Google Scholar
  13. G. A. Abrantes and F. Pereira, “MPEG-4 facial animation technology: survey, implementation, and results,” IEEE Transactions on Circuits and Systems for Video Technology, vol. 9, no. 2, pp. 290–305, 1999. View at Publisher · View at Google Scholar · View at Scopus
  14. P. Ekman and W. V. Friesen, “Constants across cultures in the face and emotion,” Journal of Personality and Social Psychology, vol. 17, no. 2, pp. 124–129, 1971. View at Publisher · View at Google Scholar · View at Scopus
  15. S. Asteriadis, K. Karpouzis, N. Shaker, and G. N. Yannakakis, “Towards detecting clusters of players using visual and gameplay behavioral cues,” in Proceedings of the 4th International Conference on Games and Virtual Worlds for Serious Applications, VS-GAMES 2012, pp. 140–147, October 2012. View at Publisher · View at Google Scholar · View at Scopus
  16. F. Bevilacqua, P. Backlund, and H. Engström, “Variations of facial actions while playing games with inducing boredom and stress,” in Proceedings of the 8th International Conference on Games and Virtual Worlds for Serious Applications, VS-Games 2016, September 2016. View at Publisher · View at Google Scholar · View at Scopus
  17. S. Kaiser, T. Wehrle, and P. Edwards, “Multi-modal emotion measurement in an interactive computer game: A pilot-study,” in in Proceedings of the VIII conference of the international society for research on emotions, pp. 275–279, ISRE Publications Storrs, 1994.
  18. F. Bevilacqua, H. Engström, and P. Backlund, “Changes in heart rate and facial actions during a gaming session with provoked boredom and stress,” Entertainment Computing, vol. 24, pp. 10–20, 2018. View at Publisher · View at Google Scholar
  19. H. Zacharatos, C. Gatzoulis, and Y. L. Chrysanthou, “Automatic emotion recognition based on body movement analysis: A survey,” IEEE Computer Graphics and Applications, vol. 34, no. 6, article no. 106, pp. 35–45, 2014. View at Publisher · View at Google Scholar · View at Scopus
  20. C. Schrader, J. Brich, J. Frommel, V. Riemer, and K. Rogers, “Rising to the challenge: an emotion-driven approach toward adaptive serious games,” in Serious Games and Edutainment Applications, pp. 3–28, Springer, 2017. View at Google Scholar
  21. R. L. Hazlett, “Measuring emotional valence during interactive experiences: boys at video game play,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1023–1026, 2006. View at Publisher · View at Google Scholar
  22. N. Ravaja, T. Saari, M. Salminen, J. Laarni, and K. Kallinen, “Phasic emotional reactions to video game events: a psychophysiological investigation,” Media Psychology, vol. 8, no. 4, pp. 343–367, 2006. View at Publisher · View at Google Scholar · View at Scopus
  23. A. A. Salah, N. Sebe, and T. Gevers, “Communication and automatic interpretation of affect from facial expressions,” Affective Computing and Interaction: Psychological, Cognitive and Neuroscientific Perspectives, pp. 157–183, 2010. View at Publisher · View at Google Scholar · View at Scopus
  24. A. Samara, L. Galway, R. Bond, and H. Wang, “Sensing affective states using facial expression analysis,” in Ubiquitous Computing and Ambient Intelligence, Lecture Notes in Computer Science, pp. 341–352, Springer International Publishing, 2016. View at Publisher · View at Google Scholar
  25. C.-Y. Chang, J.-S. Tsai, C.-J. Wang, and P.-C. Chung, “Emotion recognition with consideration of facial expression and physiological signals,” in Proceedings of the 2009 IEEE Symposium on Computational Intelligence in Bioinformatics and Computational Biology, CIBCB 2009, pp. 278–283, April 2009. View at Publisher · View at Google Scholar · View at Scopus
  26. Z. Hammal, L. Couvreur, A. Caplier, and M. Rombaut, “Facial expression classification: An approach based on the fusion of facial deformations using the transferable belief model,” International Journal of Approximate Reasoning, vol. 46, no. 3, pp. 542–567, 2007. View at Publisher · View at Google Scholar · View at Scopus
  27. H. Tang and T. S. Huang, “3D Facial expression recognition based on automatically selected features,” in Proceedings of the 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, CVPR Workshops, pp. 1–8, June 2008. View at Publisher · View at Google Scholar · View at Scopus
  28. H. Tang and T. S. Huang, “3d facial expression recognition based on properties of line segments connecting facial feature points,” in Proceedings of the Automatic Face & Gesture Recognition, 8th IEEE International Conference on IEEE, pp. 1–6, 2008.
  29. I. Hupont, S. Baldassarri, and E. Cerezo, “Facial emotional classification: from a discrete perspective to a continuous emotional space,” PAA. Pattern Analysis and Applications, vol. 16, no. 1, pp. 41–54, 2013. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  30. H. Ç. Akakn and B. Sankur, “Spatiotemporal-boosted DCT features for head and face gesture analysis,” in Human Behavior Understanding, pp. 64–74, Springer Nature, 2010. View at Google Scholar
  31. M. S. Bartlett, J. C. Hager, P. Ekman, and T. J. Sejnowski, “Measuring facial expressions by computer image analysis,” Psychophysiology, vol. 36, no. 2, pp. 253–263, 1999. View at Publisher · View at Google Scholar · View at Scopus
  32. T. Wehrle and S. Kaiser, “Emotion and facial expression,” in Affective Interactions, vol. 1814 of Lecture Notes in Computer Science, pp. 49–63, Springer, Berlin, Germany, 2000. View at Publisher · View at Google Scholar
  33. J. F. Grafsgaard, J. B. Wiggins, K. E. Boyer, E. N. Wiebe, and J. C. Lester, “Automatically recognizing facial expression: Predicting engagement and frustration,” in EDM, pp. 43–50, 2013. View at Publisher · View at Google Scholar · View at Scopus
  34. D. Heylen, M. Ghijsen, A. Nijholt, and R. Op Den Akker, “Facial signs of affect during tutoring sessions,” Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics): Preface, vol. 3784, pp. 24–31, 2005. View at Publisher · View at Google Scholar · View at Scopus
  35. Wikimedia Commons, Sobotta’s Atlas and Text-book of Human Anatomy 1909, J. Sobotta, K. Hajek, and A. Schmitson, Eds., 2013, https://commons.wikimedia.org/wiki/File:Sobo_1909_260.png.
  36. P. M. Blom, S. Bakkes, C. T. Tan et al., “Towards personalised gaming via facial expression recognition,” in Proceedings of the 10th AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment, AIIDE 2014, pp. 30–36, October 2014. View at Scopus
  37. N. Shaker, S. Asteriadis, G. N. Yannakakis, and K. Karpouzis, “A game-based corpus for analysing the interplay between game context and player experience,” Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics): Preface, vol. 6975, no. 2, pp. 547–556, 2011. View at Publisher · View at Google Scholar · View at Scopus
  38. G. Giannakakis, M. Pediaditis, D. Manousos et al., “Stress and anxiety detection using facial cues from videos,” Biomedical Signal Processing and Control, vol. 31, pp. 89–101, 2017. View at Publisher · View at Google Scholar · View at Scopus
  39. T. Yamakoshi, K. Yamakoshi, S. Tanaka et al., “A preliminary study on driver's stress index using a new method based on differential skin temperature measurement,” in Proceedings of the 29th Annual International Conference of IEEE-EMBS, Engineering in Medicine and Biology Society, EMBC'07, pp. 722–725, August 2007. View at Publisher · View at Google Scholar · View at Scopus
  40. M. Yamaguchi, J. Wakasugi, and J. Sakakima, “Evaluation of driver stress using biomarker in motor-vehicle driving simulator,” in Proceedings of the Conference Proceedings. Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 1834–1837, New York, NY, August 2006. View at Publisher · View at Google Scholar
  41. J. A. Healey and R. W. Picard, “Detecting stress during real-world driving tasks using physiological sensors,” IEEE Transactions on Intelligent Transportation Systems, vol. 6, no. 2, pp. 156–166, 2005. View at Publisher · View at Google Scholar · View at Scopus
  42. R. E. Jack, “Culture and facial expressions of emotion,” Visual Cognition, vol. 21, no. 9-10, pp. 1248–1286, 2013. View at Publisher · View at Google Scholar · View at Scopus
  43. T. Baltrusaitis, P. Robinson, and L.-P. Morency, “Constrained local neural fields for robust facial landmark detection in the wild,” in Proceedings of the 14th IEEE International Conference on Computer Vision Workshops (ICCVW '13), pp. 354–361, Sydney, Australia, December 2013. View at Publisher · View at Google Scholar · View at Scopus
  44. T. Baltrusaitis, P. Robinson, and L.-P. Morency, “OpenFace: An open source facial behavior analysis toolkit,” in Proceedings of the IEEE Winter Conference on Applications of Computer Vision, WACV 2016, pp. 1–10, March 2016. View at Publisher · View at Google Scholar · View at Scopus
  45. J. Stewart, “Calculus,” 2011, Cengage Learning.
  46. P. Wang, F. Barrett, E. Martin et al., “Automated video-based facial expression analysis of neuropsychiatric disorders,” Journal of Neuroscience Methods, vol. 168, no. 1, pp. 224–238, 2008. View at Publisher · View at Google Scholar · View at Scopus