Table of Contents Author Guidelines Submit a Manuscript
Journal of Sensors
Volume 2018, Article ID 9054758, 10 pages
https://doi.org/10.1155/2018/9054758
Research Article

Biological-Signal-Based User-Interface System for Virtual-Reality Applications for Healthcare

1Department of Newmedia, Seoul Media Institute of Technology, 99 Hwagok-ro 61-gil, Gangseo-gu, Seoul, Republic of Korea
2HCI and Robotics, University of Science and Technology, 216 Gajeong-ro, Yuseong-gu, Daejeon, Republic of Korea
3Graduate School of Game, Gachon University, 1342 Seongnam Daero, Sujeong-Gu, Seongnam-Si, Gyeonggi-Do 461-701, Republic of Korea

Correspondence should be addressed to Jung Yoon Kim; moc.liamg@97nooyjk

Received 7 February 2018; Accepted 10 June 2018; Published 29 July 2018

Academic Editor: Banshi D. Gupta

Copyright © 2018 Sang Hun Nam et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Biosignal interfaces provide important data that reveal the physical status of a user, and they are used in the medical field for patient health status monitoring, medical automation, or rehabilitation services. Biosignals can be used in developing new contents, in conjunction with virtual reality, and are important factors for extracting user emotion or measuring user experience. A biological-signal-based user-interface system composed of sensor devices, a user-interface system, and an application that can extract biological-signal data from multiple biological-signal devices and be used by content developers was designed. A network-based protocol was used for unconstrained use of the device so that the biological signals can be freely received via USB, Bluetooth, WiFi, and an internal system module. A system that can extract biological-signal data from multiple biological-signal data and simultaneously extract and analyze the data from a virtual-reality-specific eye-tracking device was developed so that users who develop healthcare contents based on virtual-reality technology can easily use the biological signals.

1. Introduction

Recently, virtual-reality technology has been effectively used in education, medicine, virtual experiments, and games following the development of hardware and software technologies related to virtual-reality technology [1, 2]. The development of real-time simulations allows perception of virtual reality similar to a real world, and they are utilized in various research studies in combination with user movements [3]. In addition, the theory of games has been successfully used in various other areas other than games because it allows active participation of users by increasing their motivation. Exercise concentration and persistence can be increased by allowing the exerciser to be immersed in the contents by adjusting the difficulty level of the exercise according to the ability of the person [4]. The theory of games applied to health and fitness has been found to influence user behavior changes in real life [5]. Because of the increase in the elderly population and medical expenses, the interest in monitoring health in everyday life and making exercise a part of daily routine has been increasing [6]. It became an exciting area as a social aspect of health and elements of fun and immersion through the combination of virtual- and augmented-reality technologies [7]. Recently, various healthcare contents that measure user movements and exercise methods have been developed because information on the user movements can be obtained following the development of game controllers. Nintendo Wii Fit launched a game that uses body balance using a special controller, and this technology is used in areas such as sense of balance, exercise, corrections, and rehabilitation of the elderly [8, 9]. Microsoft Kinect can be installed in the external environment of a user, and it is used for healthcare monitoring [10] because it can receive data at night using depth sensors and extract user skeletal data. Because the mobile environment in which smartphones are always carried has been developed, various wearable devices have been produced. Users are naturally led to a healthy life by directly obtaining the user’s physical information from wearable devices, updating health information on the smartphone, and measuring the amount of exercise. Recently, interactions with users can be performed through the five senses using virtual-reality technology, and rapid simulation of virtual information has become possible by synchronizing user movements. Because the contents of the applied virtual-reality technology are reinforced with social and physical interaction elements, they are recognized as a new content area and offer the possibility of unlimited development.

Biosignal interfaces provide important data that display the physical status of a user, and they are used not only in the medical field but also various other areas. In the medical field, they are used in monitoring systems for early detection of dangerous situations and diseases by monitoring the patient’s health status and in medical automation systems that provide continuous treatment or rehabilitation services. Methods that use biological signals for automatic measurement of stress and objective data collection have achieved practical results [11, 12]. The biological-signal interfaces being used in traditional medical systems are now available for the general public following the development of wearable biological-signal devices. Biological-signal interfaces are used in rehabilitation, fitness, and sports training such as practicing the method of controlling breathing or training body balancing [13, 14]. The application of biological-signal systems has expanded beyond the medical services to various other areas such as education, information security, and human-computer interaction (HCI) as the Internet and mobile devices have become ubiquitous. In the HCI field, research on the experience of users has become important. HCI experts are investigating user content experience, and the industry is measuring the user experience, researching methods of interpreting the collected data, and designing methods of evaluating contents [15, 16]. Measurement of the physiological or physical performance data not only improves our understanding of the physical health but also help us better understand the experience of users by supplementing the results of other methods.

The technology that enables the measurement of biological signals is a fundamental method for creating healthcare contents that regard the biological signal of the user as important. The biological signal is then used as important data that can measure the response of users who use the contents. The present study designed a biological-signal interface system that can be employed in various applications that use biological signals. In particular, considerations were given to special situations employed in healthcare contents in which virtual-reality equipment such as a head-mounted display (HMD), which has been significantly improved in recent years, is used. The rest of the present report is organized as follows. In Section 2, relevant studies that used biological signals are discussed. In Section 3, the architecture and components of the user-interface system are explained. In Section 4, the sensors and data used in the biological-signal interface system are explained. In Section 5, conclusion is made for the proposed system, and new research topics and directions are discussed.

2. Related Works

Biofeedback can be generally classified into direct and indirect. Direct biofeedback is a bodily response such as body or eye movement that can be controlled according to personal intention. Indirect biofeedback is a bodily response such as heart rate (HR) or galvanic skin response (GSR) that cannot be altered by human intervention. Human-behavior analysis technology is widely used for HCI, security surveillance, sports engineering, and intelligent assistance for the elderly. Determining how and what is the manner of movement of people is a fundamental and essential technology to monitor their health. Invasive sensors, which measure human movements, are also used in the field of human activity recognition, and two measurement methods that include a vision-sensor-based method and a method that uses wearable devices are used. Even though vision-based technology reflects the accurate movements of a user, it suffers from a spatial restriction in that external cameras have to be installed. User movements can be measured because inertial sensors can be easily integrated into wearable devices such as smartphones following the recent development of microelectromechanical systems technology. Although the accuracy and location where movements can be measured vary depending on the number and location of sensors, a technology has been developed lately to the level that user motions can be captured [17]. Research on hands and upper limbs or gait analysis for rehabilitation purposes has been conducted [18, 19]. Research on interactive interfaces using gestures for handicapped people has also been carried out. Eye tracking can be helpful for physically handicapped people using the technology for controlling wheelchairs, telepresence, and teleoperation. It has been used in medical and psychological research as a tool to record and study visual behavior [20]. Eye tracking refers to tracking the user’s eye movements or the point of gaze. Eye tracking collects important physical information that allows identification of the user intention and is used in wide areas such as psychological research, medical diagnosis, and user and interactive-gaze-control applications [21]. Recent tracking devices are noninvasive and perform tracking based on the image and light reflected from the cornea. In addition to the gaze direction and eye-movement pattern, the measurement data of the pupil size and microsaccades may contribute to the interpretation of the emotional and cognitive status of the user. In contrast to the visual and movement data that change according to personal intention, biological-signal information such as user body temperature and HR can provide not only user health information but also bodily responses that change according to environmental changes [22]. Because noninvasively detecting, collecting, and processing various types of body-related data such as electricity, heat, and optical signals produced by the body have recently become possible, medical expenses can be greatly reduced by making possible the prevention and remote detection of health issues [23]. The extraction of noninvasive biological signals has been interlinked with miniaturized sensor technology and led to the development of wearable devices. It has been integrated to artificial-intelligence technology. The health-monitoring field is significantly developing. In the past, patient biological signals were monitored and analyzed using equipment installed in the hospital. Many systems that can continuously monitor health have been developed following the development of various types of wearable sensor technology. Because the method of measuring brainwaves through contacts with the head has been sufficiently developed to conveniently collect the brainwave information of a user through wireless electroencephalography headsets, the application of a user interface has been expanding not only to the medical field but also to various other areas, including entertainment and virtual reality [24]. By using biological signals as a user interface, research on controlling a computer cursor by measuring the electromyography (EMG) and electrocardiography (ECG) data of a patient is being conducted [25]. In addition, the sympathetic nerves of the autonomic nervous system increase the HR in response to sudden and strenuous exercises or fear and anger during emergency situations, whereas the parasympathetic nerves decrease the HR during resting state. The autonomic nervous system is closely related to the biological signals of the heartbeat, blood pressure, respiration, emotion, and body temperature.

Virtual-reality technology is leading to the creation of a new healthcare environment through the use of interfaces with medical technology; this is because virtual-reality technology can be used to visualize three-dimensional (3D) data and enable user interaction in various ways by combining auditory, touch, and haptic technology. Healthcare data such as computed tomography (CT) images, magnetic resonance imaging (MRI), cryosection images, and confocal microscopy images are 3D images; thus, these can be analyzed by visualizing them on a monitor [26]. Furthermore, 3D data are easily transmitted to a virtual-reality system and can be visualized and analyzed using 3D-display equipment, such as a head-mounted display (HMD). These 3D data are also utilized in medical education in areas of surgery, dentistry, and nursing, among others, wherein they are used to design interactive environments using various equipment [27]. Virtual-reality technology has also provided a new direction in biomechanics and rehabilitation, as realistic sensory experiences are delivered and scenarios are recreated using virtual-reality-based analysis methods with existing hardware that measures a user’s movement [28]; in particular, such technologies are integrated with wearable equipment, IOT technology, artificial intelligence, and cloud services to develop smart healthcare services that can automatically monitor users’ health in spaces used by the users such as homes and surrounding environments.

3. Biological User-Interface System Design

Here, we explain the architecture of a biological-signal system and the role and limitation of layers. Figure 1 shows that the architecture has three important layers of sensors, system, and applications. Each layer provides various services according to the application using the system and requirements.

Figure 1: Biological-signal-based user-interface system architecture.

The first layer is composed of various types of sensor devices in which electricity, heat, chemical, and other signals can be detected from the user body. Most of these sensors, for example, ECG and EMG sensors, can directly collect biological signals. In addition, some sensors such as accelerometers collect raw data that can be used to extract information related to health, and by combining the data with the research purpose of the user or other biological signals, the intention from the biological signals can be more accurately predicted. Table 1 lists the data and provides a brief explanation of the sensor supported by the proposed system. Most sensor devices are equipped with microprocessors and functions to convert raw biological-signal data. Devices such as eye-tracking devices process the data while they are connected to a computer. The sensor device layers transmit data to the biological-signal processing system that are wired or wirelessly connected. The second layer is a biological-signal system. To adapt to the research environment where a considerable number of diverse virtual-reality wearable devices are released, the system was designed to add or remove devices that are used to measure different vital signs depending on the purpose of the application. In addition, to preserve the independence of the sensor device layer, the system includes an interface in which two-way data transmission between the network and modules controlling these devices occurs, thus bridging the sensor layer to the biological-signal system.

Table 1: Biological signal.

These systems can be constructed in various forms, such as in smartphone and computer devices, and support network connections such as cellular, IEEE 802.11 wireless protocol, and Bluetooth interfaces [29]. A virtual-reality healthcare application must support wireless communication as well, because it should be able to measure vital signals even when a subject is freely moving around. The biological-signal systems transmit biological signals to the application layer, but it can also process existing information and transmit the processed data. For example, even though the eye-tracking module transmits camera image data, the system provides processed data produced by tracking or measuring the size of the pupil in the image data. The application layer can be classified into a real-time method and a long-term data storage and analysis method according to the method of using the biological signals. The real-time method is used when users’ health status is measured and analyzed from the perspective of UI/UX, or when appropriate interactions should be regenerated in real-time in the virtual-reality application based on the user’s health status. The long-term data-analysis method allows the creation of various biological-signal-based applications mainly with the application developer collecting information transmitted to a cloud server through the Internet or supported in the form of the Web.

4. Biological System

In the past, biological-signal monitoring was essential only for important situations in which intensive care for the survival and recuperation of a patient was necessary. However, with the advancement in sensor technology, it is now utilized in other fields such as psychology, kinematics, and modern physiology. The present study explains the technology of obtaining and analyzing biological signals that can be used for health management, rehabilitation, and training contents by integrating with the virtual-reality technology. The biological-signal data supported by the proposed system are listed in Table 1, and additional biological signals can be obtained in connection with the system under the provision of network protocols. Here, the medical-use methods through measurement of biological signals supported by the system and the method of using the contents linked to virtual-reality technology are explained.

4.1. Biological Data

The collection and utilization of data used the biological signals. For the biological-signal measurement, “MySignal Hardware Development Platform” was used. Figure 2 shows that 11 biological-signal sensors could be used, and the rescue signal from the user could be received. Each sensor was connected using Arduino UNO and MySignals Shield, and the corresponding data were collected. The data collected from each sensor are transmitted to the client-requested sensor data by the client module of PC by serial communication using a Bluetooth low-energy module, as shown in Figure 3. The structure of sensor data is defined differently according to the characteristics of the biological sensor. The client modules are designed and implemented based on the network scheme and specific interface of the biological hardware as shown in Figure 1. However, the client module provides a network-based socket interface so that the same interface can be used on the same system internally or on a remote system connected to the network.

Figure 2: MySignal Hardware Development Platform.
Figure 3: Configuration of a biometric sensor network system.

Transmission of an infinite number of sensor data in real time was not possible due to the hardware limitation of Arduino UNO and MySignals Shield. Only the requested data were transmitted by enabling the biological-signal data to activate the sensor required by the client as shown in Figure 4. Thus, transmission of the essential data was ensured and was not interrupted by the transmission of unnecessary data, which prevented wastage in computing resources by unnecessary computation. In addition, the number of bytes of the sensor data being transmitted was minimized. For example, in the case of the GSR sensor, real-number data were obtained using the conductance value. If the data were transmitted as they were, a minimum of four bytes must be transmitted. However, when the data were converted into integer values using a conversion equation, only two bytes were needed to be transmitted. Thus, the transmission efficiency was increased by reducing the amount of data transmitted by converting the real-number data into integers using a conversion rule.

Figure 4: Sequence diagram for biosignal data request.

The body-position data enable analysis of the movements caused by specific diseases such as sleep apnea and restless-leg syndrome, and they help determine the sleep patterns through the analysis of movements during sleep. Body-position sensors can help detect the syncope of fall of the elderly and infirm or handicapped people [30]. The data input from the sensors can measure six status ((1) supine, (2) left, (3) right, (4) prone, (5) standing or sitting, and (6) undefined), and by acquiring the acceleration data from a built-in triple-axis accelerometer, additional research can be conducted as shown in Figure 5.

Figure 5: Body-position tracking and 3-axis acceleration signals.

Monitoring the skin and body temperature is medically very important because many diseases accompany changes in the characteristics of body temperature. Body temperature is an important measurement factor for research on such areas as cardiac surgeries and sleep and circadian rhythm and is used for monitoring patient status and predicting risks [31, 32]. As shown in Figure 6, body temperature can be measured using various sensors such as infrared (IR) sensors, thermistors, and thermocouples, and it can be monitored without inconvenience to the user life when a wearable device is used [33].

Figure 6: Body temperature signals.

EMG is a diagnostic technique that measures and records electrical activities produced by skeletal muscles. EMG produces electromyograms by detecting the electrical potentials created by muscle cells when they are electrically or neurologically activated using an electromyograph as shown in Figure 7. By analyzing the electromyogram, biodynamics at the level of medical abnormalities or movements can be analyzed. EMG is used in various clinical and biomedical tests such as diagnosing neuromuscular diseases or as a research tool for exercise science. Because of the development of EMG detection technology, comfortable data collection without attaching electrodes, for example, Myo products, to the body becomes possible. As a contactless interface, numerous applications of the technology are expected [34, 35].

Figure 7: EMG signals.

ECG is one of the medical tests most commonly used in modern medicine. Because the measurement of HR in unit of beats using ECG and the exact representation of heart activity in waveforms are possible as shown in Figure 8, accurate monitoring of HR variability is possible [36, 37]. ECG plays a very important role in the diagnosis of cardiac diseases from myocardial ischemia and infarction to syncope and palpitations. Abnormal change in respiration and breathing rate is one of the broad indicators of major physiological instability as shown in Figure 9. Hypoxemia and apnea symptoms can be diagnosed by monitoring the patient condition using the breathing rate.

Figure 8: ECG signals.
Figure 9: Air flow signal graph.

GSR is a method of measuring the electrical conductivity of the skin, and the result varies depending on the level of moisture as shown in Figure 10. Because sweat glands are controlled by the sympathetic nervous system and a moment of strong emotions changes the electrical resistance of the skin, skin conductivity is used as an indicator of psychological or physiological arousal.

Figure 10: GSR signal graph.

Blood pressure is the pressure of the blood in the arteries when the blood is circulating through the body due to the heartbeat. The blood pressure is recorded in two numbers of systolic pressure and HR. Because high blood pressure can lead to serious problems such as cardiac arrest, cerebral stroke, or renal diseases, it is one of the important physical indicators that must be regularly checked [38]. Pulse oximetry is a noninvasive measurement method that shows the arterial oxygen saturation of hemoglobin as shown in Figure 11. It measures the amount of oxygen dissolved in the blood using two waves of absorption coefficient, namely, 660 nm (red light spectrum) and 940 nm (IR light spectrum), based on the detection of hemoglobin and deoxyhemoglobin. Pulse oximeter sensors are required to analyze the treatment, surgery, and recovery of critically ill patients and for assessment of the oxygen uptake and effects of supplemental oxygen on emergency patients. It is less accurate than ECG but effective for long-term monitoring and is widely applied because it can be used in smart watches and wristbands [39].

Figure 11: SpO2 signals.

Glucometer sensors can help manage the blood glucose of diabetic patients by measuring the blood sugar level. Diabetic patients should manage their blood glucose through periodic measurement. Spirometry is the most common pulmonary function test that examines and measures lung function and measures the amount or speed of air that can be inhaled and exhaled. It measures the breathing capacity, which is useful for evaluating conditions such as asthma, fibroid lung, cystic fibrosis, and chronic obstructive pulmonary disease. Snoring is a major symptom of obstructive sleep apnea. To analyze snoring, data must be collected at a high sampling rate, and a large amount of data are processed in most studies because snoring is analyzed using the acoustic characteristics of the sound collected through a microphone. The system designed in the present study employed a hidden Markov model- (HMM-) based analysis method that extracts snoring using a piezo sensor attached to the neck as shown in Figure 12. Short-term Fourier transform and short-term energy are calculated and input to the HMM, and the results can be classified into snoring, noise, and silence. Although it is not a biological-signal measurement, preparation and support for emergency can be ensured by having patients wear an alarm and an emergency button as a pendant or on their wrists.

Figure 12: Snoring signals.
4.2. Eye-Tracking Data

Assessing human interests and interpreting eye movements as input modes have been proposed as advanced methods of interactions between humans and computers. Previous case studies in Web usability, marketing, medicine, video games, psychology, and neurology suggested that an eye tracker can be used. In the case of virtual-reality contents, eye-tracking modules are installed inside the HMD, as shown in Figure 13, because they use the same display devices as the HMD. Further, because head-mounted eye-tracking devices provide higher accuracy in measuring ocular fixation, saccades, and dilation of the pupil, they are widely used in psychology and cognitive science research [40].

Figure 13: Eye-tracking camera module in HMD.

Figure 14 shows that the pupil of a user is extracted from the IR images transmitted from cameras. The first step in the pupil extraction algorithm finds dark regions in the IR image using binarization. The dark region appears in several places depending on the state of the acquired image inside the HMD. The next step is to extract the contour of candidate regions. Because the extracted contour does not show a perfect pupil shape due to distortion such as image noise, the ellipse fitting is performed using the preliminary information that the pupil is similar to the shape of an ellipse. The small ellipses are removed for eliminating outliers. A heuristic algorithm is applied to determine that the pupil cannot be located at the edge of the image and that the size of the pupil is large and the pupil is located at the center of the image. The pupil data is back projected using the focal length of cameras and extracted ellipse, and the 3D direction vector is calculated through a back-projection process. It is possible to measure the direction of the user’s gaze using the 3D pupil data and to monitor the pupil size using the 3D ellipsis of the pupil in real time.

Figure 14: Eye-tracking camera module in HMD.

Figure 15 shows the application used to develop the system that tracks eye movements of the user and monitors the photoplethysmogram (PPG) information. A green dot represents the eye tracking of the user on top of the image shown on the HMD. Simultaneously, monitoring of the PPG signal is shown. Figure 16 shows the application that analyzes the biological signals from the user. The top portion shows the changes in the eyeball position transmitted from the user, and the size of the circle represents the changes in the size of the pupil. The bottom portion shows the changes in the user PPG. The user can analyze many biological signals simultaneously extracted under a specific condition through the timeline function.

Figure 15: Eye tracking and PPG capturing application.
Figure 16: Eye tracking and PPG analysis application.

5. Conclusion

Biological-signal-based systems are widely distributed, along with the distribution of wearable devices, and the range of their applications is also wide. Recent biological-signal modules only support wearable devices supplied by each company, and the trend is not to provide researchers with real-time biological-signal data that could ensure and protect data. Investigators conducting research using biological-signal data can control biological-signal hardware by themselves, but controlling biological-signal hardware is not easy for people creating contents. A biological-signal-based user-interface framework was designed, developed, and tested for those who want to create contents using biological signals. This system supports the researchers that have investigated the relationships among many biological signals using various sensor devices that complement each other and require analysis of biological signals because it is designed to support many biological hardware devices using a network-based internal interface. Especially, this system can be extended to include the creation of virtual-reality contents that are influencing the medical field, various forms of entertainments, and the education field.

Immersive virtual-reality contents maximize user involvement and users can participate while they are moving around, and new forms of contents can be developed using the biological-signal-based user-interface system. Biological-signal feedback can be obtained from the user in real time while the user is participating in contents, and the learning effect of the user can be increased by changing the contents after analyzing elements of the contents and evaluating biological signals. This study provides a new opportunity of understanding the interaction between humans and computers by designing a system that provides biological signals and attempts to apply user experience by analyzing the relationship between the user’s biological signals and emotions in a future study. The traditional method of user evaluation has been carried out using interviews or questionnaires. Since the contents of virtual reality are carried out while users are moving around with the HMD worn, the sense of visual and behavioral immersion increases and a greater amount of concentration and physical strength may be consumed. Since there are large psychological and physical differences between before and after carrying out the contents of virtual reality, it can affect the evaluation process. An analysis on user evaluation using the biological signals of the user can be performed while the contents are being carried out.

Data Availability

The data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest

The authors declare that there is no conflict of interest regarding the publication of this paper.

Acknowledgments

This research is supported by the Ministry of Culture, Sports, and Tourism (MCST) and the Korea Creative Content Agency (KOCCA) in the Culture Technology (CT) Research and Development Program 2017 (R2017050041_00000001).

References

  1. S. Deterding, D. Dixon, R. Khaled, and L. Nacke, “From game design elements to gamefulness: defining “gamification”,” in Proceedings of the 15th International Academic MindTrek Conference on Envisioning Future Media Environments—MindTrek ‘11, pp. 9–15, Tampere, Finland, 2011. View at Publisher · View at Google Scholar · View at Scopus
  2. D. Valtchanov, Physiological and Affective Responses to Immersion in Virtual Reality: Effects of Nature and Urban Settings (M.S. thesis), University of Waterloo, 2010.
  3. G. Riva, “From toys to brain: virtual reality applications in neuroscience,” Virtual Reality, vol. 3, no. 4, pp. 259–266, 1998. View at Publisher · View at Google Scholar · View at Scopus
  4. P. T. Chua, R. Crivella, B. Daly et al., “Training for physical tasks in virtual environments: Tai Chi,” in IEEE Virtual Reality, 2003. Proceedings, pp. 87–94, Los Angeles, CA, USA, 2003. View at Publisher · View at Google Scholar · View at Scopus
  5. C. Lister, J. H. West, B. Cannon, T. Sax, and D. Brodegard, “Just a fad? Gamification in health and fitness apps,” JMIR Serious Games, vol. 2, no. 2, article e9, 2014. View at Publisher · View at Google Scholar · View at Scopus
  6. Y. Hao and R. Foster, “Wireless body sensor networks for health-monitoring applications,” Physiological Measurement, vol. 29, no. 11, pp. R27–R56, 2008. View at Publisher · View at Google Scholar · View at Scopus
  7. T. Althoff, R. W. White, and E. Horvitz, “Influence of Pokémon Go on physical activity: study and implications,” Journal of Medical Internet Research, vol. 18, no. 12, article e315, 2016. View at Publisher · View at Google Scholar · View at Scopus
  8. M. Agmon, C. K. Perry, E. Phelan, G. Demiris, and H. Q. Nguyen, “A pilot study of Wii Fit exergames to improve balance in older adults,” Journal of Geriatric Physical Therapy, vol. 34, no. 4, pp. 161–167, 2011. View at Publisher · View at Google Scholar · View at Scopus
  9. G. Saposnik, R. Teasell, M. Mamdani et al., “Effectiveness of virtual reality using Wii gaming technology in stroke rehabilitation: a pilot randomized clinical trial and proof of principle,” Stroke, vol. 41, no. 7, pp. 1477–1484, 2010. View at Publisher · View at Google Scholar · View at Scopus
  10. E. E. Stone and M. Skubic, “Fall detection in homes of older adults using the Microsoft Kinect,” IEEE Journal of Biomedical and Health Informatics, vol. 19, no. 1, pp. 290–301, 2015. View at Publisher · View at Google Scholar · View at Scopus
  11. J. Bakker, M. Pechenizkiy, and N. Sidorova, “What’s your current stress level? Detection of stress patterns from GSR sensor data,” in 2011 IEEE 11th International Conference on Data Mining Workshops, pp. 573–580, Vancouver, BC, Canada, December 2011. View at Publisher · View at Google Scholar · View at Scopus
  12. D. Cho, J. Ham, J. Oh et al., “Detection of stress levels from biosignals measured in virtual reality environments using a kernel-based extreme learning machine,” Sensors, vol. 17, no. 10, p. 2435, 2017. View at Publisher · View at Google Scholar · View at Scopus
  13. T. Wollmann, F. Abtahi, A. Eghdam et al., “User-centred design and usability evaluation of a heart rate variability biofeedback game,” IEEE Access, vol. 4, pp. 5531–5539, 2016. View at Publisher · View at Google Scholar · View at Scopus
  14. K. M. V. McConville and S. Virk, “Evaluation of an electronic video game for improvement of balance,” Virtual Reality, vol. 16, no. 4, pp. 315–323, 2012. View at Publisher · View at Google Scholar · View at Scopus
  15. D. Clarke and P. R. Duimering, “How computer gamers experience the game situation: a behavioral study,” Computers in Entertainment, vol. 4, no. 3, p. 6, 2006. View at Publisher · View at Google Scholar
  16. T. Marsh, K. Yang, C. Shahabi et al., “Automating the detection of breaks in continuous user experience with computer games,” in CHI ‘05 Extended Abstracts on Human Factors in Computing Systems—CHI ‘05, pp. 1629–1632, Portland, OR, USA, April 2005. View at Publisher · View at Google Scholar · View at Scopus
  17. B. Klaassen, B. J. F. van Beijnum, J. P. Held et al., “Usability evaluations of a wearable inertial sensing system and quality of movement metrics for stroke survivors by care professionals,” Frontiers in Bioengineering and Biotechnology, vol. 5, 2017. View at Publisher · View at Google Scholar
  18. A. Muro-De-La-Herran, B. Garcia-Zapirain, and A. Mendez-Zorrilla, “Gait analysis methods: an overview of wearable and non-wearable systems, highlighting clinical applications,” Sensors, vol. 14, no. 2, pp. 3362–3394, 2014. View at Publisher · View at Google Scholar · View at Scopus
  19. L. Bai, M. G. Pepper, Y. Yan, S. K. Spurgeon, M. Sakel, and M. Phillips, “Quantitative assessment of upper limb motion in neurorehabilitation utilizing inertial sensors,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 23, no. 2, pp. 232–243, 2015. View at Publisher · View at Google Scholar · View at Scopus
  20. S. I. Ktena, W. Abbott, and A. A. Faisal, “A virtual reality platform for safe evaluation and training of natural gaze-based wheelchair driving,” in 2015 7th International IEEE/EMBS Conference on Neural Engineering (NER), pp. 236–239, Montpellier, France, April 2015. View at Publisher · View at Google Scholar · View at Scopus
  21. P. Majaranta and A. Bulling, “Eye tracking and eye-based human-computer interaction,” Advances in Physiological Computing, Springer, London, UK, 2014. View at Publisher · View at Google Scholar
  22. G. G. Berntson, J. T. Bigger Jr., D. L. Eckberg et al., “Heart rate variability: origins, methods, and interpretive caveats,” Psychophysiology, vol. 34, no. 6, pp. 623–648, 1997. View at Publisher · View at Google Scholar · View at Scopus
  23. S. C. Mukhopadhyay, “Wearable sensors for human activity monitoring: a review,” IEEE Sensors Journal, vol. 15, no. 3, pp. 1321–1330, 2015. View at Publisher · View at Google Scholar · View at Scopus
  24. Y. Liu, O. Sourina, and M. K. Nguyen, “Real-time EEG-based human emotion recognition and visualization,” in 2010 International Conference on Cyberworlds, pp. 262–269, Singapore, Singapore, October 2010. View at Publisher · View at Google Scholar · View at Scopus
  25. A. B. Barreto, S. D. Scargle, and M. Adjouadi, “A practical EMG-based human-computer interface for users with motor disabilities,” Journal of Rehabilitation Research and Development, vol. 37, no. 1, pp. 53–63, 2000. View at Google Scholar
  26. B. Preim and D. Bartz, Visualization in Medicine: Theory, Algorithms, and Applications, Morgan Kaufmann Publishers Inc., 2007.
  27. K. M. Hanson, The Utilization of Mixed-Reality Technologies to Teach Techniques for Administering Local Anesthesia (M.S. thesis), Utah State University, 2011.
  28. A. Garcia, N. Andre, D. Bell Boucher, A. Roberts-South, M. Jog, and M. Katchabaw, “Immersive augmented reality for Parkinson disease rehabilitation,” in Virtual, Augmented Reality and Serious Games for Healthcare 1, pp. 445–469, Springer, Berlin, Heidelberg, 2014. View at Publisher · View at Google Scholar · View at Scopus
  29. H. Alemdar and C. Ersoy, “Wireless sensor networks for healthcare: a survey,” Computer Networks, vol. 54, no. 15, pp. 2688–2710, 2010. View at Publisher · View at Google Scholar · View at Scopus
  30. N. Noury, T. Herve, V. Rialle et al., “Monitoring behavior in home using a smart fall sensor and position sensors,” in 1st Annual International IEEE-EMBS Special Topic Conference on Microtechnologies in Medicine and Biology. Proceedings (Cat. No. 00EX451), pp. 607–610, Lyon, France, 2000. View at Publisher · View at Google Scholar · View at Scopus
  31. W. Chen, S. Dols, S. B. Oetomo, and L. Feijs, “Monitoring body temperature of newborn infants at neonatal intensive care units using wearable sensors,” in Proceedings of the Fifth International Conference on Body Area Networks—BodyNets ‘10, pp. 188–194, Corfu, Greece, September 2010. View at Publisher · View at Google Scholar · View at Scopus
  32. W. Vanmarkenlichtenbelt, H. Daanen, L. Wouters et al., “Evaluation of wireless determination of skin temperature using iButtons,” Physiology & Behavior, vol. 88, no. 4-5, pp. 489–497, 2006. View at Publisher · View at Google Scholar · View at Scopus
  33. A. Pentland, “Healthwear: medical technology becomes wearable,” Computer, vol. 37, no. 5, pp. 42–49, 2004. View at Publisher · View at Google Scholar · View at Scopus
  34. E. Kutafina, D. Laukamp, and S. M. Jonas, Wearable Sensors in Medical Education: Supporting Hand Hygiene Training with a Forearm EMG, pHealth, 2015.
  35. J. Nishida, K. Takahashi, and K. Suzuki, “A wearable stimulation device for sharing and augmenting kinesthetic feedback,” in Proceedings of the 6th Augmented Human International Conference—AH ‘15, pp. 211-212, Singapore, Singapore, March 2015. View at Publisher · View at Google Scholar · View at Scopus
  36. D. Phan, L. Y. Siong, P. N. Pathirana, and A. Seneviratne, “Smartwatch: performance evaluation for long-term heart rate monitoring,” in 2015 International Symposium on Bioelectronics and Bioinformatics (ISBB), pp. 144–147, Beijing, China, October 2015. View at Publisher · View at Google Scholar · View at Scopus
  37. M. Weippert, M. Kumar, S. Kreuzfeld, D. Arndt, A. Rieger, and R. Stoll, “Comparison of three mobile devices for measuring R-R intervals and heart rate variability: Polar S810i, Suunto t6 and an ambulatory ECG system,” European Journal of Applied Physiology, vol. 109, no. 4, pp. 779–786, 2010. View at Publisher · View at Google Scholar · View at Scopus
  38. A. Pantelopoulos and N. G. Bourbakis, “A survey on wearable sensor-based systems for health monitoring and prognosis,” IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), vol. 40, no. 1, pp. 1–12, 2010. View at Publisher · View at Google Scholar · View at Scopus
  39. J. Parak, A. Tarniceriu, P. Renevey, M. Bertschi, R. Delgado-Gonzalo, and I. Korhonen, “Evaluation of the beat-to-beat detection accuracy of PulseOn wearable optical heart rate monitor,” in 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 8099–8102, Milan, Italy, August 2015. View at Publisher · View at Google Scholar · View at Scopus
  40. L. A. Granka, T. Joachims, and G. Gay, “Eye-tracking analysis of user behavior in WWW search,” in Proceedings of the 27th Annual International Conference on Research and Development in Information Retrieval—SIGIR ‘04, pp. 478-479, Sheffield, UK, July 2004. View at Publisher · View at Google Scholar