Journal of Sensors

Journal of Sensors / 2018 / Article

Review Article | Open Access

Volume 2018 |Article ID 5853917 |

Rasha M. Al-Eidan, Hend Al-Khalifa, Abdul Malik Al-Salman, "A Review of Wrist-Worn Wearable: Sensors, Models, and Challenges", Journal of Sensors, vol. 2018, Article ID 5853917, 20 pages, 2018.

A Review of Wrist-Worn Wearable: Sensors, Models, and Challenges

Academic Editor: Evangelos Hristoforou
Received01 Jun 2018
Revised13 Oct 2018
Accepted13 Nov 2018
Published19 Dec 2018


Wearable technology impacts the daily life of its users. Wearable devices are defined as devices embedded within clothes, watches, or accessories. Wrist-worn devices, as a type of wearable devices, have gained popularity among other wearable devices. They allow quick access to vital information, and they are suitable for many applications. This paper presents a comprehensive survey of wearable computing as a research field and provides a systematic review of recent work specifically on wrist-worn wearables. The focus of this research is on wrist-worn wearable studies because there is a lack of systematic literature reviews related to this area. This study reviewed journal and conference articles from 2015 and 2017 with some studies from 2014 and 2018, resulting in a selection of 54 studies that met the selection criteria. The literature showed that research in wrist-worn wearables spans three domains, namely, user interface and interaction studies, user studies, and activity/affect recognition studies. Our study then concludes with challenges and open research directions.

1. Introduction

Wearable technology impacts the daily life of its users. On a daily basis, humans perform many physical and cognitive activities, such as decision-making, eating, studying, walking, and communication with others. New technologies are involved in many aspects of our lives, such as communication (through social networks) or shopping (through e-commerce websites). In 1995, a new field of research called affective computing, which considers human affects [1], was introduced by Picard.

Wearable technology, as a type of affective computing, is mainly used for activity recognition [2, 3] and feeling or affect detection [4, 5]. Wristwear device technology has been studied more recently, for example, the WearWrite system for smartwatches [6]. Tomo is an example of an ad hoc wristwear system that uses hand gesture recognition [7]. Some other studies expand the interface of commercial devices, such as [8], where the interface of a wristwear device is extended to the user’s skin. In this article, we aim to provide a review of the studies based on only wrist-worn devices (WWDs).

The field of wearable computing has spawned many conferences and research groups. The Conference on Human Factors in Computing Systems, the International Symposium on Wearable Computers, and the Enterprise Wearable Technology Summit are examples of high impact conferences. Also, popular research groups exist at Carnegie Mellon, Columbia University, Georgia Tech, MIT, Bremen University, Darmstadt University, ETH Zurich, Lancaster University, University of South Australia, and NARA in Japan.

The structure of this paper is as follows: Initially, in Section 2, we provide background information on wearable computing (definitions, fields, devices, etc.). Then, in Section 3, we explain the methodology of this literature review, which includes search strategy and inclusion criteria. Section 4 presents a general overview of recent review studies of wearable computing. Section 5 focuses on experimental papers that are based on wrist-worn wearables. Sections 6 and 7 present discussion, challenges, and open directions. At the end of this review, Section 8 presents our conclusion and future work.

2. Background

Wearable computers are any devices that can be worn on the body. There is no specific definition of wearable computers, but they can be defined by their distinct characteristics [9]. Rhodes in 1997 and Hendrik Witt in 2008 defined wearable computers by describing many of their properties, such as portability, limited capability, context awareness, operational constancy, and hands-free or limited use of hands. In 2014, Genaro Motti et al. gave a simple definition of wearable computers as body-worn devices, such as clothing and accessories, that integrate computational capabilities to provide specific features to users [10]. The term wearables, as well as the terms wearable technology and wearable devices, is indicative of consumer electronics technology that is based on embedded computer hardware that is built into products that are worn on the outside of one’s body [11].

In 1980, Professor Steve Man built a prototype of wearable personal computer-imaging system [12]. It consists of lens, mirror, a partly silvered mirror, reflections off eyeglasses, and two antennas for communication. Over 16 years, from 1980 until 1997, his system had passed through many developments until becoming a prototype consisting of eyeglasses, a handheld control, and a computer worn in the back under the shirt.

The first report on a wearable computer was authored by Thad Starner in 1995 and was called “The Cyborgs are Coming” [9]. His concern was with wearable computer interfaces, and he identifies two main characteristics: persistence and constancy [9]. Persistence describes the permanent availability of wearable computers and the ability to use them while simultaneously performing other tasks. Constancy describes how one wearable computer can be used in every situation.

In 1998, Professor Kevin Warwick implemented a sensor embedded in the median nerves of his left arm [9]. This work has been applied to controlling a wheelchair and an artificial hand by measuring transmitted signal and creating artificial sensors through electrodes on the arm.

Wearable computing is not an independent research area; research questions from different disciplines must be raised depending on the goal of a study. As shown in Figure 1, the three significant fields that contribute to wearable computing are computer science, electrical engineering, and psychology [9]. Artificial intelligence, human-computer interaction (HCI), and hardware design are branches of computer science. HCI is related strongly to psychology, while hardware design has its roots in electrical engineering. Wearable computer interfaces are essentially related to HCI and phycology rather than electrical engineering [9].

Wearable computers are related to the internet of things (IoT). The IoT is a concept that states that everything that can be connected will be connected [13]. This evolved from both ubiquitous computing and pervasive computing [13]. Wearable devices provide this critical functionality; they are IoT devices in the sense that they are always connected to the Internet, even if they are through a device such as a phone or a tablet. Many people already own or plan to purchase wearables for fitness or medical reasons, and eventually, wearables will become essential work tools.

Wearable hardware devices include smartwatches, smart glasses, textiles (also called smart fabrics), hats, caps, shoes, socks, contact lenses, earrings, headbands, hearing aids, and jewelry, such as rings, bracelets, and necklaces [11].

Over the course of our literature review, we observed that the most investigated wearable devices are as follows: (i)Smartwatch(ii)Smart eyewear (e.g., smart glasses and head-mounted displays) [14](iii)Egocentric vision devices [15](iv)Light-based devices (e.g., LED) [1624](v)Fabrics, textiles, and skin-based devices [2528](vi)Tactile gloves [29](vii)Hair and nail-based devices [30](viii)Magnetic inputs (e.g., Google cardboard) [31, 32]

These wearable devices’ main challenges are networking, power and heat, display, and mobile input. All of them should be affordable for low-income earners and small in size and consume a small amount of battery power [33]. They can be used to communicate via wireless technologies such as Wi-Fi, Bluetooth, Zigbee, and NFC [33].

3. Methodology

In this research, we review studies on wearables and focus specifically on WWD studies. In order to find the most recent and representative papers, our search strategy and inclusion criteria are as follows.

3.1. Search Strategy

This section describes our method for obtaining literature on wearable computers from books, theses, and recent journal and conference articles with a particular focus on WDDs. As can be seen in Figure 2, our search method can be divided into two categories: organization/individual-based and keyword-based. Next, we will explain each in detail.

3.1.1. Organization/Individual-Based Search

This search included university websites, conferences, research groups, and staff homepages. For university websites, we obtained the top universities in computer science as ranked by the Top Universities website and the complete university guide. From within each university website, we searched for research groups using the terms HCI, ubiquitous computing, people and technology, wearable, IoT, and so on. Then, we searched group publications, staff pages, and any related links. For example, at Georgia Tech, there is the People and Technology research group. This group develops technologies related to health care, modern society, education, and community [34]. Also, it contains “interdisciplinary teams of computer scientists, system scientists, and engineers partnered with psychologists, sociologists, architects, designers, economists, medical professionals, government officials, and others to develop technologies that empower people in all walks of life” [34]. Similarly, within each conference website, we looked at all papers presented in these venues from the venues’ websites or conference proceedings at ACM; for example, ISWC 2016 [35], ISWC 2015 [36], and UbiComp 2016 [37].

3.1.2. Keyword-Based Search

This search used certain search terms on popular online libraries and portals. We considered the following websites: ACM, IEEE, ScienceDirect, Springer, and Google Scholar. The following general terms were used for searching: wearable computer, wearable, smart textile, smart light, smart watch, sensor devices, mobile and wearable, child and wearable, elderly and wearable, activity recognition and wearable, and wrist-worn. Specific terms used for the health domain as an example were wearable and health, mental health, wearable and bipolar disorder, and wearable and dementia. Our systematic reviewing protocol is illustrated in Figure 2.

3.2. Inclusion and Exclusion Criteria

After the searching process, we followed certain criteria in order to select the appropriate articles for review. Figure 3 shows a flow diagram for the inclusion and exclusion criteria based on the PRISMA statement.

This study is aimed at reviewing recent work; therefore, we only searched for journal and conference articles published no later than 2017, although there are some small studies from 2014 and 2018 included. Once the duplicated records were removed, we screened each paper by looking at the abstract and quickly reading it. Then, we grouped the papers based on their type: review paper or experimental paper.

Review papers included all papers that review wearables under any domain. We discuss this classification in detail in the next section. After we found the relevant review papers, we observed that few of them were based on only wrist-worn wearables.

Therefore, for experimental papers, we excluded any papers that did not discuss wrist-worn wearables, such as ones on textiles or eyeglasses only. Finally, we read each article in full and classified them based on the classification schema that we introduce in the next section. Figure 4 shows how we clustered papers in our systematic review. In the first phase, we grouped papers in terms of their type. Then, we further classified each paper based on study domain or topic.

4. Review Papers

After reviewing the literature on wearable computing, we found that many studies are from different disciplines. For the most recent studies from 2015 to 2018, we summarized thirteen of them, as shown in Table 1, which gives a comparison of studies according to year, number of studies, domain, and outcomes.

StudyDomainNumber of reviewed studiesObjectives and outcomes

2015 [38]Health (rehabilitation and impairment)Not available (N/A)(i) The review found that wearable haptic devices are implemented for different clinical applications including rehabilitation, prosthetics, vestibular loss, osteoarthritis, vision loss, and hearing loss.
(ii) Need development of haptic wearables based on clinical needs, multimodal haptic displays, low battery requirements, and long-term usage.
2015 [39]General healthN/A(i) Review of recent developments and applications of low-power technologies in wearable telecare and telehealth systems; divided different approaches into hardware-based approaches and firmware-based approaches.
(ii) Need first to realize these systems in the wild, and then increase power efficiency.
(iii) Low-power technologies will benefit people’s daily lives.
2015 [40]General healthN/A(i) It discusses opportunities and challenges of wearable applications of health care and behavior changes particularly.
2015 [41]General healthN/A(i) Overview of current methods used within wearable applications to monitor and support positive health and wellbeing within an individual.
(ii) Highlight issues and challenges outlined by previous studies and describe the future focuses of work.
2015 [42]ActivityN/A(i) Need to develop lightweight physiological sensors to have comfortable wearable devices that enable monitoring of different ranges of activities of inhabitants.
2015 [43]Activity tracker22(i) A systematic review of 22 studies to evaluate validity and reliability of popular consumer wearable activity trackers (Fitbit and Jawbone).
(ii) Determine trackers’ ability to predict steps, distance, physical activity, energy expenditure, and sleep.
(iii) Results: higher validity of steps, few studies on distance and physical activity, and lower validity for energy expenditure and sleep. High interdevice reliability for steps, distance, energy expenditure, and sleep for certain Fitbit models.
2016 [44]General793 historical and 103 current(i) Two-phase survey of the application space of wearable technology by assessing its applications observed in research or industrial activities within two time periods:
(a) Historical (up to 2014)
(b) Current (2014-2015)
(ii) Explore and assess product types, application categories, and the availability of wearable application at the body surface.
(iii) Consider the differences in product price and gender.
(iv) Discuss the effects of changes between the two time periods.
(v) Wrist-worn devices that stand out from the current trends.
2016 [15]Elderly people133(i) Explore frameworks and sensor systems of AALS relative to care and clinical systems.
(ii) Most systems focused on activity monitoring for helping instance risks only.
(iii) Lack of long-term care systems that must add the environmental factors for analytics and decision-making.
(iv) Need to further explore the distributed storage and access to wearable devices and sensors.
(v) Take account of social issues: acceptability and usability.
(vi) Need to consider privacy and cybersecurity issues.
2016 [45]Health (Parkinson’s disease)113(i) Use accelerometer and gyroscope data.
(ii) Improvement of battery life, movement sensors, and information technology to create a long-use clinical device.
2016 [46]EducationN/A(i) Require more research to know the needs of wearable technology for education.
2016 [47]Biometric recognitionN/A(i) Review and give a categorization of wearable sensors useful for capturing biometric signals.
(ii) Computational cost of the different signal processing techniques.
(iii) Review and classify the recent studies in the field of wearable biometrics.
2017 [48]Wearable hapticN/A(i) Review the wearable haptic systems for only fingertip and hand.
(ii) Summarize the main characteristics of these systems and discuss the main challenges in developments.
2018 [49]Elderly people13(i) Provide a framework for fall detection assessment system that focuses on three factors: sensor placement, task and feature category.
(ii) Summarize the trends of wearable inertial sensors features and provide statistical analysis and meta-analysis for these features.

We can see from Table 1 and Figure 5 that the two main domains of wearable review studies are health and activity recognition. Few studies review wearables for the education domain [46] or biometric recognition systems. Each study can be described as having one of two approaches: general or specific.

Regarding the health domain, a specific health approach is one that targets a specific chronic disease, specific people or medical specialty such as rehabilitation, impairment, Parkinson’s disease [45], and Ambient Assistance Living systems (AALS) in elderly people [15]. In terms of activity recognition studies, they are either on wearable activity recognition in general [42] or on a specific kind of activity recognition, such as activity trackers [43].

In contrast, there is only one study that reviews the wearables in general [44] over all domains. This study reviews the current trend of wearable technology and found that wrist-worn devices have gathered much attention recently. To the best of our knowledge, there are no other studies that review wearable wrist-worn devices. Therefore, the next section focuses on wrist-worn wearable studies.

5. Wrist-Worn Wearable Studies

WWDs have gained more popularity than other wearables [44]. They allow quick access and are more suitable for many applications than other wearables. WWDs can be categorized as commercial or ad hoc devices. Commercial devices have three types: smart watch, fitness tracker, and armband [50]. Each of them has many applications from gesture recognition to authentication [51].

The literature on WWDs can be categorized into three types based on its goal: user interface and interaction studies, user studies, and activity/affect recognition studies. Next, we will explain each category in detail.

5.1. User Interface and Interaction Studies

Studies [5255] in this category are aimed at supporting the interaction of users with WWDs. They depend on a wrist-worn interface (WWI), either for input or output techniques.

There are two main challenges ahead for WWIs: their physical limitations and the contexts of their use. In order to overcome these drawbacks, two novel wrist-worn interaction paradigms were proposed by Motti and Caine in 2015 [56]. The first paradigm is the microinteractions that enable users to complete tasks in less than four seconds and attempting to have many smaller tasks rather than one big task to minimize the cognition and attention required. Examples of these microinteractions are audio, gesture, graphics, tactile, and vibratory wrist-worn interfaces. The second paradigm is the multidimensional graphical user interface for both input and output. For example, make the virtual extension to a graphical user interface. The main user interface recommendations by this study are very brief text, display content or navigation, and only using short actions to complete tasks.

Table 2 gives a comparison of user interface–based studies regarding their goal, sensor used, devices, subjects, algorithms, interactions, and their outcome.

StudyApplicationSensorsDevicesSubjects (M : F)MethodInteractionsOutcome

2016 [52]WatchOut New three gesture familiesInertial measurement unit (IMU) sensors with 200–250 Hz frequencyLG Urbane smartwatch and Sony SmartWatch 37 to build 2 classification models test: 12 subjects (8 : 4) and 6698 gesture samples collected.SVM with 192 featuresSide, bezel, and band of the watch.88.7% to 99.4%
2016 [53]Watch MI
Augments normal touch input on a smartwatch
IMU with sensor fusion composed of an Invensense M651 6-axis accelerometer and gyroscope, and an Asahi Kasei AK8963 3-axis compass sensor with 100 Hz frequencyLG Urbane smartwatchUser study: 1152 valid trials (12 participants × 24 regions × 4 repetitions).
Experiment: 12 volunteers (9 : 3) (3 left-handed) aged (20–36) (M: 25.8, SD: 5.2).
Analyzes in real-timeOmni-direction pressure touch, bi-direction twist, and omni-direction panning.98.4%
2016 [54]Circular selection proposes a list selection user interface designed for small round touchscreens.TouchscreenAndroid Wear platform displayed on a Motorola Moto 360 smartwatchUser study 1: 24
User study 2: 15
Circular selection list with 3 selection methods of fixed ring, movable ring, and jump back ring.Outperforms traditional smartwatch list interfaces for user preference and task completion time (66% large lists and 45% small lists)
2016 [55]Whoosh Uses nonvoice acoustic input for microinteractions on smartwatches48 kHz frequencyLG G Watch (Android), single monophonic microphone, and Motorola Droid Turbo (Android) smartphone to explore multidevice interaction8 participants in the laboratory and 4 participants in-the-wildSVM: total of 52 features + 26 features based on the deltas of the MFCC coefficientsBlow, sip-and-puff, and directional air swipes “sip” on the watch and “puff” on the phone. (A) short blow, (B) double blow, (C) long blow, (D) swipe up, (E) swipe down, (F) clockwise blow, (G) shoosh, (H) open exhale, and (I)-(J) sip-and-puff.Unmodified watch was 90.5% with a single classifier
- Instrumental watch case with 14 additional interactions was 91.3% with per-user cross-validation

5.2. User Studies

These studies [51, 5762] focus on understanding several issues related to WWDs from users’ viewpoints and concerns. Therefore, they analyze users’ reviews, answers, and usage in order to obtain different recommendations and limitations. As a result, the recent methods in WWD user studies can be categorized as one of the following: (i)Review analysis [51, 58](ii)Online survey [57, 60](iii)Interview [59, 61](iv)Record of usage [59]

Table 3 shows a comparison of these studies.


[57] 2015Study 1: examine usage behaviors
Study 2: analyze battery usage data
SmartwatchStudy 1: 59 smartwatch users
Study 2: 17 Android Wear smartwatch users
Online survey(1) Many users are satisfied with current battery life.
(2) The drain rate of the smartwatch battery is relatively low compared to that of the smartphone even with very frequent interactions.
(3) Users usually recharge their smartwatch once a day.
[51] 2015Understand the users’ concerns regarding the interaction with WWDs11 devices, including fitness trackers, armbands, and smartwatches:1349 comments between May 2014 and November 2014 from 59 online sources (e-commerce websites and company forums)Analysis of online reviews297 key users’ concerns emerged from a bottom-up content analysis involving both the input and the output of data.
10 design recommendations that can aid to improve the interaction design in novel WWDs.
[58] 2015How the context of use impacts the user experience and interaction with WWDs10 popular WWDs:545 comments between 2010 and 2016 from AmazonAnalysis of online comments
A qualitative analysis (coding) was combined with a quantitative approach (frequencies of occurrences)
Analysis of 31 interaction problems related to the users’ contexts (platform, computational, and technological issues).
Suggests that most problems are classified as significant or catastrophic, leading to both the users’ frustration and task interruptions.
Propose design implications: customization, adaptation, and personalization are essential in the UX design process for WWDs.
[59] 2016How smartwatches are used, what for, and in what contexts.Two wearable cameras and Apple watch34 days 12 participants of ages 23–36Recordings and interviews: capture and analysis using wearable cameras to record daily use of smartwatches using a small “sensor bag” to capture the users’ body and wrist.1009 watch uses.
Most prominent uses for the watch in turn: timekeeping, notifications, activity tracking, and applications (third-party apps: Instagram, Twitter, and Nike Plus).
[60] 2016Study of real-life experiences with three wearable activity trackersFitbit, Jawbone Up, and Nike+ FuelBand133 responses by activity trackers in the US (median 30 years old, 35% female) on Amazon Mechanical TurkOnline surveyRevealed a user’s experience derived from the needs of physical thriving or relatedness.
[61] 2015Explore current smartwatch use2 Samsung Galaxy Gears, 2 Pebbles, and a Moto 365 from the San Francisco - 2 females and 3 males (ages 18–51)Semistructured interviewShould take as much care in designing a beautiful app that will be “worn” as device.
[62] 2015Effectiveness of a smartphone application and wearable device for weight lossStandard diet intervention, smartphone application, and wearable device70 primary care patients aged 18 or older,
Control group,
Experimental group,
Time: 12 months
Experimental group and control groupChange in body weight at 3, 6, and 12 months in the experimental group, compared with the control group.
Effectiveness of most popular free apps and wearable devices for weight loss.

5.3. Activity and Affect Recognition Studies

Studies of this type are aimed at providing extra abilities to WWDs so they can recognize different activities or affects by proposing different algorithms and using different sensors.

The aspects that should be considered to develop WWDs with activity or affect recognition are sensors and devices, modeling techniques, testing duration, sampling rate, elicitation methods, extracted features, and experimental setup; next, we will discuss these aspects in more detail.

5.3.1. Sensors and Devices

The first step in recognizing activities or affects is to select the appropriate sensors and signals for measuring them. Sensing data can be classified as either direct or indirect. (i)Direct sensing means, “tracking the parameters that are related to the human subject themselves” [15]. Examples of direct sensing are sound capture, video camera, motion sensors, and wearable body sensors(ii)Indirect sensing “focuses on identifying environmental conditions and spatial features” [15]

For direct sensing, ambient intelligence techniques can be used to embed the sensing data into the environment. Such techniques can be classified as remote, mobile, or wearable sensing [63]. (i)Remote sensing is used for visual analysis such as a webcam for recognizing facial expression or blood flow under the skin. The main advantage of remote sensing is that it is easy to apply to the community without the need for any mobile or wearable devices [15]. However, there are disadvantages related to the inability to sense data away from a desk or in a remote area [63](ii)Mobile sensing collects data from mobile phones(iii)Wearable sensing employs wearable devices to sense data from close to the body. Therefore, the availability of wearable sensing devices has increased more than mobile sensing devices. Any wearable sensor consists of three main components: sensor, processor, and display [42]. Wearable devices capture the sensor’s data and send it to the processor. Then, any actions are output through the display unit. If the sensor uses wireless technology, the sensor data can be sent by a transceiver to a central station to store and process, or the processing can be completed inside the processor

Tables 4 and 5 summarize the signals used in previous studies for activity and affect recognition by wearable sensing (wrist-worn devices only).

Sensor typesActivity studies

Accelerometer[2, 3]
3-Axis accelerometer[6467]
Accelerometer and geomagnetic[68]
Inertial measurement units:
(i) Accelerometers
(ii) Gyroscopes
(iii) Magnetometers (sometimes)
Electric potential sensor[70]
Multimodal sensors:
(i) Activity (accelerometer and gyroscope)
(ii) Ambient environment (temperature, atmospheric pressure, and humidity)
(iii) Location context (Bluetooth message reception)
[63, 71]
Piezoresistive sensors and triaxial accelerometers[72]
Ambient light[16]
Heart rate (HR)[67]

MeasurementsAffect studies

Body movement[4, 5]
Prior physical activity (steps and workouts)[63]
Electrodermal activity (EDA) = galvanic skin response (GSR) = skin conductivity (SC)[5, 16, 7375]
Electrocardiogram (ECG)[7678]
Body/skin temperature[5, 16, 79]
Heart rate (HR)[5, 16, 63, 77, 7981]
Heart rate variability (HRV)[5]
Respiration (RSP)[5, 75, 82]
Electromyogram (EMG)[75, 78]
Electroencephalogram (EEG)[74, 76, 81]
Electrooculogram (EOG), magnetoencephalography (MEG), and near-infrared facial videos[78]
Photoplethysmogram (PPG)[73]
Self-reports[74, 78, 81, 83, 84]
Location[63, 74, 83] [78]
Time[83] [78]
Semantic description[78]
Ambient temperature, atmospheric pressure, humidity, light, posture, social interaction, and type of interaction[5]

The selection of sensors depends on the type of activity to be recognized. There are many studies that use only one sensor. For example, accelerometer, electric potential, and ambient light have each been used in studies as the only sensor type. The multisensory data approach is still used if the aim is to recognize multiple activities at the same time. Choosing the best wearable devices is important to capture the desired signals in an accurate, comfortable, and affordable way. WWDs can be classified as we mentioned before to commercial or customized. As shown in Figure 6, the percentage of using the commercial WWDs was 77% larger than using the customized devices (23%).

Tables 6 and 7 list the WWDs used in activity and affect recognition studies in addition to information about smartphones if they were used in the study. Furthermore, Table 8 lists the commercial WWD name, picture, price, their goal affect or activity, and sensors.


[64]FallingShimmer platform with triaxis accelerometer MMA7461
[65]StepCustomized WWD, for comparison: Samsung Gear S, Motorola Moto 360, SKT Smart Band, and Xiaomi MI BandiPhone 6 for comparison
[69]StepSparkFun 9DoF Razor IMUSmartphone for data registering
[71]22 complex fine-grained activity contextsCustomized WWDSamsung Galaxy S4 for data collection, Bluetooth beacon location tags in the physical environment
[16]Two computing activities:
(i) Keystrokes
(ii) Web browsing
Moto 360 smartwatch Android 5.1Nexus 5X smartphone Android 6.0, Dell XPS laptop, 15-inch screen, and Linux Debian 8
[68]Holding mobile phone on left or rightGalaxy Gear Live smartwatchSamsung Galaxy Note 4
[2]Cardiopulmonary resuscitation (CPR), 30/2 (30 compressions 2 rescue breaths)LG G Watch R Smart
[70]Two habitual movements:
(i) Hair touch
(ii) Restless leg movement
Data-logging platform and the electric potential sensor extension board. Can be integrated to smartwatch
[3]Two eating activity:
(i) Hand-to-mouth gestures when eating
(ii) Unique head motions during chewing, chewing range: 1-2.5 Hz
Pebble Watch, Google Glass, preinstalled app developed to collect accelerometer data
[66]4 ambulation activities: walking, standing, sitting, and lyingSmartwatch, a fitness band or a clip on sensor
[6]Useful computing tasks: writing a research paperSmartwatchMobile and Google server
[72]Mood recognitionPSYCHE wearable monitoring platform
[67]Sitting, standing, household activities, and stationary cycling with two intensitiesA PulseOn CLOUD PPG-based heart rate monitoring wristband (PulseOn, Espoo, Finland) with an embedded triaxial accelerometerOne Mobile phone


[4]Customized on right wrist and ankle
[80]Mio LINK wristband on left wristMotorola Moto G (2nd generation)
[77]Shimmer 2R with flour electrocardiogram with sampling rate of 150 HzSmartphone to collect data
[83]Pebble watch with sampling rate of 25 HzSmartphone
[76]Customized, 3 electrocardiogram electrodes (right and left wrist and left leg), and electroencephalogram with 19 channels
[79]Toshiba Silmee TM Bar Type (left wrist) and Toshiba Silmee TM W20/W21 wristbands (right wrist)App on smartphone to collect data
[75]Customized: SC with 2 electrodes (the index and ring fingers), RSP (placed tight enough in the abdominal area above the navel), and 2-channel EEG
[81]Apple Watch Sports Edition (heart rate) with sampling rate of 12 samples per minutesiPad to play arcade and puzzle games.
[74]Wrist device, electrocardiogram device, GSR, and Pulse sensorSmartphone
[82]E4 wristband
[85]Affectiva’s Q sensor
[84]Google Glass and Samsung Galaxy Gear smartwatch with constant sampling rate of 256 Hz.Samsung Galaxy S4 smartphone
[5]Samsung Galaxy Gear smartwatch, Affectiva Q smartwatch, narrative clip (torso), BioPatch (torso), and Google GlassSamsung Galaxy S4 smartphone

NamePicturePrice $ (name of the resource website of price)For affect/activity recognitionSensors

Shimmer IMU
444 shimmerActivityACCEL, GYRO, MAGNET, AND ALTI,
Motorola Moto 360 2gen
Motorola Moto 360 sport
Xiaomi MI band23–32
not ship to SA
SparkFun 9DoF Razor IMU49.95ActivityACCEL, GYRO AND MAGNET
Samsung Gear Live Smartwatch
Activity + affectACCEL, GYRO, HR, COMPASS
LG G Watch R Smart153.59
Pebble Watch Black
Pebble 2 + Heart Rate Smart Watch -
Activity + affectMICROPHONE

5.3.2. Modeling Techniques

The main step of a recognition system is classification. There are many classification methods used for activity recognition systems of WWDs; we can divide them into two approaches: machine learning-based or threshold-based. The most common and accurate machine learning algorithm is the support vector machine (SVM). Naïve Bayes and decision tree J48 are also popular to use. A deep neural network was used in only one WWD activity recognition system and gave promising results. While there are many machine learning–based studies, threshold-based studies have been conducted many times for classification. Table 9 lists the modeling techniques used in activity or affect recognition studies. Figure 7 illustrates clearly this comparison in line chart for both activity and affect recognition together. We can see that affect recognition studies preferred to use a different kind of machine learning algorithms, while most activity recognition studies preferred to use the threshold-based algorithms.

Modeling techniquesActivity studiesAffect studies

Threshold-based[2, 16, 64, 65, 69, 70]
SVM[3, 66, 68][4, 5]
Decision tree[66, 68][4, 79]
k-Nearest neighbors[66][75, 79, 85]
Multilayer perceptron[66][73, 75, 76]
Data mining[63, 77, 81]
Random forest[66][4]
Fuzzy logic[73, 78]
Naïve Bayes[66]
Random tree[4]
Linear discriminant analysis[75]
Deep learning neural network[71]

5.3.3. Testing Duration

Testing duration is essential in evaluating the performance of a system. Table 10 shows the time of test for each study. The time of testing the WWDs per user for each study ranges from two seconds up to 14 weeks. Usually, the testing time for activity recognition is in seconds or minutes for each user. In contrast, most health diagnosis systems for a specific disease, such as bipolar disorder [72], require more time to test to provide accurate results. In contrast, affect studies range from minutes to two months. Affect recognition systems take a longer time to test than activity recognition systems.

Testing durationActivity studyAffect study

2 seconds[64]
5.5 seconds[66]
5 minutes[4]
10 minutes[68]
45 minutes[67, 71]
3 days[16]
5 days[5]
1 week[6]
Twice a week for 14 days[72]
30 days[85]
75 days[84]

5.3.4. Extracted Features

WWD studies depend on sensory data. After the preprocessing phase, they extract the desired feature from the raw sensor data. Extracted features from sensor data have three main types: time domain, frequency domain, and district domain [86]. Time domain features have lower computational cost than those of the frequency domain [87]. The mean feature from the time domain is commonly used for WWD activity recognition systems. In addition, the statistical measures of standard deviation, minimum, and maximum have been commonly used.

5.3.5. Sampling Rate

Sampling rate is used to determine the range of activity frequencies. Therefore, it is important to set the correct sampling rate. Accuracy, power consumption, and other selected features are factors that are affected by the sampling rate [87]. The range of sampling rates is different based on the type of activity being detected. For example, step and fall detection usually uses 20 Hz as a sampling rate. This is due to the range of human movement being in the range 0-20 Hz [87]. In addition, using a small sampling rate reduces power consumption [69]. This review found different sampling rates being used for WWD activity recognition; they ranged from 1 to 1000 Hz. Different sampling rates used, along with their related studies, are shown in Table 11 and Figure 8.

Sampling rate (Hz)1 to 2.2510203025501002002501000

Studies[3][69][69][64, 65, 69][66][67, 69][69][69][71][68][70]

5.3.6. Elicitation Method Used for WWD Affect Recognition

In order to collect the data, an emotion must be induced by a stimulus. Table 12 shows the different kinds of elicitation methods used in previous studies. Many recent studies are aimed at collecting data naturally without any elicitation by continuously collecting data over the participants’ daily lives.


Image from the International Affective Picture System[75, 76]
Video[4, 73, 78, 80]
None (continuous monitoring in a natural environment)[5, 63, 74, 78, 79, 83]

5.3.7. Experimental Setup

To evaluate the performance of activity recognition systems, we need to collect representative datasets via experiments. This dataset includes a number of participants wearing the WWDs and the target activities or feelings of the subjects. The participants might be hospital patients, elderly people, students, etc. The participant demographic depends on the study domain: activity tracker, home monitoring, health, security, education, etc. Most of the affect studies applied to participants at regular daily lives and work lives. There is a lack of affect recognition studies that deal with diseases.

There are different places that an experiment can be conducted. Few experiments were applied in real-world conditions due to the difficulty to deal with changes and noise. Most experiments are implemented in a laboratory environment. For example, affect recognition studies have applied in different places: a quiet room, an office, a “natural” environment, and a real-life stress environment [5].

The number of participants for WWD activity recognition studies ranged from 1 to 41. In contrast, affect recognition studies had up to 123 participants.

Moreover, classes used to recognize the type and level of an affect vary between studies. There is no agreed-upon way to best categorize different kinds of emotion [1]. For affective computing researches, the best choice is whatever suits the application best [1]. More details about participant demographic are summarized in Table 13.

StudyGoalDataset (M : F)Domain

[64]Falling12 (3 groups of 4 people of different ages).Elderly people living
[65]Step1 Commercial devices: walk on the treadmill at 4.5 km/h
Prototype experiments: walk 120 seconds on the treadmill
Activity tracker (sport)
[69]Step1 Straight line walks of 30 steps at different pacesActivity tracker (sport)
[71]22 Complex fine-grained activity contexts
(i) Locomotive (walk indoors and run indoors); (ii) semantic (use refrigerator, clean utensil, cooking, sit and eat, use bathroom sink, standing, and talking); (iii) transitional (indoor to outdoor, outdoor to indoor, walk upstairs, and walk downstairs); and (iv) postural/relatively stationary (just stand, stand and lean on wall, lying on bed, sit on bed, sit on desk chair, lying on floor, sit on floor, lying on sofa, sit on sofa, and sit on commode).
2 two separate home environments.
User 1: 22 activities, user 2: 19 activities.
Every user’s series of selected activities consisted of an average of 45 minutes of sensor data collection.
Home living
[16]Keystrokes and web browsing1 activity 1: 60 key presses per character, repeat in 3 days.
Activity 2: capturing the lux readings for 10 popular websites, each one minute
[68]Holding mobile phone on left or right24 14: control study, 10 minutes each
10: user study to receive feedback
Adaptive user interface: one-handed interaction
[2]Cardiopulmonary resuscitation (CPR)41 (24: 17) age (24-70, average: 37)Health: training
[70]Hair touch and restless leg movement1 two different floorings: carpet and vinylHealth: consumer care product research
[3]Two eating activities10 (7: 3) age (15-52, average: 28.7), four food groups: different chewing motions due to different food texturesHealth: weight management
[66]Walking, standing, sitting, and lying3 triaxial acceleration data in the dataset of Activity Recognition Challenge [16], 5.5 seconds. 300,000 records or 2300 windows (window length is 128) for each wrist of a subject.Home living
[6]Writing a research paper7 (2 users, 5 crowd workers), full weakEducation
[72]Mood recognition14 patientsHealth: mood changes of bipolar disorder
[67]Sitting, standing, household activities and stationary cycling with two intensities25 healthy peopleHome living
[4]3 (neutral, happy, and angry)123 (45: 78)Daily living
[80]7 (amusement, sadness, anger, fear, disgust, surprise, and neutral)14 (5: 9) age (20-28)Recommendation and sharing
[77]Computed dental radiography (CDR)40 (15: 25) average age (25: 29)Health
[83]8 (upset, stressed, tense, excited, happy, bored, tired, and relaxed)18 (2: 16) (10 university students, 6 researchers/staff, 1 software engineer, 1 professor)Daily living
[73]16 (pride, elation, joy, satisfaction, relief, hope, interest, surprise, sadness, fear, shame, guilt, envy, disgust, contempt, and anger)DEAP datasetSocial living
[76]4 (calm, happy, fear, sad) Valence, Arousal, and Dominance (VAD)12Daily living
[79]8 (excitement, happiness, calmness, tiredness, boredom, sadness, stress, and anger)4Office living
[75]VAD (2: A: +/−, 3: V: +/−/0, 5: VA: 0−/++/−+/+−/−−, 10: VAD)20 (9: 11)
Age (22–76) (average age 47.4, ).
Right-handed, healthy, and had normal vision or corrected normal vision.
Work living
A: 0 (very calm) to 4 (very aroused)
V: −2 (unpleasant) to 2 (very pleasant).
7 (amusement, anger, disgust, excitement, fear, fun, and shock)
30: train DECAF dataset (VAD)
30: test 600 individual records.
Daily living
[81]2 (frustration and satisfaction)3 (friends and family), age (26, 24, and 44)
Big data: 176 minutes of sampled data, 10,560 seconds of raw data, and electroencephalogram with 5 million lines
Daily living
[63]N/AN/ADaily living
[74]2 (mania and depression)N/AMental health
[82]Only proposeOnly proposeChild care
[85]Happy-sad68Daily living
[84]VAD15 (7 : 8)Daily living
[5]Low stress and high stress15Daily living

6. Discussion

In the previous sections, we reviewed 54 studies based on wearables with a particular focus on WWDs. The WWD studies take various directions, and thus, it is not possible to provide an absolute comparison of them in this discussion. Moreover, even studies that followed the same direction are not easily comparable. For example, within the activity recognition studies, we found that the activities being recognized varied, which hinders comparison. However, this discussion highlights and describes the significant challenges faced in studying wearables.

We found that wearable computing, in general, has been researched many times and in different domains, such as sport, health, education, and security. Most WWD studies were conducted in the domain of health [3841, 45], activity tracking [43], and home monitoring [15, 49], and there is a lack of studies related to education [46], security [47], and child care.

Long-term usage is considered to be the main challenge of wearable technology [40]. This challenge can be overcome by considering the issues of battery life, user acceptance, safety, privacy, weight, and fault tolerance. Ambient intelligence is an important concept to implement in WWDs.

The recent WWD studies were divided, as shown in the previous section, based on three categories: user interface and interaction studies, user studies, and activity/affect recognition studies.

Regarding WWIs, we found many limitations that must be managed or resolved by researchers in order to improve user interface and interaction. Microinteractions and multidimensional interfaces are a new proposed solution for WWIs [56].

On the other hand, conducting a user study is very important to build the theoretical framework of WWDs as well as understanding user requirements. This would be beneficial in terms of providing sufficient ideas for the design of better interfaces and for implementing the appropriate applications.

We have noticed that many WWD researchers preferred to use one sensor for activity recognition in order to reduce the power consumption and increase WWD simplicity. In contrast, some studies used multisensory data to recognize more activities for intelligent life, such as elderly care within smart homes.

Features extracted from sensor data are varied. Although the sensitivity of features is important to recognize the target activity, many factors should be considered when selecting features themselves. These factors include battery consumption and availability of feature resources.

In the classification phase, most studies used a machine learning-based approach; however, researchers are still using threshold-based approaches in some studies [2, 16, 64, 65, 69, 70]. This is because a threshold-based approach must determine the critical points of classification, which are affected by changes in context and long usage needs. Consequently, we have seen that most studies for recognizing a single activity, such as falling [64], steps [65, 69], computing activity [16], CPR [2], and hair touch detection [70], used the threshold-based approach. On the other hand, to recognize the more complex activities that are sensitive to context, such as the activity of ambulation [66], eating [3], and mood [72], the machine learning approach was used. In addition, the machine learning approach used in the study on bipolar disorder diagnosis requires long-term usage. Machine learning techniques show promising results for user interface and activity recognition systems. Deep learning algorithms have been used recently and provided more efficient results [71].

Collection of a data is mandatory in order to train a classifier in the machine learning approach or for testing a system. The collection of data should be done in a real-world environment instead of a laboratory environment, which most studies used.

7. Challenges and Open Directions

After reviewing the literature of wearable computing, we found many critical challenges and issues. As we mentioned, the main technical challenge faced by wearable devices is long-term usage. This challenge is related to many aspects: battery life, user acceptance, safety, weight, fault tolerance, and privacy concerns. On the other hand, there are challenges related to creating a main standard of wearable systems. Furthermore, there are challenges in establishing a relationship with the commercial sector to ensure installation and process completion. In this section, we provide suggested solutions for overcoming each challenge in future work.

7.1. Weight

We could use energy harvesting technology to remove the need for a battery, which will decrease total weight.

7.2. Battery Life

This challenge could be overcome with respect to hardware, software, or interface. Hardware could also use energy harvesting technology, which extracts power from the surroundings via solar, kinetic, and electromagnetic emission energy [42]. Regarding software, we could reduce the power consumption of the system firmware, which can be categorized as event-driven, duty cycle, feature selection, or sensor selection. For interface, the microinteractions used to complete the task should be done in less than 3 seconds.

7.3. Lack of Standards

This challenge makes wearable technology unstable and difficult to adapt and use. Al-Shaqi et al. said regarding this issue that “Adaptability of different system components from sensors, communication protocol, decision support, and subject interaction method or language. Availability of standards will help system designers to integrate efforts and provide the market with the necessary devices and systems to meet the subject defined requirements.” There is also a lack of commercial concern for establishing a relationship between academia and commercial sectors.

7.4. Safety

A novel nonlinear optimization framework has been presented to consider safety and sustainability requirements that depend on human physiology and derive system-level design parameters for wearable sensor applications. Reliability of data is another challenge. Therefore, we should use data collection that uses accurate thresholds with low or no fault tolerance. Fault tolerance of devices is also a concern in terms of resistance to impact, heat, cold, and water.

7.5. User Acceptance

This is the main property of wearable devices. User acceptance has many challenges with respect to personalization, interface, design, and data. Personalization is the ability to support a person’s lifestyle, for example, allowing the user to create the form that they want. Also, the focus should be on interfaces that are accessible, simple, and easy to use. In addition, there is a need for customization and adaptivity to different settings based on user requirements, such as in ALS.

7.6. Design

Wearable devices must be designed in a special form with the following properties. Firstly, it is not very obvious and appears to others. Secondly, they should not disturb the users’ daily activities, and frequent wearing must be comfortable. Finally, they must be close to the body in order to sense the required measurements accurately.

7.7. Data

It should be meaningful by displaying ambient feedback rather than showing the user lots of numerical values.

8. Conclusions

Many review studies on wearable technology have been conducted. Therefore, this paper presented a review of previous research on wearable computing studies. As a result, and to the best of our knowledge, there were no review studies on WWDs that consider many aspects. This paper discussed the different kinds of WWD studies, highlighted important issues, and suggested future works.

The next step is to attempt investigating more studies based on a specific domain or a specific topic that could be solved with wearable technology then explore and detect problems with the aid of a domain expert to determine requirements.

Conflicts of Interest

The authors declare that there is no conflict of interest regarding the publication of this paper.


The authors would like to thank the Deanship of Scientific Research for funding and supporting this research through the initiative of DSR Graduate Students Research Support (GSR). The authors also thank the Deanship of Scientific Research and RSSU at King Saud University for the technical support.


  1. R. W. Picard, Affective Computing, MIT Press, Cambridge, MA, USA, 1997.
  2. A. Gruenerbl, G. Pirkl, E. Monger, M. Gobbi, and P. Lukowicz, “Smart-watch life saver: smart-watch interactive-feedback system for improving bystander CPR,” in Proceedings of the 2015 ACM International Symposium on Wearable Computers, pp. 19–26, Osaka, Japan, September 2015. View at: Publisher Site | Google Scholar
  3. X. Ye, G. Chen, and Y. Cao, “Automatic eating detection using head-mount and wrist-worn accelerometers,” in 2015 17th International Conference on E-Health Networking, Application Services (HealthCom), pp. 578–581, Boston, MA, USA, October 2015. View at: Publisher Site | Google Scholar
  4. Z. Zhang, Y. Song, L. Cui, X. Liu, and T. Zhu, “Emotion recognition based on customized smart bracelet with built-in accelerometer,” PeerJ, vol. 4, article e2258, 2016. View at: Publisher Site | Google Scholar
  5. J. Hernandez Rivera, Towards Wearable Stress Measurement, Massachusetts Institute of Technology, 2015.
  6. M. Nebeling, A. To, A. Guo et al., “WearWrite: crowd-assisted writing from smartwatches,” in Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 3834–3846, San Jose, CA, USA, May 2016. View at: Publisher Site | Google Scholar
  7. Y. Zhang and C. Harrison, “Tomo: wearable, low-cost electrical impedance tomography for hand gesture recognition,” in Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, pp. 167–173, Charlotte, NC, USA, November 2015. View at: Publisher Site | Google Scholar
  8. Y. Zhang, J. Zhou, G. Laput, and C. Harrison, “SkinTrack: using the body as an electrical waveguide for continuous finger tracking on the skin,” in Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 1491–1503, San Jose, California, USA, May 2016. View at: Publisher Site | Google Scholar
  9. H. Witt, User Interfaces for Wearable Computers: Development and Evaluation, Vieweg and Teubner, Germany, 2008. View at: Publisher Site
  10. V. Genaro Motti, S. Kohn, and K. Caine, “Wearable computing: a human-centered view of key concepts, application domains, and quality factors,” in Proceedings of the 16th International Conference on Human-Computer Interaction with Mobile Devices & Services, pp. 563-564, Toronto, ON, Canada, September 2014. View at: Publisher Site | Google Scholar
  11. W. Barfield and T. Caudell, Fundamentals of Wearable Computers and Augumented Reality, L. Erlbaum Associates Inc., Hillsdale, NJ, USA, 2000.
  12. S. Mann, “Wearable computing: a first step toward personal imaging,” Computer, vol. 30, no. 2, pp. 25–32, 1997. View at: Publisher Site | Google Scholar
  13. D. C. Ruiz and A. Goransson, Professional Android Wearables, Wrox Press Ltd., Birmingham, UK, 1st edition, 2015.
  14. A. Nassani, H. Bai, G. Lee, and M. Billinghurst, “Tag it!: AR annotation using wearable sensors,” in SIGGRAPH Asia 2015 Mobile Graphics and Interactive Applications, pp. 12:1–12:4, Kobe, Japan, November 2015. View at: Publisher Site | Google Scholar
  15. R. Al-Shaqi, M. Mourshed, and Y. Rezgui, “Progress in ambient assisted systems for independent living by the elderly,” Springerplus, vol. 5, no. 1, p. 624, 2016. View at: Publisher Site | Google Scholar
  16. A. Holmes, S. Desai, and A. Nahapetian, “LuxLeak: capturing computing activity using smart device ambient light sensors,” in Proceedings of the 2nd Workshop on Experiences in the Design and Implementation of Smart Objects, pp. 47–52, New York City, NY, USA, October 2016. View at: Publisher Site | Google Scholar
  17. N. T. Ly, R. Tscharn, J. Preßler et al., “Smart lighting in dementia care facility,” in Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct, pp. 1636–1639, Heidelberg, Germany, September 2016. View at: Publisher Site | Google Scholar
  18. R. Tscharn, N. Ly-Tung, D. Löffler, and J. Hurtienne, “Ambient light as spatial attention guidance in indoor environments,” in Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct, pp. 1627–1630, Heidelberg, Germany, September 2016. View at: Publisher Site | Google Scholar
  19. H. Yoon, S.-H. Park, and K.-T. Lee, “Lightful user interaction on smart wearables,” Personal and Ubiquitous Computing, vol. 20, no. 6, pp. 973–984, 2016. View at: Publisher Site | Google Scholar
  20. N. Zhao and J. A. Paradiso, “HALO: wearable lighting,” in Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers, pp. 601–606, Osaka, Japan, September 2015. View at: Publisher Site | Google Scholar
  21. D. Aliakseyeu and J. Mason, “Tap sensor: evaluating a new physical user interface for connected lighting,” in Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct, pp. 1614–1619, Heidelberg, Germany, September 2016. View at: Publisher Site | Google Scholar
  22. Y. Choi, K. Yoo, S. J. Kang, B. Seo, and S. K. Kim, “Development of a low-cost wearable sensing glove with multiple inertial sensors and a light and fast orientation estimation algorithm,” The Journal of Supercomputing, vol. 74, no. 8, pp. 3639–3652, 2018. View at: Publisher Site | Google Scholar
  23. Z. Cochran, B. Tomlinson, D.-W. Chen, and K. Patel, “LightWeight: wearable resistance visualizer for rehabilitation,” in Proceedings of the Adjunct Publication of the 27th Annual ACM Symposium on User Interface Software and Technology, pp. 101-102, Honolulu, Hawaii, USA, October 2014. View at: Publisher Site | Google Scholar
  24. M. Caon, L. Angelini, O. A. Khaled, D. Aliakseyeu, J. Mason, and E. Mugellini, “Tangible interaction with light in the IoT,” in Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct, pp. 1646–1651, Heidelberg, Germany, September 2016. View at: Publisher Site | Google Scholar
  25. J. Haladjian, K. Bredies, and B. Brügge, “Interactex: an integrated development environment for smart textiles,” in Proceedings of the 2016 ACM International Symposium on Wearable Computers, pp. 8–15, New York, NY, USA, September 2016. View at: Publisher Site | Google Scholar
  26. G. Valenza, L. Citi, C. Gentili, A. Lanatá, E. P. Scilingo, and R. Barbieri, “Characterization of depressive states in bipolar patients using wearable textile technology and instantaneous heart rate variability assessment,” IEEE Journal of Biomedical and Health Informatics, vol. 19, no. 1, pp. 263–274, 2015. View at: Publisher Site | Google Scholar
  27. K. Ueda, T. Terada, and M. Tsukamoto, “Input interface using wrinkles on clothes,” in Proceedings of the 2016 ACM International Symposium on Wearable Computers, pp. 56-57, New York, NY, USA, September 2016. View at: Publisher Site | Google Scholar
  28. M. Chen, Y. Ma, J. Song, C.-F. Lai, and B. Hu, “Smart clothing: connecting human with clouds and big data for sustainable health monitoring,” Mobile Networks and Applications, vol. 21, no. 5, pp. 825–845, 2016. View at: Publisher Site | Google Scholar
  29. A. Jylhä, Y.-T. Hsieh, V. Orso, S. Andolina, L. Gamberini, and G. Jacucci, “A wearable multimodal interface for exploring urban points of interest,” in Proceedings of the 2015 ACM on International Conference on Multimodal Interaction, pp. 175–182, New York, NY, USA, November 2015. View at: Publisher Site | Google Scholar
  30. K. Vega and H. Fuks, Beauty Technology: Designing Seamless Interfaces for Wearable Computing, Springer International Publishing, 2016.
  31. C. Lee, Simple Magnetic Trigger Hack for Google Cardboard, 2016,
  32. K. Lyons, “2D input for virtual reality enclosures with magnetic field sensing,” in Proceedings of the 2016 ACM International Symposium on Wearable Computers, pp. 176–183, New York, NY, USA, September 2016. View at: Publisher Site | Google Scholar
  33. A. Nag and S. C. Mukhopadhyay, “Wearable electronics sensors: current status and future opportunities,” in Wearable Electronics Sensors. Smart Sensors, Measurement and Instrumentation, S. Mukhopadhyay, Ed., vol. 15, pp. 1–35, Springer, Cham, 2015. View at: Publisher Site | Google Scholar
  34. “Georgia Institute of Technology,” 2016, View at: Google Scholar
  35. in Proceedings of the 2016 ACM International Symposium on Wearable Computers, ISWC’16, ACM, 201, New York, NY, USA, 2016,
  36. in Proceedings of the 2015 ACM International Symposium on Wearable Computers, ISWC’15, ACM, 201, New York, NY, USA, 2015,
  37. in Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing, UbiComp’16, New York, NY, USA, 2016,
  38. P. B. Shull and D. D. Damian, “Haptic wearables as sensory replacement, sensory augmentation and trainer – a review,” Journal of NeuroEngineering and Rehabilitation, vol. 12, no. 1, p. 59, 2015. View at: Publisher Site | Google Scholar
  39. C. Wang, W. Lu, M. R. Narayanan, S. J. Redmond, and N. H. Lovell, “Low-power technologies for wearable telecare and telehealth systems: a review,” Biomedical Engineering Letters, vol. 5, no. 1, pp. 1–9, 2015. View at: Publisher Site | Google Scholar
  40. V. Genaro Motti and K. Caine, “An overview of wearable applications for healthcare,” in Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers - UbiComp '15, pp. 635–641, New York, NY, USA, September 2015. View at: Publisher Site | Google Scholar
  41. K. Hänsel, N. Wilde, H. Haddadi, and A. Alomainy, “Challenges with current wearable technology in monitoring health data and providing positive behavioural support’,” in Proceedings of the 5th EAI International Conference on Wireless Mobile Communication and Healthcare, ICST, pp. 158–161, Brussels, Belgium, December 2015. View at: Publisher Site | Google Scholar
  42. S. C. Mukhopadhyay, “Wearable sensors for human activity monitoring: a review,” IEEE Sensors Journal, vol. 15, no. 3, pp. 1321–1330, 2015. View at: Publisher Site | Google Scholar
  43. K. R. Evenson, M. M. Goto, and R. D. Furberg, “Systematic review of the validity and reliability of consumer-wearable activity trackers,” International Journal of Behavioral Nutrition and Physical Activity, vol. 12, no. 1, p. 159, 2015. View at: Publisher Site | Google Scholar
  44. M. E. Berglund, J. Duvall, and L. E. Dunne, “A survey of the historical scope and current trends of wearable technology applications,” in ISWC '16 Proceedings of the 2016 ACM International Symposium on Wearable Computers, pp. 40–43, New York, NY, USA, September 2016. View at: Publisher Site | Google Scholar
  45. C. Ossig, A. Antonini, C. Buhmann et al., “Wearable sensor-based objective assessment of motor symptoms in Parkinson’s disease,” Journal of Neural Transmission, vol. 123, no. 1, pp. 57–64, 2016. View at: Publisher Site | Google Scholar
  46. B. Sandall, “Wearable technology and schools: where are we and where do we go from here?” Journal of Curriculum, Teaching, Learning and Leadership in Education, vol. 1, no. 1, 2016. View at: Google Scholar
  47. J. Blasco, T. M. Chen, J. Tapiador, and P. Peris-Lopez, “A survey of wearable biometric recognition systems,” ACM Computing Surveys, vol. 49, no. 3, pp. 1–35, 2016. View at: Publisher Site | Google Scholar
  48. C. Pacchierotti, S. Sinclair, M. Solazzi, A. Frisoli, V. Hayward, and D. Prattichizzo, “Wearable haptic systems for the fingertip and the hand: taxonomy, review, and perspectives,” IEEE Transactions on Haptics, vol. 10, no. 4, pp. 580–600, 2017. View at: Publisher Site | Google Scholar
  49. L. Montesinos, R. Castaldo, and L. Pecchia, “Wearable inertial sensors for fall risk assessment and prediction in older adults: a systematic review and meta-analysis,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 26, no. 3, pp. 573–582, 2018. View at: Publisher Site | Google Scholar
  50. V. G. Motti and K. Caine, “Smart wearables or dumb wearables?: understanding how context impacts the UX in wrist worn interaction,” in SIGDOC '16 Proceedings of the 34th ACM International Conference on the Design of Communication, pp. 10:1–10:10, New York, NY, USA, September 2016. View at: Publisher Site | Google Scholar
  51. B. Lowens, V. Motti, and K. Caine, “Design recommendations to improve the user interaction with wrist worn devices,” in 2015 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops), pp. 562–567, St. Louis, MO, USA, March 2015. View at: Publisher Site | Google Scholar
  52. C. Zhang, J. Yang, C. Southern, T. E. Starner, and G. D. Abowd, “WatchOut: extending interactions on a smartwatch with inertial sensing,” in ISWC '16 Proceedings of the 2016 ACM International Symposium on Wearable Computers, pp. 136–143, New York, NY, USA, September 2016. View at: Publisher Site | Google Scholar
  53. H.-S. Yeo, J. Lee, A. Bianchi, and A. Quigley, “WatchMI: applications of watch movement input on unmodified smartwatches,” in MobileHCI '16 Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct, pp. 594–598, New York, NY, USA, September 2016. View at: Publisher Site | Google Scholar
  54. K. Plaumann, M. Müller, and E. Rukzio, “CircularSelection: optimizing list selection for smartwatches,” in Proceedings of the 2016 ACM International Symposium on Wearable Computers, pp. 128–135, New York, NY, USA, September 2016. View at: Publisher Site | Google Scholar
  55. G. Reyes, W. Keith Edwards, D. Zhang et al., “Whoosh: non-voice acoustics for low-cost, hands-free, and rapid input on smartwatches,” in Proceedings of the 2016 ACM International Symposium on Wearable Computers, pp. 120–127, New York, NY, USA, September 2016. View at: Publisher Site | Google Scholar
  56. V. G. Motti and K. Caine, “Micro interactions and multi dimensional graphical user interfaces in the design of wrist worn wearables,” Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 59, no. 1, pp. 1712–1716, 2015. View at: Publisher Site | Google Scholar
  57. C. Min, S. Kang, C. Yoo et al., “Exploring current practices for battery use and management of smartwatches,” in ISWC '15 Proceedings of the 2015 ACM International Symposium on Wearable Computers, pp. 11–18, New York, NY, USA, September 2015. View at: Publisher Site | Google Scholar
  58. K. Lyons, “What can a dumb watch teach a smartwatch?: informing the design of smartwatches,” in ISWC '15 Proceedings of the 2015 ACM International Symposium on Wearable Computers, pp. 3–10, New York, NY, USA, September 2015. View at: Publisher Site | Google Scholar
  59. S. Pizza, B. Brown, D. McMillan, and A. Lampinen, “Smartwatch in vivo,” in CHI '16 Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 5456–5469, New York, NY, USA, May 2016. View at: Publisher Site | Google Scholar
  60. E. Karapanos, R. Gouveia, M. Hassenzahl, and J. Forlizzi, “Wellbeing in the making: peoples’ experiences with wearable activity trackers,” Psychology of Well-Being, vol. 6, no. 1, p. 4, 2016. View at: Publisher Site | Google Scholar
  61. S. Schirra and F. R. Bentley, “It’s kind of like an extra screen for my phone,” in Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems - CHI EA '15, pp. 2151–2156, New York, NY, USA, 2015. View at: Google Scholar
  62. E. Granado-Font, OBSBIT Study Group, G. Flores-Mateo et al., “Effectiveness of a smartphone application and wearable device for weight loss in overweight or obese primary care patients: protocol for a randomised controlled trial,” BMC Public Health, vol. 15, no. 1, p. 531, 2015. View at: Publisher Site | Google Scholar
  63. K. Hänsel, A. Alomainy, and H. Haddadi, “Large scale mood and stress self-assessments on a smartwatch,” in Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing Adjunct - UbiComp '16, pp. 1180–1184, New York, NY, USA, 2016. View at: Publisher Site | Google Scholar
  64. S. Zhou, J. Chen, X. Wang, L. Zhou, B. Zhen, and J. Cui, “Inclination gradient-based fall detection algorithm for wrist-worn device,” in 2015 IEEE International Conference on Consumer Electronics, pp. 148-149, Taiwan, June 2015. View at: Google Scholar
  65. Y. Cho, H. Cho, and C. M. Kyung, “Design and implementation of practical step detection algorithm for wrist-worn devices,” IEEE Sensors Journal, vol. 16, no. 21, pp. 1–7730, 2016. View at: Publisher Site | Google Scholar
  66. M. Nguyen, L. Fan, and C. Shahabi, “Activity Recognition Using Wrist-Worn Sensors for Human Performance Evaluation,” in 2015 IEEE International Conference on Data Mining Workshop (ICDMW), pp. 164–169, November 2015. View at: Publisher Site | Google Scholar
  67. S. Mehrang, J. Pietilä, and I. Korhonen, “An activity recognition framework deploying the random forest classifier and a single optical heart rate monitoring and triaxial accelerometer wrist-band†,” Sensors, vol. 18, no. 3, 2018. View at: Publisher Site | Google Scholar
  68. H. Lim, G. An, Y. Cho, K. Lee, and B. Suh, “WhichHand: automatic recognition of a smartphone’s position in the hand using a smartwatch,” in Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct, pp. 675–681, New York, NY, USA, 2016. View at: Google Scholar
  69. L. E. Diez, A. Bahillo, A. D. Masegosa et al., “Signal processing requirements for step detection using wrist-worn IMU,” in 2015 International Conference on Electromagnetics in Advanced Applications (ICEAA), pp. 1032–1035, September 2015. View at: Publisher Site | Google Scholar
  70. A. Yazdan, R. Prance, H. Prance, and D. Roggen, “Wearable electric potential sensing: a new modality sensing hair touch and restless leg movement,” in UbiComp’16 Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct, pp. 846–850, New York:, 2016. View at: Google Scholar
  71. P. Vepakomma, D. De, S. K. Das, and S. Bhansali, “A-Wristocracy: deep learning on wrist-worn sensing for recognition of user complex activities,” in 2015 IEEE 12th International Conference on Wearable and Implantable Body Sensor Networks (BSN), pp. 1–6, June 2015. View at: Publisher Site | Google Scholar
  72. G. Valenza, M. Nardelli, A. Lanata et al., “Predicting mood changes in bipolar disorder through heartbeat nonlinear dynamics,” IEEE Journal of Biomedical and Health Informatics, vol. 20, no. 4, pp. 1034–1043, 2016. View at: Publisher Site | Google Scholar
  73. J. A. Rincon, Â. Costa, P. Novais, V. Julian, and C. Carrascosa, “Using non-invasive wearables for detecting emotions with intelligent agents,” in International Joint Conference SOCO’16-CISIS’16-ICEUTE’16, pp. 73–84, Springer, Cham, 2016. View at: Google Scholar
  74. F. Gravenhorst, A. Muaremi, J. Bardram et al., “Mobile phones as medical devices in mental disorder treatment: an overview,” Personal and Ubiquitous Computing, vol. 19, no. 2, pp. 335–353, 2015. View at: Publisher Site | Google Scholar
  75. L. Zhang, S. Rukavina, S. Gruss, H. C. Traue, and D. Hazer, “Classification analysis for the emotion recognition from psychobiological data,” in Proceedings of International Symposium on Companion-Technology, pp. 149–154, 2015. View at: Google Scholar
  76. A. M. AlzeerAlhouseini, I. F. T. Alshaikhli, A. Rahman, A. Wahab, and M. A. Dzulkifli, “Emotion detection using physiological signals EEG & ECG,” International Journal of Advancements in Computing Technology, vol. 8, no. 3, pp. 103–112, 2016. View at: Google Scholar
  77. R. Gravina and G. Fortino, “Automatic methods for the detection of accelerative cardiac defense response,” IEEE Transactions on Affective Computing, vol. 7, no. 3, pp. 286–298, 2016. View at: Publisher Site | Google Scholar
  78. A. Cinquepalmi and U. Straccia, “An ontology-based affective computing approach for passenger safety engagement on cruise ships”. View at: Google Scholar
  79. A. Zenonos, A. Khan, G. Kalogridis, S. Vatsikas, T. Lewis, and M. Sooriyabandara, “HealthyOffice: mood recognition at work using smartphones and wearable sensors,” in 2016 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops), pp. 1–6, March 2016. View at: Publisher Site | Google Scholar
  80. J. Vermeulen, L. MacDonald, J. Schöning, R. Beale, and S. Carpendale, “Heartefacts: augmenting mobile video sharing using wrist-worn heart rate sensors,” in Proceedings of the 2016 ACM Conference on Designing Interactive Systems, pp. 712–723, New York, NY, USA, 2016. View at: Google Scholar
  81. N. Costadopoulos, “Emotional intelligence via wearables: a method for detecting frustration,” Information Technology in Industry, vol. 4, no. 1, pp. 19–25, 2016. View at: Google Scholar
  82. S. Ramesh, “Using wearable technology to gain insight into children’s physical and social behaviors,” Massachusetts Institute of Technology, 2016. View at: Google Scholar
  83. Z. Zhu, H. F. Satizabal, U. Blanke, A. Perez-Uribe, and G. Troster, “Naturalistic recognition of activities and mood using wearable electronics,” IEEE Transactions on Affective Computing, vol. 7, no. 3, pp. 272–285, 2016. View at: Publisher Site | Google Scholar
  84. J. Hernandez, D. McDuff, C. Infante, P. Maes, K. Quigley, and R. Picard, “Wearable ESM: differences in the experience sampling method across wearable devices,” in Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services, pp. 195–205, New York, NY, USA, 2016. View at: Google Scholar
  85. A. Sano, A. Z. Yu, A. W. McHill et al., “Prediction of happy-sad mood from daily behaviors and previous sleep history,” in 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 6796–6799, August 2015. View at: Publisher Site | Google Scholar
  86. D. Figo, P. C. Diniz, D. R. Ferreira, and J. M. P. Cardoso, “Preprocessing techniques for context recognition from accelerometer data,” Personal and Ubiquitous Computing, vol. 14, no. 7, pp. 645–662, 2010. View at: Publisher Site | Google Scholar
  87. M. Shoaib, S. Bosch, O. Incel, H. Scholten, and P. Havinga, “A survey of online activity recognition using mobile phones,” Sensors, vol. 15, no. 1, pp. 2059–2085, 2015. View at: Publisher Site | Google Scholar

Copyright © 2018 Rasha M. Al-Eidan et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Related articles

No related content is available yet for this article.
 PDF Download Citation Citation
 Download other formatsMore
 Order printed copiesOrder

Related articles

No related content is available yet for this article.

Article of the Year Award: Outstanding research contributions of 2021, as selected by our Chief Editors. Read the winning articles.