Computational and Mathematical Methods in Medicine

Computational and Mathematical Methods in Medicine / 2016 / Article

Research Article | Open Access

Volume 2016 |Article ID 4073584 | 17 pages | https://doi.org/10.1155/2016/4073584

Human Activity Recognition in AAL Environments Using Random Projections

Academic Editor: Ezequiel López-Rubio
Received08 Feb 2016
Revised29 Apr 2016
Accepted19 May 2016
Published20 Jun 2016

Abstract

Automatic human activity recognition systems aim to capture the state of the user and its environment by exploiting heterogeneous sensors attached to the subject’s body and permit continuous monitoring of numerous physiological signals reflecting the state of human actions. Successful identification of human activities can be immensely useful in healthcare applications for Ambient Assisted Living (AAL), for automatic and intelligent activity monitoring systems developed for elderly and disabled people. In this paper, we propose the method for activity recognition and subject identification based on random projections from high-dimensional feature space to low-dimensional projection space, where the classes are separated using the Jaccard distance between probability density functions of projected data. Two HAR domain tasks are considered: activity identification and subject identification. The experimental results using the proposed method with Human Activity Dataset (HAD) data are presented.

1. Introduction

The societies in the developed countries are rapidly aging. In 2006, almost 500 million people worldwide were 65 years of age or older. By 2030, that total number of aged people is projected to increase to 1 billion. The most rapid increase of aging population occurs in the developing countries, which will see a jump of 140% by 2030 [1]. Moreover, the world’s population is expected to reach 9.3 billion by 2050 [2], and people who are above 60 years old will make up 28% of the population. Dealing with this situation will require huge financial resources to support the ever-increasing living cost, where human life expectancy is expected to reach 81 years by 2100.

As older people may have disorders of body functions or suffer from age-related diseases, the need for smart health assistance systems increases each year. A common method of monitoring geriatric patients is a physical observation, which is costly, requires a lot of human staff, and is increasingly infeasible in view of massive population aging in the following years. Many Ambient Assisted Living (AAL) applications such as care-providing robots, video surveillance systems, and assistive human-computer interaction technologies require human activity recognition. While the primary users of the AAL systems are of course the senior (elderly) people, the concept also applies to mentally and physically impaired people as well as people suffering from diabetes and obesity, who may need assistance at home, and people of any age interested in personal fitness monitoring. As a result, the sensor-based real-time monitoring system to support independent living at home has been a subject of many recent research studies in human activity recognition (HAR) domain [310].

Activity recognition can be defined as the process of how to interpret sensor data to classify a set of human activities [11]. HAR is a rapidly growing area of research that can provide valuable information on health, wellbeing, and fitness of monitored persons outside a hospital setting. Daily activity recognition using wearable technology plays a central role in the field of pervasive healthcare [12]. HAR has gained increased attention in the last decade due to the arrival of affordable and minimally invasive mobile sensing platforms such as smartphones. Smartphones are innovative platforms for HAR because of the availability of different wireless interfaces, unobtrusiveness, ease of use, high computing power and storage, and the availability of sensors, such as accelerometer, compass, and gyroscope, which meet the technical and practical hardware requirements for HAR tasks [1315]. Moreover, technological development possibilities of other applications are still arising, including virtual reality systems. Therefore, these machines present a great possibility for the development of innovative technology dedicated for the AAL systems.

One of the key motivating factors for using mobile phone-based human activity recognition in the AAL systems is the relationship and correlation between the level of physical activity and the level of wellbeing of a person. Recording and analysing precise information on the person’s activities are beneficial to keeping the progress and status of the disease (or mental condition) and can potentially improve the treatment of person’s conditions and diseases, as well as decreasing the cost of care. Recognizing indoor and outdoor activities such as walking, running, or cycling can be useful to provide feedback to the caregiver about the patient’s behaviour. When following the daily habits and routines of users, one can easily identify deviations from routines, which can assist the doctors in diagnosing conditions that would not be observed during routine medical examination. Another key enabler of the HAR technology is the possibility of providing independent living for the elderly as well as for patients with dementia and other mental pathologies, which could be monitored to prevent undesirable consequences of abnormal activities. Furthermore, by using persuasive techniques and gamification, HAR systems can be designed to interact with users to change their behaviour and lifestyles towards more active and healthier ones [16].

Recently, various intelligent systems based on mobile technologies have been constructed. HAR using smartphones or other types of portable or wearable sensor platforms has been used for assessing movement quality after stroke [17], such as upper extremity motion [18], for assessing gait characteristics of human locomotion for rehabilitation and diagnosis of medical conditions [19], for postoperative mobilization [20], for detecting Parkinson’s disease, back pain, and hemiparesis [21], for cardiac rehabilitation [22], for physical therapy, for example, if a user is correctly doing the exercises recommended by a physician [23, 24], for detecting abnormal activities arising due to memory loss for dementia care [25, 26], for dealing with Alzheimer’s [27] and neurodegenerative diseases such as epilepsy [28], for assessment of physical activity for children and adolescents suffering from hyperlipidaemia, hypertension, cardiovascular disease, and type 2 diabetes [29], for detecting falls [30, 31], for addressing physical inactivity when dealing with obesity [32], for analysing sleeping patterns [33], for estimating energy expenditures of a person to assess his/her healthy daily lifestyle [34], and for recognizing the user’s intent in the domain of rehabilitation engineering such as smart walking support systems to assist motor-impaired persons and the elderly [35].

In this paper, we propose a new method for offline recognition of daily human activities based on feature dimensionality reduction using random projections [36] to low dimensionality feature space and using the Jaccard distance between kernel density probabilities as a decision function for classification of human activities.

The structure of the remaining parts of the paper is as follows. Section 2 presents the overview of related work in the smartphone-based HAR domain with a particular emphasis on the features extracted from the sensor data. Section 3 describes the proposed method. Section 4 evaluates and discusses the results. Finally, Section 5 presents the conclusions and discusses future work.

All tasks of the HAR domain require correct identification of human activities from sensor data, which, in turn, requires that features derived from sensor data must be properly categorized and described. Next, we present an overview of features used in the HAR domain.

2.1. Features

While numerous features can be extracted from physical activity signals, increasing the number of features does not necessarily increase classification accuracy since the features may be redundant or may not be class-specific:(i)Time domain features (such as mean, median, variance, standard deviation, minimum, maximum, and root mean square, applied to the amplitude and time dimensions of a signal) are typically used in many practical HAR systems because of being less computationally intensive; thus, they can be easily extracted in real time.(ii)Frequency-domain features require higher computational cost to distinguish between different human activities. Thus, they may not be suitable for real-time AAL applications.(iii)Physical features are derived from a fundamental understanding of how a certain human movement would produce a specific sensor signal. Physical features are usually extracted from multiple sensor axes, based on the physical parameters of human movements.

Based on the extensive analysis of the literature and features used by other authors (esp. by Capela et al. [17], Mathie et al. [37], and Zhang and Sawchuk [38]), we have extracted 99 features of data, which are detailed in Table 1.


Feature numberDescriptionEquation (notation)

4–6Acceleration (-, -, and -axes)

7–9Gyroscope (-, -, and -axes)

10–15Moving variance of 100 samples of acceleration and gyroscope data, here

16-17Movement intensity of acceleration and gyroscope data

18Movement intensity of difference between acceleration and gyroscope data

19–21Moving variance of 100 samples of movement intensity data, here  

22–24Polar coordinates of acceleration data,
,

25–27Polar coordinates of gyroscope data,
,

28–30Polar coordinates of difference between acceleration and gyroscope data,
,

31Simple moving average of acceleration data

32Simple moving average of gyroscope data

33Simple moving average of difference between acceleration and gyroscope data

34First eigenvalue of moving covariance between acceleration data

35First eigenvalue of moving covariance between gyroscope data

36First eigenvalue of moving covariance of difference between acceleration and gyroscope data

37–42Moving energy of acceleration and gyroscope data, here  

43–48Difference between moving maximum and moving minimum of acceleration and gyroscope data, here  

49Moving correlation between - and -axis of acceleration data

50Moving correlation between -axis and -axis of acceleration data

51Moving correlation between -axis and -axis of acceleration data

52Moving correlation between -axis and -axis of gyroscope data

53Moving correlation between -axis and -axis of gyroscope data

54Moving correlation between -axis and -axis of gyroscope data

55–57Projection of gyroscope data onto acceleration data

58Moving mean of orientation vector of acceleration data, here  

59Moving variance of orientation vector of acceleration data, 

60Moving energy of orientation vector of acceleration data, here  

61–63Moving energy of difference between acceleration and gyroscope data, here  ,  

64Moving energy of difference between -axis and -axis of acceleration data

65Moving energy of difference between -axis and -axis of acceleration data

66Moving energy of difference between -axis and -axis of acceleration data

67Moving mean of orientation vector of difference between acceleration and gyroscope data, here  

68Moving variance of orientation vector of difference between acceleration and gyroscope data, 

69Moving energy of orientation vector of difference between acceleration and gyroscope data, here  

70Moving mean of orientation vector of gravity data, here  

71Moving variance of orientation vector of gravity data, 

72Moving energy of orientation vector of gravity data, here  

73Moving mean of orientation vector of difference between acceleration and gravity data, 

74Moving variance of orientation vector of difference between acceleration and gravity data, 

75Moving energy of orientation vector of difference between acceleration and gravity data, 

76–81Moving cumulative sum of acceleration and gyroscope data, here  

82Simple moving average of cumulative sums of acceleration data

83Simple moving average of cumulative sums of gyroscope data

84Simple moving average of cumulative sums of difference between accelerometer and gyroscope data

85–90Moving 2nd-order cumulative sum of acceleration and gyroscope data

91–93Moving 2nd-order cumulative sum of differences between cumulative sums of acceleration and gyroscope data

94–96Polar coordinates of moving cumulative sum of acceleration data,
,

97–99Polar coordinates of moving cumulative sum of gyroscope data,
,

100–102Polar coordinates of moving cumulative sum of differences between acceleration and gyroscope data